Codebase list matrix-synapse / c1e3497
Merge branch 'debian/unstable' into debian/buster-backports Andrej Shadura 3 years ago
156 changed file(s) with 5277 addition(s) and 1754 deletion(s). Raw diff Collapse all Expand all
99
1010 export LANG="C.UTF-8"
1111
12 # Prevent virtualenv from auto-updating pip to an incompatible version
13 export VIRTUALENV_NO_DOWNLOAD=1
14
1215 exec tox -e py35-old,combine
0 Synapse 1.27.0 (2021-02-16)
1 ===========================
2
3 Note that this release includes a change in Synapse to use Redis as a cache ─ as well as a pub/sub mechanism ─ if Redis support is enabled for workers. No action is needed by server administrators, and we do not expect resource usage of the Redis instance to change dramatically.
4
5 This release also changes the callback URI for OpenID Connect (OIDC) identity providers. If your server is configured to use single sign-on via an OIDC/OAuth2 IdP, you may need to make configuration changes. Please review [UPGRADE.rst](UPGRADE.rst) for more details on these changes.
6
7 This release also changes escaping of variables in the HTML templates for SSO or email notifications. If you have customised these templates, please review [UPGRADE.rst](UPGRADE.rst) for more details on these changes.
8
9
10 Bugfixes
11 --------
12
13 - Fix building Docker images for armv7. ([\#9405](https://github.com/matrix-org/synapse/issues/9405))
14
15
16 Synapse 1.27.0rc2 (2021-02-11)
17 ==============================
18
19 Features
20 --------
21
22 - Further improvements to the user experience of registration via single sign-on. ([\#9297](https://github.com/matrix-org/synapse/issues/9297))
23
24
25 Bugfixes
26 --------
27
28 - Fix ratelimiting introduced in v1.27.0rc1 for invites to respect the `ratelimit` flag on application services. ([\#9302](https://github.com/matrix-org/synapse/issues/9302))
29 - Do not automatically calculate `public_baseurl` since it can be wrong in some situations. Reverts behaviour introduced in v1.26.0. ([\#9313](https://github.com/matrix-org/synapse/issues/9313))
30
31
32 Improved Documentation
33 ----------------------
34
35 - Clarify the sample configuration for changes made to the template loading code. ([\#9310](https://github.com/matrix-org/synapse/issues/9310))
36
37
38 Synapse 1.27.0rc1 (2021-02-02)
39 ==============================
40
41 Features
42 --------
43
44 - Add an admin API for getting and deleting forward extremities for a room. ([\#9062](https://github.com/matrix-org/synapse/issues/9062))
45 - Add an admin API for retrieving the current room state of a room. ([\#9168](https://github.com/matrix-org/synapse/issues/9168))
46 - Add experimental support for allowing clients to pick an SSO Identity Provider ([MSC2858](https://github.com/matrix-org/matrix-doc/pull/2858)). ([\#9183](https://github.com/matrix-org/synapse/issues/9183), [\#9242](https://github.com/matrix-org/synapse/issues/9242))
47 - Add an admin API endpoint for shadow-banning users. ([\#9209](https://github.com/matrix-org/synapse/issues/9209))
48 - Add ratelimits to the 3PID `/requestToken` APIs. ([\#9238](https://github.com/matrix-org/synapse/issues/9238))
49 - Add support to the OpenID Connect integration for adding the user's email address. ([\#9245](https://github.com/matrix-org/synapse/issues/9245))
50 - Add ratelimits to invites in rooms and to specific users. ([\#9258](https://github.com/matrix-org/synapse/issues/9258))
51 - Improve the user experience of setting up an account via single-sign on. ([\#9262](https://github.com/matrix-org/synapse/issues/9262), [\#9272](https://github.com/matrix-org/synapse/issues/9272), [\#9275](https://github.com/matrix-org/synapse/issues/9275), [\#9276](https://github.com/matrix-org/synapse/issues/9276), [\#9277](https://github.com/matrix-org/synapse/issues/9277), [\#9286](https://github.com/matrix-org/synapse/issues/9286), [\#9287](https://github.com/matrix-org/synapse/issues/9287))
52 - Add phone home stats for encrypted messages. ([\#9283](https://github.com/matrix-org/synapse/issues/9283))
53 - Update the redirect URI for OIDC authentication. ([\#9288](https://github.com/matrix-org/synapse/issues/9288))
54
55
56 Bugfixes
57 --------
58
59 - Fix spurious errors in logs when deleting a non-existant pusher. ([\#9121](https://github.com/matrix-org/synapse/issues/9121))
60 - Fix a long-standing bug where Synapse would return a 500 error when a thumbnail did not exist (and auto-generation of thumbnails was not enabled). ([\#9163](https://github.com/matrix-org/synapse/issues/9163))
61 - Fix a long-standing bug where an internal server error was raised when attempting to preview an HTML document in an unknown character encoding. ([\#9164](https://github.com/matrix-org/synapse/issues/9164))
62 - Fix a long-standing bug where invalid data could cause errors when calculating the presentable room name for push. ([\#9165](https://github.com/matrix-org/synapse/issues/9165))
63 - Fix bug where we sometimes didn't detect that Redis connections had died, causing workers to not see new data. ([\#9218](https://github.com/matrix-org/synapse/issues/9218))
64 - Fix a bug where `None` was passed to Synapse modules instead of an empty dictionary if an empty module `config` block was provided in the homeserver config. ([\#9229](https://github.com/matrix-org/synapse/issues/9229))
65 - Fix a bug in the `make_room_admin` admin API where it failed if the admin with the greatest power level was not in the room. Contributed by Pankaj Yadav. ([\#9235](https://github.com/matrix-org/synapse/issues/9235))
66 - Prevent password hashes from getting dropped if a client failed threepid validation during a User Interactive Auth stage. Removes a workaround for an ancient bug in Riot Web <v0.7.4. ([\#9265](https://github.com/matrix-org/synapse/issues/9265))
67 - Fix single-sign-on when the endpoints are routed to synapse workers. ([\#9271](https://github.com/matrix-org/synapse/issues/9271))
68
69
70 Improved Documentation
71 ----------------------
72
73 - Add docs for using Gitea as OpenID provider. ([\#9134](https://github.com/matrix-org/synapse/issues/9134))
74 - Add link to Matrix VoIP tester for turn-howto. ([\#9135](https://github.com/matrix-org/synapse/issues/9135))
75 - Add notes on integrating with Facebook for SSO login. ([\#9244](https://github.com/matrix-org/synapse/issues/9244))
76
77
78 Deprecations and Removals
79 -------------------------
80
81 - The `service_url` parameter in `cas_config` is deprecated in favor of `public_baseurl`. ([\#9199](https://github.com/matrix-org/synapse/issues/9199))
82 - Add new endpoint `/_synapse/client/saml2` for SAML2 authentication callbacks, and deprecate the old endpoint `/_matrix/saml2`. ([\#9289](https://github.com/matrix-org/synapse/issues/9289))
83
84
85 Internal Changes
86 ----------------
87
88 - Add tests to `test_user.UsersListTestCase` for List Users Admin API. ([\#9045](https://github.com/matrix-org/synapse/issues/9045))
89 - Various improvements to the federation client. ([\#9129](https://github.com/matrix-org/synapse/issues/9129))
90 - Speed up chain cover calculation when persisting a batch of state events at once. ([\#9176](https://github.com/matrix-org/synapse/issues/9176))
91 - Add a `long_description_type` to the package metadata. ([\#9180](https://github.com/matrix-org/synapse/issues/9180))
92 - Speed up batch insertion when using PostgreSQL. ([\#9181](https://github.com/matrix-org/synapse/issues/9181), [\#9188](https://github.com/matrix-org/synapse/issues/9188))
93 - Emit an error at startup if different Identity Providers are configured with the same `idp_id`. ([\#9184](https://github.com/matrix-org/synapse/issues/9184))
94 - Improve performance of concurrent use of `StreamIDGenerators`. ([\#9190](https://github.com/matrix-org/synapse/issues/9190))
95 - Add some missing source directories to the automatic linting script. ([\#9191](https://github.com/matrix-org/synapse/issues/9191))
96 - Precompute joined hosts and store in Redis. ([\#9198](https://github.com/matrix-org/synapse/issues/9198), [\#9227](https://github.com/matrix-org/synapse/issues/9227))
97 - Clean-up template loading code. ([\#9200](https://github.com/matrix-org/synapse/issues/9200))
98 - Fix the Python 3.5 old dependencies build. ([\#9217](https://github.com/matrix-org/synapse/issues/9217))
99 - Update `isort` to v5.7.0 to bypass a bug where it would disagree with `black` about formatting. ([\#9222](https://github.com/matrix-org/synapse/issues/9222))
100 - Add type hints to handlers code. ([\#9223](https://github.com/matrix-org/synapse/issues/9223), [\#9232](https://github.com/matrix-org/synapse/issues/9232))
101 - Fix Debian package building on Ubuntu 16.04 LTS (Xenial). ([\#9254](https://github.com/matrix-org/synapse/issues/9254))
102 - Minor performance improvement during TLS handshake. ([\#9255](https://github.com/matrix-org/synapse/issues/9255))
103 - Refactor the generation of summary text for email notifications. ([\#9260](https://github.com/matrix-org/synapse/issues/9260))
104 - Restore PyPy compatibility by not calling CPython-specific GC methods when under PyPy. ([\#9270](https://github.com/matrix-org/synapse/issues/9270))
105
106
0107 Synapse 1.26.0 (2021-01-27)
1108 ===========================
2109
8383 # replace `1.3.0` and `stretch` accordingly:
8484 wget https://packages.matrix.org/debian/pool/main/m/matrix-synapse-py3/matrix-synapse-py3_1.3.0+stretch1_amd64.deb
8585 dpkg -i matrix-synapse-py3_1.3.0+stretch1_amd64.deb
86
87 Upgrading to v1.27.0
88 ====================
89
90 Changes to callback URI for OAuth2 / OpenID Connect
91 ---------------------------------------------------
92
93 This version changes the URI used for callbacks from OAuth2 identity providers. If
94 your server is configured for single sign-on via an OpenID Connect or OAuth2 identity
95 provider, you will need to add ``[synapse public baseurl]/_synapse/client/oidc/callback``
96 to the list of permitted "redirect URIs" at the identity provider.
97
98 See `docs/openid.md <docs/openid.md>`_ for more information on setting up OpenID
99 Connect.
100
101 (Note: a similar change is being made for SAML2; in this case the old URI
102 ``[synapse public baseurl]/_matrix/saml2`` is being deprecated, but will continue to
103 work, so no immediate changes are required for existing installations.)
104
105 Changes to HTML templates
106 -------------------------
107
108 The HTML templates for SSO and email notifications now have `Jinja2's autoescape <https://jinja.palletsprojects.com/en/2.11.x/api/#autoescaping>`_
109 enabled for files ending in ``.html``, ``.htm``, and ``.xml``. If you have customised
110 these templates and see issues when viewing them you might need to update them.
111 It is expected that most configurations will need no changes.
112
113 If you have customised the templates *names* for these templates, it is recommended
114 to verify they end in ``.html`` to ensure autoescape is enabled.
115
116 The above applies to the following templates:
117
118 * ``add_threepid.html``
119 * ``add_threepid_failure.html``
120 * ``add_threepid_success.html``
121 * ``notice_expiry.html``
122 * ``notice_expiry.html``
123 * ``notif_mail.html`` (which, by default, includes ``room.html`` and ``notif.html``)
124 * ``password_reset.html``
125 * ``password_reset_confirmation.html``
126 * ``password_reset_failure.html``
127 * ``password_reset_success.html``
128 * ``registration.html``
129 * ``registration_failure.html``
130 * ``registration_success.html``
131 * ``sso_account_deactivated.html``
132 * ``sso_auth_bad_user.html``
133 * ``sso_auth_confirm.html``
134 * ``sso_auth_success.html``
135 * ``sso_error.html``
136 * ``sso_login_idp_picker.html``
137 * ``sso_redirect_confirm.html``
86138
87139 Upgrading to v1.26.0
88140 ====================
197249
198250 return {"localpart": localpart}
199251
200 Removal historical Synapse Admin API
252 Removal historical Synapse Admin API
201253 ------------------------------------
202254
203255 Historically, the Synapse Admin API has been accessible under:
0 matrix-synapse (1.27.0-1) unstable; urgency=medium
1
2 * New upstream release.
3
4 -- Andrej Shadura <andrewsh@debian.org> Wed, 17 Feb 2021 17:36:17 +0100
5
6 matrix-synapse (1.26.0-3) unstable; urgency=medium
7
8 * Downgrade the level of sandboxing to prevent confusion.
9 Strict sandboxing enabled by default caused issues for users running
10 appservices: https://github.com/matrix-org/synapse/issues/9327
11 It’s best to leave it enabled but relax a bit and let users opt in
12 for stricter protection if they wish.
13
14 -- Andrej Shadura <andrewsh@debian.org> Tue, 09 Feb 2021 23:25:50 +0100
15
16 matrix-synapse (1.26.0-2) unstable; urgency=medium
17
18 * Make psycopg2 a dependency.
19
20 -- Andrej Shadura <andrewsh@debian.org> Thu, 04 Feb 2021 11:21:55 +0100
21
022 matrix-synapse (1.26.0-1~bpo10+3) buster-backports; urgency=medium
123
224 * Downgrade the level of sandboxing to prevent confusion.
2727 libwebp-dev \
2828 libxml++2.6-dev \
2929 libxslt1-dev \
30 rustc \
3031 zlib1g-dev \
3132 && rm -rf /var/lib/apt/lists/*
3233
3334 # Build dependencies that are not available as wheels, to speed up rebuilds
3435 RUN pip install --prefix="/install" --no-warn-script-location \
36 cryptography \
3537 frozendict \
3638 jaeger-client \
3739 opentracing \
2626 wget
2727
2828 # fetch and unpack the package
29 # TODO: Upgrade to 1.2.2 once xenial is dropped
2930 RUN mkdir /dh-virtualenv
3031 RUN wget -q -O /dh-virtualenv.tar.gz https://github.com/spotify/dh-virtualenv/archive/ac6e1b1.tar.gz
3132 RUN tar -xv --strip-components=1 -C /dh-virtualenv -f /dh-virtualenv.tar.gz
88 * [Response](#response)
99 * [Undoing room shutdowns](#undoing-room-shutdowns)
1010 - [Make Room Admin API](#make-room-admin-api)
11 - [Forward Extremities Admin API](#forward-extremities-admin-api)
1112
1213 # List Room API
1314
366367 }
367368 ```
368369
370 # Room State API
371
372 The Room State admin API allows server admins to get a list of all state events in a room.
373
374 The response includes the following fields:
375
376 * `state` - The current state of the room at the time of request.
377
378 ## Usage
379
380 A standard request:
381
382 ```
383 GET /_synapse/admin/v1/rooms/<room_id>/state
384
385 {}
386 ```
387
388 Response:
389
390 ```json
391 {
392 "state": [
393 {"type": "m.room.create", "state_key": "", "etc": true},
394 {"type": "m.room.power_levels", "state_key": "", "etc": true},
395 {"type": "m.room.name", "state_key": "", "etc": true}
396 ]
397 }
398 ```
399
369400 # Delete Room API
370401
371402 The Delete Room admin API allows server admins to remove rooms from server
510541 "user_id": "@foo:example.com"
511542 }
512543 ```
544
545 # Forward Extremities Admin API
546
547 Enables querying and deleting forward extremities from rooms. When a lot of forward
548 extremities accumulate in a room, performance can become degraded. For details, see
549 [#1760](https://github.com/matrix-org/synapse/issues/1760).
550
551 ## Check for forward extremities
552
553 To check the status of forward extremities for a room:
554
555 ```
556 GET /_synapse/admin/v1/rooms/<room_id_or_alias>/forward_extremities
557 ```
558
559 A response as follows will be returned:
560
561 ```json
562 {
563 "count": 1,
564 "results": [
565 {
566 "event_id": "$M5SP266vsnxctfwFgFLNceaCo3ujhRtg_NiiHabcdefgh",
567 "state_group": 439,
568 "depth": 123,
569 "received_ts": 1611263016761
570 }
571 ]
572 }
573 ```
574
575 ## Deleting forward extremities
576
577 **WARNING**: Please ensure you know what you're doing and have read
578 the related issue [#1760](https://github.com/matrix-org/synapse/issues/1760).
579 Under no situations should this API be executed as an automated maintenance task!
580
581 If a room has lots of forward extremities, the extra can be
582 deleted as follows:
583
584 ```
585 DELETE /_synapse/admin/v1/rooms/<room_id_or_alias>/forward_extremities
586 ```
587
588 A response as follows will be returned, indicating the amount of forward extremities
589 that were deleted.
590
591 ```json
592 {
593 "deleted": 1
594 }
595 ```
759759 - ``total`` - integer - Number of pushers.
760760
761761 See also `Client-Server API Spec <https://matrix.org/docs/spec/client_server/latest#get-matrix-client-r0-pushers>`_
762
763 Shadow-banning users
764 ====================
765
766 Shadow-banning is a useful tool for moderating malicious or egregiously abusive users.
767 A shadow-banned users receives successful responses to their client-server API requests,
768 but the events are not propagated into rooms. This can be an effective tool as it
769 (hopefully) takes longer for the user to realise they are being moderated before
770 pivoting to another account.
771
772 Shadow-banning a user should be used as a tool of last resort and may lead to confusing
773 or broken behaviour for the client. A shadow-banned user will not receive any
774 notification and it is generally more appropriate to ban or kick abusive users.
775 A shadow-banned user will be unable to contact anyone on the server.
776
777 The API is::
778
779 POST /_synapse/admin/v1/users/<user_id>/shadow_ban
780
781 To use it, you will need to authenticate by providing an ``access_token`` for a
782 server admin: see `README.rst <README.rst>`_.
783
784 An empty JSON dict is returned.
785
786 **Parameters**
787
788 The following parameters should be set in the URL:
789
790 - ``user_id`` - The fully qualified MXID: for example, ``@user:server.com``. The user must
791 be local.
4343
4444 To enable the OpenID integration, you should then add a section to the `oidc_providers`
4545 setting in your configuration file (or uncomment one of the existing examples).
46 See [sample_config.yaml](./sample_config.yaml) for some sample settings, as well as
46 See [sample_config.yaml](./sample_config.yaml) for some sample settings, as well as
4747 the text below for example configurations for specific providers.
4848
4949 ## Sample configs
5151 Here are a few configs for providers that should work with Synapse.
5252
5353 ### Microsoft Azure Active Directory
54 Azure AD can act as an OpenID Connect Provider. Register a new application under
54 Azure AD can act as an OpenID Connect Provider. Register a new application under
5555 *App registrations* in the Azure AD management console. The RedirectURI for your
56 application should point to your matrix server: `[synapse public baseurl]/_synapse/oidc/callback`
57
58 Go to *Certificates & secrets* and register a new client secret. Make note of your
56 application should point to your matrix server:
57 `[synapse public baseurl]/_synapse/client/oidc/callback`
58
59 Go to *Certificates & secrets* and register a new client secret. Make note of your
5960 Directory (tenant) ID as it will be used in the Azure links.
6061 Edit your Synapse config file and change the `oidc_config` section:
6162
9394 - id: synapse
9495 secret: secret
9596 redirectURIs:
96 - '[synapse public baseurl]/_synapse/oidc/callback'
97 - '[synapse public baseurl]/_synapse/client/oidc/callback'
9798 name: 'Synapse'
9899 ```
99100
117118 ```
118119 ### [Keycloak][keycloak-idp]
119120
120 [Keycloak][keycloak-idp] is an opensource IdP maintained by Red Hat.
121 [Keycloak][keycloak-idp] is an opensource IdP maintained by Red Hat.
121122
122123 Follow the [Getting Started Guide](https://www.keycloak.org/getting-started) to install Keycloak and set up a realm.
123124
139140 | Enabled | `On` |
140141 | Client Protocol | `openid-connect` |
141142 | Access Type | `confidential` |
142 | Valid Redirect URIs | `[synapse public baseurl]/_synapse/oidc/callback` |
143 | Valid Redirect URIs | `[synapse public baseurl]/_synapse/client/oidc/callback` |
143144
144145 5. Click `Save`
145146 6. On the Credentials tab, update the fields:
167168 ### [Auth0][auth0]
168169
169170 1. Create a regular web application for Synapse
170 2. Set the Allowed Callback URLs to `[synapse public baseurl]/_synapse/oidc/callback`
171 2. Set the Allowed Callback URLs to `[synapse public baseurl]/_synapse/client/oidc/callback`
171172 3. Add a rule to add the `preferred_username` claim.
172173 <details>
173174 <summary>Code sample</summary>
193194
194195 ```yaml
195196 oidc_providers:
196 - idp_id: auth0
197 - idp_id: auth0
197198 idp_name: Auth0
198199 issuer: "https://your-tier.eu.auth0.com/" # TO BE FILLED
199200 client_id: "your-client-id" # TO BE FILLED
216217 does not return a `sub` property, an alternative `subject_claim` has to be set.
217218
218219 1. Create a new OAuth application: https://github.com/settings/applications/new.
219 2. Set the callback URL to `[synapse public baseurl]/_synapse/oidc/callback`.
220 2. Set the callback URL to `[synapse public baseurl]/_synapse/client/oidc/callback`.
220221
221222 Synapse config:
222223
224225 oidc_providers:
225226 - idp_id: github
226227 idp_name: Github
228 idp_brand: "org.matrix.github" # optional: styling hint for clients
227229 discover: false
228230 issuer: "https://github.com/"
229231 client_id: "your-client-id" # TO BE FILLED
249251 oidc_providers:
250252 - idp_id: google
251253 idp_name: Google
254 idp_brand: "org.matrix.google" # optional: styling hint for clients
252255 issuer: "https://accounts.google.com/"
253256 client_id: "your-client-id" # TO BE FILLED
254257 client_secret: "your-client-secret" # TO BE FILLED
259262 display_name_template: "{{ user.name }}"
260263 ```
261264 4. Back in the Google console, add this Authorized redirect URI: `[synapse
262 public baseurl]/_synapse/oidc/callback`.
265 public baseurl]/_synapse/client/oidc/callback`.
263266
264267 ### Twitch
265268
266269 1. Setup a developer account on [Twitch](https://dev.twitch.tv/)
267270 2. Obtain the OAuth 2.0 credentials by [creating an app](https://dev.twitch.tv/console/apps/)
268 3. Add this OAuth Redirect URL: `[synapse public baseurl]/_synapse/oidc/callback`
271 3. Add this OAuth Redirect URL: `[synapse public baseurl]/_synapse/client/oidc/callback`
269272
270273 Synapse config:
271274
287290
288291 1. Create a [new application](https://gitlab.com/profile/applications).
289292 2. Add the `read_user` and `openid` scopes.
290 3. Add this Callback URL: `[synapse public baseurl]/_synapse/oidc/callback`
293 3. Add this Callback URL: `[synapse public baseurl]/_synapse/client/oidc/callback`
291294
292295 Synapse config:
293296
295298 oidc_providers:
296299 - idp_id: gitlab
297300 idp_name: Gitlab
301 idp_brand: "org.matrix.gitlab" # optional: styling hint for clients
298302 issuer: "https://gitlab.com/"
299303 client_id: "your-client-id" # TO BE FILLED
300304 client_secret: "your-client-secret" # TO BE FILLED
306310 localpart_template: '{{ user.nickname }}'
307311 display_name_template: '{{ user.name }}'
308312 ```
313
314 ### Facebook
315
316 Like Github, Facebook provide a custom OAuth2 API rather than an OIDC-compliant
317 one so requires a little more configuration.
318
319 0. You will need a Facebook developer account. You can register for one
320 [here](https://developers.facebook.com/async/registration/).
321 1. On the [apps](https://developers.facebook.com/apps/) page of the developer
322 console, "Create App", and choose "Build Connected Experiences".
323 2. Once the app is created, add "Facebook Login" and choose "Web". You don't
324 need to go through the whole form here.
325 3. In the left-hand menu, open "Products"/"Facebook Login"/"Settings".
326 * Add `[synapse public baseurl]/_synapse/client/oidc/callback` as an OAuth Redirect
327 URL.
328 4. In the left-hand menu, open "Settings/Basic". Here you can copy the "App ID"
329 and "App Secret" for use below.
330
331 Synapse config:
332
333 ```yaml
334 - idp_id: facebook
335 idp_name: Facebook
336 idp_brand: "org.matrix.facebook" # optional: styling hint for clients
337 discover: false
338 issuer: "https://facebook.com"
339 client_id: "your-client-id" # TO BE FILLED
340 client_secret: "your-client-secret" # TO BE FILLED
341 scopes: ["openid", "email"]
342 authorization_endpoint: https://facebook.com/dialog/oauth
343 token_endpoint: https://graph.facebook.com/v9.0/oauth/access_token
344 user_profile_method: "userinfo_endpoint"
345 userinfo_endpoint: "https://graph.facebook.com/v9.0/me?fields=id,name,email,picture"
346 user_mapping_provider:
347 config:
348 subject_claim: "id"
349 display_name_template: "{{ user.name }}"
350 ```
351
352 Relevant documents:
353 * https://developers.facebook.com/docs/facebook-login/manually-build-a-login-flow
354 * Using Facebook's Graph API: https://developers.facebook.com/docs/graph-api/using-graph-api/
355 * Reference to the User endpoint: https://developers.facebook.com/docs/graph-api/reference/user
356
357 ### Gitea
358
359 Gitea is, like Github, not an OpenID provider, but just an OAuth2 provider.
360
361 The [`/user` API endpoint](https://try.gitea.io/api/swagger#/user/userGetCurrent)
362 can be used to retrieve information on the authenticated user. As the Synapse
363 login mechanism needs an attribute to uniquely identify users, and that endpoint
364 does not return a `sub` property, an alternative `subject_claim` has to be set.
365
366 1. Create a new application.
367 2. Add this Callback URL: `[synapse public baseurl]/_synapse/oidc/callback`
368
369 Synapse config:
370
371 ```yaml
372 oidc_providers:
373 - idp_id: gitea
374 idp_name: Gitea
375 discover: false
376 issuer: "https://your-gitea.com/"
377 client_id: "your-client-id" # TO BE FILLED
378 client_secret: "your-client-secret" # TO BE FILLED
379 client_auth_method: client_secret_post
380 scopes: [] # Gitea doesn't support Scopes
381 authorization_endpoint: "https://your-gitea.com/login/oauth/authorize"
382 token_endpoint: "https://your-gitea.com/login/oauth/access_token"
383 userinfo_endpoint: "https://your-gitea.com/api/v1/user"
384 user_mapping_provider:
385 config:
386 subject_claim: "id"
387 localpart_template: "{{ user.login }}"
388 display_name_template: "{{ user.full_name }}"
389 ```
7272 # reverse proxy, this should be the URL to reach Synapse via the proxy.
7373 # Otherwise, it should be the URL to reach Synapse's client HTTP listener (see
7474 # 'listeners' below).
75 #
76 # If this is left unset, it defaults to 'https://<server_name>/'. (Note that
77 # that will not work unless you configure Synapse or a reverse-proxy to listen
78 # on port 443.)
7975 #
8076 #public_baseurl: https://example.com/
8177
823819 # users are joining rooms the server is already in (this is cheap) vs
824820 # "remote" for when users are trying to join rooms not on the server (which
825821 # can be more expensive)
822 # - one for ratelimiting how often a user or IP can attempt to validate a 3PID.
823 # - two for ratelimiting how often invites can be sent in a room or to a
824 # specific user.
826825 #
827826 # The defaults are as shown below.
828827 #
856855 # remote:
857856 # per_second: 0.01
858857 # burst_count: 3
859
858 #
859 #rc_3pid_validation:
860 # per_second: 0.003
861 # burst_count: 5
862 #
863 #rc_invites:
864 # per_room:
865 # per_second: 0.3
866 # burst_count: 10
867 # per_user:
868 # per_second: 0.003
869 # burst_count: 5
860870
861871 # Ratelimiting settings for incoming federation
862872 #
11541164 # send an email to the account's email address with a renewal link. By
11551165 # default, no such emails are sent.
11561166 #
1157 # If you enable this setting, you will also need to fill out the 'email'
1158 # configuration section. You should also check that 'public_baseurl' is set
1159 # correctly.
1167 # If you enable this setting, you will also need to fill out the 'email' and
1168 # 'public_baseurl' configuration sections.
11601169 #
11611170 #renew_at: 1w
11621171
12471256 # The identity server which we suggest that clients should use when users log
12481257 # in on this server.
12491258 #
1250 # (By default, no suggestion is made, so it is left up to the client.)
1259 # (By default, no suggestion is made, so it is left up to the client.
1260 # This setting is ignored unless public_baseurl is also set.)
12511261 #
12521262 #default_identity_server: https://matrix.org
12531263
12711281 # Servers handling the these requests must answer the `/requestToken` endpoints defined
12721282 # by the Matrix Identity Service API specification:
12731283 # https://matrix.org/docs/spec/identity_service/latest
1284 #
1285 # If a delegate is specified, the config option public_baseurl must also be filled out.
12741286 #
12751287 account_threepid_delegates:
12761288 #email: https://example.com # Delegate email sending to example.com
15511563 # enable SAML login.
15521564 #
15531565 # Once SAML support is enabled, a metadata file will be exposed at
1554 # https://<server>:<port>/_matrix/saml2/metadata.xml, which you may be able to
1566 # https://<server>:<port>/_synapse/client/saml2/metadata.xml, which you may be able to
15551567 # use to configure your SAML IdP with. Alternatively, you can manually configure
15561568 # the IdP to use an ACS location of
1557 # https://<server>:<port>/_matrix/saml2/authn_response.
1569 # https://<server>:<port>/_synapse/client/saml2/authn_response.
15581570 #
15591571 saml2_config:
15601572 # `sp_config` is the configuration for the pysaml2 Service Provider.
17261738 # offer the user a choice of login mechanisms.
17271739 #
17281740 # idp_icon: An optional icon for this identity provider, which is presented
1729 # by identity picker pages. If given, must be an MXC URI of the format
1730 # mxc://<server-name>/<media-id>. (An easy way to obtain such an MXC URI
1731 # is to upload an image to an (unencrypted) room and then copy the "url"
1732 # from the source of the event.)
1741 # by clients and Synapse's own IdP picker page. If given, must be an
1742 # MXC URI of the format mxc://<server-name>/<media-id>. (An easy way to
1743 # obtain such an MXC URI is to upload an image to an (unencrypted) room
1744 # and then copy the "url" from the source of the event.)
1745 #
1746 # idp_brand: An optional brand for this identity provider, allowing clients
1747 # to style the login flow according to the identity provider in question.
1748 # See the spec for possible options here.
17331749 #
17341750 # discover: set to 'false' to disable the use of the OIDC discovery mechanism
17351751 # to discover endpoints. Defaults to true.
17901806 #
17911807 # For the default provider, the following settings are available:
17921808 #
1793 # sub: name of the claim containing a unique identifier for the
1794 # user. Defaults to 'sub', which OpenID Connect compliant
1795 # providers should provide.
1809 # subject_claim: name of the claim containing a unique identifier
1810 # for the user. Defaults to 'sub', which OpenID Connect
1811 # compliant providers should provide.
17961812 #
17971813 # localpart_template: Jinja2 template for the localpart of the MXID.
17981814 # If this is not set, the user will be prompted to choose their
1799 # own username.
1815 # own username (see 'sso_auth_account_details.html' in the 'sso'
1816 # section of this file).
18001817 #
18011818 # display_name_template: Jinja2 template for the display name to set
18021819 # on first login. If unset, no displayname will be set.
1820 #
1821 # email_template: Jinja2 template for the email address of the user.
1822 # If unset, no email address will be added to the account.
18031823 #
18041824 # extra_attributes: a map of Jinja2 templates for extra attributes
18051825 # to send back to the client during login.
18361856 # userinfo_endpoint: "https://accounts.example.com/userinfo"
18371857 # jwks_uri: "https://accounts.example.com/.well-known/jwks.json"
18381858 # skip_verification: true
1859 # user_mapping_provider:
1860 # config:
1861 # subject_claim: "id"
1862 # localpart_template: "{ user.login }"
1863 # display_name_template: "{ user.name }"
1864 # email_template: "{ user.email }"
18391865
18401866 # For use with Keycloak
18411867 #
18501876 #
18511877 #- idp_id: github
18521878 # idp_name: Github
1879 # idp_brand: org.matrix.github
18531880 # discover: false
18541881 # issuer: "https://github.com/"
18551882 # client_id: "your-client-id" # TO BE FILLED
18771904 #
18781905 #server_url: "https://cas-server.com"
18791906
1880 # The public URL of the homeserver.
1881 #
1882 #service_url: "https://homeserver.domain.com:8448"
1883
18841907 # The attribute of the CAS response to use as the display name.
18851908 #
18861909 # If unset, no displayname will be set.
19121935 # phishing attacks from evil.site. To avoid this, include a slash after the
19131936 # hostname: "https://my.client/".
19141937 #
1915 # The login fallback page (used by clients that don't natively support the
1916 # required login flows) is automatically whitelisted in addition to any URLs
1917 # in this list.
1938 # If public_baseurl is set, then the login fallback page (used by clients
1939 # that don't natively support the required login flows) is whitelisted in
1940 # addition to any URLs in this list.
19181941 #
19191942 # By default, this list is empty.
19201943 #
19351958 #
19361959 # When rendering, this template is given the following variables:
19371960 # * redirect_url: the URL that the user will be redirected to after
1938 # login. Needs manual escaping (see
1939 # https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
1961 # login.
19401962 #
19411963 # * server_name: the homeserver's name.
19421964 #
19431965 # * providers: a list of available Identity Providers. Each element is
19441966 # an object with the following attributes:
1967 #
19451968 # * idp_id: unique identifier for the IdP
19461969 # * idp_name: user-facing name for the IdP
1970 # * idp_icon: if specified in the IdP config, an MXC URI for an icon
1971 # for the IdP
1972 # * idp_brand: if specified in the IdP config, a textual identifier
1973 # for the brand of the IdP
19471974 #
19481975 # The rendered HTML page should contain a form which submits its results
19491976 # back as a GET request, with the following query parameters:
19531980 #
19541981 # * idp: the 'idp_id' of the chosen IDP.
19551982 #
1983 # * HTML page to prompt new users to enter a userid and confirm other
1984 # details: 'sso_auth_account_details.html'. This is only shown if the
1985 # SSO implementation (with any user_mapping_provider) does not return
1986 # a localpart.
1987 #
1988 # When rendering, this template is given the following variables:
1989 #
1990 # * server_name: the homeserver's name.
1991 #
1992 # * idp: details of the SSO Identity Provider that the user logged in
1993 # with: an object with the following attributes:
1994 #
1995 # * idp_id: unique identifier for the IdP
1996 # * idp_name: user-facing name for the IdP
1997 # * idp_icon: if specified in the IdP config, an MXC URI for an icon
1998 # for the IdP
1999 # * idp_brand: if specified in the IdP config, a textual identifier
2000 # for the brand of the IdP
2001 #
2002 # * user_attributes: an object containing details about the user that
2003 # we received from the IdP. May have the following attributes:
2004 #
2005 # * display_name: the user's display_name
2006 # * emails: a list of email addresses
2007 #
2008 # The template should render a form which submits the following fields:
2009 #
2010 # * username: the localpart of the user's chosen user id
2011 #
2012 # * HTML page allowing the user to consent to the server's terms and
2013 # conditions. This is only shown for new users, and only if
2014 # `user_consent.require_at_registration` is set.
2015 #
2016 # When rendering, this template is given the following variables:
2017 #
2018 # * server_name: the homeserver's name.
2019 #
2020 # * user_id: the user's matrix proposed ID.
2021 #
2022 # * user_profile.display_name: the user's proposed display name, if any.
2023 #
2024 # * consent_version: the version of the terms that the user will be
2025 # shown
2026 #
2027 # * terms_url: a link to the page showing the terms.
2028 #
2029 # The template should render a form which submits the following fields:
2030 #
2031 # * accepted_version: the version of the terms accepted by the user
2032 # (ie, 'consent_version' from the input variables).
2033 #
19562034 # * HTML page for a confirmation step before redirecting back to the client
19572035 # with the login token: 'sso_redirect_confirm.html'.
19582036 #
1959 # When rendering, this template is given three variables:
1960 # * redirect_url: the URL the user is about to be redirected to. Needs
1961 # manual escaping (see
1962 # https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
2037 # When rendering, this template is given the following variables:
2038 #
2039 # * redirect_url: the URL the user is about to be redirected to.
19632040 #
19642041 # * display_url: the same as `redirect_url`, but with the query
19652042 # parameters stripped. The intention is to have a
19662043 # human-readable URL to show to users, not to use it as
1967 # the final address to redirect to. Needs manual escaping
1968 # (see https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
2044 # the final address to redirect to.
19692045 #
19702046 # * server_name: the homeserver's name.
2047 #
2048 # * new_user: a boolean indicating whether this is the user's first time
2049 # logging in.
2050 #
2051 # * user_id: the user's matrix ID.
2052 #
2053 # * user_profile.avatar_url: an MXC URI for the user's avatar, if any.
2054 # None if the user has not set an avatar.
2055 #
2056 # * user_profile.display_name: the user's display name. None if the user
2057 # has not set a display name.
19712058 #
19722059 # * HTML page which notifies the user that they are authenticating to confirm
19732060 # an operation on their account during the user interactive authentication
19742061 # process: 'sso_auth_confirm.html'.
19752062 #
19762063 # When rendering, this template is given the following variables:
1977 # * redirect_url: the URL the user is about to be redirected to. Needs
1978 # manual escaping (see
1979 # https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
2064 # * redirect_url: the URL the user is about to be redirected to.
19802065 #
19812066 # * description: the operation which the user is being asked to confirm
2067 #
2068 # * idp: details of the Identity Provider that we will use to confirm
2069 # the user's identity: an object with the following attributes:
2070 #
2071 # * idp_id: unique identifier for the IdP
2072 # * idp_name: user-facing name for the IdP
2073 # * idp_icon: if specified in the IdP config, an MXC URI for an icon
2074 # for the IdP
2075 # * idp_brand: if specified in the IdP config, a textual identifier
2076 # for the brand of the IdP
19822077 #
19832078 # * HTML page shown after a successful user interactive authentication session:
19842079 # 'sso_auth_success.html'.
231231
232232 (Understanding the output is beyond the scope of this document!)
233233
234 * You can test your Matrix homeserver TURN setup with https://test.voip.librepush.net/.
235 Note that this test is not fully reliable yet, so don't be discouraged if
236 the test fails.
237 [Here](https://github.com/matrix-org/voip-tester) is the github repo of the
238 source of the tester, where you can file bug reports.
239
234240 * There is a WebRTC test tool at
235241 https://webrtc.github.io/samples/src/content/peerconnection/trickle-ice/. To
236242 use it, you will need a username/password for your TURN server. You can
3838 which relays replication commands between processes. This can give a significant
3939 cpu saving on the main process and will be a prerequisite for upcoming
4040 performance improvements.
41
42 If Redis support is enabled Synapse will use it as a shared cache, as well as a
43 pub/sub mechanism.
4144
4245 See the [Architectural diagram](#architectural-diagram) section at the end for
4346 a visualisation of what this looks like.
224227 ^/_matrix/client/(api/v1|r0|unstable)/joined_groups$
225228 ^/_matrix/client/(api/v1|r0|unstable)/publicised_groups$
226229 ^/_matrix/client/(api/v1|r0|unstable)/publicised_groups/
227 ^/_synapse/client/password_reset/email/submit_token$
228230
229231 # Registration/login requests
230232 ^/_matrix/client/(api/v1|r0|unstable)/login$
255257 to use SSO (you only need to include the ones for whichever SSO provider you're
256258 using):
257259
260 # for all SSO providers
261 ^/_matrix/client/(api/v1|r0|unstable)/login/sso/redirect
262 ^/_synapse/client/pick_idp$
263 ^/_synapse/client/pick_username
264 ^/_synapse/client/new_user_consent$
265 ^/_synapse/client/sso_register$
266
258267 # OpenID Connect requests.
259 ^/_matrix/client/(api/v1|r0|unstable)/login/sso/redirect$
260 ^/_synapse/oidc/callback$
268 ^/_synapse/client/oidc/callback$
261269
262270 # SAML requests.
263 ^/_matrix/client/(api/v1|r0|unstable)/login/sso/redirect$
264 ^/_matrix/saml2/authn_response$
271 ^/_synapse/client/saml2/authn_response$
265272
266273 # CAS requests.
267 ^/_matrix/client/(api/v1|r0|unstable)/login/(cas|sso)/redirect$
268274 ^/_matrix/client/(api/v1|r0|unstable)/login/cas/ticket$
275
276 Ensure that all SSO logins go to a single process.
277 For multiple workers not handling the SSO endpoints properly, see
278 [#7530](https://github.com/matrix-org/synapse/issues/7530).
269279
270280 Note that a HTTP listener with `client` and `federation` resources must be
271281 configured in the `worker_listeners` option in the worker config.
272
273 Ensure that all SSO logins go to a single process (usually the main process).
274 For multiple workers not handling the SSO endpoints properly, see
275 [#7530](https://github.com/matrix-org/synapse/issues/7530).
276282
277283 #### Load balancing
278284
2222 synapse/events/validator.py,
2323 synapse/events/spamcheck.py,
2424 synapse/federation,
25 synapse/handlers/_base.py,
26 synapse/handlers/account_data.py,
27 synapse/handlers/account_validity.py,
28 synapse/handlers/admin.py,
29 synapse/handlers/appservice.py,
30 synapse/handlers/auth.py,
31 synapse/handlers/cas_handler.py,
32 synapse/handlers/deactivate_account.py,
33 synapse/handlers/device.py,
34 synapse/handlers/devicemessage.py,
35 synapse/handlers/directory.py,
36 synapse/handlers/events.py,
37 synapse/handlers/federation.py,
38 synapse/handlers/identity.py,
39 synapse/handlers/initial_sync.py,
40 synapse/handlers/message.py,
41 synapse/handlers/oidc_handler.py,
42 synapse/handlers/pagination.py,
43 synapse/handlers/password_policy.py,
44 synapse/handlers/presence.py,
45 synapse/handlers/profile.py,
46 synapse/handlers/read_marker.py,
47 synapse/handlers/receipts.py,
48 synapse/handlers/register.py,
49 synapse/handlers/room.py,
50 synapse/handlers/room_list.py,
51 synapse/handlers/room_member.py,
52 synapse/handlers/room_member_worker.py,
53 synapse/handlers/saml_handler.py,
54 synapse/handlers/sso.py,
55 synapse/handlers/sync.py,
56 synapse/handlers/user_directory.py,
57 synapse/handlers/ui_auth,
25 synapse/handlers,
5826 synapse/http/client.py,
5927 synapse/http/federation/matrix_federation_agent.py,
6028 synapse/http/federation/well_known_resolver.py,
193161
194162 [mypy-hiredis]
195163 ignore_missing_imports = True
164
165 [mypy-josepy.*]
166 ignore_missing_imports = True
167
168 [mypy-txacme.*]
169 ignore_missing_imports = True
7979 # then lint everything!
8080 if [[ -z ${files+x} ]]; then
8181 # Lint all source code files and directories
82 files=("synapse" "tests" "scripts-dev" "scripts" "contrib" "synctl" "setup.py" "synmark")
82 # Note: this list aims the mirror the one in tox.ini
83 files=("synapse" "docker" "tests" "scripts-dev" "scripts" "contrib" "synctl" "setup.py" "synmark" "stubs" ".buildkite")
8384 fi
8485 fi
8586
9595 #
9696 # We pin black so that our tests don't start failing on new releases.
9797 CONDITIONAL_REQUIREMENTS["lint"] = [
98 "isort==5.0.3",
98 "isort==5.7.0",
9999 "black==19.10b0",
100100 "flake8-comprehensions",
101101 "flake8",
120120 include_package_data=True,
121121 zip_safe=False,
122122 long_description=long_description,
123 long_description_content_type="text/x-rst",
123124 python_requires="~=3.5",
124125 classifiers=[
125126 "Development Status :: 5 - Production/Stable",
1414
1515 """Contains *incomplete* type hints for txredisapi.
1616 """
17
18 from typing import List, Optional, Type, Union
17 from typing import Any, List, Optional, Type, Union
1918
2019 class RedisProtocol:
2120 def publish(self, channel: str, message: bytes): ...
21 async def ping(self) -> None: ...
22 async def set(
23 self,
24 key: str,
25 value: Any,
26 expire: Optional[int] = None,
27 pexpire: Optional[int] = None,
28 only_if_not_exists: bool = False,
29 only_if_exists: bool = False,
30 ) -> None: ...
31 async def get(self, key: str) -> Any: ...
2232
23 class SubscriberProtocol:
33 class SubscriberProtocol(RedisProtocol):
2434 def __init__(self, *args, **kwargs): ...
2535 password: Optional[str]
2636 def subscribe(self, channels: Union[str, List[str]]): ...
3949 convertNumbers: bool = ...,
4050 ) -> RedisProtocol: ...
4151
42 class SubscriberFactory:
43 def buildProtocol(self, addr): ...
44
4552 class ConnectionHandler: ...
4653
4754 class RedisFactory:
4855 continueTrying: bool
4956 handler: RedisProtocol
57 pool: List[RedisProtocol]
58 replyTimeout: Optional[int]
5059 def __init__(
5160 self,
5261 uuid: str,
5968 replyTimeout: Optional[int] = None,
6069 convertNumbers: Optional[int] = True,
6170 ): ...
71 def buildProtocol(self, addr) -> RedisProtocol: ...
72
73 class SubscriberFactory(RedisFactory):
74 def __init__(self): ...
4747 except ImportError:
4848 pass
4949
50 __version__ = "1.26.0"
50 __version__ = "1.27.0"
5151
5252 if bool(os.environ.get("SYNAPSE_TEST_PATCH_LOG_CONTEXTS", False)):
5353 # We import here so that we don't have to install a bunch of deps when
4141 """
4242 if hs_config.form_secret is None:
4343 raise ConfigError("form_secret not set in config")
44 if hs_config.public_baseurl is None:
45 raise ConfigError("public_baseurl not set in config")
4446
4547 self._hmac_secret = hs_config.form_secret.encode("utf-8")
4648 self._public_baseurl = hs_config.public_baseurl
1515 import gc
1616 import logging
1717 import os
18 import platform
1819 import signal
1920 import socket
2021 import sys
338339 # rest of time. Doing so means less work each GC (hopefully).
339340 #
340341 # This only works on Python 3.7
341 if sys.version_info >= (3, 7):
342 if platform.python_implementation() == "CPython" and sys.version_info >= (3, 7):
342343 gc.collect()
343344 gc.freeze()
344345
2121 from typing_extensions import ContextManager
2222
2323 from twisted.internet import address
24 from twisted.web.resource import IResource
2425
2526 import synapse
2627 import synapse.events
8990 ToDeviceStream,
9091 )
9192 from synapse.rest.admin import register_servlets_for_media_repo
92 from synapse.rest.client.v1 import events, room
93 from synapse.rest.client.v1 import events, login, room
9394 from synapse.rest.client.v1.initial_sync import InitialSyncRestServlet
94 from synapse.rest.client.v1.login import LoginRestServlet
9595 from synapse.rest.client.v1.profile import (
9696 ProfileAvatarURLRestServlet,
9797 ProfileDisplaynameRestServlet,
126126 from synapse.rest.client.versions import VersionsRestServlet
127127 from synapse.rest.health import HealthResource
128128 from synapse.rest.key.v2 import KeyApiV2Resource
129 from synapse.rest.synapse.client import build_synapse_client_resource_tree
129130 from synapse.server import HomeServer, cache_in_self
130131 from synapse.storage.databases.main.censor_events import CensorEventsStore
131132 from synapse.storage.databases.main.client_ips import ClientIpWorkerStore
506507 site_tag = port
507508
508509 # We always include a health resource.
509 resources = {"/health": HealthResource()}
510 resources = {"/health": HealthResource()} # type: Dict[str, IResource]
510511
511512 for res in listener_config.http_options.resources:
512513 for name in res.names:
516517 resource = JsonResource(self, canonical_json=False)
517518
518519 RegisterRestServlet(self).register(resource)
519 LoginRestServlet(self).register(resource)
520 login.register_servlets(self, resource)
520521 ThreepidRestServlet(self).register(resource)
521522 DevicesRestServlet(self).register(resource)
522523 KeyQueryServlet(self).register(resource)
556557 groups.register_servlets(self, resource)
557558
558559 resources.update({CLIENT_API_PREFIX: resource})
560
561 resources.update(build_synapse_client_resource_tree(self))
559562 elif name == "federation":
560563 resources.update({FEDERATION_PREFIX: TransportLayerServer(self)})
561564 elif name == "media":
5959 from synapse.rest.admin import AdminRestResource
6060 from synapse.rest.health import HealthResource
6161 from synapse.rest.key.v2 import KeyApiV2Resource
62 from synapse.rest.synapse.client.pick_idp import PickIdpResource
63 from synapse.rest.synapse.client.pick_username import pick_username_resource
62 from synapse.rest.synapse.client import build_synapse_client_resource_tree
6463 from synapse.rest.well_known import WellKnownResource
6564 from synapse.server import HomeServer
6665 from synapse.storage import DataStore
189188 "/_matrix/client/versions": client_resource,
190189 "/.well-known/matrix/client": WellKnownResource(self),
191190 "/_synapse/admin": AdminRestResource(self),
192 "/_synapse/client/pick_username": pick_username_resource(self),
193 "/_synapse/client/pick_idp": PickIdpResource(self),
191 **build_synapse_client_resource_tree(self),
194192 }
195193 )
196
197 if self.get_config().oidc_enabled:
198 from synapse.rest.oidc import OIDCResource
199
200 resources["/_synapse/oidc"] = OIDCResource(self)
201
202 if self.get_config().saml2_enabled:
203 from synapse.rest.saml2 import SAML2Resource
204
205 resources["/_matrix/saml2"] = SAML2Resource(self)
206194
207195 if self.get_config().threepid_behaviour_email == ThreepidBehaviour.LOCAL:
208196 from synapse.rest.synapse.client.password_reset import (
9292
9393 stats["daily_active_users"] = await hs.get_datastore().count_daily_users()
9494 stats["monthly_active_users"] = await hs.get_datastore().count_monthly_users()
95 daily_active_e2ee_rooms = await hs.get_datastore().count_daily_active_e2ee_rooms()
96 stats["daily_active_e2ee_rooms"] = daily_active_e2ee_rooms
97 stats["daily_e2ee_messages"] = await hs.get_datastore().count_daily_e2ee_messages()
98 daily_sent_e2ee_messages = await hs.get_datastore().count_daily_sent_e2ee_messages()
99 stats["daily_sent_e2ee_messages"] = daily_sent_e2ee_messages
95100 stats["daily_active_rooms"] = await hs.get_datastore().count_daily_active_rooms()
96101 stats["daily_messages"] = await hs.get_datastore().count_daily_messages()
102 daily_sent_messages = await hs.get_datastore().count_daily_sent_messages()
103 stats["daily_sent_messages"] = daily_sent_messages
97104
98105 r30_results = await hs.get_datastore().count_r30_users()
99106 for name, count in r30_results.items():
100107 stats["r30_users_" + name] = count
101108
102 daily_sent_messages = await hs.get_datastore().count_daily_sent_messages()
103 stats["daily_sent_messages"] = daily_sent_messages
104109 stats["cache_factor"] = hs.config.caches.global_factor
105110 stats["event_cache_size"] = hs.config.caches.event_cache_size
106111
1717 import argparse
1818 import errno
1919 import os
20 import time
21 import urllib.parse
2220 from collections import OrderedDict
2321 from hashlib import sha256
2422 from textwrap import dedent
25 from typing import Any, Callable, Iterable, List, MutableMapping, Optional
23 from typing import Any, Iterable, List, MutableMapping, Optional
2624
2725 import attr
2826 import jinja2
2927 import pkg_resources
3028 import yaml
29
30 from synapse.util.templates import _create_mxc_to_http_filter, _format_ts_filter
3131
3232
3333 class ConfigError(Exception):
202202 with open(file_path) as file_stream:
203203 return file_stream.read()
204204
205 def read_template(self, filename: str) -> jinja2.Template:
206 """Load a template file from disk.
207
208 This function will attempt to load the given template from the default Synapse
209 template directory.
210
211 Files read are treated as Jinja templates. The templates is not rendered yet
212 and has autoescape enabled.
213
214 Args:
215 filename: A template filename to read.
216
217 Raises:
218 ConfigError: if the file's path is incorrect or otherwise cannot be read.
219
220 Returns:
221 A jinja2 template.
222 """
223 return self.read_templates([filename])[0]
224
205225 def read_templates(
206 self,
207 filenames: List[str],
208 custom_template_directory: Optional[str] = None,
209 autoescape: bool = False,
226 self, filenames: List[str], custom_template_directory: Optional[str] = None,
210227 ) -> List[jinja2.Template]:
211228 """Load a list of template files from disk using the given variables.
212229
214231 template directory. If `custom_template_directory` is supplied, that directory
215232 is tried first.
216233
217 Files read are treated as Jinja templates. These templates are not rendered yet.
234 Files read are treated as Jinja templates. The templates are not rendered yet
235 and have autoescape enabled.
218236
219237 Args:
220238 filenames: A list of template filenames to read.
222240 custom_template_directory: A directory to try to look for the templates
223241 before using the default Synapse template directory instead.
224242
225 autoescape: Whether to autoescape variables before inserting them into the
226 template.
227
228243 Raises:
229244 ConfigError: if the file's path is incorrect or otherwise cannot be read.
230245
231246 Returns:
232247 A list of jinja2 templates.
233248 """
234 templates = []
235249 search_directories = [self.default_template_dir]
236250
237251 # The loader will first look in the custom template directory (if specified) for the
247261 # Search the custom template directory as well
248262 search_directories.insert(0, custom_template_directory)
249263
264 # TODO: switch to synapse.util.templates.build_jinja_env
250265 loader = jinja2.FileSystemLoader(search_directories)
251 env = jinja2.Environment(loader=loader, autoescape=autoescape)
266 env = jinja2.Environment(loader=loader, autoescape=jinja2.select_autoescape(),)
252267
253268 # Update the environment with our custom filters
254269 env.filters.update(
258273 }
259274 )
260275
261 for filename in filenames:
262 # Load the template
263 template = env.get_template(filename)
264 templates.append(template)
265
266 return templates
267
268
269 def _format_ts_filter(value: int, format: str):
270 return time.strftime(format, time.localtime(value / 1000))
271
272
273 def _create_mxc_to_http_filter(public_baseurl: str) -> Callable:
274 """Create and return a jinja2 filter that converts MXC urls to HTTP
275
276 Args:
277 public_baseurl: The public, accessible base URL of the homeserver
278 """
279
280 def mxc_to_http_filter(value, width, height, resize_method="crop"):
281 if value[0:6] != "mxc://":
282 return ""
283
284 server_and_media_id = value[6:]
285 fragment = None
286 if "#" in server_and_media_id:
287 server_and_media_id, fragment = server_and_media_id.split("#", 1)
288 fragment = "#" + fragment
289
290 params = {"width": width, "height": height, "method": resize_method}
291 return "%s_matrix/media/v1/thumbnail/%s?%s%s" % (
292 public_baseurl,
293 server_and_media_id,
294 urllib.parse.urlencode(params),
295 fragment or "",
296 )
297
298 return mxc_to_http_filter
276 # Load the templates
277 return [env.get_template(filename) for filename in filenames]
299278
300279
301280 class RootConfig:
88 consent_config,
99 database,
1010 emailconfig,
11 experimental,
1112 groups,
1213 jwt_config,
1314 key,
1718 password_auth_providers,
1819 push,
1920 ratelimiting,
21 redis,
2022 registration,
2123 repository,
2224 room_directory,
4749
4850 class RootConfig:
4951 server: server.ServerConfig
52 experimental: experimental.ExperimentalConfig
5053 tls: tls.TlsConfig
5154 database: database.DatabaseConfig
5255 logging: logger.LoggingConfig
53 ratelimit: ratelimiting.RatelimitConfig
56 ratelimiting: ratelimiting.RatelimitConfig
5457 media: repository.ContentRepositoryConfig
5558 captcha: captcha.CaptchaConfig
5659 voip: voip.VoipConfig
7881 roomdirectory: room_directory.RoomDirectoryConfig
7982 thirdpartyrules: third_party_event_rules.ThirdPartyRulesConfig
8083 tracer: tracer.TracerConfig
84 redis: redis.RedisConfig
8185
8286 config_classes: List = ...
8387 def __init__(self) -> None: ...
2727 "recaptcha_siteverify_api",
2828 "https://www.recaptcha.net/recaptcha/api/siteverify",
2929 )
30 self.recaptcha_template = self.read_templates(
31 ["recaptcha.html"], autoescape=True
32 )[0]
30 self.recaptcha_template = self.read_template("recaptcha.html")
3331
3432 def generate_config_section(self, **kwargs):
3533 return """\
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414
15 from ._base import Config
15 from ._base import Config, ConfigError
1616
1717
1818 class CasConfig(Config):
2929
3030 if self.cas_enabled:
3131 self.cas_server_url = cas_config["server_url"]
32 self.cas_service_url = cas_config["service_url"]
32
33 # The public baseurl is required because it is used by the redirect
34 # template.
35 public_baseurl = self.public_baseurl
36 if not public_baseurl:
37 raise ConfigError("cas_config requires a public_baseurl to be set")
38
39 # TODO Update this to a _synapse URL.
40 self.cas_service_url = public_baseurl + "_matrix/client/r0/login/cas/ticket"
3341 self.cas_displayname_attribute = cas_config.get("displayname_attribute")
3442 self.cas_required_attributes = cas_config.get("required_attributes") or {}
3543 else:
5260 #
5361 #server_url: "https://cas-server.com"
5462
55 # The public URL of the homeserver.
56 #
57 #service_url: "https://homeserver.domain.com:8448"
58
5963 # The attribute of the CAS response to use as the display name.
6064 #
6165 # If unset, no displayname will be set.
8888
8989 def read_config(self, config, **kwargs):
9090 consent_config = config.get("user_consent")
91 self.terms_template = self.read_templates(["terms.html"], autoescape=True)[0]
91 self.terms_template = self.read_template("terms.html")
9292
9393 if consent_config is None:
9494 return
164164 missing = []
165165 if not self.email_notif_from:
166166 missing.append("email.notif_from")
167
168 # public_baseurl is required to build password reset and validation links that
169 # will be emailed to users
170 if config.get("public_baseurl") is None:
171 missing.append("public_baseurl")
167172
168173 if missing:
169174 raise ConfigError(
263268 if not self.email_notif_from:
264269 missing.append("email.notif_from")
265270
271 if config.get("public_baseurl") is None:
272 missing.append("public_baseurl")
273
266274 if missing:
267275 raise ConfigError(
268276 "email.enable_notifs is True but required keys are missing: %s"
0 # -*- coding: utf-8 -*-
1 # Copyright 2021 The Matrix.org Foundation C.I.C.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from synapse.config._base import Config
16 from synapse.types import JsonDict
17
18
19 class ExperimentalConfig(Config):
20 """Config section for enabling experimental features"""
21
22 section = "experimental"
23
24 def read_config(self, config: JsonDict, **kwargs):
25 experimental = config.get("experimental_features") or {}
26
27 # MSC2858 (multiple SSO identity providers)
28 self.msc2858_enabled = experimental.get("msc2858_enabled", False) # type: bool
2323 from .consent_config import ConsentConfig
2424 from .database import DatabaseConfig
2525 from .emailconfig import EmailConfig
26 from .experimental import ExperimentalConfig
2627 from .federation import FederationConfig
2728 from .groups import GroupsConfig
2829 from .jwt_config import JWTConfig
5657
5758 config_classes = [
5859 ServerConfig,
60 ExperimentalConfig,
5961 TlsConfig,
6062 FederationConfig,
6163 CacheConfig,
1313 # See the License for the specific language governing permissions and
1414 # limitations under the License.
1515
16 import string
16 from collections import Counter
1717 from typing import Iterable, Optional, Tuple, Type
1818
1919 import attr
4242 except DependencyException as e:
4343 raise ConfigError(e.message) from e
4444
45 # check we don't have any duplicate idp_ids now. (The SSO handler will also
46 # check for duplicates when the REST listeners get registered, but that happens
47 # after synapse has forked so doesn't give nice errors.)
48 c = Counter([i.idp_id for i in self.oidc_providers])
49 for idp_id, count in c.items():
50 if count > 1:
51 raise ConfigError(
52 "Multiple OIDC providers have the idp_id %r." % idp_id
53 )
54
4555 public_baseurl = self.public_baseurl
46 self.oidc_callback_url = public_baseurl + "_synapse/oidc/callback"
56 if public_baseurl is None:
57 raise ConfigError("oidc_config requires a public_baseurl to be set")
58 self.oidc_callback_url = public_baseurl + "_synapse/client/oidc/callback"
4759
4860 @property
4961 def oidc_enabled(self) -> bool:
6779 # offer the user a choice of login mechanisms.
6880 #
6981 # idp_icon: An optional icon for this identity provider, which is presented
70 # by identity picker pages. If given, must be an MXC URI of the format
71 # mxc://<server-name>/<media-id>. (An easy way to obtain such an MXC URI
72 # is to upload an image to an (unencrypted) room and then copy the "url"
73 # from the source of the event.)
82 # by clients and Synapse's own IdP picker page. If given, must be an
83 # MXC URI of the format mxc://<server-name>/<media-id>. (An easy way to
84 # obtain such an MXC URI is to upload an image to an (unencrypted) room
85 # and then copy the "url" from the source of the event.)
86 #
87 # idp_brand: An optional brand for this identity provider, allowing clients
88 # to style the login flow according to the identity provider in question.
89 # See the spec for possible options here.
7490 #
7591 # discover: set to 'false' to disable the use of the OIDC discovery mechanism
7692 # to discover endpoints. Defaults to true.
131147 #
132148 # For the default provider, the following settings are available:
133149 #
134 # sub: name of the claim containing a unique identifier for the
135 # user. Defaults to 'sub', which OpenID Connect compliant
136 # providers should provide.
150 # subject_claim: name of the claim containing a unique identifier
151 # for the user. Defaults to 'sub', which OpenID Connect
152 # compliant providers should provide.
137153 #
138154 # localpart_template: Jinja2 template for the localpart of the MXID.
139155 # If this is not set, the user will be prompted to choose their
140 # own username.
156 # own username (see 'sso_auth_account_details.html' in the 'sso'
157 # section of this file).
141158 #
142159 # display_name_template: Jinja2 template for the display name to set
143160 # on first login. If unset, no displayname will be set.
161 #
162 # email_template: Jinja2 template for the email address of the user.
163 # If unset, no email address will be added to the account.
144164 #
145165 # extra_attributes: a map of Jinja2 templates for extra attributes
146166 # to send back to the client during login.
177197 # userinfo_endpoint: "https://accounts.example.com/userinfo"
178198 # jwks_uri: "https://accounts.example.com/.well-known/jwks.json"
179199 # skip_verification: true
200 # user_mapping_provider:
201 # config:
202 # subject_claim: "id"
203 # localpart_template: "{{ user.login }}"
204 # display_name_template: "{{ user.name }}"
205 # email_template: "{{ user.email }}"
180206
181207 # For use with Keycloak
182208 #
191217 #
192218 #- idp_id: github
193219 # idp_name: Github
220 # idp_brand: org.matrix.github
194221 # discover: false
195222 # issuer: "https://github.com/"
196223 # client_id: "your-client-id" # TO BE FILLED
214241 "type": "object",
215242 "required": ["issuer", "client_id", "client_secret"],
216243 "properties": {
217 # TODO: fix the maxLength here depending on what MSC2528 decides
218 # remember that we prefix the ID given here with `oidc-`
219 "idp_id": {"type": "string", "minLength": 1, "maxLength": 128},
244 "idp_id": {
245 "type": "string",
246 "minLength": 1,
247 # MSC2858 allows a maxlen of 255, but we prefix with "oidc-"
248 "maxLength": 250,
249 "pattern": "^[A-Za-z0-9._~-]+$",
250 },
220251 "idp_name": {"type": "string"},
221252 "idp_icon": {"type": "string"},
253 "idp_brand": {
254 "type": "string",
255 # MSC2758-style namespaced identifier
256 "minLength": 1,
257 "maxLength": 255,
258 "pattern": "^[a-z][a-z0-9_.-]*$",
259 },
222260 "discover": {"type": "boolean"},
223261 "issuer": {"type": "string"},
224262 "client_id": {"type": "string"},
337375 config_path + ("user_mapping_provider", "module"),
338376 )
339377
340 # MSC2858 will apply certain limits in what can be used as an IdP id, so let's
341 # enforce those limits now.
342 # TODO: factor out this stuff to a generic function
343378 idp_id = oidc_config.get("idp_id", "oidc")
344
345 # TODO: update this validity check based on what MSC2858 decides.
346 valid_idp_chars = set(string.ascii_lowercase + string.digits + "-._")
347
348 if any(c not in valid_idp_chars for c in idp_id):
349 raise ConfigError(
350 'idp_id may only contain a-z, 0-9, "-", ".", "_"',
351 config_path + ("idp_id",),
352 )
353
354 if idp_id[0] not in string.ascii_lowercase:
355 raise ConfigError(
356 "idp_id must start with a-z", config_path + ("idp_id",),
357 )
358379
359380 # prefix the given IDP with a prefix specific to the SSO mechanism, to avoid
360381 # clashes with other mechs (such as SAML, CAS).
381402 idp_id=idp_id,
382403 idp_name=oidc_config.get("idp_name", "OIDC"),
383404 idp_icon=idp_icon,
405 idp_brand=oidc_config.get("idp_brand"),
384406 discover=oidc_config.get("discover", True),
385407 issuer=oidc_config["issuer"],
386408 client_id=oidc_config["client_id"],
411433 # Optional MXC URI for icon for this IdP.
412434 idp_icon = attr.ib(type=Optional[str])
413435
436 # Optional brand identifier for this IdP.
437 idp_brand = attr.ib(type=Optional[str])
438
414439 # whether the OIDC discovery mechanism is used to discover endpoints
415440 discover = attr.ib(type=bool)
416441
2323 defaults={"per_second": 0.17, "burst_count": 3.0},
2424 ):
2525 self.per_second = config.get("per_second", defaults["per_second"])
26 self.burst_count = config.get("burst_count", defaults["burst_count"])
26 self.burst_count = int(config.get("burst_count", defaults["burst_count"]))
2727
2828
2929 class FederationRateLimitConfig:
9999 self.rc_joins_remote = RateLimitConfig(
100100 config.get("rc_joins", {}).get("remote", {}),
101101 defaults={"per_second": 0.01, "burst_count": 3},
102 )
103
104 self.rc_3pid_validation = RateLimitConfig(
105 config.get("rc_3pid_validation") or {},
106 defaults={"per_second": 0.003, "burst_count": 5},
107 )
108
109 self.rc_invites_per_room = RateLimitConfig(
110 config.get("rc_invites", {}).get("per_room", {}),
111 defaults={"per_second": 0.3, "burst_count": 10},
112 )
113 self.rc_invites_per_user = RateLimitConfig(
114 config.get("rc_invites", {}).get("per_user", {}),
115 defaults={"per_second": 0.003, "burst_count": 5},
102116 )
103117
104118 def generate_config_section(self, **kwargs):
130144 # users are joining rooms the server is already in (this is cheap) vs
131145 # "remote" for when users are trying to join rooms not on the server (which
132146 # can be more expensive)
147 # - one for ratelimiting how often a user or IP can attempt to validate a 3PID.
148 # - two for ratelimiting how often invites can be sent in a room or to a
149 # specific user.
133150 #
134151 # The defaults are as shown below.
135152 #
163180 # remote:
164181 # per_second: 0.01
165182 # burst_count: 3
166
183 #
184 #rc_3pid_validation:
185 # per_second: 0.003
186 # burst_count: 5
187 #
188 #rc_invites:
189 # per_room:
190 # per_second: 0.3
191 # burst_count: 10
192 # per_user:
193 # per_second: 0.003
194 # burst_count: 5
167195
168196 # Ratelimiting settings for incoming federation
169197 #
4848
4949 self.startup_job_max_delta = self.period * 10.0 / 100.0
5050
51 if self.renew_by_email_enabled:
52 if "public_baseurl" not in synapse_config:
53 raise ConfigError("Can't send renewal emails without 'public_baseurl'")
54
5155 template_dir = config.get("template_dir")
5256
5357 if not template_dir:
104108 account_threepid_delegates = config.get("account_threepid_delegates") or {}
105109 self.account_threepid_delegate_email = account_threepid_delegates.get("email")
106110 self.account_threepid_delegate_msisdn = account_threepid_delegates.get("msisdn")
111 if self.account_threepid_delegate_msisdn and not self.public_baseurl:
112 raise ConfigError(
113 "The configuration option `public_baseurl` is required if "
114 "`account_threepid_delegate.msisdn` is set, such that "
115 "clients know where to submit validation tokens to. Please "
116 "configure `public_baseurl`."
117 )
107118
108119 self.default_identity_server = config.get("default_identity_server")
109120 self.allow_guest_access = config.get("allow_guest_access", False)
175186 self.session_lifetime = session_lifetime
176187
177188 # The success template used during fallback auth.
178 self.fallback_success_template = self.read_templates(
179 ["auth_success.html"], autoescape=True
180 )[0]
189 self.fallback_success_template = self.read_template("auth_success.html")
181190
182191 def generate_config_section(self, generate_secrets=False, **kwargs):
183192 if generate_secrets:
228237 # send an email to the account's email address with a renewal link. By
229238 # default, no such emails are sent.
230239 #
231 # If you enable this setting, you will also need to fill out the 'email'
232 # configuration section. You should also check that 'public_baseurl' is set
233 # correctly.
240 # If you enable this setting, you will also need to fill out the 'email' and
241 # 'public_baseurl' configuration sections.
234242 #
235243 #renew_at: 1w
236244
321329 # The identity server which we suggest that clients should use when users log
322330 # in on this server.
323331 #
324 # (By default, no suggestion is made, so it is left up to the client.)
332 # (By default, no suggestion is made, so it is left up to the client.
333 # This setting is ignored unless public_baseurl is also set.)
325334 #
326335 #default_identity_server: https://matrix.org
327336
345354 # Servers handling the these requests must answer the `/requestToken` endpoints defined
346355 # by the Matrix Identity Service API specification:
347356 # https://matrix.org/docs/spec/identity_service/latest
357 #
358 # If a delegate is specified, the config option public_baseurl must also be filled out.
348359 #
349360 account_threepid_delegates:
350361 #email: https://example.com # Delegate email sending to example.com
188188 import saml2
189189
190190 public_baseurl = self.public_baseurl
191 if public_baseurl is None:
192 raise ConfigError("saml2_config requires a public_baseurl to be set")
191193
192194 if self.saml2_grandfathered_mxid_source_attribute:
193195 optional_attributes.add(self.saml2_grandfathered_mxid_source_attribute)
194196 optional_attributes -= required_attributes
195197
196 metadata_url = public_baseurl + "_matrix/saml2/metadata.xml"
197 response_url = public_baseurl + "_matrix/saml2/authn_response"
198 metadata_url = public_baseurl + "_synapse/client/saml2/metadata.xml"
199 response_url = public_baseurl + "_synapse/client/saml2/authn_response"
198200 return {
199201 "entityid": metadata_url,
200202 "service": {
232234 # enable SAML login.
233235 #
234236 # Once SAML support is enabled, a metadata file will be exposed at
235 # https://<server>:<port>/_matrix/saml2/metadata.xml, which you may be able to
237 # https://<server>:<port>/_synapse/client/saml2/metadata.xml, which you may be able to
236238 # use to configure your SAML IdP with. Alternatively, you can manually configure
237239 # the IdP to use an ACS location of
238 # https://<server>:<port>/_matrix/saml2/authn_response.
240 # https://<server>:<port>/_synapse/client/saml2/authn_response.
239241 #
240242 saml2_config:
241243 # `sp_config` is the configuration for the pysaml2 Service Provider.
160160 self.print_pidfile = config.get("print_pidfile")
161161 self.user_agent_suffix = config.get("user_agent_suffix")
162162 self.use_frozen_dicts = config.get("use_frozen_dicts", False)
163 self.public_baseurl = config.get("public_baseurl") or "https://%s/" % (
164 self.server_name,
165 )
166 if self.public_baseurl[-1] != "/":
167 self.public_baseurl += "/"
163 self.public_baseurl = config.get("public_baseurl")
168164
169165 # Whether to enable user presence.
170166 self.use_presence = config.get("use_presence", True)
320316 # Always blacklist 0.0.0.0, ::
321317 self.federation_ip_range_blacklist.update(["0.0.0.0", "::"])
322318
319 if self.public_baseurl is not None:
320 if self.public_baseurl[-1] != "/":
321 self.public_baseurl += "/"
323322 self.start_pushers = config.get("start_pushers", True)
324323
325324 # (undocumented) option for torturing the worker-mode replication a bit,
747746 # Otherwise, it should be the URL to reach Synapse's client HTTP listener (see
748747 # 'listeners' below).
749748 #
750 # If this is left unset, it defaults to 'https://<server_name>/'. (Note that
751 # that will not work unless you configure Synapse or a reverse-proxy to listen
752 # on port 443.)
753 #
754749 #public_baseurl: https://example.com/
755750
756751 # Set the soft limit on the number of file descriptors synapse can use
2626 sso_config = config.get("sso") or {} # type: Dict[str, Any]
2727
2828 # The sso-specific template_dir
29 template_dir = sso_config.get("template_dir")
29 self.sso_template_dir = sso_config.get("template_dir")
3030
3131 # Read templates from disk
3232 (
4747 "sso_auth_success.html",
4848 "sso_auth_bad_user.html",
4949 ],
50 template_dir,
50 self.sso_template_dir,
5151 )
5252
5353 # These templates have no placeholders, so render them here
6363 # gracefully to the client). This would make it pointless to ask the user for
6464 # confirmation, since the URL the confirmation page would be showing wouldn't be
6565 # the client's.
66 login_fallback_url = self.public_baseurl + "_matrix/static/client/login"
67 self.sso_client_whitelist.append(login_fallback_url)
66 # public_baseurl is an optional setting, so we only add the fallback's URL to the
67 # list if it's provided (because we can't figure out what that URL is otherwise).
68 if self.public_baseurl:
69 login_fallback_url = self.public_baseurl + "_matrix/static/client/login"
70 self.sso_client_whitelist.append(login_fallback_url)
6871
6972 def generate_config_section(self, **kwargs):
7073 return """\
8285 # phishing attacks from evil.site. To avoid this, include a slash after the
8386 # hostname: "https://my.client/".
8487 #
85 # The login fallback page (used by clients that don't natively support the
86 # required login flows) is automatically whitelisted in addition to any URLs
87 # in this list.
88 # If public_baseurl is set, then the login fallback page (used by clients
89 # that don't natively support the required login flows) is whitelisted in
90 # addition to any URLs in this list.
8891 #
8992 # By default, this list is empty.
9093 #
105108 #
106109 # When rendering, this template is given the following variables:
107110 # * redirect_url: the URL that the user will be redirected to after
108 # login. Needs manual escaping (see
109 # https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
111 # login.
110112 #
111113 # * server_name: the homeserver's name.
112114 #
113115 # * providers: a list of available Identity Providers. Each element is
114116 # an object with the following attributes:
117 #
115118 # * idp_id: unique identifier for the IdP
116119 # * idp_name: user-facing name for the IdP
120 # * idp_icon: if specified in the IdP config, an MXC URI for an icon
121 # for the IdP
122 # * idp_brand: if specified in the IdP config, a textual identifier
123 # for the brand of the IdP
117124 #
118125 # The rendered HTML page should contain a form which submits its results
119126 # back as a GET request, with the following query parameters:
123130 #
124131 # * idp: the 'idp_id' of the chosen IDP.
125132 #
133 # * HTML page to prompt new users to enter a userid and confirm other
134 # details: 'sso_auth_account_details.html'. This is only shown if the
135 # SSO implementation (with any user_mapping_provider) does not return
136 # a localpart.
137 #
138 # When rendering, this template is given the following variables:
139 #
140 # * server_name: the homeserver's name.
141 #
142 # * idp: details of the SSO Identity Provider that the user logged in
143 # with: an object with the following attributes:
144 #
145 # * idp_id: unique identifier for the IdP
146 # * idp_name: user-facing name for the IdP
147 # * idp_icon: if specified in the IdP config, an MXC URI for an icon
148 # for the IdP
149 # * idp_brand: if specified in the IdP config, a textual identifier
150 # for the brand of the IdP
151 #
152 # * user_attributes: an object containing details about the user that
153 # we received from the IdP. May have the following attributes:
154 #
155 # * display_name: the user's display_name
156 # * emails: a list of email addresses
157 #
158 # The template should render a form which submits the following fields:
159 #
160 # * username: the localpart of the user's chosen user id
161 #
162 # * HTML page allowing the user to consent to the server's terms and
163 # conditions. This is only shown for new users, and only if
164 # `user_consent.require_at_registration` is set.
165 #
166 # When rendering, this template is given the following variables:
167 #
168 # * server_name: the homeserver's name.
169 #
170 # * user_id: the user's matrix proposed ID.
171 #
172 # * user_profile.display_name: the user's proposed display name, if any.
173 #
174 # * consent_version: the version of the terms that the user will be
175 # shown
176 #
177 # * terms_url: a link to the page showing the terms.
178 #
179 # The template should render a form which submits the following fields:
180 #
181 # * accepted_version: the version of the terms accepted by the user
182 # (ie, 'consent_version' from the input variables).
183 #
126184 # * HTML page for a confirmation step before redirecting back to the client
127185 # with the login token: 'sso_redirect_confirm.html'.
128186 #
129 # When rendering, this template is given three variables:
130 # * redirect_url: the URL the user is about to be redirected to. Needs
131 # manual escaping (see
132 # https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
187 # When rendering, this template is given the following variables:
188 #
189 # * redirect_url: the URL the user is about to be redirected to.
133190 #
134191 # * display_url: the same as `redirect_url`, but with the query
135192 # parameters stripped. The intention is to have a
136193 # human-readable URL to show to users, not to use it as
137 # the final address to redirect to. Needs manual escaping
138 # (see https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
139 #
140 # * server_name: the homeserver's name.
194 # the final address to redirect to.
195 #
196 # * server_name: the homeserver's name.
197 #
198 # * new_user: a boolean indicating whether this is the user's first time
199 # logging in.
200 #
201 # * user_id: the user's matrix ID.
202 #
203 # * user_profile.avatar_url: an MXC URI for the user's avatar, if any.
204 # None if the user has not set an avatar.
205 #
206 # * user_profile.display_name: the user's display name. None if the user
207 # has not set a display name.
141208 #
142209 # * HTML page which notifies the user that they are authenticating to confirm
143210 # an operation on their account during the user interactive authentication
144211 # process: 'sso_auth_confirm.html'.
145212 #
146213 # When rendering, this template is given the following variables:
147 # * redirect_url: the URL the user is about to be redirected to. Needs
148 # manual escaping (see
149 # https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
214 # * redirect_url: the URL the user is about to be redirected to.
150215 #
151216 # * description: the operation which the user is being asked to confirm
217 #
218 # * idp: details of the Identity Provider that we will use to confirm
219 # the user's identity: an object with the following attributes:
220 #
221 # * idp_id: unique identifier for the IdP
222 # * idp_name: user-facing name for the IdP
223 # * idp_icon: if specified in the IdP config, an MXC URI for an icon
224 # for the IdP
225 # * idp_brand: if specified in the IdP config, a textual identifier
226 # for the brand of the IdP
152227 #
153228 # * HTML page shown after a successful user interactive authentication session:
154229 # 'sso_auth_success.html'.
124124 self._no_verify_ssl_context = _no_verify_ssl.getContext()
125125 self._no_verify_ssl_context.set_info_callback(_context_info_cb)
126126
127 self._should_verify = self._config.federation_verify_certificates
128
129 self._federation_certificate_verification_whitelist = (
130 self._config.federation_certificate_verification_whitelist
131 )
132
127133 def get_options(self, host: bytes):
128
129134 # IPolicyForHTTPS.get_options takes bytes, but we want to compare
130135 # against the str whitelist. The hostnames in the whitelist are already
131136 # IDNA-encoded like the hosts will be here.
132137 ascii_host = host.decode("ascii")
133138
134139 # Check if certificate verification has been enabled
135 should_verify = self._config.federation_verify_certificates
140 should_verify = self._should_verify
136141
137142 # Check if we've disabled certificate verification for this host
138 if should_verify:
139 for regex in self._config.federation_certificate_verification_whitelist:
143 if self._should_verify:
144 for regex in self._federation_certificate_verification_whitelist:
140145 if regex.match(ascii_host):
141146 should_verify = False
142147 break
1717 import itertools
1818 import logging
1919 from typing import (
20 TYPE_CHECKING,
2021 Any,
2122 Awaitable,
2223 Callable,
2526 List,
2627 Mapping,
2728 Optional,
28 Sequence,
2929 Tuple,
3030 TypeVar,
3131 Union,
6060 from synapse.util.caches.expiringcache import ExpiringCache
6161 from synapse.util.retryutils import NotRetryingDestination
6262
63 if TYPE_CHECKING:
64 from synapse.app.homeserver import HomeServer
65
6366 logger = logging.getLogger(__name__)
6467
6568 sent_queries_counter = Counter("synapse_federation_client_sent_queries", "", ["type"])
7982
8083
8184 class FederationClient(FederationBase):
82 def __init__(self, hs):
85 def __init__(self, hs: "HomeServer"):
8386 super().__init__(hs)
8487
85 self.pdu_destination_tried = {}
88 self.pdu_destination_tried = {} # type: Dict[str, Dict[str, int]]
8689 self._clock.looping_call(self._clear_tried_cache, 60 * 1000)
8790 self.state = hs.get_state_handler()
8891 self.transport_layer = hs.get_federation_transport_client()
115118 self.pdu_destination_tried[event_id] = destination_dict
116119
117120 @log_function
118 def make_query(
121 async def make_query(
119122 self,
120 destination,
121 query_type,
122 args,
123 retry_on_dns_fail=False,
124 ignore_backoff=False,
125 ):
123 destination: str,
124 query_type: str,
125 args: dict,
126 retry_on_dns_fail: bool = False,
127 ignore_backoff: bool = False,
128 ) -> JsonDict:
126129 """Sends a federation Query to a remote homeserver of the given type
127130 and arguments.
128131
129132 Args:
130 destination (str): Domain name of the remote homeserver
131 query_type (str): Category of the query type; should match the
133 destination: Domain name of the remote homeserver
134 query_type: Category of the query type; should match the
132135 handler name used in register_query_handler().
133 args (dict): Mapping of strings to strings containing the details
136 args: Mapping of strings to strings containing the details
134137 of the query request.
135 ignore_backoff (bool): true to ignore the historical backoff data
138 ignore_backoff: true to ignore the historical backoff data
136139 and try the request anyway.
137140
138141 Returns:
139 a Awaitable which will eventually yield a JSON object from the
140 response
142 The JSON object from the response
141143 """
142144 sent_queries_counter.labels(query_type).inc()
143145
144 return self.transport_layer.make_query(
146 return await self.transport_layer.make_query(
145147 destination,
146148 query_type,
147149 args,
150152 )
151153
152154 @log_function
153 def query_client_keys(self, destination, content, timeout):
155 async def query_client_keys(
156 self, destination: str, content: JsonDict, timeout: int
157 ) -> JsonDict:
154158 """Query device keys for a device hosted on a remote server.
155159
156160 Args:
157 destination (str): Domain name of the remote homeserver
158 content (dict): The query content.
159
160 Returns:
161 an Awaitable which will eventually yield a JSON object from the
162 response
161 destination: Domain name of the remote homeserver
162 content: The query content.
163
164 Returns:
165 The JSON object from the response
163166 """
164167 sent_queries_counter.labels("client_device_keys").inc()
165 return self.transport_layer.query_client_keys(destination, content, timeout)
168 return await self.transport_layer.query_client_keys(
169 destination, content, timeout
170 )
166171
167172 @log_function
168 def query_user_devices(self, destination, user_id, timeout=30000):
173 async def query_user_devices(
174 self, destination: str, user_id: str, timeout: int = 30000
175 ) -> JsonDict:
169176 """Query the device keys for a list of user ids hosted on a remote
170177 server.
171178 """
172179 sent_queries_counter.labels("user_devices").inc()
173 return self.transport_layer.query_user_devices(destination, user_id, timeout)
180 return await self.transport_layer.query_user_devices(
181 destination, user_id, timeout
182 )
174183
175184 @log_function
176 def claim_client_keys(self, destination, content, timeout):
185 async def claim_client_keys(
186 self, destination: str, content: JsonDict, timeout: int
187 ) -> JsonDict:
177188 """Claims one-time keys for a device hosted on a remote server.
178189
179190 Args:
180 destination (str): Domain name of the remote homeserver
181 content (dict): The query content.
182
183 Returns:
184 an Awaitable which will eventually yield a JSON object from the
185 response
191 destination: Domain name of the remote homeserver
192 content: The query content.
193
194 Returns:
195 The JSON object from the response
186196 """
187197 sent_queries_counter.labels("client_one_time_keys").inc()
188 return self.transport_layer.claim_client_keys(destination, content, timeout)
198 return await self.transport_layer.claim_client_keys(
199 destination, content, timeout
200 )
189201
190202 async def backfill(
191203 self, dest: str, room_id: str, limit: int, extremities: Iterable[str]
194206 given destination server.
195207
196208 Args:
197 dest (str): The remote homeserver to ask.
198 room_id (str): The room_id to backfill.
199 limit (int): The maximum number of events to return.
200 extremities (list): our current backwards extremities, to backfill from
209 dest: The remote homeserver to ask.
210 room_id: The room_id to backfill.
211 limit: The maximum number of events to return.
212 extremities: our current backwards extremities, to backfill from
201213 """
202214 logger.debug("backfill extrem=%s", extremities)
203215
369381 for events that have failed their checks
370382
371383 Returns:
372 Deferred : A list of PDUs that have valid signatures and hashes.
384 A list of PDUs that have valid signatures and hashes.
373385 """
374386 deferreds = self._check_sigs_and_hashes(room_version, pdus)
375387
417429 else:
418430 return [p for p in valid_pdus if p]
419431
420 async def get_event_auth(self, destination, room_id, event_id):
432 async def get_event_auth(
433 self, destination: str, room_id: str, event_id: str
434 ) -> List[EventBase]:
421435 res = await self.transport_layer.get_event_auth(destination, room_id, event_id)
422436
423437 room_version = await self.store.get_room_version(room_id)
699713
700714 return await self._try_destination_list("send_join", destinations, send_request)
701715
702 async def _do_send_join(self, destination: str, pdu: EventBase):
716 async def _do_send_join(self, destination: str, pdu: EventBase) -> JsonDict:
703717 time_now = self._clock.time_msec()
704718
705719 try:
706 content = await self.transport_layer.send_join_v2(
720 return await self.transport_layer.send_join_v2(
707721 destination=destination,
708722 room_id=pdu.room_id,
709723 event_id=pdu.event_id,
710724 content=pdu.get_pdu_json(time_now),
711725 )
712
713 return content
714726 except HttpResponseException as e:
715727 if e.code in [400, 404]:
716728 err = e.to_synapse_error()
768780 time_now = self._clock.time_msec()
769781
770782 try:
771 content = await self.transport_layer.send_invite_v2(
783 return await self.transport_layer.send_invite_v2(
772784 destination=destination,
773785 room_id=pdu.room_id,
774786 event_id=pdu.event_id,
778790 "invite_room_state": pdu.unsigned.get("invite_room_state", []),
779791 },
780792 )
781 return content
782793 except HttpResponseException as e:
783794 if e.code in [400, 404]:
784795 err = e.to_synapse_error()
798809 "User's homeserver does not support this room version",
799810 Codes.UNSUPPORTED_ROOM_VERSION,
800811 )
801 elif e.code == 403:
812 elif e.code in (403, 429):
802813 raise e.to_synapse_error()
803814 else:
804815 raise
841852 "send_leave", destinations, send_request
842853 )
843854
844 async def _do_send_leave(self, destination, pdu):
855 async def _do_send_leave(self, destination: str, pdu: EventBase) -> JsonDict:
845856 time_now = self._clock.time_msec()
846857
847858 try:
848 content = await self.transport_layer.send_leave_v2(
859 return await self.transport_layer.send_leave_v2(
849860 destination=destination,
850861 room_id=pdu.room_id,
851862 event_id=pdu.event_id,
852863 content=pdu.get_pdu_json(time_now),
853864 )
854
855 return content
856865 except HttpResponseException as e:
857866 if e.code in [400, 404]:
858867 err = e.to_synapse_error()
878887 # content.
879888 return resp[1]
880889
881 def get_public_rooms(
890 async def get_public_rooms(
882891 self,
883892 remote_server: str,
884893 limit: Optional[int] = None,
886895 search_filter: Optional[Dict] = None,
887896 include_all_networks: bool = False,
888897 third_party_instance_id: Optional[str] = None,
889 ):
898 ) -> JsonDict:
890899 """Get the list of public rooms from a remote homeserver
891900
892901 Args:
900909 party instance
901910
902911 Returns:
903 Awaitable[Dict[str, Any]]: The response from the remote server, or None if
904 `remote_server` is the same as the local server_name
912 The response from the remote server.
905913
906914 Raises:
907915 HttpResponseException: There was an exception returned from the remote server
909917 requests over federation
910918
911919 """
912 return self.transport_layer.get_public_rooms(
920 return await self.transport_layer.get_public_rooms(
913921 remote_server,
914922 limit,
915923 since_token,
922930 self,
923931 destination: str,
924932 room_id: str,
925 earliest_events_ids: Sequence[str],
933 earliest_events_ids: Iterable[str],
926934 latest_events: Iterable[EventBase],
927935 limit: int,
928936 min_depth: int,
973981
974982 return signed_events
975983
976 async def forward_third_party_invite(self, destinations, room_id, event_dict):
984 async def forward_third_party_invite(
985 self, destinations: Iterable[str], room_id: str, event_dict: JsonDict
986 ) -> None:
977987 for destination in destinations:
978988 if destination == self.server_name:
979989 continue
982992 await self.transport_layer.exchange_third_party_invite(
983993 destination=destination, room_id=room_id, event_dict=event_dict
984994 )
985 return None
995 return
986996 except CodeMessageException:
987997 raise
988998 except Exception as e:
9941004
9951005 async def get_room_complexity(
9961006 self, destination: str, room_id: str
997 ) -> Optional[dict]:
1007 ) -> Optional[JsonDict]:
9981008 """
9991009 Fetch the complexity of a remote room from another server.
10001010
10071017 could not fetch the complexity.
10081018 """
10091019 try:
1010 complexity = await self.transport_layer.get_room_complexity(
1020 return await self.transport_layer.get_room_complexity(
10111021 destination=destination, room_id=room_id
10121022 )
1013 return complexity
10141023 except CodeMessageException as e:
10151024 # We didn't manage to get it -- probably a 404. We are okay if other
10161025 # servers don't give it to us.
141141 self._wake_destinations_needing_catchup,
142142 )
143143
144 self._external_cache = hs.get_external_cache()
145
144146 def _get_per_destination_queue(self, destination: str) -> PerDestinationQueue:
145147 """Get or create a PerDestinationQueue for the given destination
146148
196198 if not event.internal_metadata.should_proactively_send():
197199 return
198200
199 try:
200 # Get the state from before the event.
201 # We need to make sure that this is the state from before
202 # the event and not from after it.
203 # Otherwise if the last member on a server in a room is
204 # banned then it won't receive the event because it won't
205 # be in the room after the ban.
206 destinations = await self.state.get_hosts_in_room_at_events(
207 event.room_id, event_ids=event.prev_event_ids()
201 destinations = None # type: Optional[Set[str]]
202 if not event.prev_event_ids():
203 # If there are no prev event IDs then the state is empty
204 # and so no remote servers in the room
205 destinations = set()
206 else:
207 # We check the external cache for the destinations, which is
208 # stored per state group.
209
210 sg = await self._external_cache.get(
211 "event_to_prev_state_group", event.event_id
208212 )
209 except Exception:
210 logger.exception(
211 "Failed to calculate hosts in room for event: %s",
212 event.event_id,
213 )
214 return
213 if sg:
214 destinations = await self._external_cache.get(
215 "get_joined_hosts", str(sg)
216 )
217
218 if destinations is None:
219 try:
220 # Get the state from before the event.
221 # We need to make sure that this is the state from before
222 # the event and not from after it.
223 # Otherwise if the last member on a server in a room is
224 # banned then it won't receive the event because it won't
225 # be in the room after the ban.
226 destinations = await self.state.get_hosts_in_room_at_events(
227 event.room_id, event_ids=event.prev_event_ids()
228 )
229 except Exception:
230 logger.exception(
231 "Failed to calculate hosts in room for event: %s",
232 event.event_id,
233 )
234 return
215235
216236 destinations = {
217237 d
1313 # limitations under the License.
1414
1515 import logging
16 from typing import TYPE_CHECKING
1617
1718 import twisted
1819 import twisted.internet.error
2021 from twisted.web.resource import Resource
2122
2223 from synapse.app import check_bind_error
24
25 if TYPE_CHECKING:
26 from synapse.app.homeserver import HomeServer
2327
2428 logger = logging.getLogger(__name__)
2529
3438
3539
3640 class AcmeHandler:
37 def __init__(self, hs):
41 def __init__(self, hs: "HomeServer"):
3842 self.hs = hs
3943 self.reactor = hs.get_reactor()
4044 self._acme_domain = hs.config.acme_domain
4145
42 async def start_listening(self):
46 async def start_listening(self) -> None:
4347 from synapse.handlers import acme_issuing_service
4448
4549 # Configure logging for txacme, if you need to debug
8488 logger.error(ACME_REGISTER_FAIL_ERROR)
8589 raise
8690
87 async def provision_certificate(self):
91 async def provision_certificate(self) -> None:
8892
8993 logger.warning("Reprovisioning %s", self._acme_domain)
9094
109113 except Exception:
110114 logger.exception("Failed saving!")
111115 raise
112
113 return True
2121 imported conditionally.
2222 """
2323 import logging
24 from typing import Dict, Iterable, List
2425
2526 import attr
27 import pem
2628 from cryptography.hazmat.backends import default_backend
2729 from cryptography.hazmat.primitives import serialization
2830 from josepy import JWKRSA
3537 from zope.interface import implementer
3638
3739 from twisted.internet import defer
40 from twisted.internet.interfaces import IReactorTCP
3841 from twisted.python.filepath import FilePath
3942 from twisted.python.url import URL
43 from twisted.web.resource import IResource
4044
4145 logger = logging.getLogger(__name__)
4246
4347
44 def create_issuing_service(reactor, acme_url, account_key_file, well_known_resource):
48 def create_issuing_service(
49 reactor: IReactorTCP,
50 acme_url: str,
51 account_key_file: str,
52 well_known_resource: IResource,
53 ) -> AcmeIssuingService:
4554 """Create an ACME issuing service, and attach it to a web Resource
4655
4756 Args:
4857 reactor: twisted reactor
49 acme_url (str): URL to use to request certificates
50 account_key_file (str): where to store the account key
51 well_known_resource (twisted.web.IResource): web resource for .well-known.
58 acme_url: URL to use to request certificates
59 account_key_file: where to store the account key
60 well_known_resource: web resource for .well-known.
5261 we will attach a child resource for "acme-challenge".
5362
5463 Returns:
8291 A store that only stores in memory.
8392 """
8493
85 certs = attr.ib(default=attr.Factory(dict))
94 certs = attr.ib(type=Dict[bytes, List[bytes]], default=attr.Factory(dict))
8695
87 def store(self, server_name, pem_objects):
96 def store(
97 self, server_name: bytes, pem_objects: Iterable[pem.AbstractPEMObject]
98 ) -> defer.Deferred:
8899 self.certs[server_name] = [o.as_bytes() for o in pem_objects]
89100 return defer.succeed(None)
90101
91102
92 def load_or_create_client_key(key_file):
103 def load_or_create_client_key(key_file: str) -> JWKRSA:
93104 """Load the ACME account key from a file, creating it if it does not exist.
94105
95106 Args:
96 key_file (str): name of the file to use as the account key
107 key_file: name of the file to use as the account key
97108 """
98109 # this is based on txacme.endpoint.load_or_create_client_key, but doesn't
99110 # hardcode the 'client.key' filename
6060 from synapse.logging.context import defer_to_thread
6161 from synapse.metrics.background_process_metrics import run_as_background_process
6262 from synapse.module_api import ModuleApi
63 from synapse.storage.roommember import ProfileInfo
6364 from synapse.types import JsonDict, Requester, UserID
6465 from synapse.util import stringutils as stringutils
6566 from synapse.util.async_helpers import maybe_awaitable
566567 session.session_id, login_type, result
567568 )
568569 except LoginError as e:
569 if login_type == LoginType.EMAIL_IDENTITY:
570 # riot used to have a bug where it would request a new
571 # validation token (thus sending a new email) each time it
572 # got a 401 with a 'flows' field.
573 # (https://github.com/vector-im/vector-web/issues/2447).
574 #
575 # Grandfather in the old behaviour for now to avoid
576 # breaking old riot deployments.
577 raise
578
579570 # this step failed. Merge the error dict into the response
580571 # so that the client can have another go.
581572 errordict = e.error_dict()
13861377 )
13871378
13881379 return self._sso_auth_confirm_template.render(
1389 description=session.description, redirect_url=redirect_url,
1380 description=session.description,
1381 redirect_url=redirect_url,
1382 idp=sso_auth_provider,
13901383 )
13911384
13921385 async def complete_sso_login(
13951388 request: Request,
13961389 client_redirect_url: str,
13971390 extra_attributes: Optional[JsonDict] = None,
1391 new_user: bool = False,
13981392 ):
13991393 """Having figured out a mxid for this user, complete the HTTP request
14001394
14051399 process.
14061400 extra_attributes: Extra attributes which will be passed to the client
14071401 during successful login. Must be JSON serializable.
1402 new_user: True if we should use wording appropriate to a user who has just
1403 registered.
14081404 """
14091405 # If the account has been deactivated, do not proceed with the login
14101406 # flow.
14131409 respond_with_html(request, 403, self._sso_account_deactivated_template)
14141410 return
14151411
1412 profile = await self.store.get_profileinfo(
1413 UserID.from_string(registered_user_id).localpart
1414 )
1415
14161416 self._complete_sso_login(
1417 registered_user_id, request, client_redirect_url, extra_attributes
1417 registered_user_id,
1418 request,
1419 client_redirect_url,
1420 extra_attributes,
1421 new_user=new_user,
1422 user_profile_data=profile,
14181423 )
14191424
14201425 def _complete_sso_login(
14231428 request: Request,
14241429 client_redirect_url: str,
14251430 extra_attributes: Optional[JsonDict] = None,
1431 new_user: bool = False,
1432 user_profile_data: Optional[ProfileInfo] = None,
14261433 ):
14271434 """
14281435 The synchronous portion of complete_sso_login.
14291436
14301437 This exists purely for backwards compatibility of synapse.module_api.ModuleApi.
14311438 """
1439
1440 if user_profile_data is None:
1441 user_profile_data = ProfileInfo(None, None)
1442
14321443 # Store any extra attributes which will be passed in the login response.
14331444 # Note that this is per-user so it may overwrite a previous value, this
14341445 # is considered OK since the newest SSO attributes should be most valid.
14661477 display_url=redirect_url_no_params,
14671478 redirect_url=redirect_url,
14681479 server_name=self._server_name,
1480 new_user=new_user,
1481 user_id=registered_user_id,
1482 user_profile=user_profile_data,
14691483 )
14701484 respond_with_html(request, 200, html)
14711485
7979 # user-facing name of this auth provider
8080 self.idp_name = "CAS"
8181
82 # we do not currently support icons for CAS auth, but this is required by
82 # we do not currently support brands/icons for CAS auth, but this is required by
8383 # the SsoIdentityProvider protocol type.
8484 self.idp_icon = None
85 self.idp_brand = None
8586
8687 self._sso_handler = hs.get_sso_handler()
8788
9899 Returns:
99100 The URL to use as a "service" parameter.
100101 """
101 return "%s%s?%s" % (
102 self._cas_service_url,
103 "/_matrix/client/r0/login/cas/ticket",
104 urllib.parse.urlencode(args),
105 )
102 return "%s?%s" % (self._cas_service_url, urllib.parse.urlencode(args),)
106103
107104 async def _validate_ticket(
108105 self, ticket: str, service_args: Dict[str, str]
1414 # See the License for the specific language governing permissions and
1515 # limitations under the License.
1616 import logging
17 from typing import TYPE_CHECKING, Any, Dict, Iterable, List, Optional, Set, Tuple
17 from typing import TYPE_CHECKING, Dict, Iterable, List, Optional, Set, Tuple
1818
1919 from synapse.api import errors
2020 from synapse.api.constants import EventTypes
6161 self._auth_handler = hs.get_auth_handler()
6262
6363 @trace
64 async def get_devices_by_user(self, user_id: str) -> List[Dict[str, Any]]:
64 async def get_devices_by_user(self, user_id: str) -> List[JsonDict]:
6565 """
6666 Retrieve the given user's devices
6767
8484 return devices
8585
8686 @trace
87 async def get_device(self, user_id: str, device_id: str) -> Dict[str, Any]:
87 async def get_device(self, user_id: str, device_id: str) -> JsonDict:
8888 """ Retrieve the given device
8989
9090 Args:
597597
598598
599599 def _update_device_from_client_ips(
600 device: Dict[str, Any], client_ips: Dict[Tuple[str, str], Dict[str, Any]]
600 device: JsonDict, client_ips: Dict[Tuple[str, str], JsonDict]
601601 ) -> None:
602602 ip = client_ips.get((device["user_id"], device["device_id"]), {})
603603 device.update({"last_seen_ts": ip.get("last_seen"), "last_seen_ip": ip.get("ip")})
945945 async def process_cross_signing_key_update(
946946 self,
947947 user_id: str,
948 master_key: Optional[Dict[str, Any]],
949 self_signing_key: Optional[Dict[str, Any]],
948 master_key: Optional[JsonDict],
949 self_signing_key: Optional[JsonDict],
950950 ) -> List[str]:
951951 """Process the given new master and self-signing key for the given remote user.
952952
1515 # limitations under the License.
1616
1717 import logging
18 from typing import Dict, List, Optional, Tuple
18 from typing import TYPE_CHECKING, Dict, Iterable, List, Optional, Tuple
1919
2020 import attr
2121 from canonicaljson import encode_canonical_json
3030 from synapse.logging.opentracing import log_kv, set_tag, tag_args, trace
3131 from synapse.replication.http.devices import ReplicationUserDevicesResyncRestServlet
3232 from synapse.types import (
33 JsonDict,
3334 UserID,
3435 get_domain_from_id,
3536 get_verify_key_from_cross_signing_key,
3940 from synapse.util.caches.expiringcache import ExpiringCache
4041 from synapse.util.retryutils import NotRetryingDestination
4142
43 if TYPE_CHECKING:
44 from synapse.app.homeserver import HomeServer
45
4246 logger = logging.getLogger(__name__)
4347
4448
4549 class E2eKeysHandler:
46 def __init__(self, hs):
50 def __init__(self, hs: "HomeServer"):
4751 self.store = hs.get_datastore()
4852 self.federation = hs.get_federation_client()
4953 self.device_handler = hs.get_device_handler()
7781 )
7882
7983 @trace
80 async def query_devices(self, query_body, timeout, from_user_id):
84 async def query_devices(
85 self, query_body: JsonDict, timeout: int, from_user_id: str
86 ) -> JsonDict:
8187 """ Handle a device key query from a client
8288
8389 {
97103 }
98104
99105 Args:
100 from_user_id (str): the user making the query. This is used when
106 from_user_id: the user making the query. This is used when
101107 adding cross-signing signatures to limit what signatures users
102108 can see.
103109 """
104110
105 device_keys_query = query_body.get("device_keys", {})
111 device_keys_query = query_body.get(
112 "device_keys", {}
113 ) # type: Dict[str, Iterable[str]]
106114
107115 # separate users by domain.
108116 # make a map from domain to user_id to device_ids
120128 set_tag("remote_key_query", remote_queries)
121129
122130 # First get local devices.
123 failures = {}
131 # A map of destination -> failure response.
132 failures = {} # type: Dict[str, JsonDict]
124133 results = {}
125134 if local_query:
126135 local_result = await self.query_local_devices(local_query)
134143 )
135144
136145 # Now attempt to get any remote devices from our local cache.
137 remote_queries_not_in_cache = {}
146 # A map of destination -> user ID -> device IDs.
147 remote_queries_not_in_cache = {} # type: Dict[str, Dict[str, Iterable[str]]]
138148 if remote_queries:
139 query_list = []
149 query_list = [] # type: List[Tuple[str, Optional[str]]]
140150 for user_id, device_ids in remote_queries.items():
141151 if device_ids:
142152 query_list.extend((user_id, device_id) for device_id in device_ids)
283293 return ret
284294
285295 async def get_cross_signing_keys_from_cache(
286 self, query, from_user_id
296 self, query: Iterable[str], from_user_id: Optional[str]
287297 ) -> Dict[str, Dict[str, dict]]:
288298 """Get cross-signing keys for users from the database
289299
290300 Args:
291 query (Iterable[string]) an iterable of user IDs. A dict whose keys
301 query: an iterable of user IDs. A dict whose keys
292302 are user IDs satisfies this, so the query format used for
293303 query_devices can be used here.
294 from_user_id (str): the user making the query. This is used when
304 from_user_id: the user making the query. This is used when
295305 adding cross-signing signatures to limit what signatures users
296306 can see.
297307
314324 if "self_signing" in user_info:
315325 self_signing_keys[user_id] = user_info["self_signing"]
316326
317 if (
318 from_user_id in keys
319 and keys[from_user_id] is not None
320 and "user_signing" in keys[from_user_id]
321 ):
322 # users can see other users' master and self-signing keys, but can
323 # only see their own user-signing keys
324 user_signing_keys[from_user_id] = keys[from_user_id]["user_signing"]
327 # users can see other users' master and self-signing keys, but can
328 # only see their own user-signing keys
329 if from_user_id:
330 from_user_key = keys.get(from_user_id)
331 if from_user_key and "user_signing" in from_user_key:
332 user_signing_keys[from_user_id] = from_user_key["user_signing"]
325333
326334 return {
327335 "master_keys": master_keys,
343351 A map from user_id -> device_id -> device details
344352 """
345353 set_tag("local_query", query)
346 local_query = []
347
348 result_dict = {}
354 local_query = [] # type: List[Tuple[str, Optional[str]]]
355
356 result_dict = {} # type: Dict[str, Dict[str, dict]]
349357 for user_id, device_ids in query.items():
350358 # we use UserID.from_string to catch invalid user ids
351359 if not self.is_mine(UserID.from_string(user_id)):
379387 log_kv(results)
380388 return result_dict
381389
382 async def on_federation_query_client_keys(self, query_body):
390 async def on_federation_query_client_keys(
391 self, query_body: Dict[str, Dict[str, Optional[List[str]]]]
392 ) -> JsonDict:
383393 """ Handle a device key query from a federated server
384394 """
385 device_keys_query = query_body.get("device_keys", {})
395 device_keys_query = query_body.get(
396 "device_keys", {}
397 ) # type: Dict[str, Optional[List[str]]]
386398 res = await self.query_local_devices(device_keys_query)
387399 ret = {"device_keys": res}
388400
396408 return ret
397409
398410 @trace
399 async def claim_one_time_keys(self, query, timeout):
400 local_query = []
401 remote_queries = {}
402
403 for user_id, device_keys in query.get("one_time_keys", {}).items():
411 async def claim_one_time_keys(
412 self, query: Dict[str, Dict[str, Dict[str, str]]], timeout: int
413 ) -> JsonDict:
414 local_query = [] # type: List[Tuple[str, str, str]]
415 remote_queries = {} # type: Dict[str, Dict[str, Dict[str, str]]]
416
417 for user_id, one_time_keys in query.get("one_time_keys", {}).items():
404418 # we use UserID.from_string to catch invalid user ids
405419 if self.is_mine(UserID.from_string(user_id)):
406 for device_id, algorithm in device_keys.items():
420 for device_id, algorithm in one_time_keys.items():
407421 local_query.append((user_id, device_id, algorithm))
408422 else:
409423 domain = get_domain_from_id(user_id)
410 remote_queries.setdefault(domain, {})[user_id] = device_keys
424 remote_queries.setdefault(domain, {})[user_id] = one_time_keys
411425
412426 set_tag("local_key_query", local_query)
413427 set_tag("remote_key_query", remote_queries)
414428
415429 results = await self.store.claim_e2e_one_time_keys(local_query)
416430
417 json_result = {}
418 failures = {}
431 # A map of user ID -> device ID -> key ID -> key.
432 json_result = {} # type: Dict[str, Dict[str, Dict[str, JsonDict]]]
433 failures = {} # type: Dict[str, JsonDict]
419434 for user_id, device_keys in results.items():
420435 for device_id, keys in device_keys.items():
421 for key_id, json_bytes in keys.items():
436 for key_id, json_str in keys.items():
422437 json_result.setdefault(user_id, {})[device_id] = {
423 key_id: json_decoder.decode(json_bytes)
438 key_id: json_decoder.decode(json_str)
424439 }
425440
426441 @trace
467482 return {"one_time_keys": json_result, "failures": failures}
468483
469484 @tag_args
470 async def upload_keys_for_user(self, user_id, device_id, keys):
485 async def upload_keys_for_user(
486 self, user_id: str, device_id: str, keys: JsonDict
487 ) -> JsonDict:
471488
472489 time_now = self.clock.time_msec()
473490
542559 return {"one_time_key_counts": result}
543560
544561 async def _upload_one_time_keys_for_user(
545 self, user_id, device_id, time_now, one_time_keys
546 ):
562 self, user_id: str, device_id: str, time_now: int, one_time_keys: JsonDict
563 ) -> None:
547564 logger.info(
548565 "Adding one_time_keys %r for device %r for user %r at %d",
549566 one_time_keys.keys(),
584601 log_kv({"message": "Inserting new one_time_keys.", "keys": new_keys})
585602 await self.store.add_e2e_one_time_keys(user_id, device_id, time_now, new_keys)
586603
587 async def upload_signing_keys_for_user(self, user_id, keys):
604 async def upload_signing_keys_for_user(
605 self, user_id: str, keys: JsonDict
606 ) -> JsonDict:
588607 """Upload signing keys for cross-signing
589608
590609 Args:
591 user_id (string): the user uploading the keys
592 keys (dict[string, dict]): the signing keys
610 user_id: the user uploading the keys
611 keys: the signing keys
593612 """
594613
595614 # if a master key is uploaded, then check it. Otherwise, load the
666685
667686 return {}
668687
669 async def upload_signatures_for_device_keys(self, user_id, signatures):
688 async def upload_signatures_for_device_keys(
689 self, user_id: str, signatures: JsonDict
690 ) -> JsonDict:
670691 """Upload device signatures for cross-signing
671692
672693 Args:
673 user_id (string): the user uploading the signatures
674 signatures (dict[string, dict[string, dict]]): map of users to
675 devices to signed keys. This is the submission from the user; an
676 exception will be raised if it is malformed.
694 user_id: the user uploading the signatures
695 signatures: map of users to devices to signed keys. This is the submission
696 from the user; an exception will be raised if it is malformed.
677697 Returns:
678 dict: response to be sent back to the client. The response will have
698 The response to be sent back to the client. The response will have
679699 a "failures" key, which will be a dict mapping users to devices
680700 to errors for the signatures that failed.
681701 Raises:
718738
719739 return {"failures": failures}
720740
721 async def _process_self_signatures(self, user_id, signatures):
741 async def _process_self_signatures(
742 self, user_id: str, signatures: JsonDict
743 ) -> Tuple[List["SignatureListItem"], Dict[str, Dict[str, dict]]]:
722744 """Process uploaded signatures of the user's own keys.
723745
724746 Signatures of the user's own keys from this API come in two forms:
730752 signatures (dict[string, dict]): map of devices to signed keys
731753
732754 Returns:
733 (list[SignatureListItem], dict[string, dict[string, dict]]):
734 a list of signatures to store, and a map of users to devices to failure
735 reasons
755 A tuple of a list of signatures to store, and a map of users to
756 devices to failure reasons
736757
737758 Raises:
738759 SynapseError: if the input is malformed
739760 """
740 signature_list = []
741 failures = {}
761 signature_list = [] # type: List[SignatureListItem]
762 failures = {} # type: Dict[str, Dict[str, JsonDict]]
742763 if not signatures:
743764 return signature_list, failures
744765
833854 return signature_list, failures
834855
835856 def _check_master_key_signature(
836 self, user_id, master_key_id, signed_master_key, stored_master_key, devices
837 ):
857 self,
858 user_id: str,
859 master_key_id: str,
860 signed_master_key: JsonDict,
861 stored_master_key: JsonDict,
862 devices: Dict[str, Dict[str, JsonDict]],
863 ) -> List["SignatureListItem"]:
838864 """Check signatures of a user's master key made by their devices.
839865
840866 Args:
841 user_id (string): the user whose master key is being checked
842 master_key_id (string): the ID of the user's master key
843 signed_master_key (dict): the user's signed master key that was uploaded
844 stored_master_key (dict): our previously-stored copy of the user's master key
845 devices (iterable(dict)): the user's devices
867 user_id: the user whose master key is being checked
868 master_key_id: the ID of the user's master key
869 signed_master_key: the user's signed master key that was uploaded
870 stored_master_key: our previously-stored copy of the user's master key
871 devices: the user's devices
846872
847873 Returns:
848 list[SignatureListItem]: a list of signatures to store
874 A list of signatures to store
849875
850876 Raises:
851877 SynapseError: if a signature is invalid
876902
877903 return master_key_signature_list
878904
879 async def _process_other_signatures(self, user_id, signatures):
905 async def _process_other_signatures(
906 self, user_id: str, signatures: Dict[str, dict]
907 ) -> Tuple[List["SignatureListItem"], Dict[str, Dict[str, dict]]]:
880908 """Process uploaded signatures of other users' keys. These will be the
881909 target user's master keys, signed by the uploading user's user-signing
882910 key.
883911
884912 Args:
885 user_id (string): the user uploading the keys
886 signatures (dict[string, dict]): map of users to devices to signed keys
913 user_id: the user uploading the keys
914 signatures: map of users to devices to signed keys
887915
888916 Returns:
889 (list[SignatureListItem], dict[string, dict[string, dict]]):
890 a list of signatures to store, and a map of users to devices to failure
917 A list of signatures to store, and a map of users to devices to failure
891918 reasons
892919
893920 Raises:
894921 SynapseError: if the input is malformed
895922 """
896 signature_list = []
897 failures = {}
923 signature_list = [] # type: List[SignatureListItem]
924 failures = {} # type: Dict[str, Dict[str, JsonDict]]
898925 if not signatures:
899926 return signature_list, failures
900927
9821009
9831010 async def _get_e2e_cross_signing_verify_key(
9841011 self, user_id: str, key_type: str, from_user_id: str = None
985 ):
1012 ) -> Tuple[JsonDict, str, VerifyKey]:
9861013 """Fetch locally or remotely query for a cross-signing public key.
9871014
9881015 First, attempt to fetch the cross-signing public key from storage.
9961023 This affects what signatures are fetched.
9971024
9981025 Returns:
999 dict, str, VerifyKey: the raw key data, the key ID, and the
1000 signedjson verify key
1026 The raw key data, the key ID, and the signedjson verify key
10011027
10021028 Raises:
10031029 NotFoundError: if the key is not found
11341160 return desired_key, desired_key_id, desired_verify_key
11351161
11361162
1137 def _check_cross_signing_key(key, user_id, key_type, signing_key=None):
1163 def _check_cross_signing_key(
1164 key: JsonDict, user_id: str, key_type: str, signing_key: Optional[VerifyKey] = None
1165 ) -> None:
11381166 """Check a cross-signing key uploaded by a user. Performs some basic sanity
11391167 checking, and ensures that it is signed, if a signature is required.
11401168
11411169 Args:
1142 key (dict): the key data to verify
1143 user_id (str): the user whose key is being checked
1144 key_type (str): the type of key that the key should be
1145 signing_key (VerifyKey): (optional) the signing key that the key should
1146 be signed with. If omitted, signatures will not be checked.
1170 key: the key data to verify
1171 user_id: the user whose key is being checked
1172 key_type: the type of key that the key should be
1173 signing_key: the signing key that the key should be signed with. If
1174 omitted, signatures will not be checked.
11471175 """
11481176 if (
11491177 key.get("user_id") != user_id
11611189 )
11621190
11631191
1164 def _check_device_signature(user_id, verify_key, signed_device, stored_device):
1192 def _check_device_signature(
1193 user_id: str,
1194 verify_key: VerifyKey,
1195 signed_device: JsonDict,
1196 stored_device: JsonDict,
1197 ) -> None:
11651198 """Check that a signature on a device or cross-signing key is correct and
11661199 matches the copy of the device/key that we have stored. Throws an
11671200 exception if an error is detected.
11681201
11691202 Args:
1170 user_id (str): the user ID whose signature is being checked
1171 verify_key (VerifyKey): the key to verify the device with
1172 signed_device (dict): the uploaded signed device data
1173 stored_device (dict): our previously stored copy of the device
1203 user_id: the user ID whose signature is being checked
1204 verify_key: the key to verify the device with
1205 signed_device: the uploaded signed device data
1206 stored_device: our previously stored copy of the device
11741207
11751208 Raises:
11761209 SynapseError: if the signature was invalid or the sent device is not the
12001233 raise SynapseError(400, "Invalid signature", Codes.INVALID_SIGNATURE)
12011234
12021235
1203 def _exception_to_failure(e):
1236 def _exception_to_failure(e: Exception) -> JsonDict:
12041237 if isinstance(e, SynapseError):
12051238 return {"status": e.code, "errcode": e.errcode, "message": str(e)}
12061239
12171250 return {"status": 503, "message": str(e)}
12181251
12191252
1220 def _one_time_keys_match(old_key_json, new_key):
1253 def _one_time_keys_match(old_key_json: str, new_key: JsonDict) -> bool:
12211254 old_key = json_decoder.decode(old_key_json)
12221255
12231256 # if either is a string rather than an object, they must match exactly
12381271 """An item in the signature list as used by upload_signatures_for_device_keys.
12391272 """
12401273
1241 signing_key_id = attr.ib()
1242 target_user_id = attr.ib()
1243 target_device_id = attr.ib()
1244 signature = attr.ib()
1274 signing_key_id = attr.ib(type=str)
1275 target_user_id = attr.ib(type=str)
1276 target_device_id = attr.ib(type=str)
1277 signature = attr.ib(type=JsonDict)
12451278
12461279
12471280 class SigningKeyEduUpdater:
12481281 """Handles incoming signing key updates from federation and updates the DB"""
12491282
1250 def __init__(self, hs, e2e_keys_handler):
1283 def __init__(self, hs: "HomeServer", e2e_keys_handler: E2eKeysHandler):
12511284 self.store = hs.get_datastore()
12521285 self.federation = hs.get_federation_client()
12531286 self.clock = hs.get_clock()
12561289 self._remote_edu_linearizer = Linearizer(name="remote_signing_key")
12571290
12581291 # user_id -> list of updates waiting to be handled.
1259 self._pending_updates = {}
1292 self._pending_updates = {} # type: Dict[str, List[Tuple[JsonDict, JsonDict]]]
12601293
12611294 # Recently seen stream ids. We don't bother keeping these in the DB,
12621295 # but they're useful to have them about to reduce the number of spurious
12691302 iterable=True,
12701303 )
12711304
1272 async def incoming_signing_key_update(self, origin, edu_content):
1305 async def incoming_signing_key_update(
1306 self, origin: str, edu_content: JsonDict
1307 ) -> None:
12731308 """Called on incoming signing key update from federation. Responsible for
12741309 parsing the EDU and adding to pending updates list.
12751310
12761311 Args:
1277 origin (string): the server that sent the EDU
1278 edu_content (dict): the contents of the EDU
1312 origin: the server that sent the EDU
1313 edu_content: the contents of the EDU
12791314 """
12801315
12811316 user_id = edu_content.pop("user_id")
12981333
12991334 await self._handle_signing_key_updates(user_id)
13001335
1301 async def _handle_signing_key_updates(self, user_id):
1336 async def _handle_signing_key_updates(self, user_id: str) -> None:
13021337 """Actually handle pending updates.
13031338
13041339 Args:
1305 user_id (string): the user whose updates we are processing
1340 user_id: the user whose updates we are processing
13061341 """
13071342
13081343 device_handler = self.e2e_keys_handler.device_handler
13141349 # This can happen since we batch updates
13151350 return
13161351
1317 device_ids = []
1352 device_ids = [] # type: List[str]
13181353
13191354 logger.info("pending updates: %r", pending_updates)
13201355
1414 # limitations under the License.
1515
1616 import logging
17 from typing import TYPE_CHECKING, List, Optional
1718
1819 from synapse.api.errors import (
1920 Codes,
2324 SynapseError,
2425 )
2526 from synapse.logging.opentracing import log_kv, trace
27 from synapse.types import JsonDict
2628 from synapse.util.async_helpers import Linearizer
29
30 if TYPE_CHECKING:
31 from synapse.app.homeserver import HomeServer
2732
2833 logger = logging.getLogger(__name__)
2934
3641 The actual payload of the encrypted keys is completely opaque to the handler.
3742 """
3843
39 def __init__(self, hs):
44 def __init__(self, hs: "HomeServer"):
4045 self.store = hs.get_datastore()
4146
4247 # Used to lock whenever a client is uploading key data. This prevents collisions
4752 self._upload_linearizer = Linearizer("upload_room_keys_lock")
4853
4954 @trace
50 async def get_room_keys(self, user_id, version, room_id=None, session_id=None):
55 async def get_room_keys(
56 self,
57 user_id: str,
58 version: str,
59 room_id: Optional[str] = None,
60 session_id: Optional[str] = None,
61 ) -> List[JsonDict]:
5162 """Bulk get the E2E room keys for a given backup, optionally filtered to a given
5263 room, or a given session.
5364 See EndToEndRoomKeyStore.get_e2e_room_keys for full details.
5465
5566 Args:
56 user_id(str): the user whose keys we're getting
57 version(str): the version ID of the backup we're getting keys from
58 room_id(string): room ID to get keys for, for None to get keys for all rooms
59 session_id(string): session ID to get keys for, for None to get keys for all
67 user_id: the user whose keys we're getting
68 version: the version ID of the backup we're getting keys from
69 room_id: room ID to get keys for, for None to get keys for all rooms
70 session_id: session ID to get keys for, for None to get keys for all
6071 sessions
6172 Raises:
6273 NotFoundError: if the backup version does not exist
6374 Returns:
64 A deferred list of dicts giving the session_data and message metadata for
75 A list of dicts giving the session_data and message metadata for
6576 these room keys.
6677 """
6778
8596 return results
8697
8798 @trace
88 async def delete_room_keys(self, user_id, version, room_id=None, session_id=None):
99 async def delete_room_keys(
100 self,
101 user_id: str,
102 version: str,
103 room_id: Optional[str] = None,
104 session_id: Optional[str] = None,
105 ) -> JsonDict:
89106 """Bulk delete the E2E room keys for a given backup, optionally filtered to a given
90107 room or a given session.
91108 See EndToEndRoomKeyStore.delete_e2e_room_keys for full details.
92109
93110 Args:
94 user_id(str): the user whose backup we're deleting
95 version(str): the version ID of the backup we're deleting
96 room_id(string): room ID to delete keys for, for None to delete keys for all
111 user_id: the user whose backup we're deleting
112 version: the version ID of the backup we're deleting
113 room_id: room ID to delete keys for, for None to delete keys for all
97114 rooms
98 session_id(string): session ID to delete keys for, for None to delete keys
115 session_id: session ID to delete keys for, for None to delete keys
99116 for all sessions
100117 Raises:
101118 NotFoundError: if the backup version does not exist
127144 return {"etag": str(version_etag), "count": count}
128145
129146 @trace
130 async def upload_room_keys(self, user_id, version, room_keys):
147 async def upload_room_keys(
148 self, user_id: str, version: str, room_keys: JsonDict
149 ) -> JsonDict:
131150 """Bulk upload a list of room keys into a given backup version, asserting
132151 that the given version is the current backup version. room_keys are merged
133152 into the current backup as described in RoomKeysServlet.on_PUT().
134153
135154 Args:
136 user_id(str): the user whose backup we're setting
137 version(str): the version ID of the backup we're updating
138 room_keys(dict): a nested dict describing the room_keys we're setting:
155 user_id: the user whose backup we're setting
156 version: the version ID of the backup we're updating
157 room_keys: a nested dict describing the room_keys we're setting:
139158
140159 {
141160 "rooms": {
253272 return {"etag": str(version_etag), "count": count}
254273
255274 @staticmethod
256 def _should_replace_room_key(current_room_key, room_key):
275 def _should_replace_room_key(
276 current_room_key: Optional[JsonDict], room_key: JsonDict
277 ) -> bool:
257278 """
258279 Determine whether to replace a given current_room_key (if any)
259280 with a newly uploaded room_key backup
260281
261282 Args:
262 current_room_key (dict): Optional, the current room_key dict if any
263 room_key (dict): The new room_key dict which may or may not be fit to
283 current_room_key: Optional, the current room_key dict if any
284 room_key : The new room_key dict which may or may not be fit to
264285 replace the current_room_key
265286
266287 Returns:
285306 return True
286307
287308 @trace
288 async def create_version(self, user_id, version_info):
309 async def create_version(self, user_id: str, version_info: JsonDict) -> str:
289310 """Create a new backup version. This automatically becomes the new
290311 backup version for the user's keys; previous backups will no longer be
291312 writeable to.
292313
293314 Args:
294 user_id(str): the user whose backup version we're creating
295 version_info(dict): metadata about the new version being created
315 user_id: the user whose backup version we're creating
316 version_info: metadata about the new version being created
296317
297318 {
298319 "algorithm": "m.megolm_backup.v1",
300321 }
301322
302323 Returns:
303 A deferred of a string that gives the new version number.
324 The new version number.
304325 """
305326
306327 # TODO: Validate the JSON to make sure it has the right keys.
312333 )
313334 return new_version
314335
315 async def get_version_info(self, user_id, version=None):
336 async def get_version_info(
337 self, user_id: str, version: Optional[str] = None
338 ) -> JsonDict:
316339 """Get the info about a given version of the user's backup
317340
318341 Args:
319 user_id(str): the user whose current backup version we're querying
320 version(str): Optional; if None gives the most recent version
342 user_id: the user whose current backup version we're querying
343 version: Optional; if None gives the most recent version
321344 otherwise a historical one.
322345 Raises:
323346 NotFoundError: if the requested backup version doesn't exist
324347 Returns:
325 A deferred of a info dict that gives the info about the new version.
348 A info dict that gives the info about the new version.
326349
327350 {
328351 "version": "1234",
345368 return res
346369
347370 @trace
348 async def delete_version(self, user_id, version=None):
371 async def delete_version(self, user_id: str, version: Optional[str] = None) -> None:
349372 """Deletes a given version of the user's e2e_room_keys backup
350373
351374 Args:
365388 raise
366389
367390 @trace
368 async def update_version(self, user_id, version, version_info):
391 async def update_version(
392 self, user_id: str, version: str, version_info: JsonDict
393 ) -> JsonDict:
369394 """Update the info about a given version of the user's backup
370395
371396 Args:
372 user_id(str): the user whose current backup version we're updating
373 version(str): the backup version we're updating
374 version_info(dict): the new information about the backup
397 user_id: the user whose current backup version we're updating
398 version: the backup version we're updating
399 version_info: the new information about the backup
375400 Raises:
376401 NotFoundError: if the requested backup version doesn't exist
377402 Returns:
378 A deferred of an empty dict.
403 An empty dict.
379404 """
380405 if "version" not in version_info:
381406 version_info["version"] = version
16161616 if event.state_key == self._server_notices_mxid:
16171617 raise SynapseError(HTTPStatus.FORBIDDEN, "Cannot invite this user")
16181618
1619 # We retrieve the room member handler here as to not cause a cyclic dependency
1620 member_handler = self.hs.get_room_member_handler()
1621 # We don't rate limit based on room ID, as that should be done by
1622 # sending server.
1623 member_handler.ratelimit_invite(None, event.state_key)
1624
16191625 # keep a record of the room version, if we don't yet know it.
16201626 # (this may get overwritten if we later get a different room version in a
16211627 # join dance).
20912097
20922098 if event.type == EventTypes.GuestAccess and not context.rejected:
20932099 await self.maybe_kick_guest_users(event)
2100
2101 # If we are going to send this event over federation we precaclculate
2102 # the joined hosts.
2103 if event.internal_metadata.get_send_on_behalf_of():
2104 await self.event_creation_handler.cache_joined_hosts_for_event(event)
20942105
20952106 return context
20962107
1414 # limitations under the License.
1515
1616 import logging
17 from typing import TYPE_CHECKING, Dict, Iterable, List, Set
1718
1819 from synapse.api.errors import HttpResponseException, RequestSendFailed, SynapseError
19 from synapse.types import GroupID, get_domain_from_id
20 from synapse.types import GroupID, JsonDict, get_domain_from_id
21
22 if TYPE_CHECKING:
23 from synapse.app.homeserver import HomeServer
2024
2125 logger = logging.getLogger(__name__)
2226
5559
5660
5761 class GroupsLocalWorkerHandler:
58 def __init__(self, hs):
62 def __init__(self, hs: "HomeServer"):
5963 self.hs = hs
6064 self.store = hs.get_datastore()
6165 self.room_list_handler = hs.get_room_list_handler()
8387 get_group_role = _create_rerouter("get_group_role")
8488 get_group_roles = _create_rerouter("get_group_roles")
8589
86 async def get_group_summary(self, group_id, requester_user_id):
90 async def get_group_summary(
91 self, group_id: str, requester_user_id: str
92 ) -> JsonDict:
8793 """Get the group summary for a group.
8894
8995 If the group is remote we check that the users have valid attestations.
136142
137143 return res
138144
139 async def get_users_in_group(self, group_id, requester_user_id):
145 async def get_users_in_group(
146 self, group_id: str, requester_user_id: str
147 ) -> JsonDict:
140148 """Get users in a group
141149 """
142150 if self.is_mine_id(group_id):
143 res = await self.groups_server_handler.get_users_in_group(
151 return await self.groups_server_handler.get_users_in_group(
144152 group_id, requester_user_id
145153 )
146 return res
147154
148155 group_server_name = get_domain_from_id(group_id)
149156
177184
178185 return res
179186
180 async def get_joined_groups(self, user_id):
187 async def get_joined_groups(self, user_id: str) -> JsonDict:
181188 group_ids = await self.store.get_joined_groups(user_id)
182189 return {"groups": group_ids}
183190
184 async def get_publicised_groups_for_user(self, user_id):
191 async def get_publicised_groups_for_user(self, user_id: str) -> JsonDict:
185192 if self.hs.is_mine_id(user_id):
186193 result = await self.store.get_publicised_groups_for_user(user_id)
187194
205212 # TODO: Verify attestations
206213 return {"groups": result}
207214
208 async def bulk_get_publicised_groups(self, user_ids, proxy=True):
209 destinations = {}
215 async def bulk_get_publicised_groups(
216 self, user_ids: Iterable[str], proxy: bool = True
217 ) -> JsonDict:
218 destinations = {} # type: Dict[str, Set[str]]
210219 local_users = set()
211220
212221 for user_id in user_ids:
219228 raise SynapseError(400, "Some user_ids are not local")
220229
221230 results = {}
222 failed_results = []
231 failed_results = [] # type: List[str]
223232 for destination, dest_user_ids in destinations.items():
224233 try:
225234 r = await self.transport_client.bulk_get_publicised_groups(
241250
242251
243252 class GroupsLocalHandler(GroupsLocalWorkerHandler):
244 def __init__(self, hs):
253 def __init__(self, hs: "HomeServer"):
245254 super().__init__(hs)
246255
247256 # Ensure attestations get renewed
270279
271280 set_group_join_policy = _create_rerouter("set_group_join_policy")
272281
273 async def create_group(self, group_id, user_id, content):
282 async def create_group(
283 self, group_id: str, user_id: str, content: JsonDict
284 ) -> JsonDict:
274285 """Create a group
275286 """
276287
283294 local_attestation = None
284295 remote_attestation = None
285296 else:
286 local_attestation = self.attestations.create_attestation(group_id, user_id)
287 content["attestation"] = local_attestation
288
289 content["user_profile"] = await self.profile_handler.get_profile(user_id)
290
291 try:
292 res = await self.transport_client.create_group(
293 get_domain_from_id(group_id), group_id, user_id, content
294 )
295 except HttpResponseException as e:
296 raise e.to_synapse_error()
297 except RequestSendFailed:
298 raise SynapseError(502, "Failed to contact group server")
299
300 remote_attestation = res["attestation"]
301 await self.attestations.verify_attestation(
302 remote_attestation,
303 group_id=group_id,
304 user_id=user_id,
305 server_name=get_domain_from_id(group_id),
306 )
297 raise SynapseError(400, "Unable to create remote groups")
307298
308299 is_publicised = content.get("publicise", False)
309300 token = await self.store.register_user_group_membership(
319310
320311 return res
321312
322 async def join_group(self, group_id, user_id, content):
313 async def join_group(
314 self, group_id: str, user_id: str, content: JsonDict
315 ) -> JsonDict:
323316 """Request to join a group
324317 """
325318 if self.is_mine_id(group_id):
364357
365358 return {}
366359
367 async def accept_invite(self, group_id, user_id, content):
360 async def accept_invite(
361 self, group_id: str, user_id: str, content: JsonDict
362 ) -> JsonDict:
368363 """Accept an invite to a group
369364 """
370365 if self.is_mine_id(group_id):
409404
410405 return {}
411406
412 async def invite(self, group_id, user_id, requester_user_id, config):
407 async def invite(
408 self, group_id: str, user_id: str, requester_user_id: str, config: JsonDict
409 ) -> JsonDict:
413410 """Invite a user to a group
414411 """
415412 content = {"requester_user_id": requester_user_id, "config": config}
433430
434431 return res
435432
436 async def on_invite(self, group_id, user_id, content):
433 async def on_invite(
434 self, group_id: str, user_id: str, content: JsonDict
435 ) -> JsonDict:
437436 """One of our users were invited to a group
438437 """
439438 # TODO: Support auto join and rejection
464463 return {"state": "invite", "user_profile": user_profile}
465464
466465 async def remove_user_from_group(
467 self, group_id, user_id, requester_user_id, content
468 ):
466 self, group_id: str, user_id: str, requester_user_id: str, content: JsonDict
467 ) -> JsonDict:
469468 """Remove a user from a group
470469 """
471470 if user_id == requester_user_id:
498497
499498 return res
500499
501 async def user_removed_from_group(self, group_id, user_id, content):
500 async def user_removed_from_group(
501 self, group_id: str, user_id: str, content: JsonDict
502 ) -> None:
502503 """One of our users was removed/kicked from a group
503504 """
504505 # TODO: Check if user in group
2626 HttpResponseException,
2727 SynapseError,
2828 )
29 from synapse.api.ratelimiting import Ratelimiter
2930 from synapse.config.emailconfig import ThreepidBehaviour
3031 from synapse.http import RequestTimedOutError
3132 from synapse.http.client import SimpleHttpClient
33 from synapse.http.site import SynapseRequest
3234 from synapse.types import JsonDict, Requester
3335 from synapse.util import json_decoder
3436 from synapse.util.hash import sha256_and_url_safe_base64
5557 self.hs = hs
5658
5759 self._web_client_location = hs.config.invite_client_location
60
61 # Ratelimiters for `/requestToken` endpoints.
62 self._3pid_validation_ratelimiter_ip = Ratelimiter(
63 clock=hs.get_clock(),
64 rate_hz=hs.config.ratelimiting.rc_3pid_validation.per_second,
65 burst_count=hs.config.ratelimiting.rc_3pid_validation.burst_count,
66 )
67 self._3pid_validation_ratelimiter_address = Ratelimiter(
68 clock=hs.get_clock(),
69 rate_hz=hs.config.ratelimiting.rc_3pid_validation.per_second,
70 burst_count=hs.config.ratelimiting.rc_3pid_validation.burst_count,
71 )
72
73 def ratelimit_request_token_requests(
74 self, request: SynapseRequest, medium: str, address: str,
75 ):
76 """Used to ratelimit requests to `/requestToken` by IP and address.
77
78 Args:
79 request: The associated request
80 medium: The type of threepid, e.g. "msisdn" or "email"
81 address: The actual threepid ID, e.g. the phone number or email address
82 """
83
84 self._3pid_validation_ratelimiter_ip.ratelimit((medium, request.getClientIP()))
85 self._3pid_validation_ratelimiter_address.ratelimit((medium, address))
5886
5987 async def threepid_from_creds(
6088 self, id_server: str, creds: Dict[str, str]
474502 raise e.to_synapse_error()
475503 except RequestTimedOutError:
476504 raise SynapseError(500, "Timed out contacting identity server")
505
506 # It is already checked that public_baseurl is configured since this code
507 # should only be used if account_threepid_delegate_msisdn is true.
508 assert self.hs.config.public_baseurl
477509
478510 # we need to tell the client to send the token back to us, since it doesn't
479511 # otherwise know where to send it, so add submit_url response parameter
173173 raise NotFoundError("Can't find event for token %s" % (at_token,))
174174
175175 visible_events = await filter_events_for_client(
176 self.storage, user_id, last_events, filter_send_to_client=False
176 self.storage, user_id, last_events, filter_send_to_client=False,
177177 )
178178
179179 event = last_events[0]
430430 self._message_handler = hs.get_message_handler()
431431
432432 self._ephemeral_events_enabled = hs.config.enable_ephemeral_messages
433
434 self._external_cache = hs.get_external_cache()
433435
434436 async def create_event(
435437 self,
938940
939941 await self.action_generator.handle_push_actions_for_event(event, context)
940942
943 await self.cache_joined_hosts_for_event(event)
944
941945 try:
942946 # If we're a worker we need to hit out to the master.
943947 writer_instance = self._events_shard_config.get_instance(event.room_id)
976980 # staging area, if we calculated them.
977981 await self.store.remove_push_actions_from_staging(event.event_id)
978982 raise
983
984 async def cache_joined_hosts_for_event(self, event: EventBase) -> None:
985 """Precalculate the joined hosts at the event, when using Redis, so that
986 external federation senders don't have to recalculate it themselves.
987 """
988
989 if not self._external_cache.is_enabled():
990 return
991
992 # We actually store two mappings, event ID -> prev state group,
993 # state group -> joined hosts, which is much more space efficient
994 # than event ID -> joined hosts.
995 #
996 # Note: We have to cache event ID -> prev state group, as we don't
997 # store that in the DB.
998 #
999 # Note: We always set the state group -> joined hosts cache, even if
1000 # we already set it, so that the expiry time is reset.
1001
1002 state_entry = await self.state.resolve_state_groups_for_events(
1003 event.room_id, event_ids=event.prev_event_ids()
1004 )
1005
1006 if state_entry.state_group:
1007 joined_hosts = await self.store.get_joined_hosts(event.room_id, state_entry)
1008
1009 await self._external_cache.set(
1010 "event_to_prev_state_group",
1011 event.event_id,
1012 state_entry.state_group,
1013 expiry_ms=60 * 60 * 1000,
1014 )
1015 await self._external_cache.set(
1016 "get_joined_hosts",
1017 str(state_entry.state_group),
1018 list(joined_hosts),
1019 expiry_ms=60 * 60 * 1000,
1020 )
9791021
9801022 async def _validate_canonical_alias(
9811023 self, directory_handler, room_alias_str: str, expected_room_id: str
101101 ) from e
102102
103103 async def handle_oidc_callback(self, request: SynapseRequest) -> None:
104 """Handle an incoming request to /_synapse/oidc/callback
104 """Handle an incoming request to /_synapse/client/oidc/callback
105105
106106 Since we might want to display OIDC-related errors in a user-friendly
107107 way, we don't raise SynapseError from here. Instead, we call
272272
273273 # MXC URI for icon for this auth provider
274274 self.idp_icon = provider.idp_icon
275
276 # optional brand identifier for this auth provider
277 self.idp_brand = provider.idp_brand
275278
276279 self._sso_handler = hs.get_sso_handler()
277280
639642
640643 - ``client_id``: the client ID set in ``oidc_config.client_id``
641644 - ``response_type``: ``code``
642 - ``redirect_uri``: the callback URL ; ``{base url}/_synapse/oidc/callback``
645 - ``redirect_uri``: the callback URL ; ``{base url}/_synapse/client/oidc/callback``
643646 - ``scope``: the list of scopes set in ``oidc_config.scopes``
644647 - ``state``: a random string
645648 - ``nonce``: a random string
680683 request.addCookie(
681684 SESSION_COOKIE_NAME,
682685 cookie,
683 path="/_synapse/oidc",
686 path="/_synapse/client/oidc",
684687 max_age="3600",
685688 httpOnly=True,
686689 sameSite="lax",
701704 async def handle_oidc_callback(
702705 self, request: SynapseRequest, session_data: "OidcSessionData", code: str
703706 ) -> None:
704 """Handle an incoming request to /_synapse/oidc/callback
707 """Handle an incoming request to /_synapse/client/oidc/callback
705708
706709 By this time we have already validated the session on the synapse side, and
707710 now need to do the provider-specific operations. This includes:
10551058
10561059
10571060 UserAttributeDict = TypedDict(
1058 "UserAttributeDict", {"localpart": Optional[str], "display_name": Optional[str]}
1061 "UserAttributeDict",
1062 {"localpart": Optional[str], "display_name": Optional[str], "emails": List[str]},
10591063 )
10601064 C = TypeVar("C")
10611065
11341138 env = Environment(finalize=jinja_finalize)
11351139
11361140
1137 @attr.s
1141 @attr.s(slots=True, frozen=True)
11381142 class JinjaOidcMappingConfig:
11391143 subject_claim = attr.ib(type=str)
11401144 localpart_template = attr.ib(type=Optional[Template])
11411145 display_name_template = attr.ib(type=Optional[Template])
1146 email_template = attr.ib(type=Optional[Template])
11421147 extra_attributes = attr.ib(type=Dict[str, Template])
11431148
11441149
11551160 def parse_config(config: dict) -> JinjaOidcMappingConfig:
11561161 subject_claim = config.get("subject_claim", "sub")
11571162
1158 localpart_template = None # type: Optional[Template]
1159 if "localpart_template" in config:
1163 def parse_template_config(option_name: str) -> Optional[Template]:
1164 if option_name not in config:
1165 return None
11601166 try:
1161 localpart_template = env.from_string(config["localpart_template"])
1167 return env.from_string(config[option_name])
11621168 except Exception as e:
1163 raise ConfigError(
1164 "invalid jinja template", path=["localpart_template"]
1165 ) from e
1166
1167 display_name_template = None # type: Optional[Template]
1168 if "display_name_template" in config:
1169 try:
1170 display_name_template = env.from_string(config["display_name_template"])
1171 except Exception as e:
1172 raise ConfigError(
1173 "invalid jinja template", path=["display_name_template"]
1174 ) from e
1169 raise ConfigError("invalid jinja template", path=[option_name]) from e
1170
1171 localpart_template = parse_template_config("localpart_template")
1172 display_name_template = parse_template_config("display_name_template")
1173 email_template = parse_template_config("email_template")
11751174
11761175 extra_attributes = {} # type Dict[str, Template]
11771176 if "extra_attributes" in config:
11911190 subject_claim=subject_claim,
11921191 localpart_template=localpart_template,
11931192 display_name_template=display_name_template,
1193 email_template=email_template,
11941194 extra_attributes=extra_attributes,
11951195 )
11961196
12121212 # a usable mxid.
12131213 localpart += str(failures) if failures else ""
12141214
1215 display_name = None # type: Optional[str]
1216 if self._config.display_name_template is not None:
1217 display_name = self._config.display_name_template.render(
1218 user=userinfo
1219 ).strip()
1220
1221 if display_name == "":
1222 display_name = None
1223
1224 return UserAttributeDict(localpart=localpart, display_name=display_name)
1215 def render_template_field(template: Optional[Template]) -> Optional[str]:
1216 if template is None:
1217 return None
1218 return template.render(user=userinfo).strip()
1219
1220 display_name = render_template_field(self._config.display_name_template)
1221 if display_name == "":
1222 display_name = None
1223
1224 emails = [] # type: List[str]
1225 email = render_template_field(self._config.email_template)
1226 if email:
1227 emails.append(email)
1228
1229 return UserAttributeDict(
1230 localpart=localpart, display_name=display_name, emails=emails
1231 )
12251232
12261233 async def get_extra_attributes(self, userinfo: UserInfo, token: Token) -> JsonDict:
12271234 extras = {} # type: Dict[str, str]
1313 # limitations under the License.
1414
1515 """Contains functions for registering clients."""
16
1617 import logging
17 from typing import TYPE_CHECKING, List, Optional, Tuple
18 from typing import TYPE_CHECKING, Iterable, List, Optional, Tuple
1819
1920 from synapse import types
2021 from synapse.api.constants import MAX_USERID_LENGTH, EventTypes, JoinRules, LoginType
151152 user_type: Optional[str] = None,
152153 default_display_name: Optional[str] = None,
153154 address: Optional[str] = None,
154 bind_emails: List[str] = [],
155 bind_emails: Iterable[str] = [],
155156 by_admin: bool = False,
156157 user_agent_ips: Optional[List[Tuple[str, str]]] = None,
157158 ) -> str:
692693 access_token: The access token of the newly logged in device, or
693694 None if `inhibit_login` enabled.
694695 """
696 # TODO: 3pid registration can actually happen on the workers. Consider
697 # refactoring it.
695698 if self.hs.config.worker_app:
696699 await self._post_registration_client(
697700 user_id=user_id, auth_result=auth_result, access_token=access_token
125125
126126 self.third_party_event_rules = hs.get_third_party_event_rules()
127127
128 self._invite_burst_count = (
129 hs.config.ratelimiting.rc_invites_per_room.burst_count
130 )
131
128132 async def upgrade_room(
129133 self, requester: Requester, old_room_id: str, new_version: RoomVersion
130134 ) -> str:
660664 # Allow the request to go through, but remove any associated invites.
661665 invite_3pid_list = []
662666 invite_list = []
667
668 if len(invite_list) + len(invite_3pid_list) > self._invite_burst_count:
669 raise SynapseError(400, "Cannot invite so many users at once")
663670
664671 await self.event_creation_handler.assert_accepted_privacy_policy(requester)
665672
8484 burst_count=hs.config.ratelimiting.rc_joins_remote.burst_count,
8585 )
8686
87 self._invites_per_room_limiter = Ratelimiter(
88 clock=self.clock,
89 rate_hz=hs.config.ratelimiting.rc_invites_per_room.per_second,
90 burst_count=hs.config.ratelimiting.rc_invites_per_room.burst_count,
91 )
92 self._invites_per_user_limiter = Ratelimiter(
93 clock=self.clock,
94 rate_hz=hs.config.ratelimiting.rc_invites_per_user.per_second,
95 burst_count=hs.config.ratelimiting.rc_invites_per_user.burst_count,
96 )
97
8798 # This is only used to get at ratelimit function, and
8899 # maybe_kick_guest_users. It's fine there are multiple of these as
89100 # it doesn't store state.
142153 room_id
143154 """
144155 raise NotImplementedError()
156
157 def ratelimit_invite(self, room_id: Optional[str], invitee_user_id: str):
158 """Ratelimit invites by room and by target user.
159
160 If room ID is missing then we just rate limit by target user.
161 """
162 if room_id:
163 self._invites_per_room_limiter.ratelimit(room_id)
164
165 self._invites_per_user_limiter.ratelimit(invitee_user_id)
145166
146167 async def _local_membership_update(
147168 self,
386407 raise SynapseError(403, "This room has been blocked on this server")
387408
388409 if effective_membership_state == Membership.INVITE:
410 target_id = target.to_string()
411 if ratelimit:
412 # Don't ratelimit application services.
413 if not requester.app_service or requester.app_service.is_rate_limited():
414 self.ratelimit_invite(room_id, target_id)
415
389416 # block any attempts to invite the server notices mxid
390 if target.to_string() == self._server_notices_mxid:
417 if target_id == self._server_notices_mxid:
391418 raise SynapseError(HTTPStatus.FORBIDDEN, "Cannot invite this user")
392419
393420 block_invite = False
411438 block_invite = True
412439
413440 if not await self.spam_checker.user_may_invite(
414 requester.user.to_string(), target.to_string(), room_id
441 requester.user.to_string(), target_id, room_id
415442 ):
416443 logger.info("Blocking invite due to spam checker")
417444 block_invite = True
7777 # user-facing name of this auth provider
7878 self.idp_name = "SAML"
7979
80 # we do not currently support icons for SAML auth, but this is required by
80 # we do not currently support icons/brands for SAML auth, but this is required by
8181 # the SsoIdentityProvider protocol type.
8282 self.idp_icon = None
83 self.idp_brand = None
8384
8485 # a map from saml session id to Saml2SessionData object
8586 self._outstanding_requests_dict = {} # type: Dict[str, Saml2SessionData]
131132 raise Exception("prepare_for_authenticate didn't return a Location header")
132133
133134 async def handle_saml_response(self, request: SynapseRequest) -> None:
134 """Handle an incoming request to /_matrix/saml2/authn_response
135 """Handle an incoming request to /_synapse/client/saml2/authn_response
135136
136137 Args:
137138 request: the incoming request from the browser. We'll
1414
1515 import itertools
1616 import logging
17 from typing import Iterable
17 from typing import TYPE_CHECKING, Dict, Iterable, List, Optional
1818
1919 from unpaddedbase64 import decode_base64, encode_base64
2020
2121 from synapse.api.constants import EventTypes, Membership
2222 from synapse.api.errors import NotFoundError, SynapseError
2323 from synapse.api.filtering import Filter
24 from synapse.events import EventBase
2425 from synapse.storage.state import StateFilter
26 from synapse.types import JsonDict, UserID
2527 from synapse.visibility import filter_events_for_client
2628
2729 from ._base import BaseHandler
2830
31 if TYPE_CHECKING:
32 from synapse.app.homeserver import HomeServer
33
2934 logger = logging.getLogger(__name__)
3035
3136
3237 class SearchHandler(BaseHandler):
33 def __init__(self, hs):
38 def __init__(self, hs: "HomeServer"):
3439 super().__init__(hs)
3540 self._event_serializer = hs.get_event_client_serializer()
3641 self.storage = hs.get_storage()
8691
8792 return historical_room_ids
8893
89 async def search(self, user, content, batch=None):
94 async def search(
95 self, user: UserID, content: JsonDict, batch: Optional[str] = None
96 ) -> JsonDict:
9097 """Performs a full text search for a user.
9198
9299 Args:
93 user (UserID)
94 content (dict): Search parameters
95 batch (str): The next_batch parameter. Used for pagination.
100 user
101 content: Search parameters
102 batch: The next_batch parameter. Used for pagination.
96103
97104 Returns:
98105 dict to be returned to the client with results of search
185192 # If doing a subset of all rooms seearch, check if any of the rooms
186193 # are from an upgraded room, and search their contents as well
187194 if search_filter.rooms:
188 historical_room_ids = []
195 historical_room_ids = [] # type: List[str]
189196 for room_id in search_filter.rooms:
190197 # Add any previous rooms to the search if they exist
191198 ids = await self.get_old_rooms_from_upgraded_room(room_id)
208215
209216 rank_map = {} # event_id -> rank of event
210217 allowed_events = []
211 room_groups = {} # Holds result of grouping by room, if applicable
212 sender_group = {} # Holds result of grouping by sender, if applicable
218 # Holds result of grouping by room, if applicable
219 room_groups = {} # type: Dict[str, JsonDict]
220 # Holds result of grouping by sender, if applicable
221 sender_group = {} # type: Dict[str, JsonDict]
213222
214223 # Holds the next_batch for the entire result set if one of those exists
215224 global_next_batch = None
253262 s["results"].append(e.event_id)
254263
255264 elif order_by == "recent":
256 room_events = []
265 room_events = [] # type: List[EventBase]
257266 i = 0
258267
259268 pagination_token = batch_token
417426
418427 state_results = {}
419428 if include_state:
420 rooms = {e.room_id for e in allowed_events}
421 for room_id in rooms:
429 for room_id in {e.room_id for e in allowed_events}:
422430 state = await self.state_handler.get_current_state(room_id)
423431 state_results[room_id] = list(state.values())
424
425 state_results.values()
426432
427433 # We're now about to serialize the events. We should not make any
428434 # blocking calls after this. Otherwise the 'age' will be wrong
447453
448454 if state_results:
449455 s = {}
450 for room_id, state in state_results.items():
456 for room_id, state_events in state_results.items():
451457 s[room_id] = await self._event_serializer.serialize_events(
452 state, time_now
458 state_events, time_now
453459 )
454460
455461 rooms_cat_res["state"] = s
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414 import logging
15 from typing import Optional
15 from typing import TYPE_CHECKING, Optional
1616
1717 from synapse.api.errors import Codes, StoreError, SynapseError
1818 from synapse.types import Requester
1919
2020 from ._base import BaseHandler
21
22 if TYPE_CHECKING:
23 from synapse.app.homeserver import HomeServer
2124
2225 logger = logging.getLogger(__name__)
2326
2528 class SetPasswordHandler(BaseHandler):
2629 """Handler which deals with changing user account passwords"""
2730
28 def __init__(self, hs):
31 def __init__(self, hs: "HomeServer"):
2932 super().__init__(hs)
3033 self._auth_handler = hs.get_auth_handler()
3134 self._device_handler = hs.get_device_handler()
32 self._password_policy_handler = hs.get_password_policy_handler()
3335
3436 async def set_password(
3537 self,
3739 password_hash: str,
3840 logout_devices: bool,
3941 requester: Optional[Requester] = None,
40 ):
42 ) -> None:
4143 if not self.hs.config.password_localdb_enabled:
4244 raise SynapseError(403, "Password change disabled", errcode=Codes.FORBIDDEN)
4345
1313 # limitations under the License.
1414 import abc
1515 import logging
16 from typing import TYPE_CHECKING, Awaitable, Callable, Dict, List, Mapping, Optional
16 from typing import (
17 TYPE_CHECKING,
18 Awaitable,
19 Callable,
20 Dict,
21 Iterable,
22 Mapping,
23 Optional,
24 Set,
25 )
1726 from urllib.parse import urlencode
1827
1928 import attr
2029 from typing_extensions import NoReturn, Protocol
2130
2231 from twisted.web.http import Request
32 from twisted.web.iweb import IRequest
2333
2434 from synapse.api.constants import LoginType
25 from synapse.api.errors import Codes, RedirectException, SynapseError
35 from synapse.api.errors import Codes, NotFoundError, RedirectException, SynapseError
2636 from synapse.handlers.ui_auth import UIAuthSessionDataConstants
2737 from synapse.http import get_request_user_agent
28 from synapse.http.server import respond_with_html
38 from synapse.http.server import respond_with_html, respond_with_redirect
2939 from synapse.http.site import SynapseRequest
30 from synapse.types import JsonDict, UserID, contains_invalid_mxid_characters
40 from synapse.types import Collection, JsonDict, UserID, contains_invalid_mxid_characters
3141 from synapse.util.async_helpers import Linearizer
3242 from synapse.util.stringutils import random_string
3343
7787 @property
7888 def idp_icon(self) -> Optional[str]:
7989 """Optional MXC URI for user-facing icon"""
90 return None
91
92 @property
93 def idp_brand(self) -> Optional[str]:
94 """Optional branding identifier"""
8095 return None
8196
8297 @abc.abstractmethod
108123 # enter one.
109124 localpart = attr.ib(type=Optional[str])
110125 display_name = attr.ib(type=Optional[str], default=None)
111 emails = attr.ib(type=List[str], default=attr.Factory(list))
126 emails = attr.ib(type=Collection[str], default=attr.Factory(list))
112127
113128
114129 @attr.s(slots=True)
123138
124139 # attributes returned by the ID mapper
125140 display_name = attr.ib(type=Optional[str])
126 emails = attr.ib(type=List[str])
141 emails = attr.ib(type=Collection[str])
127142
128143 # An optional dictionary of extra attributes to be provided to the client in the
129144 # login response.
134149
135150 # expiry time for the session, in milliseconds
136151 expiry_time_ms = attr.ib(type=int)
152
153 # choices made by the user
154 chosen_localpart = attr.ib(type=Optional[str], default=None)
155 use_display_name = attr.ib(type=bool, default=True)
156 emails_to_use = attr.ib(type=Collection[str], default=())
157 terms_accepted_version = attr.ib(type=Optional[str], default=None)
137158
138159
139160 # the HTTP cookie used to track the mapping session id
168189
169190 # map from idp_id to SsoIdentityProvider
170191 self._identity_providers = {} # type: Dict[str, SsoIdentityProvider]
192
193 self._consent_at_registration = hs.config.consent.user_consent_at_registration
171194
172195 def register_identity_provider(self, p: SsoIdentityProvider):
173196 p_id = p.idp_id
234257 respond_with_html(request, code, html)
235258
236259 async def handle_redirect_request(
237 self, request: SynapseRequest, client_redirect_url: bytes,
260 self,
261 request: SynapseRequest,
262 client_redirect_url: bytes,
263 idp_id: Optional[str],
238264 ) -> str:
239265 """Handle a request to /login/sso/redirect
240266
242268 request: incoming HTTP request
243269 client_redirect_url: the URL that we should redirect the
244270 client to after login.
271 idp_id: optional identity provider chosen by the client
245272
246273 Returns:
247274 the URI to redirect to
251278 400, "Homeserver not configured for SSO.", errcode=Codes.UNRECOGNIZED
252279 )
253280
281 # if the client chose an IdP, use that
282 idp = None # type: Optional[SsoIdentityProvider]
283 if idp_id:
284 idp = self._identity_providers.get(idp_id)
285 if not idp:
286 raise NotFoundError("Unknown identity provider")
287
254288 # if we only have one auth provider, redirect to it directly
255 if len(self._identity_providers) == 1:
256 ap = next(iter(self._identity_providers.values()))
257 return await ap.handle_redirect_request(request, client_redirect_url)
289 elif len(self._identity_providers) == 1:
290 idp = next(iter(self._identity_providers.values()))
291
292 if idp:
293 return await idp.handle_redirect_request(request, client_redirect_url)
258294
259295 # otherwise, redirect to the IDP picker
260296 return "/_synapse/client/pick_idp?" + urlencode(
368404 to an additional page. (e.g. to prompt for more information)
369405
370406 """
407 new_user = False
408
371409 # grab a lock while we try to find a mapping for this user. This seems...
372410 # optimistic, especially for implementations that end up redirecting to
373411 # interstitial pages.
408446 get_request_user_agent(request),
409447 request.getClientIP(),
410448 )
449 new_user = True
411450
412451 await self._auth_handler.complete_sso_login(
413 user_id, request, client_redirect_url, extra_login_attributes
452 user_id,
453 request,
454 client_redirect_url,
455 extra_login_attributes,
456 new_user=new_user,
414457 )
415458
416459 async def _call_attribute_mapper(
500543 logger.info("Recorded registration session id %s", session_id)
501544
502545 # Set the cookie and redirect to the username picker
503 e = RedirectException(b"/_synapse/client/pick_username")
546 e = RedirectException(b"/_synapse/client/pick_username/account_details")
504547 e.cookies.append(
505548 b"%s=%s; path=/"
506549 % (USERNAME_MAPPING_SESSION_COOKIE_NAME, session_id.encode("ascii"))
628671 )
629672 respond_with_html(request, 200, html)
630673
674 def get_mapping_session(self, session_id: str) -> UsernameMappingSession:
675 """Look up the given username mapping session
676
677 If it is not found, raises a SynapseError with an http code of 400
678
679 Args:
680 session_id: session to look up
681 Returns:
682 active mapping session
683 Raises:
684 SynapseError if the session is not found/has expired
685 """
686 self._expire_old_sessions()
687 session = self._username_mapping_sessions.get(session_id)
688 if session:
689 return session
690 logger.info("Couldn't find session id %s", session_id)
691 raise SynapseError(400, "unknown session")
692
631693 async def check_username_availability(
632694 self, localpart: str, session_id: str,
633695 ) -> bool:
644706
645707 # make sure that there is a valid mapping session, to stop people dictionary-
646708 # scanning for accounts
647
648 self._expire_old_sessions()
649 session = self._username_mapping_sessions.get(session_id)
650 if not session:
651 logger.info("Couldn't find session id %s", session_id)
652 raise SynapseError(400, "unknown session")
709 self.get_mapping_session(session_id)
653710
654711 logger.info(
655712 "[session %s] Checking for availability of username %s",
666723 return not user_infos
667724
668725 async def handle_submit_username_request(
669 self, request: SynapseRequest, localpart: str, session_id: str
726 self,
727 request: SynapseRequest,
728 session_id: str,
729 localpart: str,
730 use_display_name: bool,
731 emails_to_use: Iterable[str],
670732 ) -> None:
671733 """Handle a request to the username-picker 'submit' endpoint
672734
676738 request: HTTP request
677739 localpart: localpart requested by the user
678740 session_id: ID of the username mapping session, extracted from a cookie
679 """
680 self._expire_old_sessions()
681 session = self._username_mapping_sessions.get(session_id)
682 if not session:
683 logger.info("Couldn't find session id %s", session_id)
684 raise SynapseError(400, "unknown session")
685
686 logger.info("[session %s] Registering localpart %s", session_id, localpart)
741 use_display_name: whether the user wants to use the suggested display name
742 emails_to_use: emails that the user would like to use
743 """
744 session = self.get_mapping_session(session_id)
745
746 # update the session with the user's choices
747 session.chosen_localpart = localpart
748 session.use_display_name = use_display_name
749
750 emails_from_idp = set(session.emails)
751 filtered_emails = set() # type: Set[str]
752
753 # we iterate through the list rather than just building a set conjunction, so
754 # that we can log attempts to use unknown addresses
755 for email in emails_to_use:
756 if email in emails_from_idp:
757 filtered_emails.add(email)
758 else:
759 logger.warning(
760 "[session %s] ignoring user request to use unknown email address %r",
761 session_id,
762 email,
763 )
764 session.emails_to_use = filtered_emails
765
766 # we may now need to collect consent from the user, in which case, redirect
767 # to the consent-extraction-unit
768 if self._consent_at_registration:
769 redirect_url = b"/_synapse/client/new_user_consent"
770
771 # otherwise, redirect to the completion page
772 else:
773 redirect_url = b"/_synapse/client/sso_register"
774
775 respond_with_redirect(request, redirect_url)
776
777 async def handle_terms_accepted(
778 self, request: Request, session_id: str, terms_version: str
779 ):
780 """Handle a request to the new-user 'consent' endpoint
781
782 Will serve an HTTP response to the request.
783
784 Args:
785 request: HTTP request
786 session_id: ID of the username mapping session, extracted from a cookie
787 terms_version: the version of the terms which the user viewed and consented
788 to
789 """
790 logger.info(
791 "[session %s] User consented to terms version %s",
792 session_id,
793 terms_version,
794 )
795 session = self.get_mapping_session(session_id)
796 session.terms_accepted_version = terms_version
797
798 # we're done; now we can register the user
799 respond_with_redirect(request, b"/_synapse/client/sso_register")
800
801 async def register_sso_user(self, request: Request, session_id: str) -> None:
802 """Called once we have all the info we need to register a new user.
803
804 Does so and serves an HTTP response
805
806 Args:
807 request: HTTP request
808 session_id: ID of the username mapping session, extracted from a cookie
809 """
810 session = self.get_mapping_session(session_id)
811
812 logger.info(
813 "[session %s] Registering localpart %s",
814 session_id,
815 session.chosen_localpart,
816 )
687817
688818 attributes = UserAttributes(
689 localpart=localpart,
690 display_name=session.display_name,
691 emails=session.emails,
692 )
819 localpart=session.chosen_localpart, emails=session.emails_to_use,
820 )
821
822 if session.use_display_name:
823 attributes.display_name = session.display_name
693824
694825 # the following will raise a 400 error if the username has been taken in the
695826 # meantime.
701832 request.getClientIP(),
702833 )
703834
704 logger.info("[session %s] Registered userid %s", session_id, user_id)
835 logger.info(
836 "[session %s] Registered userid %s with attributes %s",
837 session_id,
838 user_id,
839 attributes,
840 )
705841
706842 # delete the mapping session and the cookie
707843 del self._username_mapping_sessions[session_id]
714850 path=b"/",
715851 )
716852
853 auth_result = {}
854 if session.terms_accepted_version:
855 # TODO: make this less awful.
856 auth_result[LoginType.TERMS] = True
857
858 await self._registration_handler.post_registration_actions(
859 user_id, auth_result, access_token=None
860 )
861
717862 await self._auth_handler.complete_sso_login(
718863 user_id,
719864 request,
720865 session.client_redirect_url,
721866 session.extra_login_attributes,
867 new_user=True,
722868 )
723869
724870 def _expire_old_sessions(self):
732878 for session_id in to_expire:
733879 logger.info("Expiring mapping session %s", session_id)
734880 del self._username_mapping_sessions[session_id]
881
882
883 def get_username_mapping_session_cookie_from_request(request: IRequest) -> str:
884 """Extract the session ID from the cookie
885
886 Raises a SynapseError if the cookie isn't found
887 """
888 session_id = request.getCookie(USERNAME_MAPPING_SESSION_COOKIE_NAME)
889 if not session_id:
890 raise SynapseError(code=400, msg="missing session_id")
891 return session_id.decode("ascii", errors="replace")
1313 # limitations under the License.
1414
1515 import logging
16 from typing import TYPE_CHECKING, Optional
17
18 if TYPE_CHECKING:
19 from synapse.app.homeserver import HomeServer
1620
1721 logger = logging.getLogger(__name__)
1822
1923
2024 class StateDeltasHandler:
21 def __init__(self, hs):
25 def __init__(self, hs: "HomeServer"):
2226 self.store = hs.get_datastore()
2327
24 async def _get_key_change(self, prev_event_id, event_id, key_name, public_value):
28 async def _get_key_change(
29 self,
30 prev_event_id: Optional[str],
31 event_id: Optional[str],
32 key_name: str,
33 public_value: str,
34 ) -> Optional[bool]:
2535 """Given two events check if the `key_name` field in content changed
2636 from not matching `public_value` to doing so.
2737
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14
1514 import logging
1615 from collections import Counter
16 from typing import TYPE_CHECKING, Any, Dict, Iterable, Optional, Tuple
17
18 from typing_extensions import Counter as CounterType
1719
1820 from synapse.api.constants import EventTypes, Membership
1921 from synapse.metrics import event_processing_positions
2022 from synapse.metrics.background_process_metrics import run_as_background_process
23 from synapse.types import JsonDict
24
25 if TYPE_CHECKING:
26 from synapse.app.homeserver import HomeServer
2127
2228 logger = logging.getLogger(__name__)
2329
3036 Heavily derived from UserDirectoryHandler
3137 """
3238
33 def __init__(self, hs):
39 def __init__(self, hs: "HomeServer"):
3440 self.hs = hs
3541 self.store = hs.get_datastore()
3642 self.state = hs.get_state_handler()
4349 self.stats_enabled = hs.config.stats_enabled
4450
4551 # The current position in the current_state_delta stream
46 self.pos = None
52 self.pos = None # type: Optional[int]
4753
4854 # Guard to ensure we only process deltas one at a time
4955 self._is_processing = False
5561 # we start populating stats
5662 self.clock.call_later(0, self.notify_new_event)
5763
58 def notify_new_event(self):
64 def notify_new_event(self) -> None:
5965 """Called when there may be more deltas to process
6066 """
6167 if not self.stats_enabled or self._is_processing:
7177
7278 run_as_background_process("stats.notify_new_event", process)
7379
74 async def _unsafe_process(self):
80 async def _unsafe_process(self) -> None:
7581 # If self.pos is None then means we haven't fetched it from DB
7682 if self.pos is None:
7783 self.pos = await self.store.get_stats_positions()
109115 )
110116
111117 for room_id, fields in room_count.items():
112 room_deltas.setdefault(room_id, {}).update(fields)
118 room_deltas.setdefault(room_id, Counter()).update(fields)
113119
114120 for user_id, fields in user_count.items():
115 user_deltas.setdefault(user_id, {}).update(fields)
121 user_deltas.setdefault(user_id, Counter()).update(fields)
116122
117123 logger.debug("room_deltas: %s", room_deltas)
118124 logger.debug("user_deltas: %s", user_deltas)
130136
131137 self.pos = max_pos
132138
133 async def _handle_deltas(self, deltas):
139 async def _handle_deltas(
140 self, deltas: Iterable[JsonDict]
141 ) -> Tuple[Dict[str, CounterType[str]], Dict[str, CounterType[str]]]:
134142 """Called with the state deltas to process
135143
136144 Returns:
137 tuple[dict[str, Counter], dict[str, counter]]
138145 Two dicts: the room deltas and the user deltas,
139146 mapping from room/user ID to changes in the various fields.
140147 """
141148
142 room_to_stats_deltas = {}
143 user_to_stats_deltas = {}
144
145 room_to_state_updates = {}
149 room_to_stats_deltas = {} # type: Dict[str, CounterType[str]]
150 user_to_stats_deltas = {} # type: Dict[str, CounterType[str]]
151
152 room_to_state_updates = {} # type: Dict[str, Dict[str, Any]]
146153
147154 for delta in deltas:
148155 typ = delta["type"]
172179 )
173180 continue
174181
175 event_content = {}
182 event_content = {} # type: JsonDict
176183
177184 sender = None
178185 if event_id is not None:
256263 )
257264
258265 if has_changed_joinedness:
259 delta = +1 if membership == Membership.JOIN else -1
266 membership_delta = +1 if membership == Membership.JOIN else -1
260267
261268 user_to_stats_deltas.setdefault(user_id, Counter())[
262269 "joined_rooms"
263 ] += delta
264
265 room_stats_delta["local_users_in_room"] += delta
270 ] += membership_delta
271
272 room_stats_delta["local_users_in_room"] += membership_delta
266273
267274 elif typ == EventTypes.Create:
268275 room_state["is_federatable"] = (
1414 import logging
1515 import random
1616 from collections import namedtuple
17 from typing import TYPE_CHECKING, List, Set, Tuple
17 from typing import TYPE_CHECKING, Dict, Iterable, List, Optional, Set, Tuple
1818
1919 from synapse.api.errors import AuthError, ShadowBanError, SynapseError
2020 from synapse.appservice import ApplicationService
2121 from synapse.metrics.background_process_metrics import run_as_background_process
2222 from synapse.replication.tcp.streams import TypingStream
23 from synapse.types import JsonDict, UserID, get_domain_from_id
23 from synapse.types import JsonDict, Requester, UserID, get_domain_from_id
2424 from synapse.util.caches.stream_change_cache import StreamChangeCache
2525 from synapse.util.metrics import Measure
2626 from synapse.util.wheel_timer import WheelTimer
6464 )
6565
6666 # map room IDs to serial numbers
67 self._room_serials = {}
67 self._room_serials = {} # type: Dict[str, int]
6868 # map room IDs to sets of users currently typing
69 self._room_typing = {}
70
71 self._member_last_federation_poke = {}
69 self._room_typing = {} # type: Dict[str, Set[str]]
70
71 self._member_last_federation_poke = {} # type: Dict[RoomMember, int]
7272 self.wheel_timer = WheelTimer(bucket_size=5000)
7373 self._latest_room_serial = 0
7474
7575 self.clock.looping_call(self._handle_timeouts, 5000)
7676
77 def _reset(self):
77 def _reset(self) -> None:
7878 """Reset the typing handler's data caches.
7979 """
8080 # map room IDs to serial numbers
8585 self._member_last_federation_poke = {}
8686 self.wheel_timer = WheelTimer(bucket_size=5000)
8787
88 def _handle_timeouts(self):
88 def _handle_timeouts(self) -> None:
8989 logger.debug("Checking for typing timeouts")
9090
9191 now = self.clock.time_msec()
9595 for member in members:
9696 self._handle_timeout_for_member(now, member)
9797
98 def _handle_timeout_for_member(self, now: int, member: RoomMember):
98 def _handle_timeout_for_member(self, now: int, member: RoomMember) -> None:
9999 if not self.is_typing(member):
100100 # Nothing to do if they're no longer typing
101101 return
113113 # each person typing.
114114 self.wheel_timer.insert(now=now, obj=member, then=now + 60 * 1000)
115115
116 def is_typing(self, member):
116 def is_typing(self, member: RoomMember) -> bool:
117117 return member.user_id in self._room_typing.get(member.room_id, [])
118118
119 async def _push_remote(self, member, typing):
119 async def _push_remote(self, member: RoomMember, typing: bool) -> None:
120120 if not self.federation:
121121 return
122122
147147
148148 def process_replication_rows(
149149 self, token: int, rows: List[TypingStream.TypingStreamRow]
150 ):
150 ) -> None:
151151 """Should be called whenever we receive updates for typing stream.
152152 """
153153
177177
178178 async def _send_changes_in_typing_to_remotes(
179179 self, room_id: str, prev_typing: Set[str], now_typing: Set[str]
180 ):
180 ) -> None:
181181 """Process a change in typing of a room from replication, sending EDUs
182182 for any local users.
183183 """
193193 if self.is_mine_id(user_id):
194194 await self._push_remote(RoomMember(room_id, user_id), False)
195195
196 def get_current_token(self):
196 def get_current_token(self) -> int:
197197 return self._latest_room_serial
198198
199199
200200 class TypingWriterHandler(FollowerTypingHandler):
201 def __init__(self, hs):
201 def __init__(self, hs: "HomeServer"):
202202 super().__init__(hs)
203203
204204 assert hs.config.worker.writers.typing == hs.get_instance_name()
212212
213213 hs.get_distributor().observe("user_left_room", self.user_left_room)
214214
215 self._member_typing_until = {} # clock time we expect to stop
215 # clock time we expect to stop
216 self._member_typing_until = {} # type: Dict[RoomMember, int]
216217
217218 # caches which room_ids changed at which serials
218219 self._typing_stream_change_cache = StreamChangeCache(
219220 "TypingStreamChangeCache", self._latest_room_serial
220221 )
221222
222 def _handle_timeout_for_member(self, now: int, member: RoomMember):
223 def _handle_timeout_for_member(self, now: int, member: RoomMember) -> None:
223224 super()._handle_timeout_for_member(now, member)
224225
225226 if not self.is_typing(member):
232233 self._stopped_typing(member)
233234 return
234235
235 async def started_typing(self, target_user, requester, room_id, timeout):
236 async def started_typing(
237 self, target_user: UserID, requester: Requester, room_id: str, timeout: int
238 ) -> None:
236239 target_user_id = target_user.to_string()
237240 auth_user_id = requester.user.to_string()
238241
262265
263266 if was_present:
264267 # No point sending another notification
265 return None
268 return
266269
267270 self._push_update(member=member, typing=True)
268271
269 async def stopped_typing(self, target_user, requester, room_id):
272 async def stopped_typing(
273 self, target_user: UserID, requester: Requester, room_id: str
274 ) -> None:
270275 target_user_id = target_user.to_string()
271276 auth_user_id = requester.user.to_string()
272277
289294
290295 self._stopped_typing(member)
291296
292 def user_left_room(self, user, room_id):
297 def user_left_room(self, user: UserID, room_id: str) -> None:
293298 user_id = user.to_string()
294299 if self.is_mine_id(user_id):
295300 member = RoomMember(room_id=room_id, user_id=user_id)
296301 self._stopped_typing(member)
297302
298 def _stopped_typing(self, member):
303 def _stopped_typing(self, member: RoomMember) -> None:
299304 if member.user_id not in self._room_typing.get(member.room_id, set()):
300305 # No point
301 return None
306 return
302307
303308 self._member_typing_until.pop(member, None)
304309 self._member_last_federation_poke.pop(member, None)
305310
306311 self._push_update(member=member, typing=False)
307312
308 def _push_update(self, member, typing):
313 def _push_update(self, member: RoomMember, typing: bool) -> None:
309314 if self.hs.is_mine_id(member.user_id):
310315 # Only send updates for changes to our own users.
311316 run_as_background_process(
314319
315320 self._push_update_local(member=member, typing=typing)
316321
317 async def _recv_edu(self, origin, content):
322 async def _recv_edu(self, origin: str, content: JsonDict) -> None:
318323 room_id = content["room_id"]
319324 user_id = content["user_id"]
320325
339344 self.wheel_timer.insert(now=now, obj=member, then=now + FEDERATION_TIMEOUT)
340345 self._push_update_local(member=member, typing=content["typing"])
341346
342 def _push_update_local(self, member, typing):
347 def _push_update_local(self, member: RoomMember, typing: bool) -> None:
343348 room_set = self._room_typing.setdefault(member.room_id, set())
344349 if typing:
345350 room_set.add(member.user_id)
385390
386391 changed_rooms = self._typing_stream_change_cache.get_all_entities_changed(
387392 last_id
388 )
393 ) # type: Optional[Iterable[str]]
389394
390395 if changed_rooms is None:
391396 changed_rooms = self._room_serials
411416
412417 def process_replication_rows(
413418 self, token: int, rows: List[TypingStream.TypingStreamRow]
414 ):
419 ) -> None:
415420 # The writing process should never get updates from replication.
416421 raise Exception("Typing writer instance got typing info over replication")
417422
418423
419424 class TypingNotificationEventSource:
420 def __init__(self, hs):
425 def __init__(self, hs: "HomeServer"):
421426 self.hs = hs
422427 self.clock = hs.get_clock()
423428 # We can't call get_typing_handler here because there's a cycle:
426431 #
427432 self.get_typing_handler = hs.get_typing_handler
428433
429 def _make_event_for(self, room_id):
434 def _make_event_for(self, room_id: str) -> JsonDict:
430435 typing = self.get_typing_handler()._room_typing[room_id]
431436 return {
432437 "type": "m.typing",
461466
462467 return (events, handler._latest_room_serial)
463468
464 async def get_new_events(self, from_key, room_ids, **kwargs):
469 async def get_new_events(
470 self, from_key: int, room_ids: Iterable[str], **kwargs
471 ) -> Tuple[List[JsonDict], int]:
465472 with Measure(self.clock, "typing.get_new_events"):
466473 from_key = int(from_key)
467474 handler = self.get_typing_handler()
477484
478485 return (events, handler._latest_room_serial)
479486
480 def get_current_key(self):
487 def get_current_key(self) -> int:
481488 return self.get_typing_handler()._latest_room_serial
144144 if self.pos is None:
145145 self.pos = await self.store.get_user_directory_stream_pos()
146146
147 # If still None then the initial background update hasn't happened yet
148 if self.pos is None:
149 return None
150
151147 # Loop round handling deltas until we're up to date
152148 while True:
153149 with Measure(self.clock, "user_dir_delta"):
232228
233229 if change: # The user joined
234230 event = await self.store.get_event(event_id, allow_none=True)
231 # It isn't expected for this event to not exist, but we
232 # don't want the entire background process to break.
233 if event is None:
234 continue
235
235236 profile = ProfileInfo(
236237 avatar_url=event.content.get("avatar_url"),
237238 display_name=event.content.get("displayname"),
2121 import urllib
2222 from http import HTTPStatus
2323 from io import BytesIO
24 from typing import Any, Callable, Dict, Iterator, List, Tuple, Union
24 from typing import (
25 Any,
26 Awaitable,
27 Callable,
28 Dict,
29 Iterable,
30 Iterator,
31 List,
32 Pattern,
33 Tuple,
34 Union,
35 )
2536
2637 import jinja2
2738 from canonicaljson import iterencode_canonical_json
39 from typing_extensions import Protocol
2840 from zope.interface import implementer
2941
3042 from twisted.internet import defer, interfaces
167179 return preserve_fn(wrapped_async_request_handler)
168180
169181
170 class HttpServer:
182 # Type of a callback method for processing requests
183 # it is actually called with a SynapseRequest and a kwargs dict for the params,
184 # but I can't figure out how to represent that.
185 ServletCallback = Callable[
186 ..., Union[None, Awaitable[None], Tuple[int, Any], Awaitable[Tuple[int, Any]]]
187 ]
188
189
190 class HttpServer(Protocol):
171191 """ Interface for registering callbacks on a HTTP server
172192 """
173193
174 def register_paths(self, method, path_patterns, callback):
194 def register_paths(
195 self,
196 method: str,
197 path_patterns: Iterable[Pattern],
198 callback: ServletCallback,
199 servlet_classname: str,
200 ) -> None:
175201 """ Register a callback that gets fired if we receive a http request
176202 with the given method for a path that matches the given regex.
177203
179205 an unpacked tuple.
180206
181207 Args:
182 method (str): The method to listen to.
183 path_patterns (list<SRE_Pattern>): The regex used to match requests.
184 callback (function): The function to fire if we receive a matched
208 method: The HTTP method to listen to.
209 path_patterns: The regex used to match requests.
210 callback: The function to fire if we receive a matched
185211 request. The first argument will be the request object and
186212 subsequent arguments will be any matched groups from the regex.
187 This should return a tuple of (code, response).
213 This should return either tuple of (code, response), or None.
214 servlet_classname (str): The name of the handler to be used in prometheus
215 and opentracing logs.
188216 """
189217 pass
190218
353381
354382 def _get_handler_for_request(
355383 self, request: SynapseRequest
356 ) -> Tuple[Callable, str, Dict[str, str]]:
384 ) -> Tuple[ServletCallback, str, Dict[str, str]]:
357385 """Finds a callback method to handle the given request.
358386
359387 Returns:
732760 request.setHeader(b"Content-Security-Policy", b"frame-ancestors 'none';")
733761
734762
763 def respond_with_redirect(request: Request, url: bytes) -> None:
764 """Write a 302 response to the request, if it is still alive."""
765 logger.debug("Redirect to %s", url.decode("utf-8"))
766 request.redirect(url)
767 finish_request(request)
768
769
735770 def finish_request(request: Request):
736771 """ Finish writing the response to the request.
737772
790790
791791 @wraps(func)
792792 def _tag_args_inner(*args, **kwargs):
793 argspec = inspect.getargspec(func)
793 argspec = inspect.getfullargspec(func)
794794 for i, arg in enumerate(argspec.args[1:]):
795795 set_tag("ARG_" + arg, args[i])
796796 set_tag("args", args[len(argspec.args) :])
278278 )
279279
280280 async def complete_sso_login_async(
281 self, registered_user_id: str, request: SynapseRequest, client_redirect_url: str
281 self,
282 registered_user_id: str,
283 request: SynapseRequest,
284 client_redirect_url: str,
285 new_user: bool = False,
282286 ):
283287 """Complete a SSO login by redirecting the user to a page to confirm whether they
284288 want their access token sent to `client_redirect_url`, or redirect them to that
290294 request: The request to respond to.
291295 client_redirect_url: The URL to which to offer to redirect the user (or to
292296 redirect them directly if whitelisted).
297 new_user: set to true to use wording for the consent appropriate to a user
298 who has just registered.
293299 """
294300 await self._auth_handler.complete_sso_login(
295 registered_user_id, request, client_redirect_url,
301 registered_user_id, request, client_redirect_url, new_user=new_user
296302 )
297303
298304 @defer.inlineCallbacks
266266 fallback_to_members=True,
267267 )
268268
269 summary_text = await self.make_summary_text(
270 notifs_by_room, state_by_room, notif_events, user_id, reason
271 )
269 if len(notifs_by_room) == 1:
270 # Only one room has new stuff
271 room_id = list(notifs_by_room.keys())[0]
272
273 summary_text = await self.make_summary_text_single_room(
274 room_id,
275 notifs_by_room[room_id],
276 state_by_room[room_id],
277 notif_events,
278 user_id,
279 )
280 else:
281 summary_text = await self.make_summary_text(
282 notifs_by_room, state_by_room, notif_events, reason
283 )
272284
273285 template_vars = {
274286 "user_display_name": user_display_name,
491503 if "url" in event.content:
492504 messagevars["image_url"] = event.content["url"]
493505
506 async def make_summary_text_single_room(
507 self,
508 room_id: str,
509 notifs: List[Dict[str, Any]],
510 room_state_ids: StateMap[str],
511 notif_events: Dict[str, EventBase],
512 user_id: str,
513 ) -> str:
514 """
515 Make a summary text for the email when only a single room has notifications.
516
517 Args:
518 room_id: The ID of the room.
519 notifs: The notifications for this room.
520 room_state_ids: The state map for the room.
521 notif_events: A map of event ID -> notification event.
522 user_id: The user receiving the notification.
523
524 Returns:
525 The summary text.
526 """
527 # If the room has some kind of name, use it, but we don't
528 # want the generated-from-names one here otherwise we'll
529 # end up with, "new message from Bob in the Bob room"
530 room_name = await calculate_room_name(
531 self.store, room_state_ids, user_id, fallback_to_members=False
532 )
533
534 # See if one of the notifs is an invite event for the user
535 invite_event = None
536 for n in notifs:
537 ev = notif_events[n["event_id"]]
538 if ev.type == EventTypes.Member and ev.state_key == user_id:
539 if ev.content.get("membership") == Membership.INVITE:
540 invite_event = ev
541 break
542
543 if invite_event:
544 inviter_member_event_id = room_state_ids.get(
545 ("m.room.member", invite_event.sender)
546 )
547 inviter_name = invite_event.sender
548 if inviter_member_event_id:
549 inviter_member_event = await self.store.get_event(
550 inviter_member_event_id, allow_none=True
551 )
552 if inviter_member_event:
553 inviter_name = name_from_member_event(inviter_member_event)
554
555 if room_name is None:
556 return self.email_subjects.invite_from_person % {
557 "person": inviter_name,
558 "app": self.app_name,
559 }
560
561 return self.email_subjects.invite_from_person_to_room % {
562 "person": inviter_name,
563 "room": room_name,
564 "app": self.app_name,
565 }
566
567 if len(notifs) == 1:
568 # There is just the one notification, so give some detail
569 sender_name = None
570 event = notif_events[notifs[0]["event_id"]]
571 if ("m.room.member", event.sender) in room_state_ids:
572 state_event_id = room_state_ids[("m.room.member", event.sender)]
573 state_event = await self.store.get_event(state_event_id)
574 sender_name = name_from_member_event(state_event)
575
576 if sender_name is not None and room_name is not None:
577 return self.email_subjects.message_from_person_in_room % {
578 "person": sender_name,
579 "room": room_name,
580 "app": self.app_name,
581 }
582 elif sender_name is not None:
583 return self.email_subjects.message_from_person % {
584 "person": sender_name,
585 "app": self.app_name,
586 }
587
588 # The sender is unknown, just use the room name (or ID).
589 return self.email_subjects.messages_in_room % {
590 "room": room_name or room_id,
591 "app": self.app_name,
592 }
593 else:
594 # There's more than one notification for this room, so just
595 # say there are several
596 if room_name is not None:
597 return self.email_subjects.messages_in_room % {
598 "room": room_name,
599 "app": self.app_name,
600 }
601
602 return await self.make_summary_text_from_member_events(
603 room_id, notifs, room_state_ids, notif_events
604 )
605
494606 async def make_summary_text(
495607 self,
496608 notifs_by_room: Dict[str, List[Dict[str, Any]]],
497609 room_state_ids: Dict[str, StateMap[str]],
498610 notif_events: Dict[str, EventBase],
499 user_id: str,
500611 reason: Dict[str, Any],
501 ):
502 if len(notifs_by_room) == 1:
503 # Only one room has new stuff
504 room_id = list(notifs_by_room.keys())[0]
505
506 # If the room has some kind of name, use it, but we don't
507 # want the generated-from-names one here otherwise we'll
508 # end up with, "new message from Bob in the Bob room"
509 room_name = await calculate_room_name(
510 self.store, room_state_ids[room_id], user_id, fallback_to_members=False
511 )
512
513 # See if one of the notifs is an invite event for the user
514 invite_event = None
515 for n in notifs_by_room[room_id]:
516 ev = notif_events[n["event_id"]]
517 if ev.type == EventTypes.Member and ev.state_key == user_id:
518 if ev.content.get("membership") == Membership.INVITE:
519 invite_event = ev
520 break
521
522 if invite_event:
523 inviter_member_event_id = room_state_ids[room_id].get(
524 ("m.room.member", invite_event.sender)
525 )
526 inviter_name = invite_event.sender
527 if inviter_member_event_id:
528 inviter_member_event = await self.store.get_event(
529 inviter_member_event_id, allow_none=True
530 )
531 if inviter_member_event:
532 inviter_name = name_from_member_event(inviter_member_event)
533
534 if room_name is None:
535 return self.email_subjects.invite_from_person % {
536 "person": inviter_name,
537 "app": self.app_name,
538 }
539 else:
540 return self.email_subjects.invite_from_person_to_room % {
541 "person": inviter_name,
542 "room": room_name,
543 "app": self.app_name,
544 }
545
546 sender_name = None
547 if len(notifs_by_room[room_id]) == 1:
548 # There is just the one notification, so give some detail
549 event = notif_events[notifs_by_room[room_id][0]["event_id"]]
550 if ("m.room.member", event.sender) in room_state_ids[room_id]:
551 state_event_id = room_state_ids[room_id][
552 ("m.room.member", event.sender)
553 ]
554 state_event = await self.store.get_event(state_event_id)
555 sender_name = name_from_member_event(state_event)
556
557 if sender_name is not None and room_name is not None:
558 return self.email_subjects.message_from_person_in_room % {
559 "person": sender_name,
560 "room": room_name,
561 "app": self.app_name,
562 }
563 elif sender_name is not None:
564 return self.email_subjects.message_from_person % {
565 "person": sender_name,
566 "app": self.app_name,
567 }
568 else:
569 # There's more than one notification for this room, so just
570 # say there are several
571 if room_name is not None:
572 return self.email_subjects.messages_in_room % {
573 "room": room_name,
574 "app": self.app_name,
575 }
576 else:
577 # If the room doesn't have a name, say who the messages
578 # are from explicitly to avoid, "messages in the Bob room"
579 sender_ids = list(
580 {
581 notif_events[n["event_id"]].sender
582 for n in notifs_by_room[room_id]
583 }
584 )
585
586 member_events = await self.store.get_events(
587 [
588 room_state_ids[room_id][("m.room.member", s)]
589 for s in sender_ids
590 ]
591 )
592
593 return self.email_subjects.messages_from_person % {
594 "person": descriptor_from_member_events(member_events.values()),
595 "app": self.app_name,
596 }
597 else:
598 # Stuff's happened in multiple different rooms
599
600 # ...but we still refer to the 'reason' room which triggered the mail
601 if reason["room_name"] is not None:
602 return self.email_subjects.messages_in_room_and_others % {
603 "room": reason["room_name"],
604 "app": self.app_name,
605 }
606 else:
607 # If the reason room doesn't have a name, say who the messages
608 # are from explicitly to avoid, "messages in the Bob room"
609 room_id = reason["room_id"]
610
611 sender_ids = list(
612 {
613 notif_events[n["event_id"]].sender
614 for n in notifs_by_room[room_id]
615 }
616 )
617
618 member_events = await self.store.get_events(
619 [room_state_ids[room_id][("m.room.member", s)] for s in sender_ids]
620 )
621
622 return self.email_subjects.messages_from_person_and_others % {
623 "person": descriptor_from_member_events(member_events.values()),
624 "app": self.app_name,
625 }
612 ) -> str:
613 """
614 Make a summary text for the email when multiple rooms have notifications.
615
616 Args:
617 notifs_by_room: A map of room ID to the notifications for that room.
618 room_state_ids: A map of room ID to the state map for that room.
619 notif_events: A map of event ID -> notification event.
620 reason: The reason this notification is being sent.
621
622 Returns:
623 The summary text.
624 """
625 # Stuff's happened in multiple different rooms
626 # ...but we still refer to the 'reason' room which triggered the mail
627 if reason["room_name"] is not None:
628 return self.email_subjects.messages_in_room_and_others % {
629 "room": reason["room_name"],
630 "app": self.app_name,
631 }
632
633 room_id = reason["room_id"]
634 return await self.make_summary_text_from_member_events(
635 room_id, notifs_by_room[room_id], room_state_ids[room_id], notif_events
636 )
637
638 async def make_summary_text_from_member_events(
639 self,
640 room_id: str,
641 notifs: List[Dict[str, Any]],
642 room_state_ids: StateMap[str],
643 notif_events: Dict[str, EventBase],
644 ) -> str:
645 """
646 Make a summary text for the email when only a single room has notifications.
647
648 Args:
649 room_id: The ID of the room.
650 notifs: The notifications for this room.
651 room_state_ids: The state map for the room.
652 notif_events: A map of event ID -> notification event.
653
654 Returns:
655 The summary text.
656 """
657 # If the room doesn't have a name, say who the messages
658 # are from explicitly to avoid, "messages in the Bob room"
659 sender_ids = {notif_events[n["event_id"]].sender for n in notifs}
660
661 member_events = await self.store.get_events(
662 [room_state_ids[("m.room.member", s)] for s in sender_ids]
663 )
664
665 # There was a single sender.
666 if len(sender_ids) == 1:
667 return self.email_subjects.messages_from_person % {
668 "person": descriptor_from_member_events(member_events.values()),
669 "app": self.app_name,
670 }
671
672 # There was more than one sender, use the first one and a tweaked template.
673 return self.email_subjects.messages_from_person_and_others % {
674 "person": descriptor_from_member_events(list(member_events.values())[:1]),
675 "app": self.app_name,
676 }
626677
627678 def make_room_link(self, room_id: str) -> str:
628679 if self.hs.config.email_riot_base_url:
667718
668719
669720 def safe_markup(raw_html: str) -> jinja2.Markup:
721 """
722 Sanitise a raw HTML string to a set of allowed tags and attributes, and linkify any bare URLs.
723
724 Args
725 raw_html: Unsafe HTML.
726
727 Returns:
728 A Markup object ready to safely use in a Jinja template.
729 """
670730 return jinja2.Markup(
671731 bleach.linkify(
672732 bleach.clean(
683743
684744 def safe_text(raw_text: str) -> jinja2.Markup:
685745 """
686 Process text: treat it as HTML but escape any tags (ie. just escape the
687 HTML) then linkify it.
746 Sanitise text (escape any HTML tags), and then linkify any bare URLs.
747
748 Args
749 raw_text: Unsafe text which might include HTML markup.
750
751 Returns:
752 A Markup object ready to safely use in a Jinja template.
688753 """
689754 return jinja2.Markup(
690755 bleach.linkify(bleach.clean(raw_text, tags=[], attributes={}, strip=False))
1616 import re
1717 from typing import TYPE_CHECKING, Dict, Iterable, Optional
1818
19 from synapse.api.constants import EventTypes
19 from synapse.api.constants import EventTypes, Membership
2020 from synapse.events import EventBase
2121 from synapse.types import StateMap
2222
6262 m_room_name = await store.get_event(
6363 room_state_ids[(EventTypes.Name, "")], allow_none=True
6464 )
65 if m_room_name and m_room_name.content and m_room_name.content["name"]:
65 if m_room_name and m_room_name.content and m_room_name.content.get("name"):
6666 return m_room_name.content["name"]
6767
6868 # does it have a canonical alias?
7373 if (
7474 canon_alias
7575 and canon_alias.content
76 and canon_alias.content["alias"]
76 and canon_alias.content.get("alias")
7777 and _looks_like_an_alias(canon_alias.content["alias"])
7878 ):
7979 return canon_alias.content["alias"]
80
81 # at this point we're going to need to search the state by all state keys
82 # for an event type, so rearrange the data structure
83 room_state_bytype_ids = _state_as_two_level_dict(room_state_ids)
8480
8581 if not fallback_to_members:
8682 return None
9389
9490 if (
9591 my_member_event is not None
96 and my_member_event.content["membership"] == "invite"
92 and my_member_event.content.get("membership") == Membership.INVITE
9793 ):
9894 if (EventTypes.Member, my_member_event.sender) in room_state_ids:
9995 inviter_member_event = await store.get_event(
110106 else:
111107 return "Room Invite"
112108
109 # at this point we're going to need to search the state by all state keys
110 # for an event type, so rearrange the data structure
111 room_state_bytype_ids = _state_as_two_level_dict(room_state_ids)
112
113113 # we're going to have to generate a name based on who's in the room,
114114 # so find out who is in the room that isn't the user.
115115 if EventTypes.Member in room_state_bytype_ids:
119119 all_members = [
120120 ev
121121 for ev in member_events.values()
122 if ev.content["membership"] == "join"
123 or ev.content["membership"] == "invite"
122 if ev.content.get("membership") == Membership.JOIN
123 or ev.content.get("membership") == Membership.INVITE
124124 ]
125125 # Sort the member events oldest-first so the we name people in the
126126 # order the joined (it should at least be deterministic rather than
193193
194194
195195 def name_from_member_event(member_event: EventBase) -> str:
196 if (
197 member_event.content
198 and "displayname" in member_event.content
199 and member_event.content["displayname"]
200 ):
196 if member_event.content and member_event.content.get("displayname"):
201197 return member_event.content["displayname"]
202198 return member_event.state_key
203199
0 # -*- coding: utf-8 -*-
1 # Copyright 2021 The Matrix.org Foundation C.I.C.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import logging
16 from typing import TYPE_CHECKING, Any, Optional
17
18 from prometheus_client import Counter
19
20 from synapse.logging.context import make_deferred_yieldable
21 from synapse.util import json_decoder, json_encoder
22
23 if TYPE_CHECKING:
24 from synapse.server import HomeServer
25
26 set_counter = Counter(
27 "synapse_external_cache_set",
28 "Number of times we set a cache",
29 labelnames=["cache_name"],
30 )
31
32 get_counter = Counter(
33 "synapse_external_cache_get",
34 "Number of times we get a cache",
35 labelnames=["cache_name", "hit"],
36 )
37
38
39 logger = logging.getLogger(__name__)
40
41
42 class ExternalCache:
43 """A cache backed by an external Redis. Does nothing if no Redis is
44 configured.
45 """
46
47 def __init__(self, hs: "HomeServer"):
48 self._redis_connection = hs.get_outbound_redis_connection()
49
50 def _get_redis_key(self, cache_name: str, key: str) -> str:
51 return "cache_v1:%s:%s" % (cache_name, key)
52
53 def is_enabled(self) -> bool:
54 """Whether the external cache is used or not.
55
56 It's safe to use the cache when this returns false, the methods will
57 just no-op, but the function is useful to avoid doing unnecessary work.
58 """
59 return self._redis_connection is not None
60
61 async def set(self, cache_name: str, key: str, value: Any, expiry_ms: int) -> None:
62 """Add the key/value to the named cache, with the expiry time given.
63 """
64
65 if self._redis_connection is None:
66 return
67
68 set_counter.labels(cache_name).inc()
69
70 # txredisapi requires the value to be string, bytes or numbers, so we
71 # encode stuff in JSON.
72 encoded_value = json_encoder.encode(value)
73
74 logger.debug("Caching %s %s: %r", cache_name, key, encoded_value)
75
76 return await make_deferred_yieldable(
77 self._redis_connection.set(
78 self._get_redis_key(cache_name, key), encoded_value, pexpire=expiry_ms,
79 )
80 )
81
82 async def get(self, cache_name: str, key: str) -> Optional[Any]:
83 """Look up a key/value in the named cache.
84 """
85
86 if self._redis_connection is None:
87 return None
88
89 result = await make_deferred_yieldable(
90 self._redis_connection.get(self._get_redis_key(cache_name, key))
91 )
92
93 logger.debug("Got cache result %s %s: %r", cache_name, key, result)
94
95 get_counter.labels(cache_name, result is not None).inc()
96
97 if not result:
98 return None
99
100 # For some reason the integers get magically converted back to integers
101 if isinstance(result, int):
102 return result
103
104 return json_decoder.decode(result)
1414 # limitations under the License.
1515 import logging
1616 from typing import (
17 TYPE_CHECKING,
1718 Any,
1819 Awaitable,
1920 Dict,
6263 TypingStream,
6364 )
6465
66 if TYPE_CHECKING:
67 from synapse.server import HomeServer
68
6569 logger = logging.getLogger(__name__)
6670
6771
8791 back out to connections.
8892 """
8993
90 def __init__(self, hs):
94 def __init__(self, hs: "HomeServer"):
9195 self._replication_data_handler = hs.get_replication_data_handler()
9296 self._presence_handler = hs.get_presence_handler()
9397 self._store = hs.get_datastore()
281285 if hs.config.redis.redis_enabled:
282286 from synapse.replication.tcp.redis import (
283287 RedisDirectTcpReplicationClientFactory,
284 lazyConnection,
285 )
286
287 logger.info(
288 "Connecting to redis (host=%r port=%r)",
289 hs.config.redis_host,
290 hs.config.redis_port,
291288 )
292289
293290 # First let's ensure that we have a ReplicationStreamer started.
298295 # connection after SUBSCRIBE is called).
299296
300297 # First create the connection for sending commands.
301 outbound_redis_connection = lazyConnection(
302 reactor=hs.get_reactor(),
303 host=hs.config.redis_host,
304 port=hs.config.redis_port,
305 password=hs.config.redis.redis_password,
306 reconnect=True,
307 )
298 outbound_redis_connection = hs.get_outbound_redis_connection()
308299
309300 # Now create the factory/connection for the subscription stream.
310301 self._factory = RedisDirectTcpReplicationClientFactory(
1414
1515 import logging
1616 from inspect import isawaitable
17 from typing import TYPE_CHECKING, Optional
17 from typing import TYPE_CHECKING, Optional, Type, cast
1818
1919 import txredisapi
2020
2222 from synapse.metrics.background_process_metrics import (
2323 BackgroundProcessLoggingContext,
2424 run_as_background_process,
25 wrap_as_background_process,
2526 )
2627 from synapse.replication.tcp.commands import (
2728 Command,
5859 immediately after initialisation.
5960
6061 Attributes:
61 handler: The command handler to handle incoming commands.
62 stream_name: The *redis* stream name to subscribe to and publish from
63 (not anything to do with Synapse replication streams).
64 outbound_redis_connection: The connection to redis to use to send
62 synapse_handler: The command handler to handle incoming commands.
63 synapse_stream_name: The *redis* stream name to subscribe to and publish
64 from (not anything to do with Synapse replication streams).
65 synapse_outbound_redis_connection: The connection to redis to use to send
6566 commands.
6667 """
6768
68 handler = None # type: ReplicationCommandHandler
69 stream_name = None # type: str
70 outbound_redis_connection = None # type: txredisapi.RedisProtocol
69 synapse_handler = None # type: ReplicationCommandHandler
70 synapse_stream_name = None # type: str
71 synapse_outbound_redis_connection = None # type: txredisapi.RedisProtocol
7172
7273 def __init__(self, *args, **kwargs):
7374 super().__init__(*args, **kwargs)
8788 # it's important to make sure that we only send the REPLICATE command once we
8889 # have successfully subscribed to the stream - otherwise we might miss the
8990 # POSITION response sent back by the other end.
90 logger.info("Sending redis SUBSCRIBE for %s", self.stream_name)
91 await make_deferred_yieldable(self.subscribe(self.stream_name))
91 logger.info("Sending redis SUBSCRIBE for %s", self.synapse_stream_name)
92 await make_deferred_yieldable(self.subscribe(self.synapse_stream_name))
9293 logger.info(
9394 "Successfully subscribed to redis stream, sending REPLICATE command"
9495 )
95 self.handler.new_connection(self)
96 self.synapse_handler.new_connection(self)
9697 await self._async_send_command(ReplicateCommand())
9798 logger.info("REPLICATE successfully sent")
9899
99100 # We send out our positions when there is a new connection in case the
100101 # other side missed updates. We do this for Redis connections as the
101102 # otherside won't know we've connected and so won't issue a REPLICATE.
102 self.handler.send_positions_to_connection(self)
103 self.synapse_handler.send_positions_to_connection(self)
103104
104105 def messageReceived(self, pattern: str, channel: str, message: str):
105106 """Received a message from redis.
136137 cmd: received command
137138 """
138139
139 cmd_func = getattr(self.handler, "on_%s" % (cmd.NAME,), None)
140 cmd_func = getattr(self.synapse_handler, "on_%s" % (cmd.NAME,), None)
140141 if not cmd_func:
141142 logger.warning("Unhandled command: %r", cmd)
142143 return
154155 def connectionLost(self, reason):
155156 logger.info("Lost connection to redis")
156157 super().connectionLost(reason)
157 self.handler.lost_connection(self)
158 self.synapse_handler.lost_connection(self)
158159
159160 # mark the logging context as finished
160161 self._logging_context.__exit__(None, None, None)
182183 tcp_outbound_commands_counter.labels(cmd.NAME, "redis").inc()
183184
184185 await make_deferred_yieldable(
185 self.outbound_redis_connection.publish(self.stream_name, encoded_string)
186 )
187
188
189 class RedisDirectTcpReplicationClientFactory(txredisapi.SubscriberFactory):
186 self.synapse_outbound_redis_connection.publish(
187 self.synapse_stream_name, encoded_string
188 )
189 )
190
191
192 class SynapseRedisFactory(txredisapi.RedisFactory):
193 """A subclass of RedisFactory that periodically sends pings to ensure that
194 we detect dead connections.
195 """
196
197 def __init__(
198 self,
199 hs: "HomeServer",
200 uuid: str,
201 dbid: Optional[int],
202 poolsize: int,
203 isLazy: bool = False,
204 handler: Type = txredisapi.ConnectionHandler,
205 charset: str = "utf-8",
206 password: Optional[str] = None,
207 replyTimeout: int = 30,
208 convertNumbers: Optional[int] = True,
209 ):
210 super().__init__(
211 uuid=uuid,
212 dbid=dbid,
213 poolsize=poolsize,
214 isLazy=isLazy,
215 handler=handler,
216 charset=charset,
217 password=password,
218 replyTimeout=replyTimeout,
219 convertNumbers=convertNumbers,
220 )
221
222 hs.get_clock().looping_call(self._send_ping, 30 * 1000)
223
224 @wrap_as_background_process("redis_ping")
225 async def _send_ping(self):
226 for connection in self.pool:
227 try:
228 await make_deferred_yieldable(connection.ping())
229 except Exception:
230 logger.warning("Failed to send ping to a redis connection")
231
232
233 class RedisDirectTcpReplicationClientFactory(SynapseRedisFactory):
190234 """This is a reconnecting factory that connects to redis and immediately
191235 subscribes to a stream.
192236
205249 self, hs: "HomeServer", outbound_redis_connection: txredisapi.RedisProtocol
206250 ):
207251
208 super().__init__()
209
210 # This sets the password on the RedisFactory base class (as
211 # SubscriberFactory constructor doesn't pass it through).
212 self.password = hs.config.redis.redis_password
213
214 self.handler = hs.get_tcp_replication()
215 self.stream_name = hs.hostname
216
217 self.outbound_redis_connection = outbound_redis_connection
252 super().__init__(
253 hs,
254 uuid="subscriber",
255 dbid=None,
256 poolsize=1,
257 replyTimeout=30,
258 password=hs.config.redis.redis_password,
259 )
260
261 self.synapse_handler = hs.get_tcp_replication()
262 self.synapse_stream_name = hs.hostname
263
264 self.synapse_outbound_redis_connection = outbound_redis_connection
218265
219266 def buildProtocol(self, addr):
220 p = super().buildProtocol(addr) # type: RedisSubscriber
267 p = super().buildProtocol(addr)
268 p = cast(RedisSubscriber, p)
221269
222270 # We do this here rather than add to the constructor of `RedisSubcriber`
223271 # as to do so would involve overriding `buildProtocol` entirely, however
224272 # the base method does some other things than just instantiating the
225273 # protocol.
226 p.handler = self.handler
227 p.outbound_redis_connection = self.outbound_redis_connection
228 p.stream_name = self.stream_name
229 p.password = self.password
274 p.synapse_handler = self.synapse_handler
275 p.synapse_outbound_redis_connection = self.synapse_outbound_redis_connection
276 p.synapse_stream_name = self.synapse_stream_name
230277
231278 return p
232279
233280
234281 def lazyConnection(
235 reactor,
282 hs: "HomeServer",
236283 host: str = "localhost",
237284 port: int = 6379,
238285 dbid: Optional[int] = None,
239286 reconnect: bool = True,
240 charset: str = "utf-8",
241287 password: Optional[str] = None,
242 connectTimeout: Optional[int] = None,
243 replyTimeout: Optional[int] = None,
244 convertNumbers: bool = True,
288 replyTimeout: int = 30,
245289 ) -> txredisapi.RedisProtocol:
246 """Equivalent to `txredisapi.lazyConnection`, except allows specifying a
247 reactor.
290 """Creates a connection to Redis that is lazily set up and reconnects if the
291 connections is lost.
248292 """
249293
250 isLazy = True
251 poolsize = 1
252
253294 uuid = "%s:%d" % (host, port)
254 factory = txredisapi.RedisFactory(
255 uuid,
256 dbid,
257 poolsize,
258 isLazy,
259 txredisapi.ConnectionHandler,
260 charset,
261 password,
262 replyTimeout,
263 convertNumbers,
295 factory = SynapseRedisFactory(
296 hs,
297 uuid=uuid,
298 dbid=dbid,
299 poolsize=1,
300 isLazy=True,
301 handler=txredisapi.ConnectionHandler,
302 password=password,
303 replyTimeout=replyTimeout,
264304 )
265305 factory.continueTrying = reconnect
266 for x in range(poolsize):
267 reactor.connectTCP(host, port, factory, connectTimeout)
306
307 reactor = hs.get_reactor()
308 reactor.connectTCP(host, port, factory, 30)
268309
269310 return factory.handler
0 body {
1 font-family: "Inter", "Helvetica", "Arial", sans-serif;
2 font-size: 14px;
3 color: #17191C;
4 }
5
6 header {
7 max-width: 480px;
8 width: 100%;
9 margin: 24px auto;
10 text-align: center;
11 }
12
13 header p {
14 color: #737D8C;
15 line-height: 24px;
16 }
17
18 h1 {
19 font-size: 24px;
20 }
21
22 .error_page h1 {
23 color: #FE2928;
24 }
25
26 h2 {
27 font-size: 14px;
28 }
29
30 h2 img {
31 vertical-align: middle;
32 margin-right: 8px;
33 width: 24px;
34 height: 24px;
35 }
36
37 label {
38 cursor: pointer;
39 }
40
41 main {
42 max-width: 360px;
43 width: 100%;
44 margin: 24px auto;
45 }
46
47 .primary-button {
48 border: none;
49 text-decoration: none;
50 padding: 12px;
51 color: white;
52 background-color: #418DED;
53 font-weight: bold;
54 display: block;
55 border-radius: 12px;
56 width: 100%;
57 box-sizing: border-box;
58 margin: 16px 0;
59 cursor: pointer;
60 text-align: center;
61 }
62
63 .profile {
64 display: flex;
65 justify-content: center;
66 margin: 24px 0;
67 }
68
69 .profile .avatar {
70 width: 36px;
71 height: 36px;
72 border-radius: 100%;
73 display: block;
74 margin-right: 8px;
75 }
76
77 .profile .display-name {
78 font-weight: bold;
79 margin-bottom: 4px;
80 }
81 .profile .user-id {
82 color: #737D8C;
83 }
84
85 .profile .display-name, .profile .user-id {
86 line-height: 18px;
87 }
00 <!DOCTYPE html>
11 <html lang="en">
2 <head>
3 <meta charset="UTF-8">
4 <title>SSO account deactivated</title>
5 </head>
6 <body>
7 <p>This account has been deactivated.</p>
2 <head>
3 <meta charset="UTF-8">
4 <title>SSO account deactivated</title>
5 <meta name="viewport" content="width=device-width, user-scalable=no">
6 <style type="text/css">
7 {% include "sso.css" without context %}
8 </style>
9 </head>
10 <body class="error_page">
11 <header>
12 <h1>Your account has been deactivated</h1>
13 <p>
14 <strong>No account found</strong>
15 </p>
16 <p>
17 Your account might have been deactivated by the server administrator.
18 You can either try to create a new account or contact the server’s
19 administrator.
20 </p>
21 </header>
822 </body>
923 </html>
0 <!DOCTYPE html>
1 <html lang="en">
2 <head>
3 <title>Synapse Login</title>
4 <meta charset="utf-8">
5 <meta name="viewport" content="width=device-width, user-scalable=no">
6 <style type="text/css">
7 {% include "sso.css" without context %}
8
9 .username_input {
10 display: flex;
11 border: 2px solid #418DED;
12 border-radius: 8px;
13 padding: 12px;
14 position: relative;
15 margin: 16px 0;
16 align-items: center;
17 font-size: 12px;
18 }
19
20 .username_input.invalid {
21 border-color: #FE2928;
22 }
23
24 .username_input.invalid input, .username_input.invalid label {
25 color: #FE2928;
26 }
27
28 .username_input div, .username_input input {
29 line-height: 18px;
30 font-size: 14px;
31 }
32
33 .username_input label {
34 position: absolute;
35 top: -8px;
36 left: 14px;
37 font-size: 80%;
38 background: white;
39 padding: 2px;
40 }
41
42 .username_input input {
43 flex: 1;
44 display: block;
45 min-width: 0;
46 border: none;
47 }
48
49 .username_input div {
50 color: #8D99A5;
51 }
52
53 .idp-pick-details {
54 border: 1px solid #E9ECF1;
55 border-radius: 8px;
56 margin: 24px 0;
57 }
58
59 .idp-pick-details h2 {
60 margin: 0;
61 padding: 8px 12px;
62 }
63
64 .idp-pick-details .idp-detail {
65 border-top: 1px solid #E9ECF1;
66 padding: 12px;
67 }
68 .idp-pick-details .check-row {
69 display: flex;
70 align-items: center;
71 }
72
73 .idp-pick-details .check-row .name {
74 flex: 1;
75 }
76
77 .idp-pick-details .use, .idp-pick-details .idp-value {
78 color: #737D8C;
79 }
80
81 .idp-pick-details .idp-value {
82 margin: 0;
83 margin-top: 8px;
84 }
85
86 .idp-pick-details .avatar {
87 width: 53px;
88 height: 53px;
89 border-radius: 100%;
90 display: block;
91 margin-top: 8px;
92 }
93
94 output {
95 padding: 0 14px;
96 display: block;
97 }
98
99 output.error {
100 color: #FE2928;
101 }
102 </style>
103 </head>
104 <body>
105 <header>
106 <h1>Your account is nearly ready</h1>
107 <p>Check your details before creating an account on {{ server_name }}</p>
108 </header>
109 <main>
110 <form method="post" class="form__input" id="form">
111 <div class="username_input" id="username_input">
112 <label for="field-username">Username</label>
113 <div class="prefix">@</div>
114 <input type="text" name="username" id="field-username" autofocus>
115 <div class="postfix">:{{ server_name }}</div>
116 </div>
117 <output for="username_input" id="field-username-output"></output>
118 <input type="submit" value="Continue" class="primary-button">
119 {% if user_attributes %}
120 <section class="idp-pick-details">
121 <h2><img src="{{ idp.idp_icon | mxc_to_http(24, 24) }}"/>Information from {{ idp.idp_name }}</h2>
122 {% if user_attributes.avatar_url %}
123 <div class="idp-detail idp-avatar">
124 <div class="check-row">
125 <label for="idp-avatar" class="name">Avatar</label>
126 <label for="idp-avatar" class="use">Use</label>
127 <input type="checkbox" name="use_avatar" id="idp-avatar" value="true" checked>
128 </div>
129 <img src="{{ user_attributes.avatar_url }}" class="avatar" />
130 </div>
131 {% endif %}
132 {% if user_attributes.display_name %}
133 <div class="idp-detail">
134 <div class="check-row">
135 <label for="idp-displayname" class="name">Display name</label>
136 <label for="idp-displayname" class="use">Use</label>
137 <input type="checkbox" name="use_display_name" id="idp-displayname" value="true" checked>
138 </div>
139 <p class="idp-value">{{ user_attributes.display_name }}</p>
140 </div>
141 {% endif %}
142 {% for email in user_attributes.emails %}
143 <div class="idp-detail">
144 <div class="check-row">
145 <label for="idp-email{{ loop.index }}" class="name">E-mail</label>
146 <label for="idp-email{{ loop.index }}" class="use">Use</label>
147 <input type="checkbox" name="use_email" id="idp-email{{ loop.index }}" value="{{ email }}" checked>
148 </div>
149 <p class="idp-value">{{ email }}</p>
150 </div>
151 {% endfor %}
152 </section>
153 {% endif %}
154 </form>
155 </main>
156 <script type="text/javascript">
157 {% include "sso_auth_account_details.js" without context %}
158 </script>
159 </body>
160 </html>
0 const usernameField = document.getElementById("field-username");
1 const usernameOutput = document.getElementById("field-username-output");
2 const form = document.getElementById("form");
3
4 // needed to validate on change event when no input was changed
5 let needsValidation = true;
6 let isValid = false;
7
8 function throttle(fn, wait) {
9 let timeout;
10 const throttleFn = function() {
11 const args = Array.from(arguments);
12 if (timeout) {
13 clearTimeout(timeout);
14 }
15 timeout = setTimeout(fn.bind.apply(fn, [null].concat(args)), wait);
16 };
17 throttleFn.cancelQueued = function() {
18 clearTimeout(timeout);
19 };
20 return throttleFn;
21 }
22
23 function checkUsernameAvailable(username) {
24 let check_uri = 'check?username=' + encodeURIComponent(username);
25 return fetch(check_uri, {
26 // include the cookie
27 "credentials": "same-origin",
28 }).then(function(response) {
29 if(!response.ok) {
30 // for non-200 responses, raise the body of the response as an exception
31 return response.text().then((text) => { throw new Error(text); });
32 } else {
33 return response.json();
34 }
35 }).then(function(json) {
36 if(json.error) {
37 return {message: json.error};
38 } else if(json.available) {
39 return {available: true};
40 } else {
41 return {message: username + " is not available, please choose another."};
42 }
43 });
44 }
45
46 const allowedUsernameCharacters = new RegExp("^[a-z0-9\\.\\_\\-\\/\\=]+$");
47 const allowedCharactersString = "lowercase letters, digits, ., _, -, /, =";
48
49 function reportError(error) {
50 throttledCheckUsernameAvailable.cancelQueued();
51 usernameOutput.innerText = error;
52 usernameOutput.classList.add("error");
53 usernameField.parentElement.classList.add("invalid");
54 usernameField.focus();
55 }
56
57 function validateUsername(username) {
58 isValid = false;
59 needsValidation = false;
60 usernameOutput.innerText = "";
61 usernameField.parentElement.classList.remove("invalid");
62 usernameOutput.classList.remove("error");
63 if (!username) {
64 return reportError("Please provide a username");
65 }
66 if (username.length > 255) {
67 return reportError("Too long, please choose something shorter");
68 }
69 if (!allowedUsernameCharacters.test(username)) {
70 return reportError("Invalid username, please only use " + allowedCharactersString);
71 }
72 usernameOutput.innerText = "Checking if username is available …";
73 throttledCheckUsernameAvailable(username);
74 }
75
76 const throttledCheckUsernameAvailable = throttle(function(username) {
77 const handleError = function(err) {
78 // don't prevent form submission on error
79 usernameOutput.innerText = "";
80 isValid = true;
81 };
82 try {
83 checkUsernameAvailable(username).then(function(result) {
84 if (!result.available) {
85 reportError(result.message);
86 } else {
87 isValid = true;
88 usernameOutput.innerText = "";
89 }
90 }, handleError);
91 } catch (err) {
92 handleError(err);
93 }
94 }, 500);
95
96 form.addEventListener("submit", function(evt) {
97 if (needsValidation) {
98 validateUsername(usernameField.value);
99 evt.preventDefault();
100 return;
101 }
102 if (!isValid) {
103 evt.preventDefault();
104 usernameField.focus();
105 return;
106 }
107 });
108 usernameField.addEventListener("input", function(evt) {
109 validateUsername(usernameField.value);
110 });
111 usernameField.addEventListener("change", function(evt) {
112 if (needsValidation) {
113 validateUsername(usernameField.value);
114 }
115 });
0 <html>
1 <head>
2 <title>Authentication Failed</title>
3 </head>
4 <body>
5 <div>
0 <!DOCTYPE html>
1 <html lang="en">
2 <head>
3 <meta charset="UTF-8">
4 <title>Authentication failed</title>
5 <meta name="viewport" content="width=device-width, user-scalable=no">
6 <style type="text/css">
7 {% include "sso.css" without context %}
8 </style>
9 </head>
10 <body class="error_page">
11 <header>
12 <h1>That doesn't look right</h1>
613 <p>
7 We were unable to validate your <tt>{{server_name | e}}</tt> account via
8 single-sign-on (SSO), because the SSO Identity Provider returned
9 different details than when you logged in.
14 <strong>We were unable to validate your {{ server_name }} account</strong>
15 via single&nbsp;sign&#8209;on&nbsp;(SSO), because the SSO Identity
16 Provider returned different details than when you logged in.
1017 </p>
1118 <p>
1219 Try the operation again, and ensure that you use the same details on
1320 the Identity Provider as when you log into your account.
1421 </p>
15 </div>
22 </header>
1623 </body>
1724 </html>
0 <html>
1 <head>
2 <title>Authentication</title>
3 </head>
0 <!DOCTYPE html>
1 <html lang="en">
2 <head>
3 <meta charset="UTF-8">
4 <title>Authentication</title>
5 <meta name="viewport" content="width=device-width, user-scalable=no">
6 <style type="text/css">
7 {% include "sso.css" without context %}
8 </style>
9 </head>
410 <body>
5 <div>
11 <header>
12 <h1>Confirm it's you to continue</h1>
613 <p>
7 A client is trying to {{ description | e }}. To confirm this action,
8 <a href="{{ redirect_url | e }}">re-authenticate with single sign-on</a>.
9 If you did not expect this, your account may be compromised!
14 A client is trying to {{ description }}. To confirm this action
15 re-authorize your account with single sign-on.
1016 </p>
11 </div>
17 <p><strong>
18 If you did not expect this, your account may be compromised.
19 </strong></p>
20 </header>
21 <main>
22 <a href="{{ redirect_url }}" class="primary-button">
23 Continue with {{ idp.idp_name }}
24 </a>
25 </main>
1226 </body>
1327 </html>
0 <html>
1 <head>
2 <title>Authentication Successful</title>
3 <script>
4 if (window.onAuthDone) {
5 window.onAuthDone();
6 } else if (window.opener && window.opener.postMessage) {
7 window.opener.postMessage("authDone", "*");
8 }
9 </script>
10 </head>
0 <!DOCTYPE html>
1 <html lang="en">
2 <head>
3 <meta charset="UTF-8">
4 <title>Authentication successful</title>
5 <meta name="viewport" content="width=device-width, user-scalable=no">
6 <style type="text/css">
7 {% include "sso.css" without context %}
8 </style>
9 <script>
10 if (window.onAuthDone) {
11 window.onAuthDone();
12 } else if (window.opener && window.opener.postMessage) {
13 window.opener.postMessage("authDone", "*");
14 }
15 </script>
16 </head>
1117 <body>
12 <div>
13 <p>Thank you</p>
14 <p>You may now close this window and return to the application</p>
15 </div>
18 <header>
19 <h1>Thank you</h1>
20 <p>
21 Now we know it’s you, you can close this window and return to the
22 application.
23 </p>
24 </header>
1625 </body>
1726 </html>
00 <!DOCTYPE html>
11 <html lang="en">
2 <head>
3 <meta charset="UTF-8">
4 <title>SSO error</title>
5 </head>
6 <body>
2 <head>
3 <meta charset="UTF-8">
4 <title>Authentication failed</title>
5 <meta name="viewport" content="width=device-width, user-scalable=no">
6 <style type="text/css">
7 {% include "sso.css" without context %}
8
9 #error_code {
10 margin-top: 56px;
11 }
12 </style>
13 </head>
14 <body class="error_page">
715 {# If an error of unauthorised is returned it means we have actively rejected their login #}
816 {% if error == "unauthorised" %}
9 <p>You are not allowed to log in here.</p>
17 <header>
18 <p>You are not allowed to log in here.</p>
19 </header>
1020 {% else %}
11 <p>
12 There was an error during authentication:
13 </p>
14 <div id="errormsg" style="margin:20px 80px">{{ error_description | e }}</div>
15 <p>
16 If you are seeing this page after clicking a link sent to you via email, make
17 sure you only click the confirmation link once, and that you open the
18 validation link in the same client you're logging in from.
19 </p>
20 <p>
21 Try logging in again from your Matrix client and if the problem persists
22 please contact the server's administrator.
23 </p>
24 <p>Error: <code>{{ error }}</code></p>
21 <header>
22 <h1>There was an error</h1>
23 <p>
24 <strong id="errormsg">{{ error_description }}</strong>
25 </p>
26 <p>
27 If you are seeing this page after clicking a link sent to you via email,
28 make sure you only click the confirmation link once, and that you open
29 the validation link in the same client you're logging in from.
30 </p>
31 <p>
32 Try logging in again from your Matrix client and if the problem persists
33 please contact the server's administrator.
34 </p>
35 <div id="error_code">
36 <p><strong>Error code</strong></p>
37 <p>{{ error }}</p>
38 </div>
39 </header>
2540
26 <script type="text/javascript">
27 // Error handling to support Auth0 errors that we might get through a GET request
28 // to the validation endpoint. If an error is provided, it's either going to be
29 // located in the query string or in a query string-like URI fragment.
30 // We try to locate the error from any of these two locations, but if we can't
31 // we just don't print anything specific.
32 let searchStr = "";
33 if (window.location.search) {
34 // window.location.searchParams isn't always defined when
35 // window.location.search is, so it's more reliable to parse the latter.
36 searchStr = window.location.search;
37 } else if (window.location.hash) {
38 // Replace the # with a ? so that URLSearchParams does the right thing and
39 // doesn't parse the first parameter incorrectly.
40 searchStr = window.location.hash.replace("#", "?");
41 }
41 <script type="text/javascript">
42 // Error handling to support Auth0 errors that we might get through a GET request
43 // to the validation endpoint. If an error is provided, it's either going to be
44 // located in the query string or in a query string-like URI fragment.
45 // We try to locate the error from any of these two locations, but if we can't
46 // we just don't print anything specific.
47 let searchStr = "";
48 if (window.location.search) {
49 // window.location.searchParams isn't always defined when
50 // window.location.search is, so it's more reliable to parse the latter.
51 searchStr = window.location.search;
52 } else if (window.location.hash) {
53 // Replace the # with a ? so that URLSearchParams does the right thing and
54 // doesn't parse the first parameter incorrectly.
55 searchStr = window.location.hash.replace("#", "?");
56 }
4257
43 // We might end up with no error in the URL, so we need to check if we have one
44 // to print one.
45 let errorDesc = new URLSearchParams(searchStr).get("error_description")
46 if (errorDesc) {
47 document.getElementById("errormsg").innerText = errorDesc;
48 }
49 </script>
58 // We might end up with no error in the URL, so we need to check if we have one
59 // to print one.
60 let errorDesc = new URLSearchParams(searchStr).get("error_description")
61 if (errorDesc) {
62 document.getElementById("errormsg").innerText = errorDesc;
63 }
64 </script>
5065 {% endif %}
5166 </body>
5267 </html>
22 <head>
33 <meta charset="UTF-8">
44 <link rel="stylesheet" href="/_matrix/static/client/login/style.css">
5 <title>{{server_name | e}} Login</title>
5 <title>{{ server_name }} Login</title>
66 </head>
77 <body>
88 <div id="container">
9 <h1 id="title">{{server_name | e}} Login</h1>
9 <h1 id="title">{{ server_name }} Login</h1>
1010 <div class="login_flow">
1111 <p>Choose one of the following identity providers:</p>
1212 <form>
13 <input type="hidden" name="redirectUrl" value="{{redirect_url | e}}">
13 <input type="hidden" name="redirectUrl" value="{{ redirect_url }}">
1414 <ul class="radiobuttons">
1515 {% for p in providers %}
1616 <li>
17 <input type="radio" name="idp" id="prov{{loop.index}}" value="{{p.idp_id}}">
18 <label for="prov{{loop.index}}">{{p.idp_name | e}}</label>
17 <input type="radio" name="idp" id="prov{{ loop.index }}" value="{{ p.idp_id }}">
18 <label for="prov{{ loop.index }}">{{ p.idp_name }}</label>
1919 {% if p.idp_icon %}
20 <img src="{{p.idp_icon | mxc_to_http(32, 32)}}"/>
20 <img src="{{ p.idp_icon | mxc_to_http(32, 32) }}"/>
2121 {% endif %}
2222 </li>
2323 {% endfor %}
0 <!DOCTYPE html>
1 <html lang="en">
2 <head>
3 <meta charset="UTF-8">
4 <title>SSO redirect confirmation</title>
5 <meta name="viewport" content="width=device-width, user-scalable=no">
6 <style type="text/css">
7 {% include "sso.css" without context %}
8
9 #consent_form {
10 margin-top: 56px;
11 }
12 </style>
13 </head>
14 <body>
15 <header>
16 <h1>Your account is nearly ready</h1>
17 <p>Agree to the terms to create your account.</p>
18 </header>
19 <main>
20 <!-- {% if user_profile.avatar_url and user_profile.display_name %} -->
21 <div class="profile">
22 <img src="{{ user_profile.avatar_url | mxc_to_http(64, 64) }}" class="avatar" />
23 <div class="profile-details">
24 <div class="display-name">{{ user_profile.display_name }}</div>
25 <div class="user-id">{{ user_id }}</div>
26 </div>
27 </div>
28 <!-- {% endif %} -->
29 <form method="post" action="{{my_url}}" id="consent_form">
30 <p>
31 <input id="accepted_version" type="checkbox" name="accepted_version" value="{{ consent_version }}" required>
32 <label for="accepted_version">I have read and agree to the <a href="{{ terms_url }}" target="_blank">terms and conditions</a>.</label>
33 </p>
34 <input type="submit" class="primary-button" value="Continue"/>
35 </form>
36 </main>
37 </body>
38 </html>
22 <head>
33 <meta charset="UTF-8">
44 <title>SSO redirect confirmation</title>
5 <meta name="viewport" content="width=device-width, user-scalable=no">
6 <style type="text/css">
7 {% include "sso.css" without context %}
8 </style>
59 </head>
610 <body>
7 <p>The application at <span style="font-weight:bold">{{ display_url | e }}</span> is requesting full access to your <span style="font-weight:bold">{{ server_name }}</span> Matrix account.</p>
8 <p>If you don't recognise this address, you should ignore this and close this tab.</p>
9 <p>
10 <a href="{{ redirect_url | e }}">I trust this address</a>
11 </p>
11 <header>
12 {% if new_user %}
13 <h1>Your account is now ready</h1>
14 <p>You've made your account on {{ server_name }}.</p>
15 {% else %}
16 <h1>Log in</h1>
17 {% endif %}
18 <p>Continue to confirm you trust <strong>{{ display_url }}</strong>.</p>
19 </header>
20 <main>
21 {% if user_profile.avatar_url %}
22 <div class="profile">
23 <img src="{{ user_profile.avatar_url | mxc_to_http(64, 64) }}" class="avatar" />
24 <div class="profile-details">
25 {% if user_profile.display_name %}
26 <div class="display-name">{{ user_profile.display_name }}</div>
27 {% endif %}
28 <div class="user-id">{{ user_id }}</div>
29 </div>
30 </div>
31 {% endif %}
32 <a href="{{ redirect_url }}" class="primary-button">Continue</a>
33 </main>
1234 </body>
13 </html>
35 </html>
+0
-19
synapse/res/username_picker/index.html less more
0 <!DOCTYPE html>
1 <html lang="en">
2 <head>
3 <title>Synapse Login</title>
4 <link rel="stylesheet" href="style.css" type="text/css" />
5 </head>
6 <body>
7 <div class="card">
8 <form method="post" class="form__input" id="form" action="submit">
9 <label for="field-username">Please pick your username:</label>
10 <input type="text" name="username" id="field-username" autofocus="">
11 <input type="submit" class="button button--full-width" id="button-submit" value="Submit">
12 </form>
13 <!-- this is used for feedback -->
14 <div role=alert class="tooltip hidden" id="message"></div>
15 <script src="script.js"></script>
16 </div>
17 </body>
18 </html>
+0
-95
synapse/res/username_picker/script.js less more
0 let inputField = document.getElementById("field-username");
1 let inputForm = document.getElementById("form");
2 let submitButton = document.getElementById("button-submit");
3 let message = document.getElementById("message");
4
5 // Submit username and receive response
6 function showMessage(messageText) {
7 // Unhide the message text
8 message.classList.remove("hidden");
9
10 message.textContent = messageText;
11 };
12
13 function doSubmit() {
14 showMessage("Success. Please wait a moment for your browser to redirect.");
15
16 // remove the event handler before re-submitting the form.
17 delete inputForm.onsubmit;
18 inputForm.submit();
19 }
20
21 function onResponse(response) {
22 // Display message
23 showMessage(response);
24
25 // Enable submit button and input field
26 submitButton.classList.remove('button--disabled');
27 submitButton.value = "Submit";
28 };
29
30 let allowedUsernameCharacters = RegExp("[^a-z0-9\\.\\_\\=\\-\\/]");
31 function usernameIsValid(username) {
32 return !allowedUsernameCharacters.test(username);
33 }
34 let allowedCharactersString = "lowercase letters, digits, ., _, -, /, =";
35
36 function buildQueryString(params) {
37 return Object.keys(params)
38 .map(k => encodeURIComponent(k) + '=' + encodeURIComponent(params[k]))
39 .join('&');
40 }
41
42 function submitUsername(username) {
43 if(username.length == 0) {
44 onResponse("Please enter a username.");
45 return;
46 }
47 if(!usernameIsValid(username)) {
48 onResponse("Invalid username. Only the following characters are allowed: " + allowedCharactersString);
49 return;
50 }
51
52 // if this browser doesn't support fetch, skip the availability check.
53 if(!window.fetch) {
54 doSubmit();
55 return;
56 }
57
58 let check_uri = 'check?' + buildQueryString({"username": username});
59 fetch(check_uri, {
60 // include the cookie
61 "credentials": "same-origin",
62 }).then((response) => {
63 if(!response.ok) {
64 // for non-200 responses, raise the body of the response as an exception
65 return response.text().then((text) => { throw text; });
66 } else {
67 return response.json();
68 }
69 }).then((json) => {
70 if(json.error) {
71 throw json.error;
72 } else if(json.available) {
73 doSubmit();
74 } else {
75 onResponse("This username is not available, please choose another.");
76 }
77 }).catch((err) => {
78 onResponse("Error checking username availability: " + err);
79 });
80 }
81
82 function clickSubmit() {
83 event.preventDefault();
84 if(submitButton.classList.contains('button--disabled')) { return; }
85
86 // Disable submit button and input field
87 submitButton.classList.add('button--disabled');
88
89 // Submit username
90 submitButton.value = "Checking...";
91 submitUsername(inputField.value);
92 };
93
94 inputForm.onsubmit = clickSubmit;
+0
-27
synapse/res/username_picker/style.css less more
0 input[type="text"] {
1 font-size: 100%;
2 background-color: #ededf0;
3 border: 1px solid #fff;
4 border-radius: .2em;
5 padding: .5em .9em;
6 display: block;
7 width: 26em;
8 }
9
10 .button--disabled {
11 border-color: #fff;
12 background-color: transparent;
13 color: #000;
14 text-transform: none;
15 }
16
17 .hidden {
18 display: none;
19 }
20
21 .tooltip {
22 background-color: #f9f9fa;
23 padding: 1em;
24 margin: 1em 0;
25 }
26
00 # -*- coding: utf-8 -*-
11 # Copyright 2014-2016 OpenMarket Ltd
22 # Copyright 2018-2019 New Vector Ltd
3 # Copyright 2020, 2021 The Matrix.org Foundation C.I.C.
4
35 #
46 # Licensed under the Apache License, Version 2.0 (the "License");
57 # you may not use this file except in compliance with the License.
3537 from synapse.rest.admin.purge_room_servlet import PurgeRoomServlet
3638 from synapse.rest.admin.rooms import (
3739 DeleteRoomRestServlet,
40 ForwardExtremitiesRestServlet,
3841 JoinRoomAliasServlet,
3942 ListRoomRestServlet,
4043 MakeRoomAdminRestServlet,
4144 RoomMembersRestServlet,
4245 RoomRestServlet,
46 RoomStateRestServlet,
4347 ShutdownRoomRestServlet,
4448 )
4549 from synapse.rest.admin.server_notice_servlet import SendServerNoticeServlet
5054 PushersRestServlet,
5155 ResetPasswordRestServlet,
5256 SearchUsersRestServlet,
57 ShadowBanRestServlet,
5358 UserAdminServlet,
5459 UserMediaRestServlet,
5560 UserMembershipRestServlet,
208213 """
209214 register_servlets_for_client_rest_resource(hs, http_server)
210215 ListRoomRestServlet(hs).register(http_server)
216 RoomStateRestServlet(hs).register(http_server)
211217 RoomRestServlet(hs).register(http_server)
212218 RoomMembersRestServlet(hs).register(http_server)
213219 DeleteRoomRestServlet(hs).register(http_server)
229235 EventReportsRestServlet(hs).register(http_server)
230236 PushersRestServlet(hs).register(http_server)
231237 MakeRoomAdminRestServlet(hs).register(http_server)
238 ShadowBanRestServlet(hs).register(http_server)
239 ForwardExtremitiesRestServlet(hs).register(http_server)
232240
233241
234242 def register_servlets_for_client_rest_resource(hs, http_server):
00 # -*- coding: utf-8 -*-
1 # Copyright 2019 The Matrix.org Foundation C.I.C.
1 # Copyright 2019-2021 The Matrix.org Foundation C.I.C.
22 #
33 # Licensed under the Apache License, Version 2.0 (the "License");
44 # you may not use this file except in compliance with the License.
291291 return 200, ret
292292
293293
294 class RoomStateRestServlet(RestServlet):
295 """
296 Get full state within a room.
297 """
298
299 PATTERNS = admin_patterns("/rooms/(?P<room_id>[^/]+)/state")
300
301 def __init__(self, hs: "HomeServer"):
302 self.hs = hs
303 self.auth = hs.get_auth()
304 self.store = hs.get_datastore()
305 self.clock = hs.get_clock()
306 self._event_serializer = hs.get_event_client_serializer()
307
308 async def on_GET(
309 self, request: SynapseRequest, room_id: str
310 ) -> Tuple[int, JsonDict]:
311 requester = await self.auth.get_user_by_req(request)
312 await assert_user_is_admin(self.auth, requester.user)
313
314 ret = await self.store.get_room(room_id)
315 if not ret:
316 raise NotFoundError("Room not found")
317
318 event_ids = await self.store.get_current_state_ids(room_id)
319 events = await self.store.get_events(event_ids.values())
320 now = self.clock.time_msec()
321 room_state = await self._event_serializer.serialize_events(
322 events.values(),
323 now,
324 # We don't bother bundling aggregations in when asked for state
325 # events, as clients won't use them.
326 bundle_aggregations=False,
327 )
328 ret = {"state": room_state}
329
330 return 200, ret
331
332
294333 class JoinRoomAliasServlet(RestServlet):
295334
296335 PATTERNS = admin_patterns("/join/(?P<room_identifier>[^/]*)")
430469 if not admin_users:
431470 raise SynapseError(400, "No local admin user in room")
432471
433 admin_user_id = admin_users[-1]
472 admin_user_id = None
473
474 for admin_user in reversed(admin_users):
475 if room_state.get((EventTypes.Member, admin_user)):
476 admin_user_id = admin_user
477 break
478
479 if not admin_user_id:
480 raise SynapseError(
481 400, "No local admin user in room",
482 )
434483
435484 pl_content = power_levels.content
436485 else:
498547 )
499548
500549 return 200, {}
550
551
552 class ForwardExtremitiesRestServlet(RestServlet):
553 """Allows a server admin to get or clear forward extremities.
554
555 Clearing does not require restarting the server.
556
557 Clear forward extremities:
558 DELETE /_synapse/admin/v1/rooms/<room_id_or_alias>/forward_extremities
559
560 Get forward_extremities:
561 GET /_synapse/admin/v1/rooms/<room_id_or_alias>/forward_extremities
562 """
563
564 PATTERNS = admin_patterns("/rooms/(?P<room_identifier>[^/]*)/forward_extremities")
565
566 def __init__(self, hs: "HomeServer"):
567 self.hs = hs
568 self.auth = hs.get_auth()
569 self.room_member_handler = hs.get_room_member_handler()
570 self.store = hs.get_datastore()
571
572 async def resolve_room_id(self, room_identifier: str) -> str:
573 """Resolve to a room ID, if necessary."""
574 if RoomID.is_valid(room_identifier):
575 resolved_room_id = room_identifier
576 elif RoomAlias.is_valid(room_identifier):
577 room_alias = RoomAlias.from_string(room_identifier)
578 room_id, _ = await self.room_member_handler.lookup_room_alias(room_alias)
579 resolved_room_id = room_id.to_string()
580 else:
581 raise SynapseError(
582 400, "%s was not legal room ID or room alias" % (room_identifier,)
583 )
584 if not resolved_room_id:
585 raise SynapseError(
586 400, "Unknown room ID or room alias %s" % room_identifier
587 )
588 return resolved_room_id
589
590 async def on_DELETE(self, request, room_identifier):
591 requester = await self.auth.get_user_by_req(request)
592 await assert_user_is_admin(self.auth, requester.user)
593
594 room_id = await self.resolve_room_id(room_identifier)
595
596 deleted_count = await self.store.delete_forward_extremities_for_room(room_id)
597 return 200, {"deleted": deleted_count}
598
599 async def on_GET(self, request, room_identifier):
600 requester = await self.auth.get_user_by_req(request)
601 await assert_user_is_admin(self.auth, requester.user)
602
603 room_id = await self.resolve_room_id(room_identifier)
604
605 extremities = await self.store.get_forward_extremities_for_room(room_id)
606 return 200, {"count": len(extremities), "results": extremities}
8282 The parameter `deactivated` can be used to include deactivated users.
8383 """
8484
85 def __init__(self, hs):
85 def __init__(self, hs: "HomeServer"):
8686 self.hs = hs
8787 self.store = hs.get_datastore()
8888 self.auth = hs.get_auth()
8989 self.admin_handler = hs.get_admin_handler()
9090
91 async def on_GET(self, request):
91 async def on_GET(self, request: SynapseRequest) -> Tuple[int, JsonDict]:
9292 await assert_requester_is_admin(self.auth, request)
9393
9494 start = parse_integer(request, "from", default=0)
9595 limit = parse_integer(request, "limit", default=100)
96
97 if start < 0:
98 raise SynapseError(
99 400,
100 "Query parameter from must be a string representing a positive integer.",
101 errcode=Codes.INVALID_PARAM,
102 )
103
104 if limit < 0:
105 raise SynapseError(
106 400,
107 "Query parameter limit must be a string representing a positive integer.",
108 errcode=Codes.INVALID_PARAM,
109 )
110
96111 user_id = parse_string(request, "user_id", default=None)
97112 name = parse_string(request, "name", default=None)
98113 guests = parse_boolean(request, "guests", default=True)
102117 start, limit, user_id, name, guests, deactivated
103118 )
104119 ret = {"users": users, "total": total}
105 if len(users) >= limit:
120 if (start + limit) < total:
106121 ret["next_token"] = str(start + len(users))
107122
108123 return 200, ret
874889 )
875890
876891 return 200, {"access_token": token}
892
893
894 class ShadowBanRestServlet(RestServlet):
895 """An admin API for shadow-banning a user.
896
897 A shadow-banned users receives successful responses to their client-server
898 API requests, but the events are not propagated into rooms.
899
900 Shadow-banning a user should be used as a tool of last resort and may lead
901 to confusing or broken behaviour for the client.
902
903 Example:
904
905 POST /_synapse/admin/v1/users/@test:example.com/shadow_ban
906 {}
907
908 200 OK
909 {}
910 """
911
912 PATTERNS = admin_patterns("/users/(?P<user_id>[^/]*)/shadow_ban")
913
914 def __init__(self, hs: "HomeServer"):
915 self.hs = hs
916 self.store = hs.get_datastore()
917 self.auth = hs.get_auth()
918
919 async def on_POST(self, request, user_id):
920 await assert_requester_is_admin(self.auth, request)
921
922 if not self.hs.is_mine_id(user_id):
923 raise SynapseError(400, "Only local users can be shadow-banned")
924
925 await self.store.set_shadow_banned(UserID.from_string(user_id), True)
926
927 return 200, {}
1818 from synapse.api.errors import Codes, LoginError, SynapseError
1919 from synapse.api.ratelimiting import Ratelimiter
2020 from synapse.appservice import ApplicationService
21 from synapse.http.server import finish_request
21 from synapse.handlers.sso import SsoIdentityProvider
22 from synapse.http.server import HttpServer, finish_request
2223 from synapse.http.servlet import (
2324 RestServlet,
2425 parse_json_object_from_request,
5960 self.saml2_enabled = hs.config.saml2_enabled
6061 self.cas_enabled = hs.config.cas_enabled
6162 self.oidc_enabled = hs.config.oidc_enabled
63 self._msc2858_enabled = hs.config.experimental.msc2858_enabled
6264
6365 self.auth = hs.get_auth()
6466
6567 self.auth_handler = self.hs.get_auth_handler()
6668 self.registration_handler = hs.get_registration_handler()
69 self._sso_handler = hs.get_sso_handler()
70
6771 self._well_known_builder = WellKnownBuilder(hs)
6872 self._address_ratelimiter = Ratelimiter(
6973 clock=hs.get_clock(),
8892 flows.append({"type": LoginRestServlet.CAS_TYPE})
8993
9094 if self.cas_enabled or self.saml2_enabled or self.oidc_enabled:
91 flows.append({"type": LoginRestServlet.SSO_TYPE})
92 # While its valid for us to advertise this login type generally,
95 sso_flow = {"type": LoginRestServlet.SSO_TYPE} # type: JsonDict
96
97 if self._msc2858_enabled:
98 sso_flow["org.matrix.msc2858.identity_providers"] = [
99 _get_auth_flow_dict_for_idp(idp)
100 for idp in self._sso_handler.get_identity_providers().values()
101 ]
102
103 flows.append(sso_flow)
104
105 # While it's valid for us to advertise this login type generally,
93106 # synapse currently only gives out these tokens as part of the
94107 # SSO login flow.
95108 # Generally we don't want to advertise login flows that clients
310323 return result
311324
312325
326 def _get_auth_flow_dict_for_idp(idp: SsoIdentityProvider) -> JsonDict:
327 """Return an entry for the login flow dict
328
329 Returns an entry suitable for inclusion in "identity_providers" in the
330 response to GET /_matrix/client/r0/login
331 """
332 e = {"id": idp.idp_id, "name": idp.idp_name} # type: JsonDict
333 if idp.idp_icon:
334 e["icon"] = idp.idp_icon
335 if idp.idp_brand:
336 e["brand"] = idp.idp_brand
337 return e
338
339
313340 class SsoRedirectServlet(RestServlet):
314 PATTERNS = client_patterns("/login/(cas|sso)/redirect", v1=True)
341 PATTERNS = client_patterns("/login/(cas|sso)/redirect$", v1=True)
315342
316343 def __init__(self, hs: "HomeServer"):
317344 # make sure that the relevant handlers are instantiated, so that they
323350 if hs.config.oidc_enabled:
324351 hs.get_oidc_handler()
325352 self._sso_handler = hs.get_sso_handler()
326
327 async def on_GET(self, request: SynapseRequest):
353 self._msc2858_enabled = hs.config.experimental.msc2858_enabled
354
355 def register(self, http_server: HttpServer) -> None:
356 super().register(http_server)
357 if self._msc2858_enabled:
358 # expose additional endpoint for MSC2858 support
359 http_server.register_paths(
360 "GET",
361 client_patterns(
362 "/org.matrix.msc2858/login/sso/redirect/(?P<idp_id>[A-Za-z0-9_.~-]+)$",
363 releases=(),
364 unstable=True,
365 ),
366 self.on_GET,
367 self.__class__.__name__,
368 )
369
370 async def on_GET(
371 self, request: SynapseRequest, idp_id: Optional[str] = None
372 ) -> None:
328373 client_redirect_url = parse_string(
329374 request, "redirectUrl", required=True, encoding=None
330375 )
331376 sso_url = await self._sso_handler.handle_redirect_request(
332 request, client_redirect_url
377 request, client_redirect_url, idp_id,
333378 )
334379 logger.info("Redirecting to %s", sso_url)
335380 request.redirect(sso_url)
5353 class EmailPasswordRequestTokenRestServlet(RestServlet):
5454 PATTERNS = client_patterns("/account/password/email/requestToken$")
5555
56 def __init__(self, hs):
56 def __init__(self, hs: "HomeServer"):
5757 super().__init__()
5858 self.hs = hs
5959 self.datastore = hs.get_datastore()
101101 if next_link:
102102 # Raise if the provided next_link value isn't valid
103103 assert_valid_next_link(self.hs, next_link)
104
105 self.identity_handler.ratelimit_request_token_requests(request, "email", email)
104106
105107 # The email will be sent to the stored address.
106108 # This avoids a potential account hijack by requesting a password reset to
378380 Codes.THREEPID_DENIED,
379381 )
380382
383 self.identity_handler.ratelimit_request_token_requests(request, "email", email)
384
381385 if next_link:
382386 # Raise if the provided next_link value isn't valid
383387 assert_valid_next_link(self.hs, next_link)
429433 class MsisdnThreepidRequestTokenRestServlet(RestServlet):
430434 PATTERNS = client_patterns("/account/3pid/msisdn/requestToken$")
431435
432 def __init__(self, hs):
436 def __init__(self, hs: "HomeServer"):
433437 self.hs = hs
434438 super().__init__()
435439 self.store = self.hs.get_datastore()
456460 "Account phone numbers are not authorized on this server",
457461 Codes.THREEPID_DENIED,
458462 )
463
464 self.identity_handler.ratelimit_request_token_requests(
465 request, "msisdn", msisdn
466 )
459467
460468 if next_link:
461469 # Raise if the provided next_link value isn't valid
125125 Codes.THREEPID_DENIED,
126126 )
127127
128 self.identity_handler.ratelimit_request_token_requests(request, "email", email)
129
128130 existing_user_id = await self.hs.get_datastore().get_user_id_by_threepid(
129131 "email", email
130132 )
203205 "Phone numbers are not authorized to register on this server",
204206 Codes.THREEPID_DENIED,
205207 )
208
209 self.identity_handler.ratelimit_request_token_requests(
210 request, "msisdn", msisdn
211 )
206212
207213 existing_user_id = await self.hs.get_datastore().get_user_id_by_threepid(
208214 "msisdn", msisdn
9999
100100 consent_template_directory = hs.config.user_consent_template_dir
101101
102 # TODO: switch to synapse.util.templates.build_jinja_env
102103 loader = jinja2.FileSystemLoader(consent_template_directory)
103104 self._jinja_env = jinja2.Environment(
104105 loader=loader, autoescape=jinja2.select_autoescape(["html", "htm", "xml"])
299299 thumbnail_height (int)
300300 thumbnail_method (str)
301301 thumbnail_type (str): Content type of thumbnail, e.g. image/png
302 thumbnail_length (int): The size of the media file, in bytes.
302303 """
303304
304305 def __init__(
311312 thumbnail_height=None,
312313 thumbnail_method=None,
313314 thumbnail_type=None,
315 thumbnail_length=None,
314316 ):
315317 self.server_name = server_name
316318 self.file_id = file_id
320322 self.thumbnail_height = thumbnail_height
321323 self.thumbnail_method = thumbnail_method
322324 self.thumbnail_type = thumbnail_type
325 self.thumbnail_length = thumbnail_length
323326
324327
325328 def get_filename_from_headers(headers: Dict[bytes, List[bytes]]) -> Optional[str]:
385385 """
386386 Check whether the URL should be downloaded as oEmbed content instead.
387387
388 Params:
388 Args:
389389 url: The URL to check.
390390
391391 Returns:
402402 """
403403 Request content from an oEmbed endpoint.
404404
405 Params:
405 Args:
406406 endpoint: The oEmbed API endpoint.
407407 url: The URL to pass to the API.
408408
691691 def decode_and_calc_og(
692692 body: bytes, media_uri: str, request_encoding: Optional[str] = None
693693 ) -> Dict[str, Optional[str]]:
694 """
695 Calculate metadata for an HTML document.
696
697 This uses lxml to parse the HTML document into the OG response. If errors
698 occur during processing of the document, an empty response is returned.
699
700 Args:
701 body: The HTML document, as bytes.
702 media_url: The URI used to download the body.
703 request_encoding: The character encoding of the body, as a string.
704
705 Returns:
706 The OG response as a dictionary.
707 """
694708 # If there's no body, nothing useful is going to be found.
695709 if not body:
696710 return {}
697711
698712 from lxml import etree
699713
714 # Create an HTML parser. If this fails, log and return no metadata.
700715 try:
701716 parser = etree.HTMLParser(recover=True, encoding=request_encoding)
702 tree = etree.fromstring(body, parser)
703 og = _calc_og(tree, media_uri)
717 except LookupError:
718 # blindly consider the encoding as utf-8.
719 parser = etree.HTMLParser(recover=True, encoding="utf-8")
720 except Exception as e:
721 logger.warning("Unable to create HTML parser: %s" % (e,))
722 return {}
723
724 def _attempt_calc_og(body_attempt: Union[bytes, str]) -> Dict[str, Optional[str]]:
725 # Attempt to parse the body. If this fails, log and return no metadata.
726 tree = etree.fromstring(body_attempt, parser)
727 return _calc_og(tree, media_uri)
728
729 # Attempt to parse the body. If this fails, log and return no metadata.
730 try:
731 return _attempt_calc_og(body)
704732 except UnicodeDecodeError:
705733 # blindly try decoding the body as utf-8, which seems to fix
706734 # the charset mismatches on https://google.com
707 parser = etree.HTMLParser(recover=True, encoding=request_encoding)
708 tree = etree.fromstring(body.decode("utf-8", "ignore"), parser)
709 og = _calc_og(tree, media_uri)
710
711 return og
712
713
714 def _calc_og(tree, media_uri: str) -> Dict[str, Optional[str]]:
735 return _attempt_calc_og(body.decode("utf-8", "ignore"))
736
737
738 def _calc_og(tree: "etree.Element", media_uri: str) -> Dict[str, Optional[str]]:
715739 # suck our tree into lxml and define our OG response.
716740
717741 # if we see any image URLs in the OG response, then spider them
1515
1616
1717 import logging
18 from typing import TYPE_CHECKING
18 from typing import TYPE_CHECKING, Any, Dict, List, Optional
1919
2020 from twisted.web.http import Request
2121
105105 return
106106
107107 thumbnail_infos = await self.store.get_local_media_thumbnails(media_id)
108
109 if thumbnail_infos:
110 thumbnail_info = self._select_thumbnail(
111 width, height, method, m_type, thumbnail_infos
112 )
113
114 file_info = FileInfo(
115 server_name=None,
116 file_id=media_id,
117 url_cache=media_info["url_cache"],
118 thumbnail=True,
119 thumbnail_width=thumbnail_info["thumbnail_width"],
120 thumbnail_height=thumbnail_info["thumbnail_height"],
121 thumbnail_type=thumbnail_info["thumbnail_type"],
122 thumbnail_method=thumbnail_info["thumbnail_method"],
123 )
124
125 t_type = file_info.thumbnail_type
126 t_length = thumbnail_info["thumbnail_length"]
127
128 responder = await self.media_storage.fetch_media(file_info)
129 await respond_with_responder(request, responder, t_type, t_length)
130 else:
131 logger.info("Couldn't find any generated thumbnails")
132 respond_404(request)
108 await self._select_and_respond_with_thumbnail(
109 request,
110 width,
111 height,
112 method,
113 m_type,
114 thumbnail_infos,
115 media_id,
116 url_cache=media_info["url_cache"],
117 server_name=None,
118 )
133119
134120 async def _select_or_generate_local_thumbnail(
135121 self,
275261 thumbnail_infos = await self.store.get_remote_media_thumbnails(
276262 server_name, media_id
277263 )
278
264 await self._select_and_respond_with_thumbnail(
265 request,
266 width,
267 height,
268 method,
269 m_type,
270 thumbnail_infos,
271 media_info["filesystem_id"],
272 url_cache=None,
273 server_name=server_name,
274 )
275
276 async def _select_and_respond_with_thumbnail(
277 self,
278 request: Request,
279 desired_width: int,
280 desired_height: int,
281 desired_method: str,
282 desired_type: str,
283 thumbnail_infos: List[Dict[str, Any]],
284 file_id: str,
285 url_cache: Optional[str] = None,
286 server_name: Optional[str] = None,
287 ) -> None:
288 """
289 Respond to a request with an appropriate thumbnail from the previously generated thumbnails.
290
291 Args:
292 request: The incoming request.
293 desired_width: The desired width, the returned thumbnail may be larger than this.
294 desired_height: The desired height, the returned thumbnail may be larger than this.
295 desired_method: The desired method used to generate the thumbnail.
296 desired_type: The desired content-type of the thumbnail.
297 thumbnail_infos: A list of dictionaries of candidate thumbnails.
298 file_id: The ID of the media that a thumbnail is being requested for.
299 url_cache: The URL cache value.
300 server_name: The server name, if this is a remote thumbnail.
301 """
279302 if thumbnail_infos:
280 thumbnail_info = self._select_thumbnail(
281 width, height, method, m_type, thumbnail_infos
303 file_info = self._select_thumbnail(
304 desired_width,
305 desired_height,
306 desired_method,
307 desired_type,
308 thumbnail_infos,
309 file_id,
310 url_cache,
311 server_name,
282312 )
283 file_info = FileInfo(
313 if not file_info:
314 logger.info("Couldn't find a thumbnail matching the desired inputs")
315 respond_404(request)
316 return
317
318 responder = await self.media_storage.fetch_media(file_info)
319 await respond_with_responder(
320 request, responder, file_info.thumbnail_type, file_info.thumbnail_length
321 )
322 else:
323 logger.info("Failed to find any generated thumbnails")
324 respond_404(request)
325
326 def _select_thumbnail(
327 self,
328 desired_width: int,
329 desired_height: int,
330 desired_method: str,
331 desired_type: str,
332 thumbnail_infos: List[Dict[str, Any]],
333 file_id: str,
334 url_cache: Optional[str],
335 server_name: Optional[str],
336 ) -> Optional[FileInfo]:
337 """
338 Choose an appropriate thumbnail from the previously generated thumbnails.
339
340 Args:
341 desired_width: The desired width, the returned thumbnail may be larger than this.
342 desired_height: The desired height, the returned thumbnail may be larger than this.
343 desired_method: The desired method used to generate the thumbnail.
344 desired_type: The desired content-type of the thumbnail.
345 thumbnail_infos: A list of dictionaries of candidate thumbnails.
346 file_id: The ID of the media that a thumbnail is being requested for.
347 url_cache: The URL cache value.
348 server_name: The server name, if this is a remote thumbnail.
349
350 Returns:
351 The thumbnail which best matches the desired parameters.
352 """
353 desired_method = desired_method.lower()
354
355 # The chosen thumbnail.
356 thumbnail_info = None
357
358 d_w = desired_width
359 d_h = desired_height
360
361 if desired_method == "crop":
362 # Thumbnails that match equal or larger sizes of desired width/height.
363 crop_info_list = []
364 # Other thumbnails.
365 crop_info_list2 = []
366 for info in thumbnail_infos:
367 # Skip thumbnails generated with different methods.
368 if info["thumbnail_method"] != "crop":
369 continue
370
371 t_w = info["thumbnail_width"]
372 t_h = info["thumbnail_height"]
373 aspect_quality = abs(d_w * t_h - d_h * t_w)
374 min_quality = 0 if d_w <= t_w and d_h <= t_h else 1
375 size_quality = abs((d_w - t_w) * (d_h - t_h))
376 type_quality = desired_type != info["thumbnail_type"]
377 length_quality = info["thumbnail_length"]
378 if t_w >= d_w or t_h >= d_h:
379 crop_info_list.append(
380 (
381 aspect_quality,
382 min_quality,
383 size_quality,
384 type_quality,
385 length_quality,
386 info,
387 )
388 )
389 else:
390 crop_info_list2.append(
391 (
392 aspect_quality,
393 min_quality,
394 size_quality,
395 type_quality,
396 length_quality,
397 info,
398 )
399 )
400 if crop_info_list:
401 thumbnail_info = min(crop_info_list)[-1]
402 elif crop_info_list2:
403 thumbnail_info = min(crop_info_list2)[-1]
404 elif desired_method == "scale":
405 # Thumbnails that match equal or larger sizes of desired width/height.
406 info_list = []
407 # Other thumbnails.
408 info_list2 = []
409
410 for info in thumbnail_infos:
411 # Skip thumbnails generated with different methods.
412 if info["thumbnail_method"] != "scale":
413 continue
414
415 t_w = info["thumbnail_width"]
416 t_h = info["thumbnail_height"]
417 size_quality = abs((d_w - t_w) * (d_h - t_h))
418 type_quality = desired_type != info["thumbnail_type"]
419 length_quality = info["thumbnail_length"]
420 if t_w >= d_w or t_h >= d_h:
421 info_list.append((size_quality, type_quality, length_quality, info))
422 else:
423 info_list2.append(
424 (size_quality, type_quality, length_quality, info)
425 )
426 if info_list:
427 thumbnail_info = min(info_list)[-1]
428 elif info_list2:
429 thumbnail_info = min(info_list2)[-1]
430
431 if thumbnail_info:
432 return FileInfo(
433 file_id=file_id,
434 url_cache=url_cache,
284435 server_name=server_name,
285 file_id=media_info["filesystem_id"],
286436 thumbnail=True,
287437 thumbnail_width=thumbnail_info["thumbnail_width"],
288438 thumbnail_height=thumbnail_info["thumbnail_height"],
289439 thumbnail_type=thumbnail_info["thumbnail_type"],
290440 thumbnail_method=thumbnail_info["thumbnail_method"],
441 thumbnail_length=thumbnail_info["thumbnail_length"],
291442 )
292443
293 t_type = file_info.thumbnail_type
294 t_length = thumbnail_info["thumbnail_length"]
295
296 responder = await self.media_storage.fetch_media(file_info)
297 await respond_with_responder(request, responder, t_type, t_length)
298 else:
299 logger.info("Failed to find any generated thumbnails")
300 respond_404(request)
301
302 def _select_thumbnail(
303 self,
304 desired_width: int,
305 desired_height: int,
306 desired_method: str,
307 desired_type: str,
308 thumbnail_infos,
309 ) -> dict:
310 d_w = desired_width
311 d_h = desired_height
312
313 if desired_method.lower() == "crop":
314 crop_info_list = []
315 crop_info_list2 = []
316 for info in thumbnail_infos:
317 t_w = info["thumbnail_width"]
318 t_h = info["thumbnail_height"]
319 t_method = info["thumbnail_method"]
320 if t_method == "crop":
321 aspect_quality = abs(d_w * t_h - d_h * t_w)
322 min_quality = 0 if d_w <= t_w and d_h <= t_h else 1
323 size_quality = abs((d_w - t_w) * (d_h - t_h))
324 type_quality = desired_type != info["thumbnail_type"]
325 length_quality = info["thumbnail_length"]
326 if t_w >= d_w or t_h >= d_h:
327 crop_info_list.append(
328 (
329 aspect_quality,
330 min_quality,
331 size_quality,
332 type_quality,
333 length_quality,
334 info,
335 )
336 )
337 else:
338 crop_info_list2.append(
339 (
340 aspect_quality,
341 min_quality,
342 size_quality,
343 type_quality,
344 length_quality,
345 info,
346 )
347 )
348 if crop_info_list:
349 return min(crop_info_list)[-1]
350 else:
351 return min(crop_info_list2)[-1]
352 else:
353 info_list = []
354 info_list2 = []
355 for info in thumbnail_infos:
356 t_w = info["thumbnail_width"]
357 t_h = info["thumbnail_height"]
358 t_method = info["thumbnail_method"]
359 size_quality = abs((d_w - t_w) * (d_h - t_h))
360 type_quality = desired_type != info["thumbnail_type"]
361 length_quality = info["thumbnail_length"]
362 if t_method == "scale" and (t_w >= d_w or t_h >= d_h):
363 info_list.append((size_quality, type_quality, length_quality, info))
364 elif t_method == "scale":
365 info_list2.append(
366 (size_quality, type_quality, length_quality, info)
367 )
368 if info_list:
369 return min(info_list)[-1]
370 else:
371 return min(info_list2)[-1]
444 # No matching thumbnail was found.
445 return None
+0
-27
synapse/rest/oidc/__init__.py less more
0 # -*- coding: utf-8 -*-
1 # Copyright 2020 Quentin Gliech
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 import logging
15
16 from twisted.web.resource import Resource
17
18 from synapse.rest.oidc.callback_resource import OIDCCallbackResource
19
20 logger = logging.getLogger(__name__)
21
22
23 class OIDCResource(Resource):
24 def __init__(self, hs):
25 Resource.__init__(self)
26 self.putChild(b"callback", OIDCCallbackResource(hs))
+0
-30
synapse/rest/oidc/callback_resource.py less more
0 # -*- coding: utf-8 -*-
1 # Copyright 2020 Quentin Gliech
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 import logging
15
16 from synapse.http.server import DirectServeHtmlResource
17
18 logger = logging.getLogger(__name__)
19
20
21 class OIDCCallbackResource(DirectServeHtmlResource):
22 isLeaf = 1
23
24 def __init__(self, hs):
25 super().__init__()
26 self._oidc_handler = hs.get_oidc_handler()
27
28 async def _async_render_GET(self, request):
29 await self._oidc_handler.handle_oidc_callback(request)
+0
-29
synapse/rest/saml2/__init__.py less more
0 # -*- coding: utf-8 -*-
1 # Copyright 2018 New Vector Ltd
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 import logging
15
16 from twisted.web.resource import Resource
17
18 from synapse.rest.saml2.metadata_resource import SAML2MetadataResource
19 from synapse.rest.saml2.response_resource import SAML2ResponseResource
20
21 logger = logging.getLogger(__name__)
22
23
24 class SAML2Resource(Resource):
25 def __init__(self, hs):
26 Resource.__init__(self)
27 self.putChild(b"metadata.xml", SAML2MetadataResource(hs))
28 self.putChild(b"authn_response", SAML2ResponseResource(hs))
+0
-36
synapse/rest/saml2/metadata_resource.py less more
0 # -*- coding: utf-8 -*-
1 # Copyright 2018 New Vector Ltd
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15
16 import saml2.metadata
17
18 from twisted.web.resource import Resource
19
20
21 class SAML2MetadataResource(Resource):
22 """A Twisted web resource which renders the SAML metadata"""
23
24 isLeaf = 1
25
26 def __init__(self, hs):
27 Resource.__init__(self)
28 self.sp_config = hs.config.saml2_sp_config
29
30 def render_GET(self, request):
31 metadata_xml = saml2.metadata.create_metadata_string(
32 configfile=None, config=self.sp_config
33 )
34 request.setHeader(b"Content-Type", b"text/xml; charset=utf-8")
35 return metadata_xml
+0
-39
synapse/rest/saml2/response_resource.py less more
0 # -*- coding: utf-8 -*-
1 #
2 # Copyright 2018 New Vector Ltd
3 #
4 # Licensed under the Apache License, Version 2.0 (the "License");
5 # you may not use this file except in compliance with the License.
6 # You may obtain a copy of the License at
7 #
8 # http://www.apache.org/licenses/LICENSE-2.0
9 #
10 # Unless required by applicable law or agreed to in writing, software
11 # distributed under the License is distributed on an "AS IS" BASIS,
12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 # See the License for the specific language governing permissions and
14 # limitations under the License.
15
16 from synapse.http.server import DirectServeHtmlResource
17
18
19 class SAML2ResponseResource(DirectServeHtmlResource):
20 """A Twisted web resource which handles the SAML response"""
21
22 isLeaf = 1
23
24 def __init__(self, hs):
25 super().__init__()
26 self._saml_handler = hs.get_saml_handler()
27
28 async def _async_render_GET(self, request):
29 # We're not expecting any GET request on that resource if everything goes right,
30 # but some IdPs sometimes end up responding with a 302 redirect on this endpoint.
31 # In this case, just tell the user that something went wrong and they should
32 # try to authenticate again.
33 self._saml_handler._render_error(
34 request, "unexpected_get", "Unexpected GET request on /saml2/authn_response"
35 )
36
37 async def _async_render_POST(self, request):
38 await self._saml_handler.handle_saml_response(request)
00 # -*- coding: utf-8 -*-
1 # Copyright 2020 The Matrix.org Foundation C.I.C.
1 # Copyright 2021 The Matrix.org Foundation C.I.C.
22 #
33 # Licensed under the Apache License, Version 2.0 (the "License");
44 # you may not use this file except in compliance with the License.
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14
15 from typing import TYPE_CHECKING, Mapping
16
17 from twisted.web.resource import Resource
18
19 from synapse.rest.synapse.client.new_user_consent import NewUserConsentResource
20 from synapse.rest.synapse.client.pick_idp import PickIdpResource
21 from synapse.rest.synapse.client.pick_username import pick_username_resource
22 from synapse.rest.synapse.client.sso_register import SsoRegisterResource
23
24 if TYPE_CHECKING:
25 from synapse.server import HomeServer
26
27
28 def build_synapse_client_resource_tree(hs: "HomeServer") -> Mapping[str, Resource]:
29 """Builds a resource tree to include synapse-specific client resources
30
31 These are resources which should be loaded on all workers which expose a C-S API:
32 ie, the main process, and any generic workers so configured.
33
34 Returns:
35 map from path to Resource.
36 """
37 resources = {
38 # SSO bits. These are always loaded, whether or not SSO login is actually
39 # enabled (they just won't work very well if it's not)
40 "/_synapse/client/pick_idp": PickIdpResource(hs),
41 "/_synapse/client/pick_username": pick_username_resource(hs),
42 "/_synapse/client/new_user_consent": NewUserConsentResource(hs),
43 "/_synapse/client/sso_register": SsoRegisterResource(hs),
44 }
45
46 # provider-specific SSO bits. Only load these if they are enabled, since they
47 # rely on optional dependencies.
48 if hs.config.oidc_enabled:
49 from synapse.rest.synapse.client.oidc import OIDCResource
50
51 resources["/_synapse/client/oidc"] = OIDCResource(hs)
52
53 if hs.config.saml2_enabled:
54 from synapse.rest.synapse.client.saml2 import SAML2Resource
55
56 res = SAML2Resource(hs)
57 resources["/_synapse/client/saml2"] = res
58
59 # This is also mounted under '/_matrix' for backwards-compatibility.
60 resources["/_matrix/saml2"] = res
61
62 return resources
63
64
65 __all__ = ["build_synapse_client_resource_tree"]
0 # -*- coding: utf-8 -*-
1 # Copyright 2021 The Matrix.org Foundation C.I.C.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 import logging
15 from typing import TYPE_CHECKING
16
17 from twisted.web.http import Request
18
19 from synapse.api.errors import SynapseError
20 from synapse.handlers.sso import get_username_mapping_session_cookie_from_request
21 from synapse.http.server import DirectServeHtmlResource, respond_with_html
22 from synapse.http.servlet import parse_string
23 from synapse.types import UserID
24 from synapse.util.templates import build_jinja_env
25
26 if TYPE_CHECKING:
27 from synapse.server import HomeServer
28
29 logger = logging.getLogger(__name__)
30
31
32 class NewUserConsentResource(DirectServeHtmlResource):
33 """A resource which collects consent to the server's terms from a new user
34
35 This resource gets mounted at /_synapse/client/new_user_consent, and is shown
36 when we are automatically creating a new user due to an SSO login.
37
38 It shows a template which prompts the user to go and read the Ts and Cs, and click
39 a clickybox if they have done so.
40 """
41
42 def __init__(self, hs: "HomeServer"):
43 super().__init__()
44 self._sso_handler = hs.get_sso_handler()
45 self._server_name = hs.hostname
46 self._consent_version = hs.config.consent.user_consent_version
47
48 def template_search_dirs():
49 if hs.config.sso.sso_template_dir:
50 yield hs.config.sso.sso_template_dir
51 yield hs.config.sso.default_template_dir
52
53 self._jinja_env = build_jinja_env(template_search_dirs(), hs.config)
54
55 async def _async_render_GET(self, request: Request) -> None:
56 try:
57 session_id = get_username_mapping_session_cookie_from_request(request)
58 session = self._sso_handler.get_mapping_session(session_id)
59 except SynapseError as e:
60 logger.warning("Error fetching session: %s", e)
61 self._sso_handler.render_error(request, "bad_session", e.msg, code=e.code)
62 return
63
64 user_id = UserID(session.chosen_localpart, self._server_name)
65 user_profile = {
66 "display_name": session.display_name,
67 }
68
69 template_params = {
70 "user_id": user_id.to_string(),
71 "user_profile": user_profile,
72 "consent_version": self._consent_version,
73 "terms_url": "/_matrix/consent?v=%s" % (self._consent_version,),
74 }
75
76 template = self._jinja_env.get_template("sso_new_user_consent.html")
77 html = template.render(template_params)
78 respond_with_html(request, 200, html)
79
80 async def _async_render_POST(self, request: Request):
81 try:
82 session_id = get_username_mapping_session_cookie_from_request(request)
83 except SynapseError as e:
84 logger.warning("Error fetching session cookie: %s", e)
85 self._sso_handler.render_error(request, "bad_session", e.msg, code=e.code)
86 return
87
88 try:
89 accepted_version = parse_string(request, "accepted_version", required=True)
90 except SynapseError as e:
91 self._sso_handler.render_error(request, "bad_param", e.msg, code=e.code)
92 return
93
94 await self._sso_handler.handle_terms_accepted(
95 request, session_id, accepted_version
96 )
0 # -*- coding: utf-8 -*-
1 # Copyright 2020 Quentin Gliech
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import logging
16
17 from twisted.web.resource import Resource
18
19 from synapse.rest.synapse.client.oidc.callback_resource import OIDCCallbackResource
20
21 logger = logging.getLogger(__name__)
22
23
24 class OIDCResource(Resource):
25 def __init__(self, hs):
26 Resource.__init__(self)
27 self.putChild(b"callback", OIDCCallbackResource(hs))
28
29
30 __all__ = ["OIDCResource"]
0 # -*- coding: utf-8 -*-
1 # Copyright 2020 Quentin Gliech
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 import logging
15
16 from synapse.http.server import DirectServeHtmlResource
17
18 logger = logging.getLogger(__name__)
19
20
21 class OIDCCallbackResource(DirectServeHtmlResource):
22 isLeaf = 1
23
24 def __init__(self, hs):
25 super().__init__()
26 self._oidc_handler = hs.get_oidc_handler()
27
28 async def _async_render_GET(self, request):
29 await self._oidc_handler.handle_oidc_callback(request)
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14 from typing import TYPE_CHECKING
1514
16 import pkg_resources
15 import logging
16 from typing import TYPE_CHECKING, List
1717
1818 from twisted.web.http import Request
1919 from twisted.web.resource import Resource
20 from twisted.web.static import File
2120
2221 from synapse.api.errors import SynapseError
23 from synapse.handlers.sso import USERNAME_MAPPING_SESSION_COOKIE_NAME
24 from synapse.http.server import DirectServeHtmlResource, DirectServeJsonResource
25 from synapse.http.servlet import parse_string
22 from synapse.handlers.sso import get_username_mapping_session_cookie_from_request
23 from synapse.http.server import (
24 DirectServeHtmlResource,
25 DirectServeJsonResource,
26 respond_with_html,
27 )
28 from synapse.http.servlet import parse_boolean, parse_string
2629 from synapse.http.site import SynapseRequest
30 from synapse.util.templates import build_jinja_env
2731
2832 if TYPE_CHECKING:
2933 from synapse.server import HomeServer
34
35 logger = logging.getLogger(__name__)
3036
3137
3238 def pick_username_resource(hs: "HomeServer") -> Resource:
3339 """Factory method to generate the username picker resource.
3440
35 This resource gets mounted under /_synapse/client/pick_username. The top-level
36 resource is just a File resource which serves up the static files in the resources
37 "res" directory, but it has a couple of children:
41 This resource gets mounted under /_synapse/client/pick_username and has two
42 children:
3843
39 * "submit", which does the mechanics of registering the new user, and redirects the
40 browser back to the client URL
41
42 * "check": checks if a userid is free.
44 * "account_details": renders the form and handles the POSTed response
45 * "check": a JSON endpoint which checks if a userid is free.
4346 """
4447
45 # XXX should we make this path customisable so that admins can restyle it?
46 base_path = pkg_resources.resource_filename("synapse", "res/username_picker")
47
48 res = File(base_path)
49 res.putChild(b"submit", SubmitResource(hs))
48 res = Resource()
49 res.putChild(b"account_details", AccountDetailsResource(hs))
5050 res.putChild(b"check", AvailabilityCheckResource(hs))
5151
5252 return res
6060 async def _async_render_GET(self, request: Request):
6161 localpart = parse_string(request, "username", required=True)
6262
63 session_id = request.getCookie(USERNAME_MAPPING_SESSION_COOKIE_NAME)
64 if not session_id:
65 raise SynapseError(code=400, msg="missing session_id")
63 session_id = get_username_mapping_session_cookie_from_request(request)
6664
6765 is_available = await self._sso_handler.check_username_availability(
68 localpart, session_id.decode("ascii", errors="replace")
66 localpart, session_id
6967 )
7068 return 200, {"available": is_available}
7169
7270
73 class SubmitResource(DirectServeHtmlResource):
71 class AccountDetailsResource(DirectServeHtmlResource):
7472 def __init__(self, hs: "HomeServer"):
7573 super().__init__()
7674 self._sso_handler = hs.get_sso_handler()
7775
76 def template_search_dirs():
77 if hs.config.sso.sso_template_dir:
78 yield hs.config.sso.sso_template_dir
79 yield hs.config.sso.default_template_dir
80
81 self._jinja_env = build_jinja_env(template_search_dirs(), hs.config)
82
83 async def _async_render_GET(self, request: Request) -> None:
84 try:
85 session_id = get_username_mapping_session_cookie_from_request(request)
86 session = self._sso_handler.get_mapping_session(session_id)
87 except SynapseError as e:
88 logger.warning("Error fetching session: %s", e)
89 self._sso_handler.render_error(request, "bad_session", e.msg, code=e.code)
90 return
91
92 idp_id = session.auth_provider_id
93 template_params = {
94 "idp": self._sso_handler.get_identity_providers()[idp_id],
95 "user_attributes": {
96 "display_name": session.display_name,
97 "emails": session.emails,
98 },
99 }
100
101 template = self._jinja_env.get_template("sso_auth_account_details.html")
102 html = template.render(template_params)
103 respond_with_html(request, 200, html)
104
78105 async def _async_render_POST(self, request: SynapseRequest):
79 localpart = parse_string(request, "username", required=True)
106 try:
107 session_id = get_username_mapping_session_cookie_from_request(request)
108 except SynapseError as e:
109 logger.warning("Error fetching session cookie: %s", e)
110 self._sso_handler.render_error(request, "bad_session", e.msg, code=e.code)
111 return
80112
81 session_id = request.getCookie(USERNAME_MAPPING_SESSION_COOKIE_NAME)
82 if not session_id:
83 raise SynapseError(code=400, msg="missing session_id")
113 try:
114 localpart = parse_string(request, "username", required=True)
115 use_display_name = parse_boolean(request, "use_display_name", default=False)
116
117 try:
118 emails_to_use = [
119 val.decode("utf-8") for val in request.args.get(b"use_email", [])
120 ] # type: List[str]
121 except ValueError:
122 raise SynapseError(400, "Query parameter use_email must be utf-8")
123 except SynapseError as e:
124 logger.warning("[session %s] bad param: %s", session_id, e)
125 self._sso_handler.render_error(request, "bad_param", e.msg, code=e.code)
126 return
84127
85128 await self._sso_handler.handle_submit_username_request(
86 request, localpart, session_id.decode("ascii", errors="replace")
129 request, session_id, localpart, use_display_name, emails_to_use
87130 )
0 # -*- coding: utf-8 -*-
1 # Copyright 2018 New Vector Ltd
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import logging
16
17 from twisted.web.resource import Resource
18
19 from synapse.rest.synapse.client.saml2.metadata_resource import SAML2MetadataResource
20 from synapse.rest.synapse.client.saml2.response_resource import SAML2ResponseResource
21
22 logger = logging.getLogger(__name__)
23
24
25 class SAML2Resource(Resource):
26 def __init__(self, hs):
27 Resource.__init__(self)
28 self.putChild(b"metadata.xml", SAML2MetadataResource(hs))
29 self.putChild(b"authn_response", SAML2ResponseResource(hs))
30
31
32 __all__ = ["SAML2Resource"]
0 # -*- coding: utf-8 -*-
1 # Copyright 2018 New Vector Ltd
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15
16 import saml2.metadata
17
18 from twisted.web.resource import Resource
19
20
21 class SAML2MetadataResource(Resource):
22 """A Twisted web resource which renders the SAML metadata"""
23
24 isLeaf = 1
25
26 def __init__(self, hs):
27 Resource.__init__(self)
28 self.sp_config = hs.config.saml2_sp_config
29
30 def render_GET(self, request):
31 metadata_xml = saml2.metadata.create_metadata_string(
32 configfile=None, config=self.sp_config
33 )
34 request.setHeader(b"Content-Type", b"text/xml; charset=utf-8")
35 return metadata_xml
0 # -*- coding: utf-8 -*-
1 #
2 # Copyright 2018 New Vector Ltd
3 #
4 # Licensed under the Apache License, Version 2.0 (the "License");
5 # you may not use this file except in compliance with the License.
6 # You may obtain a copy of the License at
7 #
8 # http://www.apache.org/licenses/LICENSE-2.0
9 #
10 # Unless required by applicable law or agreed to in writing, software
11 # distributed under the License is distributed on an "AS IS" BASIS,
12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 # See the License for the specific language governing permissions and
14 # limitations under the License.
15
16 from synapse.http.server import DirectServeHtmlResource
17
18
19 class SAML2ResponseResource(DirectServeHtmlResource):
20 """A Twisted web resource which handles the SAML response"""
21
22 isLeaf = 1
23
24 def __init__(self, hs):
25 super().__init__()
26 self._saml_handler = hs.get_saml_handler()
27
28 async def _async_render_GET(self, request):
29 # We're not expecting any GET request on that resource if everything goes right,
30 # but some IdPs sometimes end up responding with a 302 redirect on this endpoint.
31 # In this case, just tell the user that something went wrong and they should
32 # try to authenticate again.
33 self._saml_handler._render_error(
34 request, "unexpected_get", "Unexpected GET request on /saml2/authn_response"
35 )
36
37 async def _async_render_POST(self, request):
38 await self._saml_handler.handle_saml_response(request)
0 # -*- coding: utf-8 -*-
1 # Copyright 2021 The Matrix.org Foundation C.I.C.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import logging
16 from typing import TYPE_CHECKING
17
18 from twisted.web.http import Request
19
20 from synapse.api.errors import SynapseError
21 from synapse.handlers.sso import get_username_mapping_session_cookie_from_request
22 from synapse.http.server import DirectServeHtmlResource
23
24 if TYPE_CHECKING:
25 from synapse.server import HomeServer
26
27 logger = logging.getLogger(__name__)
28
29
30 class SsoRegisterResource(DirectServeHtmlResource):
31 """A resource which completes SSO registration
32
33 This resource gets mounted at /_synapse/client/sso_register, and is shown
34 after we collect username and/or consent for a new SSO user. It (finally) registers
35 the user, and confirms redirect to the client
36 """
37
38 def __init__(self, hs: "HomeServer"):
39 super().__init__()
40 self._sso_handler = hs.get_sso_handler()
41
42 async def _async_render_GET(self, request: Request) -> None:
43 try:
44 session_id = get_username_mapping_session_cookie_from_request(request)
45 except SynapseError as e:
46 logger.warning("Error fetching session cookie: %s", e)
47 self._sso_handler.render_error(request, "bad_session", e.msg, code=e.code)
48 return
49 await self._sso_handler.register_sso_user(request, session_id)
3333 self._config = hs.config
3434
3535 def get_well_known(self):
36 # if we don't have a public_baseurl, we can't help much here.
37 if self._config.public_baseurl is None:
38 return None
39
3640 result = {"m.homeserver": {"base_url": self._config.public_baseurl}}
3741
3842 if self._config.default_identity_server:
102102 from synapse.push.action_generator import ActionGenerator
103103 from synapse.push.pusherpool import PusherPool
104104 from synapse.replication.tcp.client import ReplicationDataHandler
105 from synapse.replication.tcp.external_cache import ExternalCache
105106 from synapse.replication.tcp.handler import ReplicationCommandHandler
106107 from synapse.replication.tcp.resource import ReplicationStreamer
107108 from synapse.replication.tcp.streams import STREAMS_MAP, Stream
127128 logger = logging.getLogger(__name__)
128129
129130 if TYPE_CHECKING:
131 from txredisapi import RedisProtocol
132
130133 from synapse.handlers.oidc_handler import OidcHandler
131134 from synapse.handlers.saml_handler import SamlHandler
132135
715718 def get_account_data_handler(self) -> AccountDataHandler:
716719 return AccountDataHandler(self)
717720
721 @cache_in_self
722 def get_external_cache(self) -> ExternalCache:
723 return ExternalCache(self)
724
725 @cache_in_self
726 def get_outbound_redis_connection(self) -> Optional["RedisProtocol"]:
727 if not self.config.redis.redis_enabled:
728 return None
729
730 # We only want to import redis module if we're using it, as we have
731 # `txredisapi` as an optional dependency.
732 from synapse.replication.tcp.redis import lazyConnection
733
734 logger.info(
735 "Connecting to redis (host=%r port=%r) for external cache",
736 self.config.redis_host,
737 self.config.redis_port,
738 )
739
740 return lazyConnection(
741 hs=self,
742 host=self.config.redis_host,
743 port=self.config.redis_port,
744 password=self.config.redis.redis_password,
745 reconnect=True,
746 )
747
718748 async def remove_pusher(self, app_id: str, push_key: str, user_id: str):
719749 return await self.get_pusherpool().remove_pusher(app_id, push_key, user_id)
720750
309309 state_group_before_event = None
310310 state_group_before_event_prev_group = None
311311 deltas_to_state_group_before_event = None
312 entry = None
312313
313314 else:
314315 # otherwise, we'll need to resolve the state across the prev_events.
339340 current_state_ids=state_ids_before_event,
340341 )
341342
342 # XXX: can we update the state cache entry for the new state group? or
343 # could we set a flag on resolve_state_groups_for_events to tell it to
344 # always make a state group?
343 # Assign the new state group to the cached state entry.
344 #
345 # Note that this can race in that we could generate multiple state
346 # groups for the same state entry, but that is just inefficient
347 # rather than dangerous.
348 if entry and entry.state_group is None:
349 entry.state_group = state_group_before_event
345350
346351 #
347352 # now if it's not a state event, we're done
261261 return self.txn.description
262262
263263 def execute_batch(self, sql: str, args: Iterable[Iterable[Any]]) -> None:
264 """Similar to `executemany`, except `txn.rowcount` will not be correct
265 afterwards.
266
267 More efficient than `executemany` on PostgreSQL
268 """
269
264270 if isinstance(self.database_engine, PostgresEngine):
265271 from psycopg2.extras import execute_batch # type: ignore
266272
267273 self._do_execute(lambda *x: execute_batch(self.txn, *x), sql, args)
268274 else:
269 for val in args:
270 self.execute(sql, val)
275 self.executemany(sql, args)
271276
272277 def execute_values(self, sql: str, *args: Any) -> List[Tuple]:
273278 """Corresponds to psycopg2.extras.execute_values. Only available when
887892 ", ".join("?" for _ in keys[0]),
888893 )
889894
890 txn.executemany(sql, vals)
895 txn.execute_batch(sql, vals)
891896
892897 async def simple_upsert(
893898 self,
00 # -*- coding: utf-8 -*-
11 # Copyright 2014-2016 OpenMarket Ltd
22 # Copyright 2018 New Vector Ltd
3 # Copyright 2019 The Matrix.org Foundation C.I.C.
3 # Copyright 2019-2021 The Matrix.org Foundation C.I.C.
44 #
55 # Licensed under the Apache License, Version 2.0 (the "License");
66 # you may not use this file except in compliance with the License.
4242 from .event_federation import EventFederationStore
4343 from .event_push_actions import EventPushActionsStore
4444 from .events_bg_updates import EventsBackgroundUpdatesStore
45 from .events_forward_extremities import EventForwardExtremitiesStore
4546 from .filtering import FilteringStore
4647 from .group_server import GroupServerStore
4748 from .keys import KeyStore
117118 UIAuthStore,
118119 CacheInvalidationWorkerStore,
119120 ServerMetricsStore,
121 EventForwardExtremitiesStore,
120122 ):
121123 def __init__(self, database: DatabasePool, db_conn, hs):
122124 self.hs = hs
896896 DELETE FROM device_lists_outbound_last_success
897897 WHERE destination = ? AND user_id = ?
898898 """
899 txn.executemany(sql, ((row[0], row[1]) for row in rows))
899 txn.execute_batch(sql, ((row[0], row[1]) for row in rows))
900900
901901 logger.info("Pruned %d device list outbound pokes", count)
902902
13421342
13431343 # Delete older entries in the table, as we really only care about
13441344 # when the latest change happened.
1345 txn.executemany(
1345 txn.execute_batch(
13461346 """
13471347 DELETE FROM device_lists_stream
13481348 WHERE user_id = ? AND device_id = ? AND stream_id < ?
633633
634634 async def get_e2e_cross_signing_keys_bulk(
635635 self, user_ids: List[str], from_user_id: Optional[str] = None
636 ) -> Dict[str, Dict[str, dict]]:
636 ) -> Dict[str, Optional[Dict[str, dict]]]:
637637 """Returns the cross-signing keys for a set of users.
638638
639639 Args:
723723
724724 async def claim_e2e_one_time_keys(
725725 self, query_list: Iterable[Tuple[str, str, str]]
726 ) -> Dict[str, Dict[str, Dict[str, bytes]]]:
726 ) -> Dict[str, Dict[str, Dict[str, str]]]:
727727 """Take a list of one time keys out of the database.
728728
729729 Args:
486486 VALUES (?, ?, ?, ?, ?, ?)
487487 """
488488
489 txn.executemany(
489 txn.execute_batch(
490490 sql,
491491 (
492492 _gen_entry(user_id, actions)
802802 ],
803803 )
804804
805 txn.executemany(
805 txn.execute_batch(
806806 """
807807 UPDATE event_push_summary
808808 SET notif_count = ?, unread_count = ?, stream_ordering = ?
472472 txn, self.db_pool, event_to_room_id, event_to_types, event_to_auth_chain,
473473 )
474474
475 @staticmethod
475 @classmethod
476476 def _add_chain_cover_index(
477 cls,
477478 txn,
478479 db_pool: DatabasePool,
479480 event_to_room_id: Dict[str, str],
613614 if not events_to_calc_chain_id_for:
614615 return
615616
616 # We now calculate the chain IDs/sequence numbers for the events. We
617 # do this by looking at the chain ID and sequence number of any auth
618 # event with the same type/state_key and incrementing the sequence
619 # number by one. If there was no match or the chain ID/sequence
620 # number is already taken we generate a new chain.
621 #
622 # We need to do this in a topologically sorted order as we want to
623 # generate chain IDs/sequence numbers of an event's auth events
624 # before the event itself.
625 chains_tuples_allocated = set() # type: Set[Tuple[int, int]]
626 new_chain_tuples = {} # type: Dict[str, Tuple[int, int]]
627 for event_id in sorted_topologically(
628 events_to_calc_chain_id_for, event_to_auth_chain
629 ):
630 existing_chain_id = None
631 for auth_id in event_to_auth_chain.get(event_id, []):
632 if event_to_types.get(event_id) == event_to_types.get(auth_id):
633 existing_chain_id = chain_map[auth_id]
634 break
635
636 new_chain_tuple = None
637 if existing_chain_id:
638 # We found a chain ID/sequence number candidate, check its
639 # not already taken.
640 proposed_new_id = existing_chain_id[0]
641 proposed_new_seq = existing_chain_id[1] + 1
642 if (proposed_new_id, proposed_new_seq) not in chains_tuples_allocated:
643 already_allocated = db_pool.simple_select_one_onecol_txn(
644 txn,
645 table="event_auth_chains",
646 keyvalues={
647 "chain_id": proposed_new_id,
648 "sequence_number": proposed_new_seq,
649 },
650 retcol="event_id",
651 allow_none=True,
652 )
653 if already_allocated:
654 # Mark it as already allocated so we don't need to hit
655 # the DB again.
656 chains_tuples_allocated.add((proposed_new_id, proposed_new_seq))
657 else:
658 new_chain_tuple = (
659 proposed_new_id,
660 proposed_new_seq,
661 )
662
663 if not new_chain_tuple:
664 new_chain_tuple = (db_pool.event_chain_id_gen.get_next_id_txn(txn), 1)
665
666 chains_tuples_allocated.add(new_chain_tuple)
667
668 chain_map[event_id] = new_chain_tuple
669 new_chain_tuples[event_id] = new_chain_tuple
617 # Allocate chain ID/sequence numbers to each new event.
618 new_chain_tuples = cls._allocate_chain_ids(
619 txn,
620 db_pool,
621 event_to_room_id,
622 event_to_types,
623 event_to_auth_chain,
624 events_to_calc_chain_id_for,
625 chain_map,
626 )
627 chain_map.update(new_chain_tuples)
670628
671629 db_pool.simple_insert_many_txn(
672630 txn,
793751 ],
794752 )
795753
754 @staticmethod
755 def _allocate_chain_ids(
756 txn,
757 db_pool: DatabasePool,
758 event_to_room_id: Dict[str, str],
759 event_to_types: Dict[str, Tuple[str, str]],
760 event_to_auth_chain: Dict[str, List[str]],
761 events_to_calc_chain_id_for: Set[str],
762 chain_map: Dict[str, Tuple[int, int]],
763 ) -> Dict[str, Tuple[int, int]]:
764 """Allocates, but does not persist, chain ID/sequence numbers for the
765 events in `events_to_calc_chain_id_for`. (c.f. _add_chain_cover_index
766 for info on args)
767 """
768
769 # We now calculate the chain IDs/sequence numbers for the events. We do
770 # this by looking at the chain ID and sequence number of any auth event
771 # with the same type/state_key and incrementing the sequence number by
772 # one. If there was no match or the chain ID/sequence number is already
773 # taken we generate a new chain.
774 #
775 # We try to reduce the number of times that we hit the database by
776 # batching up calls, to make this more efficient when persisting large
777 # numbers of state events (e.g. during joins).
778 #
779 # We do this by:
780 # 1. Calculating for each event which auth event will be used to
781 # inherit the chain ID, i.e. converting the auth chain graph to a
782 # tree that we can allocate chains on. We also keep track of which
783 # existing chain IDs have been referenced.
784 # 2. Fetching the max allocated sequence number for each referenced
785 # existing chain ID, generating a map from chain ID to the max
786 # allocated sequence number.
787 # 3. Iterating over the tree and allocating a chain ID/seq no. to the
788 # new event, by incrementing the sequence number from the
789 # referenced event's chain ID/seq no. and checking that the
790 # incremented sequence number hasn't already been allocated (by
791 # looking in the map generated in the previous step). We generate a
792 # new chain if the sequence number has already been allocated.
793 #
794
795 existing_chains = set() # type: Set[int]
796 tree = [] # type: List[Tuple[str, Optional[str]]]
797
798 # We need to do this in a topologically sorted order as we want to
799 # generate chain IDs/sequence numbers of an event's auth events before
800 # the event itself.
801 for event_id in sorted_topologically(
802 events_to_calc_chain_id_for, event_to_auth_chain
803 ):
804 for auth_id in event_to_auth_chain.get(event_id, []):
805 if event_to_types.get(event_id) == event_to_types.get(auth_id):
806 existing_chain_id = chain_map.get(auth_id)
807 if existing_chain_id:
808 existing_chains.add(existing_chain_id[0])
809
810 tree.append((event_id, auth_id))
811 break
812 else:
813 tree.append((event_id, None))
814
815 # Fetch the current max sequence number for each existing referenced chain.
816 sql = """
817 SELECT chain_id, MAX(sequence_number) FROM event_auth_chains
818 WHERE %s
819 GROUP BY chain_id
820 """
821 clause, args = make_in_list_sql_clause(
822 db_pool.engine, "chain_id", existing_chains
823 )
824 txn.execute(sql % (clause,), args)
825
826 chain_to_max_seq_no = {row[0]: row[1] for row in txn} # type: Dict[Any, int]
827
828 # Allocate the new events chain ID/sequence numbers.
829 #
830 # To reduce the number of calls to the database we don't allocate a
831 # chain ID number in the loop, instead we use a temporary `object()` for
832 # each new chain ID. Once we've done the loop we generate the necessary
833 # number of new chain IDs in one call, replacing all temporary
834 # objects with real allocated chain IDs.
835
836 unallocated_chain_ids = set() # type: Set[object]
837 new_chain_tuples = {} # type: Dict[str, Tuple[Any, int]]
838 for event_id, auth_event_id in tree:
839 # If we reference an auth_event_id we fetch the allocated chain ID,
840 # either from the existing `chain_map` or the newly generated
841 # `new_chain_tuples` map.
842 existing_chain_id = None
843 if auth_event_id:
844 existing_chain_id = new_chain_tuples.get(auth_event_id)
845 if not existing_chain_id:
846 existing_chain_id = chain_map[auth_event_id]
847
848 new_chain_tuple = None # type: Optional[Tuple[Any, int]]
849 if existing_chain_id:
850 # We found a chain ID/sequence number candidate, check its
851 # not already taken.
852 proposed_new_id = existing_chain_id[0]
853 proposed_new_seq = existing_chain_id[1] + 1
854
855 if chain_to_max_seq_no[proposed_new_id] < proposed_new_seq:
856 new_chain_tuple = (
857 proposed_new_id,
858 proposed_new_seq,
859 )
860
861 # If we need to start a new chain we allocate a temporary chain ID.
862 if not new_chain_tuple:
863 new_chain_tuple = (object(), 1)
864 unallocated_chain_ids.add(new_chain_tuple[0])
865
866 new_chain_tuples[event_id] = new_chain_tuple
867 chain_to_max_seq_no[new_chain_tuple[0]] = new_chain_tuple[1]
868
869 # Generate new chain IDs for all unallocated chain IDs.
870 newly_allocated_chain_ids = db_pool.event_chain_id_gen.get_next_mult_txn(
871 txn, len(unallocated_chain_ids)
872 )
873
874 # Map from potentially temporary chain ID to real chain ID
875 chain_id_to_allocated_map = dict(
876 zip(unallocated_chain_ids, newly_allocated_chain_ids)
877 ) # type: Dict[Any, int]
878 chain_id_to_allocated_map.update((c, c) for c in existing_chains)
879
880 return {
881 event_id: (chain_id_to_allocated_map[chain_id], seq)
882 for event_id, (chain_id, seq) in new_chain_tuples.items()
883 }
884
796885 def _persist_transaction_ids_txn(
797886 self,
798887 txn: LoggingTransaction,
875964 WHERE room_id = ? AND type = ? AND state_key = ?
876965 )
877966 """
878 txn.executemany(
967 txn.execute_batch(
879968 sql,
880969 (
881970 (
894983 )
895984 # Now we actually update the current_state_events table
896985
897 txn.executemany(
986 txn.execute_batch(
898987 "DELETE FROM current_state_events"
899988 " WHERE room_id = ? AND type = ? AND state_key = ?",
900989 (
906995 # We include the membership in the current state table, hence we do
907996 # a lookup when we insert. This assumes that all events have already
908997 # been inserted into room_memberships.
909 txn.executemany(
998 txn.execute_batch(
910999 """INSERT INTO current_state_events
9111000 (room_id, type, state_key, event_id, membership)
9121001 VALUES (?, ?, ?, ?, (SELECT membership FROM room_memberships WHERE event_id = ?))
9261015 # we have no record of the fact the user *was* a member of the
9271016 # room but got, say, state reset out of it.
9281017 if to_delete or to_insert:
929 txn.executemany(
1018 txn.execute_batch(
9301019 "DELETE FROM local_current_membership"
9311020 " WHERE room_id = ? AND user_id = ?",
9321021 (
9371026 )
9381027
9391028 if to_insert:
940 txn.executemany(
1029 txn.execute_batch(
9411030 """INSERT INTO local_current_membership
9421031 (room_id, user_id, event_id, membership)
9431032 VALUES (?, ?, ?, (SELECT membership FROM room_memberships WHERE event_id = ?))
17371826 """
17381827
17391828 if events_and_contexts:
1740 txn.executemany(
1829 txn.execute_batch(
17411830 sql,
17421831 (
17431832 (
17661855
17671856 # Now we delete the staging area for *all* events that were being
17681857 # persisted.
1769 txn.executemany(
1858 txn.execute_batch(
17701859 "DELETE FROM event_push_actions_staging WHERE event_id = ?",
17711860 ((event.event_id,) for event, _ in all_events_and_contexts),
17721861 )
18851974 " )"
18861975 )
18871976
1888 txn.executemany(
1977 txn.execute_batch(
18891978 query,
18901979 [
18911980 (e_id, ev.room_id, e_id, ev.room_id, e_id, ev.room_id, False)
18991988 "DELETE FROM event_backward_extremities"
19001989 " WHERE event_id = ? AND room_id = ?"
19011990 )
1902 txn.executemany(
1991 txn.execute_batch(
19031992 query,
19041993 [
19051994 (ev.event_id, ev.room_id)
138138 max_stream_id = progress["max_stream_id_exclusive"]
139139 rows_inserted = progress.get("rows_inserted", 0)
140140
141 INSERT_CLUMP_SIZE = 1000
142
143141 def reindex_txn(txn):
144142 sql = (
145143 "SELECT stream_ordering, event_id, json FROM events"
177175
178176 sql = "UPDATE events SET sender = ?, contains_url = ? WHERE event_id = ?"
179177
180 for index in range(0, len(update_rows), INSERT_CLUMP_SIZE):
181 clump = update_rows[index : index + INSERT_CLUMP_SIZE]
182 txn.executemany(sql, clump)
178 txn.execute_batch(sql, update_rows)
183179
184180 progress = {
185181 "target_min_stream_id_inclusive": target_min_stream_id,
208204 target_min_stream_id = progress["target_min_stream_id_inclusive"]
209205 max_stream_id = progress["max_stream_id_exclusive"]
210206 rows_inserted = progress.get("rows_inserted", 0)
211
212 INSERT_CLUMP_SIZE = 1000
213207
214208 def reindex_search_txn(txn):
215209 sql = (
255249
256250 sql = "UPDATE events SET origin_server_ts = ? WHERE event_id = ?"
257251
258 for index in range(0, len(rows_to_update), INSERT_CLUMP_SIZE):
259 clump = rows_to_update[index : index + INSERT_CLUMP_SIZE]
260 txn.executemany(sql, clump)
252 txn.execute_batch(sql, rows_to_update)
261253
262254 progress = {
263255 "target_min_stream_id_inclusive": target_min_stream_id,
0 # -*- coding: utf-8 -*-
1 # Copyright 2021 The Matrix.org Foundation C.I.C.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import logging
16 from typing import Dict, List
17
18 from synapse.api.errors import SynapseError
19 from synapse.storage._base import SQLBaseStore
20
21 logger = logging.getLogger(__name__)
22
23
24 class EventForwardExtremitiesStore(SQLBaseStore):
25 async def delete_forward_extremities_for_room(self, room_id: str) -> int:
26 """Delete any extra forward extremities for a room.
27
28 Invalidates the "get_latest_event_ids_in_room" cache if any forward
29 extremities were deleted.
30
31 Returns count deleted.
32 """
33
34 def delete_forward_extremities_for_room_txn(txn):
35 # First we need to get the event_id to not delete
36 sql = """
37 SELECT event_id FROM event_forward_extremities
38 INNER JOIN events USING (room_id, event_id)
39 WHERE room_id = ?
40 ORDER BY stream_ordering DESC
41 LIMIT 1
42 """
43 txn.execute(sql, (room_id,))
44 rows = txn.fetchall()
45 try:
46 event_id = rows[0][0]
47 logger.debug(
48 "Found event_id %s as the forward extremity to keep for room %s",
49 event_id,
50 room_id,
51 )
52 except KeyError:
53 msg = "No forward extremity event found for room %s" % room_id
54 logger.warning(msg)
55 raise SynapseError(400, msg)
56
57 # Now delete the extra forward extremities
58 sql = """
59 DELETE FROM event_forward_extremities
60 WHERE event_id != ? AND room_id = ?
61 """
62
63 txn.execute(sql, (event_id, room_id))
64 logger.info(
65 "Deleted %s extra forward extremities for room %s",
66 txn.rowcount,
67 room_id,
68 )
69
70 if txn.rowcount > 0:
71 # Invalidate the cache
72 self._invalidate_cache_and_stream(
73 txn, self.get_latest_event_ids_in_room, (room_id,),
74 )
75
76 return txn.rowcount
77
78 return await self.db_pool.runInteraction(
79 "delete_forward_extremities_for_room",
80 delete_forward_extremities_for_room_txn,
81 )
82
83 async def get_forward_extremities_for_room(self, room_id: str) -> List[Dict]:
84 """Get list of forward extremities for a room."""
85
86 def get_forward_extremities_for_room_txn(txn):
87 sql = """
88 SELECT event_id, state_group, depth, received_ts
89 FROM event_forward_extremities
90 INNER JOIN event_to_state_groups USING (event_id)
91 INNER JOIN events USING (room_id, event_id)
92 WHERE room_id = ?
93 """
94
95 txn.execute(sql, (room_id,))
96 return self.db_pool.cursor_to_dict(txn)
97
98 return await self.db_pool.runInteraction(
99 "get_forward_extremities_for_room", get_forward_extremities_for_room_txn,
100 )
416416 " WHERE media_origin = ? AND media_id = ?"
417417 )
418418
419 txn.executemany(
419 txn.execute_batch(
420420 sql,
421421 (
422422 (time_ms, media_origin, media_id)
429429 " WHERE media_id = ?"
430430 )
431431
432 txn.executemany(sql, ((time_ms, media_id) for media_id in local_media))
432 txn.execute_batch(sql, ((time_ms, media_id) for media_id in local_media))
433433
434434 return await self.db_pool.runInteraction(
435435 "update_cached_last_access_time", update_cache_txn
556556 sql = "DELETE FROM local_media_repository_url_cache WHERE media_id = ?"
557557
558558 def _delete_url_cache_txn(txn):
559 txn.executemany(sql, [(media_id,) for media_id in media_ids])
559 txn.execute_batch(sql, [(media_id,) for media_id in media_ids])
560560
561561 return await self.db_pool.runInteraction(
562562 "delete_url_cache", _delete_url_cache_txn
585585 def _delete_url_cache_media_txn(txn):
586586 sql = "DELETE FROM local_media_repository WHERE media_id = ?"
587587
588 txn.executemany(sql, [(media_id,) for media_id in media_ids])
588 txn.execute_batch(sql, [(media_id,) for media_id in media_ids])
589589
590590 sql = "DELETE FROM local_media_repository_thumbnails WHERE media_id = ?"
591591
592 txn.executemany(sql, [(media_id,) for media_id in media_ids])
592 txn.execute_batch(sql, [(media_id,) for media_id in media_ids])
593593
594594 return await self.db_pool.runInteraction(
595595 "delete_url_cache_media", _delete_url_cache_media_txn
8585
8686 _excess_state_events_collecter.update_data(
8787 (x[0] - 1) * x[1] for x in res if x[1]
88 )
89
90 async def count_daily_e2ee_messages(self):
91 """
92 Returns an estimate of the number of messages sent in the last day.
93
94 If it has been significantly less or more than one day since the last
95 call to this function, it will return None.
96 """
97
98 def _count_messages(txn):
99 sql = """
100 SELECT COALESCE(COUNT(*), 0) FROM events
101 WHERE type = 'm.room.encrypted'
102 AND stream_ordering > ?
103 """
104 txn.execute(sql, (self.stream_ordering_day_ago,))
105 (count,) = txn.fetchone()
106 return count
107
108 return await self.db_pool.runInteraction("count_e2ee_messages", _count_messages)
109
110 async def count_daily_sent_e2ee_messages(self):
111 def _count_messages(txn):
112 # This is good enough as if you have silly characters in your own
113 # hostname then thats your own fault.
114 like_clause = "%:" + self.hs.hostname
115
116 sql = """
117 SELECT COALESCE(COUNT(*), 0) FROM events
118 WHERE type = 'm.room.encrypted'
119 AND sender LIKE ?
120 AND stream_ordering > ?
121 """
122
123 txn.execute(sql, (like_clause, self.stream_ordering_day_ago))
124 (count,) = txn.fetchone()
125 return count
126
127 return await self.db_pool.runInteraction(
128 "count_daily_sent_e2ee_messages", _count_messages
129 )
130
131 async def count_daily_active_e2ee_rooms(self):
132 def _count(txn):
133 sql = """
134 SELECT COALESCE(COUNT(DISTINCT room_id), 0) FROM events
135 WHERE type = 'm.room.encrypted'
136 AND stream_ordering > ?
137 """
138 txn.execute(sql, (self.stream_ordering_day_ago,))
139 (count,) = txn.fetchone()
140 return count
141
142 return await self.db_pool.runInteraction(
143 "count_daily_active_e2ee_rooms", _count
88144 )
89145
90146 async def count_daily_messages(self):
171171 )
172172
173173 # Update backward extremeties
174 txn.executemany(
174 txn.execute_batch(
175175 "INSERT INTO event_backward_extremities (room_id, event_id)"
176176 " VALUES (?, ?)",
177177 [(room_id, event_id) for event_id, in new_backwards_extrems],
343343 txn, self.get_if_user_has_pusher, (user_id,)
344344 )
345345
346 self.db_pool.simple_delete_one_txn(
346 # It is expected that there is exactly one pusher to delete, but
347 # if it isn't there (or there are multiple) delete them all.
348 self.db_pool.simple_delete_txn(
347349 txn,
348350 "pushers",
349351 {"app_id": app_id, "pushkey": pushkey, "user_name": user_id},
359359
360360 await self.db_pool.runInteraction("set_server_admin", set_server_admin_txn)
361361
362 async def set_shadow_banned(self, user: UserID, shadow_banned: bool) -> None:
363 """Sets whether a user shadow-banned.
364
365 Args:
366 user: user ID of the user to test
367 shadow_banned: true iff the user is to be shadow-banned, false otherwise.
368 """
369
370 def set_shadow_banned_txn(txn):
371 self.db_pool.simple_update_one_txn(
372 txn,
373 table="users",
374 keyvalues={"name": user.to_string()},
375 updatevalues={"shadow_banned": shadow_banned},
376 )
377 # In order for this to apply immediately, clear the cache for this user.
378 tokens = self.db_pool.simple_select_onecol_txn(
379 txn,
380 table="access_tokens",
381 keyvalues={"user_id": user.to_string()},
382 retcol="token",
383 )
384 for token in tokens:
385 self._invalidate_cache_and_stream(
386 txn, self.get_user_by_access_token, (token,)
387 )
388
389 await self.db_pool.runInteraction("set_shadow_banned", set_shadow_banned_txn)
390
362391 def _query_for_auth(self, txn, token: str) -> Optional[TokenLookupResult]:
363392 sql = """
364393 SELECT users.name as user_id,
441470 return dict(txn)
442471
443472 return await self.db_pool.runInteraction("get_users_by_id_case_insensitive", f)
473
474 async def record_user_external_id(
475 self, auth_provider: str, external_id: str, user_id: str
476 ) -> None:
477 """Record a mapping from an external user id to a mxid
478
479 Args:
480 auth_provider: identifier for the remote auth provider
481 external_id: id on that system
482 user_id: complete mxid that it is mapped to
483 """
484 await self.db_pool.simple_insert(
485 table="user_external_ids",
486 values={
487 "auth_provider": auth_provider,
488 "external_id": external_id,
489 "user_id": user_id,
490 },
491 desc="record_user_external_id",
492 )
444493
445494 async def get_user_by_external_id(
446495 self, auth_provider: str, external_id: str
11031152 FROM user_threepids
11041153 """
11051154
1106 txn.executemany(sql, [(id_server,) for id_server in id_servers])
1155 txn.execute_batch(sql, [(id_server,) for id_server in id_servers])
11071156
11081157 if id_servers:
11091158 await self.db_pool.runInteraction(
13701419
13711420 self._invalidate_cache_and_stream(txn, self.get_user_by_id, (user_id,))
13721421
1373 async def record_user_external_id(
1374 self, auth_provider: str, external_id: str, user_id: str
1375 ) -> None:
1376 """Record a mapping from an external user id to a mxid
1377
1378 Args:
1379 auth_provider: identifier for the remote auth provider
1380 external_id: id on that system
1381 user_id: complete mxid that it is mapped to
1382 """
1383 await self.db_pool.simple_insert(
1384 table="user_external_ids",
1385 values={
1386 "auth_provider": auth_provider,
1387 "external_id": external_id,
1388 "user_id": user_id,
1389 },
1390 desc="record_user_external_id",
1391 )
1392
13931422 async def user_set_password_hash(
13941423 self, user_id: str, password_hash: Optional[str]
13951424 ) -> None:
872872 "max_stream_id_exclusive", self._stream_order_on_start + 1
873873 )
874874
875 INSERT_CLUMP_SIZE = 1000
876
877875 def add_membership_profile_txn(txn):
878876 sql = """
879877 SELECT stream_ordering, event_id, events.room_id, event_json.json
914912 UPDATE room_memberships SET display_name = ?, avatar_url = ?
915913 WHERE event_id = ? AND room_id = ?
916914 """
917 for index in range(0, len(to_update), INSERT_CLUMP_SIZE):
918 clump = to_update[index : index + INSERT_CLUMP_SIZE]
919 txn.executemany(to_update_sql, clump)
915 txn.execute_batch(to_update_sql, to_update)
920916
921917 progress = {
922918 "target_min_stream_id_inclusive": target_min_stream_id,
5454 # { "ignored_users": "@someone:example.org": {} }
5555 ignored_users = content.get("ignored_users", {})
5656 if isinstance(ignored_users, dict) and ignored_users:
57 cur.executemany(insert_sql, [(user_id, u) for u in ignored_users])
57 cur.execute_batch(insert_sql, [(user_id, u) for u in ignored_users])
5858
5959 # Add indexes after inserting data for efficiency.
6060 logger.info("Adding constraints to ignored_users table")
2323 from synapse.storage.database import DatabasePool
2424 from synapse.storage.databases.main.events_worker import EventRedactBehaviour
2525 from synapse.storage.engines import PostgresEngine, Sqlite3Engine
26 from synapse.types import Collection
2627
2728 logger = logging.getLogger(__name__)
2829
6263 for entry in entries
6364 )
6465
65 txn.executemany(sql, args)
66 txn.execute_batch(sql, args)
6667
6768 elif isinstance(self.database_engine, Sqlite3Engine):
6869 sql = (
7475 for entry in entries
7576 )
7677
77 txn.executemany(sql, args)
78 txn.execute_batch(sql, args)
7879 else:
7980 # This should be unreachable.
8081 raise Exception("Unrecognized database engine")
459460
460461 async def search_rooms(
461462 self,
462 room_ids: List[str],
463 room_ids: Collection[str],
463464 search_term: str,
464465 keys: List[str],
465466 limit,
1414 # limitations under the License.
1515
1616 import logging
17 from collections import Counter
1817 from enum import Enum
1918 from itertools import chain
2019 from typing import Any, Dict, List, Optional, Tuple
20
21 from typing_extensions import Counter
2122
2223 from twisted.internet.defer import DeferredLock
2324
318319 return slice_list
319320
320321 @cached()
321 async def get_earliest_token_for_stats(self, stats_type: str, id: str) -> int:
322 async def get_earliest_token_for_stats(
323 self, stats_type: str, id: str
324 ) -> Optional[int]:
322325 """
323326 Fetch the "earliest token". This is used by the room stats delta
324327 processor to ignore deltas that have been processed between the
338341 )
339342
340343 async def bulk_update_stats_delta(
341 self, ts: int, updates: Dict[str, Dict[str, Dict[str, Counter]]], stream_id: int
344 self, ts: int, updates: Dict[str, Dict[str, Counter[str]]], stream_id: int
342345 ) -> None:
343346 """Bulk update stats tables for a given stream_id and updates the stats
344347 incremental position.
664667
665668 async def get_changes_room_total_events_and_bytes(
666669 self, min_pos: int, max_pos: int
667 ) -> Dict[str, Dict[str, int]]:
670 ) -> Tuple[Dict[str, Dict[str, int]], Dict[str, Dict[str, int]]]:
668671 """Fetches the counts of events in the given range of stream IDs.
669672
670673 Args:
682685 max_pos,
683686 )
684687
685 def get_changes_room_total_events_and_bytes_txn(self, txn, low_pos, high_pos):
688 def get_changes_room_total_events_and_bytes_txn(
689 self, txn, low_pos: int, high_pos: int
690 ) -> Tuple[Dict[str, Dict[str, int]], Dict[str, Dict[str, int]]]:
686691 """Gets the total_events and total_event_bytes counts for rooms and
687692 senders, in a range of stream_orderings (including backfilled events).
688693
689694 Args:
690695 txn
691 low_pos (int): Low stream ordering
692 high_pos (int): High stream ordering
696 low_pos: Low stream ordering
697 high_pos: High stream ordering
693698
694699 Returns:
695 tuple[dict[str, dict[str, int]], dict[str, dict[str, int]]]: The
696 room and user deltas for total_events/total_event_bytes in the
700 The room and user deltas for total_events/total_event_bytes in the
697701 format of `stats_id` -> fields
698702 """
699703
539539 desc="get_user_in_directory",
540540 )
541541
542 async def update_user_directory_stream_pos(self, stream_id: str) -> None:
542 async def update_user_directory_stream_pos(self, stream_id: int) -> None:
543543 await self.db_pool.simple_update_one(
544544 table="user_directory_stream_pos",
545545 keyvalues={},
564564 )
565565
566566 logger.info("[purge] removing redundant state groups")
567 txn.executemany(
567 txn.execute_batch(
568568 "DELETE FROM state_groups_state WHERE state_group = ?",
569569 ((sg,) for sg in state_groups_to_delete),
570570 )
571 txn.executemany(
571 txn.execute_batch(
572572 "DELETE FROM state_groups WHERE id = ?",
573573 ((sg,) for sg in state_groups_to_delete),
574574 )
1414 import heapq
1515 import logging
1616 import threading
17 from collections import deque
17 from collections import OrderedDict
1818 from contextlib import contextmanager
1919 from typing import Dict, List, Optional, Set, Tuple, Union
2020
2121 import attr
22 from typing_extensions import Deque
2322
2423 from synapse.metrics.background_process_metrics import run_as_background_process
2524 from synapse.storage.database import DatabasePool, LoggingTransaction
10099 self._current = (max if step > 0 else min)(
101100 self._current, _load_current_id(db_conn, table, column, step)
102101 )
103 self._unfinished_ids = deque() # type: Deque[int]
102
103 # We use this as an ordered set, as we want to efficiently append items,
104 # remove items and get the first item. Since we insert IDs in order, the
105 # insertion ordering will ensure its in the correct ordering.
106 #
107 # The key and values are the same, but we never look at the values.
108 self._unfinished_ids = OrderedDict() # type: OrderedDict[int, int]
104109
105110 def get_next(self):
106111 """
112117 self._current += self._step
113118 next_id = self._current
114119
115 self._unfinished_ids.append(next_id)
120 self._unfinished_ids[next_id] = next_id
116121
117122 @contextmanager
118123 def manager():
120125 yield next_id
121126 finally:
122127 with self._lock:
123 self._unfinished_ids.remove(next_id)
128 self._unfinished_ids.pop(next_id)
124129
125130 return _AsyncCtxManagerWrapper(manager())
126131
139144 self._current += n * self._step
140145
141146 for next_id in next_ids:
142 self._unfinished_ids.append(next_id)
147 self._unfinished_ids[next_id] = next_id
143148
144149 @contextmanager
145150 def manager():
148153 finally:
149154 with self._lock:
150155 for next_id in next_ids:
151 self._unfinished_ids.remove(next_id)
156 self._unfinished_ids.pop(next_id)
152157
153158 return _AsyncCtxManagerWrapper(manager())
154159
161166 """
162167 with self._lock:
163168 if self._unfinished_ids:
164 return self._unfinished_ids[0] - self._step
169 return next(iter(self._unfinished_ids)) - self._step
165170
166171 return self._current
167172
6969 ...
7070
7171 @abc.abstractmethod
72 def get_next_mult_txn(self, txn: Cursor, n: int) -> List[int]:
73 """Get the next `n` IDs in the sequence"""
74 ...
75
76 @abc.abstractmethod
7277 def check_consistency(
7378 self,
7479 db_conn: "LoggingDatabaseConnection",
218223 self._current_max_id += 1
219224 return self._current_max_id
220225
226 def get_next_mult_txn(self, txn: Cursor, n: int) -> List[int]:
227 with self._lock:
228 if self._current_max_id is None:
229 assert self._callback is not None
230 self._current_max_id = self._callback(txn)
231 self._callback = None
232
233 first_id = self._current_max_id + 1
234 self._current_max_id += n
235 return [first_id + i for i in range(n)]
236
221237 def check_consistency(
222238 self,
223239 db_conn: Connection,
4848 module = importlib.import_module(module)
4949 provider_class = getattr(module, clz)
5050
51 module_config = provider.get("config")
51 # Load the module config. If None, pass an empty dictionary instead
52 module_config = provider.get("config") or {}
5253 try:
5354 provider_config = provider_class.parse_config(module_config)
5455 except jsonschema.ValidationError as e:
0 # -*- coding: utf-8 -*-
1 # Copyright 2021 The Matrix.org Foundation C.I.C.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Utilities for dealing with jinja2 templates"""
16
17 import time
18 import urllib.parse
19 from typing import TYPE_CHECKING, Callable, Iterable, Optional, Union
20
21 import jinja2
22
23 if TYPE_CHECKING:
24 from synapse.config.homeserver import HomeServerConfig
25
26
27 def build_jinja_env(
28 template_search_directories: Iterable[str],
29 config: "HomeServerConfig",
30 autoescape: Union[bool, Callable[[str], bool], None] = None,
31 ) -> jinja2.Environment:
32 """Set up a Jinja2 environment to load templates from the given search path
33
34 The returned environment defines the following filters:
35 - format_ts: formats timestamps as strings in the server's local timezone
36 (XXX: why is that useful??)
37 - mxc_to_http: converts mxc: uris to http URIs. Args are:
38 (uri, width, height, resize_method="crop")
39
40 and the following global variables:
41 - server_name: matrix server name
42
43 Args:
44 template_search_directories: directories to search for templates
45
46 config: homeserver config, for things like `server_name` and `public_baseurl`
47
48 autoescape: whether template variables should be autoescaped. bool, or
49 a function mapping from template name to bool. Defaults to escaping templates
50 whose names end in .html, .xml or .htm.
51
52 Returns:
53 jinja environment
54 """
55
56 if autoescape is None:
57 autoescape = jinja2.select_autoescape()
58
59 loader = jinja2.FileSystemLoader(template_search_directories)
60 env = jinja2.Environment(loader=loader, autoescape=autoescape)
61
62 # Update the environment with our custom filters
63 env.filters.update(
64 {
65 "format_ts": _format_ts_filter,
66 "mxc_to_http": _create_mxc_to_http_filter(config.public_baseurl),
67 }
68 )
69
70 # common variables for all templates
71 env.globals.update({"server_name": config.server_name})
72
73 return env
74
75
76 def _create_mxc_to_http_filter(
77 public_baseurl: Optional[str],
78 ) -> Callable[[str, int, int, str], str]:
79 """Create and return a jinja2 filter that converts MXC urls to HTTP
80
81 Args:
82 public_baseurl: The public, accessible base URL of the homeserver
83 """
84
85 def mxc_to_http_filter(
86 value: str, width: int, height: int, resize_method: str = "crop"
87 ) -> str:
88 if not public_baseurl:
89 raise RuntimeError(
90 "public_baseurl must be set in the homeserver config to convert MXC URLs to HTTP URLs."
91 )
92
93 if value[0:6] != "mxc://":
94 return ""
95
96 server_and_media_id = value[6:]
97 fragment = None
98 if "#" in server_and_media_id:
99 server_and_media_id, fragment = server_and_media_id.split("#", 1)
100 fragment = "#" + fragment
101
102 params = {"width": width, "height": height, "method": resize_method}
103 return "%s_matrix/media/v1/thumbnail/%s?%s%s" % (
104 public_baseurl,
105 server_and_media_id,
106 urllib.parse.urlencode(params),
107 fragment or "",
108 )
109
110 return mxc_to_http_filter
111
112
113 def _format_ts_filter(value: int, format: str):
114 return time.strftime(format, time.localtime(value / 1000))
6161
6262 # check that the auth handler got called as expected
6363 auth_handler.complete_sso_login.assert_called_once_with(
64 "@test_user:test", request, "redirect_uri", None
64 "@test_user:test", request, "redirect_uri", None, new_user=True
6565 )
6666
6767 def test_map_cas_user_to_existing_user(self):
8484
8585 # check that the auth handler got called as expected
8686 auth_handler.complete_sso_login.assert_called_once_with(
87 "@test_user:test", request, "redirect_uri", None
87 "@test_user:test", request, "redirect_uri", None, new_user=False
8888 )
8989
9090 # Subsequent calls should map to the same mxid.
9393 self.handler._handle_cas_response(request, cas_response, "redirect_uri", "")
9494 )
9595 auth_handler.complete_sso_login.assert_called_once_with(
96 "@test_user:test", request, "redirect_uri", None
96 "@test_user:test", request, "redirect_uri", None, new_user=False
9797 )
9898
9999 def test_map_cas_user_to_invalid_localpart(self):
111111
112112 # check that the auth handler got called as expected
113113 auth_handler.complete_sso_login.assert_called_once_with(
114 "@f=c3=b6=c3=b6:test", request, "redirect_uri", None
114 "@f=c3=b6=c3=b6:test", request, "redirect_uri", None, new_user=True
115115 )
116116
117117
1515 from unittest import TestCase
1616
1717 from synapse.api.constants import EventTypes
18 from synapse.api.errors import AuthError, Codes, SynapseError
18 from synapse.api.errors import AuthError, Codes, LimitExceededError, SynapseError
1919 from synapse.api.room_versions import RoomVersions
2020 from synapse.events import EventBase
2121 from synapse.federation.federation_base import event_from_pdu_json
189189 sg2 = self.successResultOf(self.store._get_state_group_for_event(ev.event_id))
190190
191191 self.assertEqual(sg, sg2)
192
193 @unittest.override_config(
194 {"rc_invites": {"per_user": {"per_second": 0.5, "burst_count": 3}}}
195 )
196 def test_invite_by_user_ratelimit(self):
197 """Tests that invites from federation to a particular user are
198 actually rate-limited.
199 """
200 other_server = "otherserver"
201 other_user = "@otheruser:" + other_server
202
203 # create the room
204 user_id = self.register_user("kermit", "test")
205 tok = self.login("kermit", "test")
206
207 def create_invite():
208 room_id = self.helper.create_room_as(room_creator=user_id, tok=tok)
209 room_version = self.get_success(self.store.get_room_version(room_id))
210 return event_from_pdu_json(
211 {
212 "type": EventTypes.Member,
213 "content": {"membership": "invite"},
214 "room_id": room_id,
215 "sender": other_user,
216 "state_key": "@user:test",
217 "depth": 32,
218 "prev_events": [],
219 "auth_events": [],
220 "origin_server_ts": self.clock.time_msec(),
221 },
222 room_version,
223 )
224
225 for i in range(3):
226 event = create_invite()
227 self.get_success(
228 self.handler.on_invite_request(other_server, event, event.room_version,)
229 )
230
231 event = create_invite()
232 self.get_failure(
233 self.handler.on_invite_request(other_server, event, event.room_version,),
234 exc=LimitExceededError,
235 )
192236
193237 def _build_and_send_join_event(self, other_server, other_user, room_id):
194238 join_event = self.get_success(
3939 CLIENT_ID = "test-client-id"
4040 CLIENT_SECRET = "test-client-secret"
4141 BASE_URL = "https://synapse/"
42 CALLBACK_URL = BASE_URL + "_synapse/oidc/callback"
42 CALLBACK_URL = BASE_URL + "_synapse/client/oidc/callback"
4343 SCOPES = ["openid"]
4444
4545 AUTHORIZATION_ENDPOINT = ISSUER + "authorize"
5555 "token_endpoint": TOKEN_ENDPOINT,
5656 "jwks_uri": JWKS_URI,
5757 }
58
59
60 # The cookie name and path don't really matter, just that it has to be coherent
61 # between the callback & redirect handlers.
62 COOKIE_NAME = b"oidc_session"
63 COOKIE_PATH = "/_synapse/oidc"
6458
6559
6660 class TestMappingProvider:
339333 # For some reason, call.args does not work with python3.5
340334 args = calls[0][0]
341335 kwargs = calls[0][1]
342 self.assertEqual(args[0], COOKIE_NAME)
343 self.assertEqual(kwargs["path"], COOKIE_PATH)
336
337 # The cookie name and path don't really matter, just that it has to be coherent
338 # between the callback & redirect handlers.
339 self.assertEqual(args[0], b"oidc_session")
340 self.assertEqual(kwargs["path"], "/_synapse/client/oidc")
344341 cookie = args[1]
345342
346343 macaroon = pymacaroons.Macaroon.deserialize(cookie)
418415 self.get_success(self.handler.handle_oidc_callback(request))
419416
420417 auth_handler.complete_sso_login.assert_called_once_with(
421 expected_user_id, request, client_redirect_url, None,
418 expected_user_id, request, client_redirect_url, None, new_user=True
422419 )
423420 self.provider._exchange_code.assert_called_once_with(code)
424421 self.provider._parse_id_token.assert_called_once_with(token, nonce=nonce)
449446 self.get_success(self.handler.handle_oidc_callback(request))
450447
451448 auth_handler.complete_sso_login.assert_called_once_with(
452 expected_user_id, request, client_redirect_url, None,
449 expected_user_id, request, client_redirect_url, None, new_user=False
453450 )
454451 self.provider._exchange_code.assert_called_once_with(code)
455452 self.provider._parse_id_token.assert_not_called()
622619 self.get_success(self.handler.handle_oidc_callback(request))
623620
624621 auth_handler.complete_sso_login.assert_called_once_with(
625 "@foo:test", request, client_redirect_url, {"phone": "1234567"},
622 "@foo:test",
623 request,
624 client_redirect_url,
625 {"phone": "1234567"},
626 new_user=True,
626627 )
627628
628629 def test_map_userinfo_to_user(self):
636637 }
637638 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
638639 auth_handler.complete_sso_login.assert_called_once_with(
639 "@test_user:test", ANY, ANY, None,
640 "@test_user:test", ANY, ANY, None, new_user=True
640641 )
641642 auth_handler.complete_sso_login.reset_mock()
642643
647648 }
648649 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
649650 auth_handler.complete_sso_login.assert_called_once_with(
650 "@test_user_2:test", ANY, ANY, None,
651 "@test_user_2:test", ANY, ANY, None, new_user=True
651652 )
652653 auth_handler.complete_sso_login.reset_mock()
653654
684685 }
685686 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
686687 auth_handler.complete_sso_login.assert_called_once_with(
687 user.to_string(), ANY, ANY, None,
688 user.to_string(), ANY, ANY, None, new_user=False
688689 )
689690 auth_handler.complete_sso_login.reset_mock()
690691
691692 # Subsequent calls should map to the same mxid.
692693 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
693694 auth_handler.complete_sso_login.assert_called_once_with(
694 user.to_string(), ANY, ANY, None,
695 user.to_string(), ANY, ANY, None, new_user=False
695696 )
696697 auth_handler.complete_sso_login.reset_mock()
697698
706707 }
707708 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
708709 auth_handler.complete_sso_login.assert_called_once_with(
709 user.to_string(), ANY, ANY, None,
710 user.to_string(), ANY, ANY, None, new_user=False
710711 )
711712 auth_handler.complete_sso_login.reset_mock()
712713
742743
743744 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
744745 auth_handler.complete_sso_login.assert_called_once_with(
745 "@TEST_USER_2:test", ANY, ANY, None,
746 "@TEST_USER_2:test", ANY, ANY, None, new_user=False
746747 )
747748
748749 def test_map_userinfo_to_invalid_localpart(self):
778779
779780 # test_user is already taken, so test_user1 gets registered instead.
780781 auth_handler.complete_sso_login.assert_called_once_with(
781 "@test_user1:test", ANY, ANY, None,
782 "@test_user1:test", ANY, ANY, None, new_user=True
782783 )
783784 auth_handler.complete_sso_login.reset_mock()
784785
130130
131131 # check that the auth handler got called as expected
132132 auth_handler.complete_sso_login.assert_called_once_with(
133 "@test_user:test", request, "redirect_uri", None
133 "@test_user:test", request, "redirect_uri", None, new_user=True
134134 )
135135
136136 @override_config({"saml2_config": {"grandfathered_mxid_source_attribute": "mxid"}})
156156
157157 # check that the auth handler got called as expected
158158 auth_handler.complete_sso_login.assert_called_once_with(
159 "@test_user:test", request, "", None
159 "@test_user:test", request, "", None, new_user=False
160160 )
161161
162162 # Subsequent calls should map to the same mxid.
165165 self.handler._handle_authn_response(request, saml_response, "")
166166 )
167167 auth_handler.complete_sso_login.assert_called_once_with(
168 "@test_user:test", request, "", None
168 "@test_user:test", request, "", None, new_user=False
169169 )
170170
171171 def test_map_saml_response_to_invalid_localpart(self):
213213
214214 # test_user is already taken, so test_user1 gets registered instead.
215215 auth_handler.complete_sso_login.assert_called_once_with(
216 "@test_user1:test", request, "", None
216 "@test_user1:test", request, "", None, new_user=True
217217 )
218218 auth_handler.complete_sso_login.reset_mock()
219219
186186 # We should get emailed about those messages
187187 self._check_for_mail()
188188
189 def test_multiple_rooms(self):
190 # We want to test multiple notifications from multiple rooms, so we pause
191 # processing of push while we send messages.
192 self.pusher._pause_processing()
193
194 # Create a simple room with multiple other users
195 rooms = [
196 self.helper.create_room_as(self.user_id, tok=self.access_token),
197 self.helper.create_room_as(self.user_id, tok=self.access_token),
198 ]
199
200 for r, other in zip(rooms, self.others):
201 self.helper.invite(
202 room=r, src=self.user_id, tok=self.access_token, targ=other.id
203 )
204 self.helper.join(room=r, user=other.id, tok=other.token)
205
206 # The other users send some messages
207 self.helper.send(rooms[0], body="Hi!", tok=self.others[0].token)
208 self.helper.send(rooms[1], body="There!", tok=self.others[1].token)
209 self.helper.send(rooms[1], body="There!", tok=self.others[1].token)
210
211 # Nothing should have happened yet, as we're paused.
212 assert not self.email_attempts
213
214 self.pusher._resume_processing()
215
216 # We should get emailed about those messages
217 self._check_for_mail()
218
189219 def test_encrypted_message(self):
190220 room = self.helper.create_room_as(self.user_id, tok=self.access_token)
191221 self.helper.invite(
0 # Copyright 2021 The Matrix.org Foundation C.I.C.
1 #
2 # Licensed under the Apache License, Version 2.0 (the "License");
3 # you may not use this file except in compliance with the License.
4 # You may obtain a copy of the License at
5 #
6 # http://www.apache.org/licenses/LICENSE-2.0
7 #
8 # Unless required by applicable law or agreed to in writing, software
9 # distributed under the License is distributed on an "AS IS" BASIS,
10 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
11 # See the License for the specific language governing permissions and
12 # limitations under the License.
13
14 from typing import Iterable, Optional, Tuple
15
16 from synapse.api.constants import EventTypes, Membership
17 from synapse.api.room_versions import RoomVersions
18 from synapse.events import FrozenEvent
19 from synapse.push.presentable_names import calculate_room_name
20 from synapse.types import StateKey, StateMap
21
22 from tests import unittest
23
24
25 class MockDataStore:
26 """
27 A fake data store which stores a mapping of state key to event content.
28 (I.e. the state key is used as the event ID.)
29 """
30
31 def __init__(self, events: Iterable[Tuple[StateKey, dict]]):
32 """
33 Args:
34 events: A state map to event contents.
35 """
36 self._events = {}
37
38 for i, (event_id, content) in enumerate(events):
39 self._events[event_id] = FrozenEvent(
40 {
41 "event_id": "$event_id",
42 "type": event_id[0],
43 "sender": "@user:test",
44 "state_key": event_id[1],
45 "room_id": "#room:test",
46 "content": content,
47 "origin_server_ts": i,
48 },
49 RoomVersions.V1,
50 )
51
52 async def get_event(
53 self, event_id: StateKey, allow_none: bool = False
54 ) -> Optional[FrozenEvent]:
55 assert allow_none, "Mock not configured for allow_none = False"
56
57 return self._events.get(event_id)
58
59 async def get_events(self, event_ids: Iterable[StateKey]):
60 # This is cheating since it just returns all events.
61 return self._events
62
63
64 class PresentableNamesTestCase(unittest.HomeserverTestCase):
65 USER_ID = "@test:test"
66 OTHER_USER_ID = "@user:test"
67
68 def _calculate_room_name(
69 self,
70 events: StateMap[dict],
71 user_id: str = "",
72 fallback_to_members: bool = True,
73 fallback_to_single_member: bool = True,
74 ):
75 # This isn't 100% accurate, but works with MockDataStore.
76 room_state_ids = {k[0]: k[0] for k in events}
77
78 return self.get_success(
79 calculate_room_name(
80 MockDataStore(events),
81 room_state_ids,
82 user_id or self.USER_ID,
83 fallback_to_members,
84 fallback_to_single_member,
85 )
86 )
87
88 def test_name(self):
89 """A room name event should be used."""
90 events = [
91 ((EventTypes.Name, ""), {"name": "test-name"}),
92 ]
93 self.assertEqual("test-name", self._calculate_room_name(events))
94
95 # Check if the event content has garbage.
96 events = [((EventTypes.Name, ""), {"foo": 1})]
97 self.assertEqual("Empty Room", self._calculate_room_name(events))
98
99 events = [((EventTypes.Name, ""), {"name": 1})]
100 self.assertEqual(1, self._calculate_room_name(events))
101
102 def test_canonical_alias(self):
103 """An canonical alias should be used."""
104 events = [
105 ((EventTypes.CanonicalAlias, ""), {"alias": "#test-name:test"}),
106 ]
107 self.assertEqual("#test-name:test", self._calculate_room_name(events))
108
109 # Check if the event content has garbage.
110 events = [((EventTypes.CanonicalAlias, ""), {"foo": 1})]
111 self.assertEqual("Empty Room", self._calculate_room_name(events))
112
113 events = [((EventTypes.CanonicalAlias, ""), {"alias": "test-name"})]
114 self.assertEqual("Empty Room", self._calculate_room_name(events))
115
116 def test_invite(self):
117 """An invite has special behaviour."""
118 events = [
119 ((EventTypes.Member, self.USER_ID), {"membership": Membership.INVITE}),
120 ((EventTypes.Member, self.OTHER_USER_ID), {"displayname": "Other User"}),
121 ]
122 self.assertEqual("Invite from Other User", self._calculate_room_name(events))
123 self.assertIsNone(
124 self._calculate_room_name(events, fallback_to_single_member=False)
125 )
126 # Ensure this logic is skipped if we don't fallback to members.
127 self.assertIsNone(self._calculate_room_name(events, fallback_to_members=False))
128
129 # Check if the event content has garbage.
130 events = [
131 ((EventTypes.Member, self.USER_ID), {"membership": Membership.INVITE}),
132 ((EventTypes.Member, self.OTHER_USER_ID), {"foo": 1}),
133 ]
134 self.assertEqual("Invite from @user:test", self._calculate_room_name(events))
135
136 # No member event for sender.
137 events = [
138 ((EventTypes.Member, self.USER_ID), {"membership": Membership.INVITE}),
139 ]
140 self.assertEqual("Room Invite", self._calculate_room_name(events))
141
142 def test_no_members(self):
143 """Behaviour of an empty room."""
144 events = []
145 self.assertEqual("Empty Room", self._calculate_room_name(events))
146
147 # Note that events with invalid (or missing) membership are ignored.
148 events = [
149 ((EventTypes.Member, self.OTHER_USER_ID), {"foo": 1}),
150 ((EventTypes.Member, "@foo:test"), {"membership": "foo"}),
151 ]
152 self.assertEqual("Empty Room", self._calculate_room_name(events))
153
154 def test_no_other_members(self):
155 """Behaviour of a room with no other members in it."""
156 events = [
157 (
158 (EventTypes.Member, self.USER_ID),
159 {"membership": Membership.JOIN, "displayname": "Me"},
160 ),
161 ]
162 self.assertEqual("Me", self._calculate_room_name(events))
163
164 # Check if the event content has no displayname.
165 events = [
166 ((EventTypes.Member, self.USER_ID), {"membership": Membership.JOIN}),
167 ]
168 self.assertEqual("@test:test", self._calculate_room_name(events))
169
170 # 3pid invite, use the other user (who is set as the sender).
171 events = [
172 ((EventTypes.Member, self.OTHER_USER_ID), {"membership": Membership.JOIN}),
173 ]
174 self.assertEqual(
175 "nobody", self._calculate_room_name(events, user_id=self.OTHER_USER_ID)
176 )
177
178 events = [
179 ((EventTypes.Member, self.OTHER_USER_ID), {"membership": Membership.JOIN}),
180 ((EventTypes.ThirdPartyInvite, self.OTHER_USER_ID), {}),
181 ]
182 self.assertEqual(
183 "Inviting email address",
184 self._calculate_room_name(events, user_id=self.OTHER_USER_ID),
185 )
186
187 def test_one_other_member(self):
188 """Behaviour of a room with a single other member."""
189 events = [
190 ((EventTypes.Member, self.USER_ID), {"membership": Membership.JOIN}),
191 (
192 (EventTypes.Member, self.OTHER_USER_ID),
193 {"membership": Membership.JOIN, "displayname": "Other User"},
194 ),
195 ]
196 self.assertEqual("Other User", self._calculate_room_name(events))
197 self.assertIsNone(
198 self._calculate_room_name(events, fallback_to_single_member=False)
199 )
200
201 # Check if the event content has no displayname and is an invite.
202 events = [
203 ((EventTypes.Member, self.USER_ID), {"membership": Membership.JOIN}),
204 (
205 (EventTypes.Member, self.OTHER_USER_ID),
206 {"membership": Membership.INVITE},
207 ),
208 ]
209 self.assertEqual("@user:test", self._calculate_room_name(events))
210
211 def test_other_members(self):
212 """Behaviour of a room with multiple other members."""
213 # Two other members.
214 events = [
215 ((EventTypes.Member, self.USER_ID), {"membership": Membership.JOIN}),
216 (
217 (EventTypes.Member, self.OTHER_USER_ID),
218 {"membership": Membership.JOIN, "displayname": "Other User"},
219 ),
220 ((EventTypes.Member, "@foo:test"), {"membership": Membership.JOIN}),
221 ]
222 self.assertEqual("Other User and @foo:test", self._calculate_room_name(events))
223
224 # Three or more other members.
225 events.append(
226 ((EventTypes.Member, "@fourth:test"), {"membership": Membership.INVITE})
227 )
228 self.assertEqual("Other User and 2 others", self._calculate_room_name(events))
2828 "type": "m.room.history_visibility",
2929 "sender": "@user:test",
3030 "state_key": "",
31 "room_id": "@room:test",
31 "room_id": "#room:test",
3232 "content": content,
3333 },
3434 RoomVersions.V1,
211211 # Fake in memory Redis server that servers can connect to.
212212 self._redis_server = FakeRedisPubSubServer()
213213
214 # We may have an attempt to connect to redis for the external cache already.
215 self.connect_any_redis_attempts()
216
214217 store = self.hs.get_datastore()
215218 self.database_pool = store.db_pool
216219
400403 fake one.
401404 """
402405 clients = self.reactor.tcpClients
403 self.assertEqual(len(clients), 1)
404 (host, port, client_factory, _timeout, _bindAddress) = clients.pop(0)
405 self.assertEqual(host, "localhost")
406 self.assertEqual(port, 6379)
407
408 client_protocol = client_factory.buildProtocol(None)
409 server_protocol = self._redis_server.buildProtocol(None)
410
411 client_to_server_transport = FakeTransport(
412 server_protocol, self.reactor, client_protocol
413 )
414 client_protocol.makeConnection(client_to_server_transport)
415
416 server_to_client_transport = FakeTransport(
417 client_protocol, self.reactor, server_protocol
418 )
419 server_protocol.makeConnection(server_to_client_transport)
420
421 return client_to_server_transport, server_to_client_transport
406 while clients:
407 (host, port, client_factory, _timeout, _bindAddress) = clients.pop(0)
408 self.assertEqual(host, "localhost")
409 self.assertEqual(port, 6379)
410
411 client_protocol = client_factory.buildProtocol(None)
412 server_protocol = self._redis_server.buildProtocol(None)
413
414 client_to_server_transport = FakeTransport(
415 server_protocol, self.reactor, client_protocol
416 )
417 client_protocol.makeConnection(client_to_server_transport)
418
419 server_to_client_transport = FakeTransport(
420 client_protocol, self.reactor, server_protocol
421 )
422 server_protocol.makeConnection(server_to_client_transport)
422423
423424
424425 class TestReplicationDataHandler(GenericWorkerReplicationHandler):
623624 (channel,) = args
624625 self._server.add_subscriber(self)
625626 self.send(["subscribe", channel, 1])
627
628 # Since we use SET/GET to cache things we can safely no-op them.
629 elif command == b"SET":
630 self.send("OK")
631 elif command == b"GET":
632 self.send(None)
626633 else:
627634 raise Exception("Unknown command")
628635
644651 # We assume bytes are just unicode strings.
645652 obj = obj.decode("utf-8")
646653
654 if obj is None:
655 return "$-1\r\n"
647656 if isinstance(obj, str):
648657 return "${len}\r\n{str}\r\n".format(len=len(obj), str=obj)
649658 if isinstance(obj, int):
11781178 ["@admin:test", "@bar:test", "@foobar:test"], channel.json_body["members"]
11791179 )
11801180 self.assertEqual(channel.json_body["total"], 3)
1181
1182 def test_room_state(self):
1183 """Test that room state can be requested correctly"""
1184 # Create two test rooms
1185 room_id = self.helper.create_room_as(self.admin_user, tok=self.admin_user_tok)
1186
1187 url = "/_synapse/admin/v1/rooms/%s/state" % (room_id,)
1188 channel = self.make_request(
1189 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
1190 )
1191 self.assertEqual(200, channel.code, msg=channel.json_body)
1192 self.assertIn("state", channel.json_body)
1193 # testing that the state events match is painful and not done here. We assume that
1194 # the create_room already does the right thing, so no need to verify that we got
1195 # the state events it created.
11811196
11821197
11831198 class JoinAliasRoomTestCase(unittest.HomeserverTestCase):
2727 from synapse.api.room_versions import RoomVersions
2828 from synapse.rest.client.v1 import login, logout, profile, room
2929 from synapse.rest.client.v2_alpha import devices, sync
30 from synapse.types import JsonDict
3031
3132 from tests import unittest
3233 from tests.test_utils import make_awaitable
467468 self.admin_user = self.register_user("admin", "pass", admin=True)
468469 self.admin_user_tok = self.login("admin", "pass")
469470
470 self.user1 = self.register_user(
471 "user1", "pass1", admin=False, displayname="Name 1"
472 )
473 self.user2 = self.register_user(
474 "user2", "pass2", admin=False, displayname="Name 2"
475 )
476
477471 def test_no_auth(self):
478472 """
479473 Try to list users without authentication.
487481 """
488482 If the user is not a server admin, an error is returned.
489483 """
484 self._create_users(1)
490485 other_user_token = self.login("user1", "pass1")
491486
492487 channel = self.make_request("GET", self.url, access_token=other_user_token)
498493 """
499494 List all users, including deactivated users.
500495 """
496 self._create_users(2)
497
501498 channel = self.make_request(
502499 "GET",
503500 self.url + "?deactivated=true",
510507 self.assertEqual(3, channel.json_body["total"])
511508
512509 # Check that all fields are available
513 for u in channel.json_body["users"]:
514 self.assertIn("name", u)
515 self.assertIn("is_guest", u)
516 self.assertIn("admin", u)
517 self.assertIn("user_type", u)
518 self.assertIn("deactivated", u)
519 self.assertIn("displayname", u)
520 self.assertIn("avatar_url", u)
510 self._check_fields(channel.json_body["users"])
521511
522512 def test_search_term(self):
523513 """Test that searching for a users works correctly"""
548538
549539 # Check that users were returned
550540 self.assertTrue("users" in channel.json_body)
541 self._check_fields(channel.json_body["users"])
551542 users = channel.json_body["users"]
552543
553544 # Check that the expected number of users were returned
560551 u = users[0]
561552 self.assertEqual(expected_user_id, u["name"])
562553
554 self._create_users(2)
555
556 user1 = "@user1:test"
557 user2 = "@user2:test"
558
563559 # Perform search tests
564 _search_test(self.user1, "er1")
565 _search_test(self.user1, "me 1")
566
567 _search_test(self.user2, "er2")
568 _search_test(self.user2, "me 2")
569
570 _search_test(self.user1, "er1", "user_id")
571 _search_test(self.user2, "er2", "user_id")
560 _search_test(user1, "er1")
561 _search_test(user1, "me 1")
562
563 _search_test(user2, "er2")
564 _search_test(user2, "me 2")
565
566 _search_test(user1, "er1", "user_id")
567 _search_test(user2, "er2", "user_id")
572568
573569 # Test case insensitive
574 _search_test(self.user1, "ER1")
575 _search_test(self.user1, "NAME 1")
576
577 _search_test(self.user2, "ER2")
578 _search_test(self.user2, "NAME 2")
579
580 _search_test(self.user1, "ER1", "user_id")
581 _search_test(self.user2, "ER2", "user_id")
570 _search_test(user1, "ER1")
571 _search_test(user1, "NAME 1")
572
573 _search_test(user2, "ER2")
574 _search_test(user2, "NAME 2")
575
576 _search_test(user1, "ER1", "user_id")
577 _search_test(user2, "ER2", "user_id")
582578
583579 _search_test(None, "foo")
584580 _search_test(None, "bar")
585581
586582 _search_test(None, "foo", "user_id")
587583 _search_test(None, "bar", "user_id")
584
585 def test_invalid_parameter(self):
586 """
587 If parameters are invalid, an error is returned.
588 """
589
590 # negative limit
591 channel = self.make_request(
592 "GET", self.url + "?limit=-5", access_token=self.admin_user_tok,
593 )
594
595 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
596 self.assertEqual(Codes.INVALID_PARAM, channel.json_body["errcode"])
597
598 # negative from
599 channel = self.make_request(
600 "GET", self.url + "?from=-5", access_token=self.admin_user_tok,
601 )
602
603 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
604 self.assertEqual(Codes.INVALID_PARAM, channel.json_body["errcode"])
605
606 # invalid guests
607 channel = self.make_request(
608 "GET", self.url + "?guests=not_bool", access_token=self.admin_user_tok,
609 )
610
611 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
612 self.assertEqual(Codes.UNKNOWN, channel.json_body["errcode"])
613
614 # invalid deactivated
615 channel = self.make_request(
616 "GET", self.url + "?deactivated=not_bool", access_token=self.admin_user_tok,
617 )
618
619 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
620 self.assertEqual(Codes.UNKNOWN, channel.json_body["errcode"])
621
622 def test_limit(self):
623 """
624 Testing list of users with limit
625 """
626
627 number_users = 20
628 # Create one less user (since there's already an admin user).
629 self._create_users(number_users - 1)
630
631 channel = self.make_request(
632 "GET", self.url + "?limit=5", access_token=self.admin_user_tok,
633 )
634
635 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
636 self.assertEqual(channel.json_body["total"], number_users)
637 self.assertEqual(len(channel.json_body["users"]), 5)
638 self.assertEqual(channel.json_body["next_token"], "5")
639 self._check_fields(channel.json_body["users"])
640
641 def test_from(self):
642 """
643 Testing list of users with a defined starting point (from)
644 """
645
646 number_users = 20
647 # Create one less user (since there's already an admin user).
648 self._create_users(number_users - 1)
649
650 channel = self.make_request(
651 "GET", self.url + "?from=5", access_token=self.admin_user_tok,
652 )
653
654 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
655 self.assertEqual(channel.json_body["total"], number_users)
656 self.assertEqual(len(channel.json_body["users"]), 15)
657 self.assertNotIn("next_token", channel.json_body)
658 self._check_fields(channel.json_body["users"])
659
660 def test_limit_and_from(self):
661 """
662 Testing list of users with a defined starting point and limit
663 """
664
665 number_users = 20
666 # Create one less user (since there's already an admin user).
667 self._create_users(number_users - 1)
668
669 channel = self.make_request(
670 "GET", self.url + "?from=5&limit=10", access_token=self.admin_user_tok,
671 )
672
673 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
674 self.assertEqual(channel.json_body["total"], number_users)
675 self.assertEqual(channel.json_body["next_token"], "15")
676 self.assertEqual(len(channel.json_body["users"]), 10)
677 self._check_fields(channel.json_body["users"])
678
679 def test_next_token(self):
680 """
681 Testing that `next_token` appears at the right place
682 """
683
684 number_users = 20
685 # Create one less user (since there's already an admin user).
686 self._create_users(number_users - 1)
687
688 # `next_token` does not appear
689 # Number of results is the number of entries
690 channel = self.make_request(
691 "GET", self.url + "?limit=20", access_token=self.admin_user_tok,
692 )
693
694 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
695 self.assertEqual(channel.json_body["total"], number_users)
696 self.assertEqual(len(channel.json_body["users"]), number_users)
697 self.assertNotIn("next_token", channel.json_body)
698
699 # `next_token` does not appear
700 # Number of max results is larger than the number of entries
701 channel = self.make_request(
702 "GET", self.url + "?limit=21", access_token=self.admin_user_tok,
703 )
704
705 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
706 self.assertEqual(channel.json_body["total"], number_users)
707 self.assertEqual(len(channel.json_body["users"]), number_users)
708 self.assertNotIn("next_token", channel.json_body)
709
710 # `next_token` does appear
711 # Number of max results is smaller than the number of entries
712 channel = self.make_request(
713 "GET", self.url + "?limit=19", access_token=self.admin_user_tok,
714 )
715
716 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
717 self.assertEqual(channel.json_body["total"], number_users)
718 self.assertEqual(len(channel.json_body["users"]), 19)
719 self.assertEqual(channel.json_body["next_token"], "19")
720
721 # Check
722 # Set `from` to value of `next_token` for request remaining entries
723 # `next_token` does not appear
724 channel = self.make_request(
725 "GET", self.url + "?from=19", access_token=self.admin_user_tok,
726 )
727
728 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
729 self.assertEqual(channel.json_body["total"], number_users)
730 self.assertEqual(len(channel.json_body["users"]), 1)
731 self.assertNotIn("next_token", channel.json_body)
732
733 def _check_fields(self, content: JsonDict):
734 """Checks that the expected user attributes are present in content
735 Args:
736 content: List that is checked for content
737 """
738 for u in content:
739 self.assertIn("name", u)
740 self.assertIn("is_guest", u)
741 self.assertIn("admin", u)
742 self.assertIn("user_type", u)
743 self.assertIn("deactivated", u)
744 self.assertIn("displayname", u)
745 self.assertIn("avatar_url", u)
746
747 def _create_users(self, number_users: int):
748 """
749 Create a number of users
750 Args:
751 number_users: Number of users to be created
752 """
753 for i in range(1, number_users + 1):
754 self.register_user(
755 "user%d" % i, "pass%d" % i, admin=False, displayname="Name %d" % i,
756 )
588757
589758
590759 class DeactivateAccountTestCase(unittest.HomeserverTestCase):
22102379 self.assertEqual(200, channel.code, msg=channel.json_body)
22112380 self.assertEqual(self.other_user, channel.json_body["user_id"])
22122381 self.assertIn("devices", channel.json_body)
2382
2383
2384 class ShadowBanRestTestCase(unittest.HomeserverTestCase):
2385
2386 servlets = [
2387 synapse.rest.admin.register_servlets,
2388 login.register_servlets,
2389 ]
2390
2391 def prepare(self, reactor, clock, hs):
2392 self.store = hs.get_datastore()
2393
2394 self.admin_user = self.register_user("admin", "pass", admin=True)
2395 self.admin_user_tok = self.login("admin", "pass")
2396
2397 self.other_user = self.register_user("user", "pass")
2398
2399 self.url = "/_synapse/admin/v1/users/%s/shadow_ban" % urllib.parse.quote(
2400 self.other_user
2401 )
2402
2403 def test_no_auth(self):
2404 """
2405 Try to get information of an user without authentication.
2406 """
2407 channel = self.make_request("POST", self.url)
2408 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
2409 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
2410
2411 def test_requester_is_not_admin(self):
2412 """
2413 If the user is not a server admin, an error is returned.
2414 """
2415 other_user_token = self.login("user", "pass")
2416
2417 channel = self.make_request("POST", self.url, access_token=other_user_token)
2418 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
2419 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
2420
2421 def test_user_is_not_local(self):
2422 """
2423 Tests that shadow-banning for a user that is not a local returns a 400
2424 """
2425 url = "/_synapse/admin/v1/whois/@unknown_person:unknown_domain"
2426
2427 channel = self.make_request("POST", url, access_token=self.admin_user_tok)
2428 self.assertEqual(400, channel.code, msg=channel.json_body)
2429
2430 def test_success(self):
2431 """
2432 Shadow-banning should succeed for an admin.
2433 """
2434 # The user starts off as not shadow-banned.
2435 other_user_token = self.login("user", "pass")
2436 result = self.get_success(self.store.get_user_by_access_token(other_user_token))
2437 self.assertFalse(result.shadow_banned)
2438
2439 channel = self.make_request("POST", self.url, access_token=self.admin_user_tok)
2440 self.assertEqual(200, channel.code, msg=channel.json_body)
2441 self.assertEqual({}, channel.json_body)
2442
2443 # Ensure the user is shadow-banned (and the cache was cleared).
2444 result = self.get_success(self.store.get_user_by_access_token(other_user_token))
2445 self.assertTrue(result.shadow_banned)
1717 from synapse.api.constants import EventTypes
1818 from synapse.rest.client.v1 import directory, login, profile, room
1919 from synapse.rest.client.v2_alpha import room_upgrade_rest_servlet
20 from synapse.types import UserID
2021
2122 from tests import unittest
2223
3031 self.store = self.hs.get_datastore()
3132
3233 self.get_success(
33 self.store.db_pool.simple_update(
34 table="users",
35 keyvalues={"name": self.banned_user_id},
36 updatevalues={"shadow_banned": True},
37 desc="shadow_ban",
38 )
34 self.store.set_shadow_banned(UserID.from_string(self.banned_user_id), True)
3935 )
4036
4137 self.other_user_id = self.register_user("otheruser", "pass")
2828 from synapse.rest.client.v1 import login, logout
2929 from synapse.rest.client.v2_alpha import devices, register
3030 from synapse.rest.client.v2_alpha.account import WhoamiRestServlet
31 from synapse.rest.synapse.client.pick_idp import PickIdpResource
32 from synapse.rest.synapse.client.pick_username import pick_username_resource
31 from synapse.rest.synapse.client import build_synapse_client_resource_tree
3332 from synapse.types import create_requester
3433
3534 from tests import unittest
7372
7473 # the query params in TEST_CLIENT_REDIRECT_URL
7574 EXPECTED_CLIENT_REDIRECT_URL_PARAMS = [("<ab c>", ""), ('q" =+"', '"fö&=o"')]
75
76 # (possibly experimental) login flows we expect to appear in the list after the normal
77 # ones
78 ADDITIONAL_LOGIN_FLOWS = [{"type": "uk.half-shot.msc2778.login.application_service"}]
7679
7780
7881 class LoginRestServletTestCase(unittest.HomeserverTestCase):
418421 return config
419422
420423 def create_resource_dict(self) -> Dict[str, Resource]:
421 from synapse.rest.oidc import OIDCResource
422
423424 d = super().create_resource_dict()
424 d["/_synapse/client/pick_idp"] = PickIdpResource(self.hs)
425 d["/_synapse/oidc"] = OIDCResource(self.hs)
425 d.update(build_synapse_client_resource_tree(self.hs))
426426 return d
427
428 def test_get_login_flows(self):
429 """GET /login should return password and SSO flows"""
430 channel = self.make_request("GET", "/_matrix/client/r0/login")
431 self.assertEqual(channel.code, 200, channel.result)
432
433 expected_flows = [
434 {"type": "m.login.cas"},
435 {"type": "m.login.sso"},
436 {"type": "m.login.token"},
437 {"type": "m.login.password"},
438 ] + ADDITIONAL_LOGIN_FLOWS
439
440 self.assertCountEqual(channel.json_body["flows"], expected_flows)
441
442 @override_config({"experimental_features": {"msc2858_enabled": True}})
443 def test_get_msc2858_login_flows(self):
444 """The SSO flow should include IdP info if MSC2858 is enabled"""
445 channel = self.make_request("GET", "/_matrix/client/r0/login")
446 self.assertEqual(channel.code, 200, channel.result)
447
448 # stick the flows results in a dict by type
449 flow_results = {} # type: Dict[str, Any]
450 for f in channel.json_body["flows"]:
451 flow_type = f["type"]
452 self.assertNotIn(
453 flow_type, flow_results, "duplicate flow type %s" % (flow_type,)
454 )
455 flow_results[flow_type] = f
456
457 self.assertIn("m.login.sso", flow_results, "m.login.sso was not returned")
458 sso_flow = flow_results.pop("m.login.sso")
459 # we should have a set of IdPs
460 self.assertCountEqual(
461 sso_flow["org.matrix.msc2858.identity_providers"],
462 [
463 {"id": "cas", "name": "CAS"},
464 {"id": "saml", "name": "SAML"},
465 {"id": "oidc-idp1", "name": "IDP1"},
466 {"id": "oidc", "name": "OIDC"},
467 ],
468 )
469
470 # the rest of the flows are simple
471 expected_flows = [
472 {"type": "m.login.cas"},
473 {"type": "m.login.token"},
474 {"type": "m.login.password"},
475 ] + ADDITIONAL_LOGIN_FLOWS
476
477 self.assertCountEqual(flow_results.values(), expected_flows)
427478
428479 def test_multi_sso_redirect(self):
429480 """/login/sso/redirect should redirect to an identity picker"""
563614 )
564615 self.assertEqual(channel.code, 400, channel.result)
565616
617 def test_client_idp_redirect_msc2858_disabled(self):
618 """If the client tries to pick an IdP but MSC2858 is disabled, return a 400"""
619 channel = self.make_request(
620 "GET",
621 "/_matrix/client/unstable/org.matrix.msc2858/login/sso/redirect/oidc?redirectUrl="
622 + urllib.parse.quote_plus(TEST_CLIENT_REDIRECT_URL),
623 )
624 self.assertEqual(channel.code, 400, channel.result)
625 self.assertEqual(channel.json_body["errcode"], "M_UNRECOGNIZED")
626
627 @override_config({"experimental_features": {"msc2858_enabled": True}})
628 def test_client_idp_redirect_to_unknown(self):
629 """If the client tries to pick an unknown IdP, return a 404"""
630 channel = self.make_request(
631 "GET",
632 "/_matrix/client/unstable/org.matrix.msc2858/login/sso/redirect/xxx?redirectUrl="
633 + urllib.parse.quote_plus(TEST_CLIENT_REDIRECT_URL),
634 )
635 self.assertEqual(channel.code, 404, channel.result)
636 self.assertEqual(channel.json_body["errcode"], "M_NOT_FOUND")
637
638 @override_config({"experimental_features": {"msc2858_enabled": True}})
639 def test_client_idp_redirect_to_oidc(self):
640 """If the client pick a known IdP, redirect to it"""
641 channel = self.make_request(
642 "GET",
643 "/_matrix/client/unstable/org.matrix.msc2858/login/sso/redirect/oidc?redirectUrl="
644 + urllib.parse.quote_plus(TEST_CLIENT_REDIRECT_URL),
645 )
646
647 self.assertEqual(channel.code, 302, channel.result)
648 oidc_uri = channel.headers.getRawHeaders("Location")[0]
649 oidc_uri_path, oidc_uri_query = oidc_uri.split("?", 1)
650
651 # it should redirect us to the auth page of the OIDC server
652 self.assertEqual(oidc_uri_path, TEST_OIDC_AUTH_ENDPOINT)
653
566654 @staticmethod
567655 def _get_value_from_macaroon(macaroon: pymacaroons.Macaroon, key: str) -> str:
568656 prefix = key + " = "
583671 self.redirect_path = "_synapse/client/login/sso/redirect/confirm"
584672
585673 config = self.default_config()
674 config["public_baseurl"] = (
675 config.get("public_baseurl") or "https://matrix.goodserver.com:8448"
676 )
586677 config["cas_config"] = {
587678 "enabled": True,
588679 "server_url": CAS_SERVER,
589 "service_url": "https://matrix.goodserver.com:8448",
590680 }
591681
592682 cas_user_id = "username"
11181208 return config
11191209
11201210 def create_resource_dict(self) -> Dict[str, Resource]:
1121 from synapse.rest.oidc import OIDCResource
1122
11231211 d = super().create_resource_dict()
1124 d["/_synapse/client/pick_username"] = pick_username_resource(self.hs)
1125 d["/_synapse/oidc"] = OIDCResource(self.hs)
1212 d.update(build_synapse_client_resource_tree(self.hs))
11261213 return d
11271214
11281215 def test_username_picker(self):
11361223 # that should redirect to the username picker
11371224 self.assertEqual(channel.code, 302, channel.result)
11381225 picker_url = channel.headers.getRawHeaders("Location")[0]
1139 self.assertEqual(picker_url, "/_synapse/client/pick_username")
1226 self.assertEqual(picker_url, "/_synapse/client/pick_username/account_details")
11401227
11411228 # ... with a username_mapping_session cookie
11421229 cookies = {} # type: Dict[str,str]
11601247 self.assertApproximates(session.expiry_time_ms, expected_expiry, tolerance=1000)
11611248
11621249 # Now, submit a username to the username picker, which should serve a redirect
1163 # back to the client
1164 submit_path = picker_url + "/submit"
1250 # to the completion page
11651251 content = urlencode({b"username": b"bobby"}).encode("utf8")
11661252 chan = self.make_request(
11671253 "POST",
1168 path=submit_path,
1254 path=picker_url,
11691255 content=content,
11701256 content_is_form=True,
11711257 custom_headers=[
11771263 )
11781264 self.assertEqual(chan.code, 302, chan.result)
11791265 location_headers = chan.headers.getRawHeaders("Location")
1266
1267 # send a request to the completion page, which should 302 to the client redirectUrl
1268 chan = self.make_request(
1269 "GET",
1270 path=location_headers[0],
1271 custom_headers=[("Cookie", "username_mapping_session=" + session_id)],
1272 )
1273 self.assertEqual(chan.code, 302, chan.result)
1274 location_headers = chan.headers.getRawHeaders("Location")
1275
11801276 # ensure that the returned location matches the requested redirect URL
11811277 path, query = location_headers[0].split("?", 1)
11821278 self.assertEqual(path, "https://x")
615615 self.assertEquals(json.loads(content), channel.json_body)
616616
617617
618 class RoomInviteRatelimitTestCase(RoomBase):
619 user_id = "@sid1:red"
620
621 servlets = [
622 admin.register_servlets,
623 profile.register_servlets,
624 room.register_servlets,
625 ]
626
627 @unittest.override_config(
628 {"rc_invites": {"per_room": {"per_second": 0.5, "burst_count": 3}}}
629 )
630 def test_invites_by_rooms_ratelimit(self):
631 """Tests that invites in a room are actually rate-limited."""
632 room_id = self.helper.create_room_as(self.user_id)
633
634 for i in range(3):
635 self.helper.invite(room_id, self.user_id, "@user-%s:red" % (i,))
636
637 self.helper.invite(room_id, self.user_id, "@user-4:red", expect_code=429)
638
639 @unittest.override_config(
640 {"rc_invites": {"per_user": {"per_second": 0.5, "burst_count": 3}}}
641 )
642 def test_invites_by_users_ratelimit(self):
643 """Tests that invites to a specific user are actually rate-limited."""
644
645 for i in range(3):
646 room_id = self.helper.create_room_as(self.user_id)
647 self.helper.invite(room_id, self.user_id, "@other-users:red")
648
649 room_id = self.helper.create_room_as(self.user_id)
650 self.helper.invite(room_id, self.user_id, "@other-users:red", expect_code=429)
651
652
618653 class RoomJoinRatelimitTestCase(RoomBase):
619654 user_id = "@sid1:red"
620655
2323
2424 import synapse.rest.admin
2525 from synapse.api.constants import LoginType, Membership
26 from synapse.api.errors import Codes
26 from synapse.api.errors import Codes, HttpResponseException
2727 from synapse.rest.client.v1 import login, room
2828 from synapse.rest.client.v2_alpha import account, register
2929 from synapse.rest.synapse.client.password_reset import PasswordResetSubmitTokenResource
111111 # Assert we can't log in with the old password
112112 self.attempt_wrong_password_login("kermit", old_password)
113113
114 @override_config({"rc_3pid_validation": {"burst_count": 3}})
115 def test_ratelimit_by_email(self):
116 """Test that we ratelimit /requestToken for the same email.
117 """
118 old_password = "monkey"
119 new_password = "kangeroo"
120
121 user_id = self.register_user("kermit", old_password)
122 self.login("kermit", old_password)
123
124 email = "test1@example.com"
125
126 # Add a threepid
127 self.get_success(
128 self.store.user_add_threepid(
129 user_id=user_id,
130 medium="email",
131 address=email,
132 validated_at=0,
133 added_at=0,
134 )
135 )
136
137 def reset(ip):
138 client_secret = "foobar"
139 session_id = self._request_token(email, client_secret, ip)
140
141 self.assertEquals(len(self.email_attempts), 1)
142 link = self._get_link_from_email()
143
144 self._validate_token(link)
145
146 self._reset_password(new_password, session_id, client_secret)
147
148 self.email_attempts.clear()
149
150 # We expect to be able to make three requests before getting rate
151 # limited.
152 #
153 # We change IPs to ensure that we're not being ratelimited due to the
154 # same IP
155 reset("127.0.0.1")
156 reset("127.0.0.2")
157 reset("127.0.0.3")
158
159 with self.assertRaises(HttpResponseException) as cm:
160 reset("127.0.0.4")
161
162 self.assertEqual(cm.exception.code, 429)
163
114164 def test_basic_password_reset_canonicalise_email(self):
115165 """Test basic password reset flow
116166 Request password reset with different spelling
238288
239289 self.assertIsNotNone(session_id)
240290
241 def _request_token(self, email, client_secret):
291 def _request_token(self, email, client_secret, ip="127.0.0.1"):
242292 channel = self.make_request(
243293 "POST",
244294 b"account/password/email/requestToken",
245295 {"client_secret": client_secret, "email": email, "send_attempt": 1},
246 )
247 self.assertEquals(200, channel.code, channel.result)
296 client_ip=ip,
297 )
298
299 if channel.code != 200:
300 raise HttpResponseException(
301 channel.code, channel.result["reason"], channel.result["body"],
302 )
248303
249304 return channel.json_body["sid"]
250305
508563 def test_address_trim(self):
509564 self.get_success(self._add_email(" foo@test.bar ", "foo@test.bar"))
510565
566 @override_config({"rc_3pid_validation": {"burst_count": 3}})
567 def test_ratelimit_by_ip(self):
568 """Tests that adding emails is ratelimited by IP
569 """
570
571 # We expect to be able to set three emails before getting ratelimited.
572 self.get_success(self._add_email("foo1@test.bar", "foo1@test.bar"))
573 self.get_success(self._add_email("foo2@test.bar", "foo2@test.bar"))
574 self.get_success(self._add_email("foo3@test.bar", "foo3@test.bar"))
575
576 with self.assertRaises(HttpResponseException) as cm:
577 self.get_success(self._add_email("foo4@test.bar", "foo4@test.bar"))
578
579 self.assertEqual(cm.exception.code, 429)
580
511581 def test_add_email_if_disabled(self):
512582 """Test adding email to profile when doing so is disallowed
513583 """
776846 body["next_link"] = next_link
777847
778848 channel = self.make_request("POST", b"account/3pid/email/requestToken", body,)
779 self.assertEquals(expect_code, channel.code, channel.result)
849
850 if channel.code != expect_code:
851 raise HttpResponseException(
852 channel.code, channel.result["reason"], channel.result["body"],
853 )
780854
781855 return channel.json_body.get("sid")
782856
822896 def _add_email(self, request_email, expected_email):
823897 """Test adding an email to profile
824898 """
899 previous_email_attempts = len(self.email_attempts)
900
825901 client_secret = "foobar"
826902 session_id = self._request_token(request_email, client_secret)
827903
828 self.assertEquals(len(self.email_attempts), 1)
904 self.assertEquals(len(self.email_attempts) - previous_email_attempts, 1)
829905 link = self._get_link_from_email()
830906
831907 self._validate_token(link)
854930
855931 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
856932 self.assertEqual("email", channel.json_body["threepids"][0]["medium"])
857 self.assertEqual(expected_email, channel.json_body["threepids"][0]["address"])
933
934 threepids = {threepid["address"] for threepid in channel.json_body["threepids"]}
935 self.assertIn(expected_email, threepids)
2121 from synapse.handlers.ui_auth.checkers import UserInteractiveAuthChecker
2222 from synapse.rest.client.v1 import login
2323 from synapse.rest.client.v2_alpha import auth, devices, register
24 from synapse.rest.oidc import OIDCResource
24 from synapse.rest.synapse.client import build_synapse_client_resource_tree
2525 from synapse.types import JsonDict, UserID
2626
2727 from tests import unittest
172172
173173 def create_resource_dict(self):
174174 resource_dict = super().create_resource_dict()
175 if HAS_OIDC:
176 # mount the OIDC resource at /_synapse/oidc
177 resource_dict["/_synapse/oidc"] = OIDCResource(self.hs)
175 resource_dict.update(build_synapse_client_resource_tree(self.hs))
178176 return resource_dict
179177
180178 def prepare(self, reactor, clock, hs):
201201
202202 config = self.default_config()
203203 config["media_store_path"] = self.media_store_path
204 config["thumbnail_requirements"] = {}
205204 config["max_image_pixels"] = 2000000
206205
207206 provider_config = {
312311 self.assertEqual(headers.getRawHeaders(b"Content-Disposition"), None)
313312
314313 def test_thumbnail_crop(self):
314 """Test that a cropped remote thumbnail is available."""
315315 self._test_thumbnail(
316316 "crop", self.test_image.expected_cropped, self.test_image.expected_found
317317 )
318318
319319 def test_thumbnail_scale(self):
320 """Test that a scaled remote thumbnail is available."""
320321 self._test_thumbnail(
321322 "scale", self.test_image.expected_scaled, self.test_image.expected_found
322323 )
324
325 def test_invalid_type(self):
326 """An invalid thumbnail type is never available."""
327 self._test_thumbnail("invalid", None, False)
328
329 @unittest.override_config(
330 {"thumbnail_sizes": [{"width": 32, "height": 32, "method": "scale"}]}
331 )
332 def test_no_thumbnail_crop(self):
333 """
334 Override the config to generate only scaled thumbnails, but request a cropped one.
335 """
336 self._test_thumbnail("crop", None, False)
337
338 @unittest.override_config(
339 {"thumbnail_sizes": [{"width": 32, "height": 32, "method": "crop"}]}
340 )
341 def test_no_thumbnail_scale(self):
342 """
343 Override the config to generate only cropped thumbnails, but request a scaled one.
344 """
345 self._test_thumbnail("scale", None, False)
323346
324347 def _test_thumbnail(self, method, expected_body, expected_found):
325348 params = "?width=32&height=32&method=" + method
3939 "m.identity_server": {"base_url": "https://testis"},
4040 },
4141 )
42
43 def test_well_known_no_public_baseurl(self):
44 self.hs.config.public_baseurl = None
45
46 channel = self.make_request(
47 "GET", "/.well-known/matrix/client", shorthand=False
48 )
49
50 self.assertEqual(channel.code, 404)
4646 site = attr.ib(type=Site)
4747 _reactor = attr.ib()
4848 result = attr.ib(type=dict, default=attr.Factory(dict))
49 _ip = attr.ib(type=str, default="127.0.0.1")
4950 _producer = None
5051
5152 @property
119120 def getPeer(self):
120121 # We give an address so that getClientIP returns a non null entry,
121122 # causing us to record the MAU
122 return address.IPv4Address("TCP", "127.0.0.1", 3423)
123 return address.IPv4Address("TCP", self._ip, 3423)
123124
124125 def getHost(self):
125126 return None
195196 custom_headers: Optional[
196197 Iterable[Tuple[Union[bytes, str], Union[bytes, str]]]
197198 ] = None,
199 client_ip: str = "127.0.0.1",
198200 ) -> FakeChannel:
199201 """
200202 Make a web request using the given method, path and content, and render it
222224 will pump the reactor until the the renderer tells the channel the request
223225 is finished.
224226
227 client_ip: The IP to use as the requesting IP. Useful for testing
228 ratelimiting.
229
225230 Returns:
226231 channel
227232 """
249254 if isinstance(content, str):
250255 content = content.encode("utf8")
251256
252 channel = FakeChannel(site, reactor)
257 channel = FakeChannel(site, reactor, ip=client_ip)
253258
254259 req = request(channel)
255260 req.content = BytesIO(content)
260260 html = ""
261261 og = decode_and_calc_og(html, "http://example.com/test.html")
262262 self.assertEqual(og, {})
263
264 def test_invalid_encoding(self):
265 """An invalid character encoding should be ignored and treated as UTF-8, if possible."""
266 html = """
267 <html>
268 <head><title>Foo</title></head>
269 <body>
270 Some text.
271 </body>
272 </html>
273 """
274 og = decode_and_calc_og(
275 html, "http://example.com/test.html", "invalid-encoding"
276 )
277 self.assertEqual(og, {"og:title": "Foo", "og:description": "Some text."})
278
279 def test_invalid_encoding2(self):
280 """A body which doesn't match the sent character encoding."""
281 # Note that this contains an invalid UTF-8 sequence in the title.
282 html = b"""
283 <html>
284 <head><title>\xff\xff Foo</title></head>
285 <body>
286 Some text.
287 </body>
288 </html>
289 """
290 og = decode_and_calc_og(html, "http://example.com/test.html")
291 self.assertEqual(og, {"og:title": "ÿÿ Foo", "og:description": "Some text."})
385385 custom_headers: Optional[
386386 Iterable[Tuple[Union[bytes, str], Union[bytes, str]]]
387387 ] = None,
388 client_ip: str = "127.0.0.1",
388389 ) -> FakeChannel:
389390 """
390391 Create a SynapseRequest at the path using the method and containing the
408409 tells the channel the request is finished.
409410
410411 custom_headers: (name, value) pairs to add as request headers
412
413 client_ip: The IP to use as the requesting IP. Useful for testing
414 ratelimiting.
411415
412416 Returns:
413417 The FakeChannel object which stores the result of the request.
425429 content_is_form,
426430 await_result,
427431 custom_headers,
432 client_ip,
428433 )
429434
430435 def setup_test_homeserver(self, *args, **kwargs):
3232 from synapse.config.database import DatabaseConnectionConfig
3333 from synapse.config.homeserver import HomeServerConfig
3434 from synapse.config.server import DEFAULT_ROOM_VERSION
35 from synapse.http.server import HttpServer
3635 from synapse.logging.context import current_context, set_current_context
3736 from synapse.server import HomeServer
3837 from synapse.storage import DataStore
157156 "local": {"per_second": 10000, "burst_count": 10000},
158157 "remote": {"per_second": 10000, "burst_count": 10000},
159158 },
159 "rc_3pid_validation": {"per_second": 10000, "burst_count": 10000},
160160 "saml2_enabled": False,
161 "public_baseurl": None,
161162 "default_identity_server": None,
162163 "key_refresh_interval": 24 * 60 * 60 * 1000,
163164 "old_signing_keys": {},
350351
351352
352353 # This is a mock /resource/ not an entire server
353 class MockHttpResource(HttpServer):
354 class MockHttpResource:
354355 def __init__(self, prefix=""):
355356 self.callbacks = [] # 3-tuple of method/pattern/function
356357 self.prefix = prefix
1717 # installed on that).
1818 #
1919 # anyway, make sure that we have a recent enough setuptools.
20 setuptools>=18.5
20 setuptools>=18.5 ; python_version >= '3.6'
21 setuptools>=18.5,<51.0.0 ; python_version < '3.6'
2122
2223 # we also need a semi-recent version of pip, because old ones fail to
2324 # install the "enum34" dependency of cryptography.
24 pip>=10
25 pip>=10 ; python_version >= '3.6'
26 pip>=10,<21.0 ; python_version < '3.6'
2527
26 # directories/files we run the linters on
28 # directories/files we run the linters on.
29 # if you update this list, make sure to do the same in scripts-dev/lint.sh
2730 lint_targets =
2831 setup.py
2932 synapse
102105 [testenv:py35-old]
103106 skip_install=True
104107 deps =
105 # Ensure a version of setuptools that supports Python 3.5 is installed.
106 setuptools < 51.0.0
107
108108 # Old automat version for Twisted
109109 Automat == 0.3.0
110
111110 lxml
112 coverage
113 coverage-enable-subprocess==1.0
111 {[base]deps}
114112
115113 commands =
116114 # Make all greater-thans equals so we test the oldest version of our direct
167165 skip_install = True
168166 deps =
169167 coverage
168 pip>=10 ; python_version >= '3.6'
169 pip>=10,<21.0 ; python_version < '3.6'
170170 commands=
171171 coverage combine
172172 coverage report