Codebase list matrix-synapse / 56044ca
New upstream version 1.25.0 Andrej Shadura 3 years ago
251 changed file(s) with 8040 addition(s) and 3734 deletion(s). Raw diff Collapse all Expand all
44 - image: docker:git
55 steps:
66 - checkout
7 - setup_remote_docker
87 - docker_prepare
98 - run: docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
9 # for release builds, we want to get the amd64 image out asap, so first
10 # we do an amd64-only build, before following up with a multiarch build.
1011 - docker_build:
1112 tag: -t matrixdotorg/synapse:${CIRCLE_TAG}
1213 platforms: linux/amd64
1920 - image: docker:git
2021 steps:
2122 - checkout
22 - setup_remote_docker
2323 - docker_prepare
2424 - run: docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
25 - docker_build:
26 tag: -t matrixdotorg/synapse:latest
27 platforms: linux/amd64
25 # for `latest`, we don't want the arm images to disappear, so don't update the tag
26 # until all of the platforms are built.
2827 - docker_build:
2928 tag: -t matrixdotorg/synapse:latest
3029 platforms: linux/amd64,linux/arm/v7,linux/arm64
4544
4645 commands:
4746 docker_prepare:
48 description: Downloads the buildx cli plugin and enables multiarch images
47 description: Sets up a remote docker server, downloads the buildx cli plugin, and enables multiarch images
4948 parameters:
5049 buildx_version:
5150 type: string
5251 default: "v0.4.1"
5352 steps:
53 - setup_remote_docker:
54 # 19.03.13 was the most recent available on circleci at the time of
55 # writing.
56 version: 19.03.13
5457 - run: apk add --no-cache curl
5558 - run: mkdir -vp ~/.docker/cli-plugins/ ~/dockercache
5659 - run: curl --silent -L "https://github.com/docker/buildx/releases/download/<< parameters.buildx_version >>/buildx-<< parameters.buildx_version >>.linux-amd64" > ~/.docker/cli-plugins/docker-buildx
0 Synapse 1.25.0 (2021-01-13)
1 ===========================
2
3 Ending Support for Python 3.5 and Postgres 9.5
4 ----------------------------------------------
5
6 With this release, the Synapse team is announcing a formal deprecation policy for our platform dependencies, like Python and PostgreSQL:
7
8 All future releases of Synapse will follow the upstream end-of-life schedules.
9
10 Which means:
11
12 * This is the last release which guarantees support for Python 3.5.
13 * We will end support for PostgreSQL 9.5 early next month.
14 * We will end support for Python 3.6 and PostgreSQL 9.6 near the end of the year.
15
16 Crucially, this means __we will not produce .deb packages for Debian 9 (Stretch) or Ubuntu 16.04 (Xenial)__ beyond the transition period described below.
17
18 The website https://endoflife.date/ has convenient summaries of the support schedules for projects like [Python](https://endoflife.date/python) and [PostgreSQL](https://endoflife.date/postgresql).
19
20 If you are unable to upgrade your environment to a supported version of Python or Postgres, we encourage you to consider using the [Synapse Docker images](./INSTALL.md#docker-images-and-ansible-playbooks) instead.
21
22 ### Transition Period
23
24 We will make a good faith attempt to avoid breaking compatibility in all releases through the end of March 2021. However, critical security vulnerabilities in dependencies or other unanticipated circumstances may arise which necessitate breaking compatibility earlier.
25
26 We intend to continue producing .deb packages for Debian 9 (Stretch) and Ubuntu 16.04 (Xenial) through the transition period.
27
28 Removal warning
29 ---------------
30
31 The old [Purge Room API](https://github.com/matrix-org/synapse/tree/master/docs/admin_api/purge_room.md)
32 and [Shutdown Room API](https://github.com/matrix-org/synapse/tree/master/docs/admin_api/shutdown_room.md)
33 are deprecated and will be removed in a future release. They will be replaced by the
34 [Delete Room API](https://github.com/matrix-org/synapse/tree/master/docs/admin_api/rooms.md#delete-room-api).
35
36 `POST /_synapse/admin/v1/rooms/<room_id>/delete` replaces `POST /_synapse/admin/v1/purge_room` and
37 `POST /_synapse/admin/v1/shutdown_room/<room_id>`.
38
39 Bugfixes
40 --------
41
42 - Fix HTTP proxy support when using a proxy that is on a blacklisted IP. Introduced in v1.25.0rc1. Contributed by @Bubu. ([\#9084](https://github.com/matrix-org/synapse/issues/9084))
43
44
45 Synapse 1.25.0rc1 (2021-01-06)
46 ==============================
47
48 Features
49 --------
50
51 - Add an admin API that lets server admins get power in rooms in which local users have power. ([\#8756](https://github.com/matrix-org/synapse/issues/8756))
52 - Add optional HTTP authentication to replication endpoints. ([\#8853](https://github.com/matrix-org/synapse/issues/8853))
53 - Improve the error messages printed as a result of configuration problems for extension modules. ([\#8874](https://github.com/matrix-org/synapse/issues/8874))
54 - Add the number of local devices to Room Details Admin API. Contributed by @dklimpel. ([\#8886](https://github.com/matrix-org/synapse/issues/8886))
55 - Add `X-Robots-Tag` header to stop web crawlers from indexing media. Contributed by Aaron Raimist. ([\#8887](https://github.com/matrix-org/synapse/issues/8887))
56 - Spam-checkers may now define their methods as `async`. ([\#8890](https://github.com/matrix-org/synapse/issues/8890))
57 - Add support for allowing users to pick their own user ID during a single-sign-on login. ([\#8897](https://github.com/matrix-org/synapse/issues/8897), [\#8900](https://github.com/matrix-org/synapse/issues/8900), [\#8911](https://github.com/matrix-org/synapse/issues/8911), [\#8938](https://github.com/matrix-org/synapse/issues/8938), [\#8941](https://github.com/matrix-org/synapse/issues/8941), [\#8942](https://github.com/matrix-org/synapse/issues/8942), [\#8951](https://github.com/matrix-org/synapse/issues/8951))
58 - Add an `email.invite_client_location` configuration option to send a web client location to the invite endpoint on the identity server which allows customisation of the email template. ([\#8930](https://github.com/matrix-org/synapse/issues/8930))
59 - The search term in the list room and list user Admin APIs is now treated as case-insensitive. ([\#8931](https://github.com/matrix-org/synapse/issues/8931))
60 - Apply an IP range blacklist to push and key revocation requests. ([\#8821](https://github.com/matrix-org/synapse/issues/8821), [\#8870](https://github.com/matrix-org/synapse/issues/8870), [\#8954](https://github.com/matrix-org/synapse/issues/8954))
61 - Add an option to allow re-use of user-interactive authentication sessions for a period of time. ([\#8970](https://github.com/matrix-org/synapse/issues/8970))
62 - Allow running the redact endpoint on workers. ([\#8994](https://github.com/matrix-org/synapse/issues/8994))
63
64
65 Bugfixes
66 --------
67
68 - Fix bug where we might not correctly calculate the current state for rooms with multiple extremities. ([\#8827](https://github.com/matrix-org/synapse/issues/8827))
69 - Fix a long-standing bug in the register admin endpoint (`/_synapse/admin/v1/register`) when the `mac` field was not provided. The endpoint now properly returns a 400 error. Contributed by @edwargix. ([\#8837](https://github.com/matrix-org/synapse/issues/8837))
70 - Fix a long-standing bug on Synapse instances supporting Single-Sign-On, where users would be prompted to enter their password to confirm certain actions, even though they have not set a password. ([\#8858](https://github.com/matrix-org/synapse/issues/8858))
71 - Fix a longstanding bug where a 500 error would be returned if the `Content-Length` header was not provided to the upload media resource. ([\#8862](https://github.com/matrix-org/synapse/issues/8862))
72 - Add additional validation to pusher URLs to be compliant with the specification. ([\#8865](https://github.com/matrix-org/synapse/issues/8865))
73 - Fix the error code that is returned when a user tries to register on a homeserver on which new-user registration has been disabled. ([\#8867](https://github.com/matrix-org/synapse/issues/8867))
74 - Fix a bug where `PUT /_synapse/admin/v2/users/<user_id>` failed to create a new user when `avatar_url` is specified. Bug introduced in Synapse v1.9.0. ([\#8872](https://github.com/matrix-org/synapse/issues/8872))
75 - Fix a 500 error when attempting to preview an empty HTML file. ([\#8883](https://github.com/matrix-org/synapse/issues/8883))
76 - Fix occasional deadlock when handling SIGHUP. ([\#8918](https://github.com/matrix-org/synapse/issues/8918))
77 - Fix login API to not ratelimit application services that have ratelimiting disabled. ([\#8920](https://github.com/matrix-org/synapse/issues/8920))
78 - Fix bug where we ratelimited auto joining of rooms on registration (using `auto_join_rooms` config). ([\#8921](https://github.com/matrix-org/synapse/issues/8921))
79 - Fix a bug where deactivated users appeared in the user directory when their profile information was updated. ([\#8933](https://github.com/matrix-org/synapse/issues/8933), [\#8964](https://github.com/matrix-org/synapse/issues/8964))
80 - Fix bug introduced in Synapse v1.24.0 which would cause an exception on startup if both `enabled` and `localdb_enabled` were set to `False` in the `password_config` setting of the configuration file. ([\#8937](https://github.com/matrix-org/synapse/issues/8937))
81 - Fix a bug where 500 errors would be returned if the `m.room_history_visibility` event had invalid content. ([\#8945](https://github.com/matrix-org/synapse/issues/8945))
82 - Fix a bug causing common English words to not be considered for a user directory search. ([\#8959](https://github.com/matrix-org/synapse/issues/8959))
83 - Fix bug where application services couldn't register new ghost users if the server had reached its MAU limit. ([\#8962](https://github.com/matrix-org/synapse/issues/8962))
84 - Fix a long-standing bug where a `m.image` event without a `url` would cause errors on push. ([\#8965](https://github.com/matrix-org/synapse/issues/8965))
85 - Fix a small bug in v2 state resolution algorithm, which could also cause performance issues for rooms with large numbers of power levels. ([\#8971](https://github.com/matrix-org/synapse/issues/8971))
86 - Add validation to the `sendToDevice` API to raise a missing parameters error instead of a 500 error. ([\#8975](https://github.com/matrix-org/synapse/issues/8975))
87 - Add validation of group IDs to raise a 400 error instead of a 500 eror. ([\#8977](https://github.com/matrix-org/synapse/issues/8977))
88
89
90 Improved Documentation
91 ----------------------
92
93 - Fix the "Event persist rate" section of the included grafana dashboard by adding missing prometheus rules. ([\#8802](https://github.com/matrix-org/synapse/issues/8802))
94 - Combine related media admin API docs. ([\#8839](https://github.com/matrix-org/synapse/issues/8839))
95 - Fix an error in the documentation for the SAML username mapping provider. ([\#8873](https://github.com/matrix-org/synapse/issues/8873))
96 - Clarify comments around template directories in `sample_config.yaml`. ([\#8891](https://github.com/matrix-org/synapse/issues/8891))
97 - Move instructions for database setup, adjusted heading levels and improved syntax highlighting in [INSTALL.md](../INSTALL.md). Contributed by @fossterer. ([\#8987](https://github.com/matrix-org/synapse/issues/8987))
98 - Update the example value of `group_creation_prefix` in the sample configuration. ([\#8992](https://github.com/matrix-org/synapse/issues/8992))
99 - Link the Synapse developer room to the development section in the docs. ([\#9002](https://github.com/matrix-org/synapse/issues/9002))
100
101
102 Deprecations and Removals
103 -------------------------
104
105 - Deprecate Shutdown Room and Purge Room Admin APIs. ([\#8829](https://github.com/matrix-org/synapse/issues/8829))
106
107
108 Internal Changes
109 ----------------
110
111 - Properly store the mapping of external ID to Matrix ID for CAS users. ([\#8856](https://github.com/matrix-org/synapse/issues/8856), [\#8958](https://github.com/matrix-org/synapse/issues/8958))
112 - Remove some unnecessary stubbing from unit tests. ([\#8861](https://github.com/matrix-org/synapse/issues/8861))
113 - Remove unused `FakeResponse` class from unit tests. ([\#8864](https://github.com/matrix-org/synapse/issues/8864))
114 - Pass `room_id` to `get_auth_chain_difference`. ([\#8879](https://github.com/matrix-org/synapse/issues/8879))
115 - Add type hints to push module. ([\#8880](https://github.com/matrix-org/synapse/issues/8880), [\#8882](https://github.com/matrix-org/synapse/issues/8882), [\#8901](https://github.com/matrix-org/synapse/issues/8901), [\#8940](https://github.com/matrix-org/synapse/issues/8940), [\#8943](https://github.com/matrix-org/synapse/issues/8943), [\#9020](https://github.com/matrix-org/synapse/issues/9020))
116 - Simplify logic for handling user-interactive-auth via single-sign-on servers. ([\#8881](https://github.com/matrix-org/synapse/issues/8881))
117 - Skip the SAML tests if the requirements (`pysaml2` and `xmlsec1`) aren't available. ([\#8905](https://github.com/matrix-org/synapse/issues/8905))
118 - Fix multiarch docker image builds. ([\#8906](https://github.com/matrix-org/synapse/issues/8906))
119 - Don't publish `latest` docker image until all archs are built. ([\#8909](https://github.com/matrix-org/synapse/issues/8909))
120 - Various clean-ups to the structured logging and logging context code. ([\#8916](https://github.com/matrix-org/synapse/issues/8916), [\#8935](https://github.com/matrix-org/synapse/issues/8935))
121 - Automatically drop stale forward-extremities under some specific conditions. ([\#8929](https://github.com/matrix-org/synapse/issues/8929))
122 - Refactor test utilities for injecting HTTP requests. ([\#8946](https://github.com/matrix-org/synapse/issues/8946))
123 - Add a maximum size of 50 kilobytes to .well-known lookups. ([\#8950](https://github.com/matrix-org/synapse/issues/8950))
124 - Fix bug in `generate_log_config` script which made it write empty files. ([\#8952](https://github.com/matrix-org/synapse/issues/8952))
125 - Clean up tox.ini file; disable coverage checking for non-test runs. ([\#8963](https://github.com/matrix-org/synapse/issues/8963))
126 - Add type hints to the admin and room list handlers. ([\#8973](https://github.com/matrix-org/synapse/issues/8973))
127 - Add type hints to the receipts and user directory handlers. ([\#8976](https://github.com/matrix-org/synapse/issues/8976))
128 - Drop the unused `local_invites` table. ([\#8979](https://github.com/matrix-org/synapse/issues/8979))
129 - Add type hints to the base storage code. ([\#8980](https://github.com/matrix-org/synapse/issues/8980))
130 - Support using PyJWT v2.0.0 in the test suite. ([\#8986](https://github.com/matrix-org/synapse/issues/8986))
131 - Fix `tests.federation.transport.RoomDirectoryFederationTests` and ensure it runs in CI. ([\#8998](https://github.com/matrix-org/synapse/issues/8998))
132 - Add type hints to the crypto module. ([\#8999](https://github.com/matrix-org/synapse/issues/8999))
133
134
0135 Synapse 1.24.0 (2020-12-09)
1136 ===========================
2137
36171 ```
37172 * Administrators who have installed Synapse from distribution packages should
38173 consult the information from their distributions.
174
175 Internal Changes
176 ----------------
177
178 - Add a maximum version for pysaml2 on Python 3.5. ([\#8898](https://github.com/matrix-org/synapse/issues/8898))
179
180
181 Synapse 1.23.1 (2020-12-09)
182 ===========================
183
184 Due to the two security issues highlighted below, server administrators are
185 encouraged to update Synapse. We are not aware of these vulnerabilities being
186 exploited in the wild.
187
188 Security advisory
189 -----------------
190
191 The following issues are fixed in v1.23.1 and v1.24.0.
192
193 - There is a denial of service attack
194 ([CVE-2020-26257](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-26257))
195 against the federation APIs in which future events will not be correctly sent
196 to other servers over federation. This affects all servers that participate in
197 open federation. (Fixed in [#8776](https://github.com/matrix-org/synapse/pull/8776)).
198
199 - Synapse may be affected by OpenSSL
200 [CVE-2020-1971](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-1971).
201 Synapse administrators should ensure that they have the latest versions of
202 the cryptography Python package installed.
203
204 To upgrade Synapse along with the cryptography package:
205
206 * Administrators using the [`matrix.org` Docker
207 image](https://hub.docker.com/r/matrixdotorg/synapse/) or the [Debian/Ubuntu
208 packages from
209 `matrix.org`](https://github.com/matrix-org/synapse/blob/master/INSTALL.md#matrixorg-packages)
210 should ensure that they have version 1.24.0 or 1.23.1 installed: these images include
211 the updated packages.
212 * Administrators who have [installed Synapse from
213 source](https://github.com/matrix-org/synapse/blob/master/INSTALL.md#installing-from-source)
214 should upgrade the cryptography package within their virtualenv by running:
215 ```sh
216 <path_to_virtualenv>/bin/pip install 'cryptography>=3.3'
217 ```
218 * Administrators who have installed Synapse from distribution packages should
219 consult the information from their distributions.
220
221 Bugfixes
222 --------
223
224 - Fix a bug in some federation APIs which could lead to unexpected behaviour if different parameters were set in the URI and the request body. ([\#8776](https://github.com/matrix-org/synapse/issues/8776))
225
39226
40227 Internal Changes
41228 ----------------
0 - [Choosing your server name](#choosing-your-server-name)
1 - [Picking a database engine](#picking-a-database-engine)
2 - [Installing Synapse](#installing-synapse)
3 - [Installing from source](#installing-from-source)
4 - [Platform-Specific Instructions](#platform-specific-instructions)
5 - [Prebuilt packages](#prebuilt-packages)
6 - [Setting up Synapse](#setting-up-synapse)
7 - [TLS certificates](#tls-certificates)
8 - [Client Well-Known URI](#client-well-known-uri)
9 - [Email](#email)
10 - [Registering a user](#registering-a-user)
11 - [Setting up a TURN server](#setting-up-a-turn-server)
12 - [URL previews](#url-previews)
13 - [Troubleshooting Installation](#troubleshooting-installation)
14
15 # Choosing your server name
0 # Installation Instructions
1
2 There are 3 steps to follow under **Installation Instructions**.
3
4 - [Installation Instructions](#installation-instructions)
5 - [Choosing your server name](#choosing-your-server-name)
6 - [Installing Synapse](#installing-synapse)
7 - [Installing from source](#installing-from-source)
8 - [Platform-Specific Instructions](#platform-specific-instructions)
9 - [Debian/Ubuntu/Raspbian](#debianubunturaspbian)
10 - [ArchLinux](#archlinux)
11 - [CentOS/Fedora](#centosfedora)
12 - [macOS](#macos)
13 - [OpenSUSE](#opensuse)
14 - [OpenBSD](#openbsd)
15 - [Windows](#windows)
16 - [Prebuilt packages](#prebuilt-packages)
17 - [Docker images and Ansible playbooks](#docker-images-and-ansible-playbooks)
18 - [Debian/Ubuntu](#debianubuntu)
19 - [Matrix.org packages](#matrixorg-packages)
20 - [Downstream Debian packages](#downstream-debian-packages)
21 - [Downstream Ubuntu packages](#downstream-ubuntu-packages)
22 - [Fedora](#fedora)
23 - [OpenSUSE](#opensuse-1)
24 - [SUSE Linux Enterprise Server](#suse-linux-enterprise-server)
25 - [ArchLinux](#archlinux-1)
26 - [Void Linux](#void-linux)
27 - [FreeBSD](#freebsd)
28 - [OpenBSD](#openbsd-1)
29 - [NixOS](#nixos)
30 - [Setting up Synapse](#setting-up-synapse)
31 - [Using PostgreSQL](#using-postgresql)
32 - [TLS certificates](#tls-certificates)
33 - [Client Well-Known URI](#client-well-known-uri)
34 - [Email](#email)
35 - [Registering a user](#registering-a-user)
36 - [Setting up a TURN server](#setting-up-a-turn-server)
37 - [URL previews](#url-previews)
38 - [Troubleshooting Installation](#troubleshooting-installation)
39
40 ## Choosing your server name
1641
1742 It is important to choose the name for your server before you install Synapse,
1843 because it cannot be changed later.
2853 `user@email.example.com`) - but doing so may require more advanced setup: see
2954 [Setting up Federation](docs/federate.md).
3055
31 # Picking a database engine
32
33 Synapse offers two database engines:
34 * [PostgreSQL](https://www.postgresql.org)
35 * [SQLite](https://sqlite.org/)
36
37 Almost all installations should opt to use PostgreSQL. Advantages include:
38
39 * significant performance improvements due to the superior threading and
40 caching model, smarter query optimiser
41 * allowing the DB to be run on separate hardware
42
43 For information on how to install and use PostgreSQL, please see
44 [docs/postgres.md](docs/postgres.md)
45
46 By default Synapse uses SQLite and in doing so trades performance for convenience.
47 SQLite is only recommended in Synapse for testing purposes or for servers with
48 light workloads.
49
50 # Installing Synapse
51
52 ## Installing from source
56 ## Installing Synapse
57
58 ### Installing from source
5359
5460 (Prebuilt packages are available for some platforms - see [Prebuilt packages](#prebuilt-packages).)
5561
6773
6874 To install the Synapse homeserver run:
6975
70 ```
76 ```sh
7177 mkdir -p ~/synapse
7278 virtualenv -p python3 ~/synapse/env
7379 source ~/synapse/env/bin/activate
8490 This Synapse installation can then be later upgraded by using pip again with the
8591 update flag:
8692
87 ```
93 ```sh
8894 source ~/synapse/env/bin/activate
8995 pip install -U matrix-synapse
9096 ```
9298 Before you can start Synapse, you will need to generate a configuration
9399 file. To do this, run (in your virtualenv, as before):
94100
95 ```
101 ```sh
96102 cd ~/synapse
97103 python -m synapse.app.homeserver \
98104 --server-name my.domain.name \
110116 change your homeserver's keys, you may find that other homeserver have the
111117 old key cached. If you update the signing key, you should change the name of the
112118 key in the `<server name>.signing.key` file (the second word) to something
113 different. See the
114 [spec](https://matrix.org/docs/spec/server_server/latest.html#retrieving-server-keys)
115 for more information on key management).
119 different. See the [spec](https://matrix.org/docs/spec/server_server/latest.html#retrieving-server-keys) for more information on key management).
116120
117121 To actually run your new homeserver, pick a working directory for Synapse to
118122 run (e.g. `~/synapse`), and:
119123
120 ```
124 ```sh
121125 cd ~/synapse
122126 source env/bin/activate
123127 synctl start
124128 ```
125129
126 ### Platform-Specific Instructions
127
128 #### Debian/Ubuntu/Raspbian
130 #### Platform-Specific Instructions
131
132 ##### Debian/Ubuntu/Raspbian
129133
130134 Installing prerequisites on Ubuntu or Debian:
131135
132 ```
133 sudo apt-get install build-essential python3-dev libffi-dev \
136 ```sh
137 sudo apt install build-essential python3-dev libffi-dev \
134138 python3-pip python3-setuptools sqlite3 \
135139 libssl-dev virtualenv libjpeg-dev libxslt1-dev
136140 ```
137141
138 #### ArchLinux
142 ##### ArchLinux
139143
140144 Installing prerequisites on ArchLinux:
141145
142 ```
146 ```sh
143147 sudo pacman -S base-devel python python-pip \
144148 python-setuptools python-virtualenv sqlite3
145149 ```
146150
147 #### CentOS/Fedora
151 ##### CentOS/Fedora
148152
149153 Installing prerequisites on CentOS 8 or Fedora>26:
150154
151 ```
155 ```sh
152156 sudo dnf install libtiff-devel libjpeg-devel libzip-devel freetype-devel \
153157 libwebp-devel tk-devel redhat-rpm-config \
154158 python3-virtualenv libffi-devel openssl-devel
157161
158162 Installing prerequisites on CentOS 7 or Fedora<=25:
159163
160 ```
164 ```sh
161165 sudo yum install libtiff-devel libjpeg-devel libzip-devel freetype-devel \
162166 lcms2-devel libwebp-devel tcl-devel tk-devel redhat-rpm-config \
163167 python3-virtualenv libffi-devel openssl-devel
169173 recent SQLite version, but it is recommended that you instead use a Postgres
170174 database: see [docs/postgres.md](docs/postgres.md).
171175
172 #### macOS
176 ##### macOS
173177
174178 Installing prerequisites on macOS:
175179
176 ```
180 ```sh
177181 xcode-select --install
178182 sudo easy_install pip
179183 sudo pip install virtualenv
183187 On macOS Catalina (10.15) you may need to explicitly install OpenSSL
184188 via brew and inform `pip` about it so that `psycopg2` builds:
185189
186 ```
190 ```sh
187191 brew install openssl@1.1
188192 export LDFLAGS=-L/usr/local/Cellar/openssl\@1.1/1.1.1d/lib/
189193 ```
190194
191 #### OpenSUSE
195 ##### OpenSUSE
192196
193197 Installing prerequisites on openSUSE:
194198
195 ```
199 ```sh
196200 sudo zypper in -t pattern devel_basis
197201 sudo zypper in python-pip python-setuptools sqlite3 python-virtualenv \
198202 python-devel libffi-devel libopenssl-devel libjpeg62-devel
199203 ```
200204
201 #### OpenBSD
205 ##### OpenBSD
202206
203207 A port of Synapse is available under `net/synapse`. The filesystem
204208 underlying the homeserver directory (defaults to `/var/synapse`) has to be
212216 Creating a `WRKOBJDIR` for building python under `/usr/local` (which on a
213217 default OpenBSD installation is mounted with `wxallowed`):
214218
215 ```
219 ```sh
216220 doas mkdir /usr/local/pobj_wxallowed
217221 ```
218222
219223 Assuming `PORTS_PRIVSEP=Yes` (cf. `bsd.port.mk(5)`) and `SUDO=doas` are
220224 configured in `/etc/mk.conf`:
221225
222 ```
226 ```sh
223227 doas chown _pbuild:_pbuild /usr/local/pobj_wxallowed
224228 ```
225229
226230 Setting the `WRKOBJDIR` for building python:
227231
228 ```
232 ```sh
229233 echo WRKOBJDIR_lang/python/3.7=/usr/local/pobj_wxallowed \\nWRKOBJDIR_lang/python/2.7=/usr/local/pobj_wxallowed >> /etc/mk.conf
230234 ```
231235
232236 Building Synapse:
233237
234 ```
238 ```sh
235239 cd /usr/ports/net/synapse
236240 make install
237241 ```
238242
239 #### Windows
243 ##### Windows
240244
241245 If you wish to run or develop Synapse on Windows, the Windows Subsystem For
242246 Linux provides a Linux environment on Windows 10 which is capable of using the
243247 Debian, Fedora, or source installation methods. More information about WSL can
244 be found at https://docs.microsoft.com/en-us/windows/wsl/install-win10 for
245 Windows 10 and https://docs.microsoft.com/en-us/windows/wsl/install-on-server
248 be found at <https://docs.microsoft.com/en-us/windows/wsl/install-win10> for
249 Windows 10 and <https://docs.microsoft.com/en-us/windows/wsl/install-on-server>
246250 for Windows Server.
247251
248 ## Prebuilt packages
252 ### Prebuilt packages
249253
250254 As an alternative to installing from source, prebuilt packages are available
251255 for a number of platforms.
252256
253 ### Docker images and Ansible playbooks
257 #### Docker images and Ansible playbooks
254258
255259 There is an offical synapse image available at
256 https://hub.docker.com/r/matrixdotorg/synapse which can be used with
260 <https://hub.docker.com/r/matrixdotorg/synapse> which can be used with
257261 the docker-compose file available at [contrib/docker](contrib/docker). Further
258262 information on this including configuration options is available in the README
259263 on hub.docker.com.
260264
261265 Alternatively, Andreas Peters (previously Silvio Fricke) has contributed a
262266 Dockerfile to automate a synapse server in a single Docker image, at
263 https://hub.docker.com/r/avhost/docker-matrix/tags/
267 <https://hub.docker.com/r/avhost/docker-matrix/tags/>
264268
265269 Slavi Pantaleev has created an Ansible playbook,
266270 which installs the offical Docker image of Matrix Synapse
267271 along with many other Matrix-related services (Postgres database, Element, coturn,
268272 ma1sd, SSL support, etc.).
269273 For more details, see
270 https://github.com/spantaleev/matrix-docker-ansible-deploy
271
272
273 ### Debian/Ubuntu
274
275 #### Matrix.org packages
274 <https://github.com/spantaleev/matrix-docker-ansible-deploy>
275
276 #### Debian/Ubuntu
277
278 ##### Matrix.org packages
276279
277280 Matrix.org provides Debian/Ubuntu packages of the latest stable version of
278 Synapse via https://packages.matrix.org/debian/. They are available for Debian
281 Synapse via <https://packages.matrix.org/debian/>. They are available for Debian
279282 9 (Stretch), Ubuntu 16.04 (Xenial), and later. To use them:
280283
281 ```
284 ```sh
282285 sudo apt install -y lsb-release wget apt-transport-https
283286 sudo wget -O /usr/share/keyrings/matrix-org-archive-keyring.gpg https://packages.matrix.org/debian/matrix-org-archive-keyring.gpg
284287 echo "deb [signed-by=/usr/share/keyrings/matrix-org-archive-keyring.gpg] https://packages.matrix.org/debian/ $(lsb_release -cs) main" |
298301 /usr/share/keyrings/matrix-org-archive-keyring.gpg`) is
299302 `AAF9AE843A7584B5A3E4CD2BCF45A512DE2DA058`.
300303
301 #### Downstream Debian packages
304 ##### Downstream Debian packages
302305
303306 We do not recommend using the packages from the default Debian `buster`
304307 repository at this time, as they are old and suffer from known security
310313 If you are using Debian `sid` or testing, Synapse is available in the default
311314 repositories and it should be possible to install it simply with:
312315
313 ```
316 ```sh
314317 sudo apt install matrix-synapse
315318 ```
316319
317 #### Downstream Ubuntu packages
320 ##### Downstream Ubuntu packages
318321
319322 We do not recommend using the packages in the default Ubuntu repository
320323 at this time, as they are old and suffer from known security vulnerabilities.
321324 The latest version of Synapse can be installed from [our repository](#matrixorg-packages).
322325
323 ### Fedora
326 #### Fedora
324327
325328 Synapse is in the Fedora repositories as `matrix-synapse`:
326329
327 ```
330 ```sh
328331 sudo dnf install matrix-synapse
329332 ```
330333
331334 Oleg Girko provides Fedora RPMs at
332 https://obs.infoserver.lv/project/monitor/matrix-synapse
333
334 ### OpenSUSE
335 <https://obs.infoserver.lv/project/monitor/matrix-synapse>
336
337 #### OpenSUSE
335338
336339 Synapse is in the OpenSUSE repositories as `matrix-synapse`:
337340
338 ```
341 ```sh
339342 sudo zypper install matrix-synapse
340343 ```
341344
342 ### SUSE Linux Enterprise Server
345 #### SUSE Linux Enterprise Server
343346
344347 Unofficial package are built for SLES 15 in the openSUSE:Backports:SLE-15 repository at
345 https://download.opensuse.org/repositories/openSUSE:/Backports:/SLE-15/standard/
346
347 ### ArchLinux
348 <https://download.opensuse.org/repositories/openSUSE:/Backports:/SLE-15/standard/>
349
350 #### ArchLinux
348351
349352 The quickest way to get up and running with ArchLinux is probably with the community package
350 https://www.archlinux.org/packages/community/any/matrix-synapse/, which should pull in most of
353 <https://www.archlinux.org/packages/community/any/matrix-synapse/>, which should pull in most of
351354 the necessary dependencies.
352355
353356 pip may be outdated (6.0.7-1 and needs to be upgraded to 6.0.8-1 ):
354357
355 ```
358 ```sh
356359 sudo pip install --upgrade pip
357360 ```
358361
361364 compile it under the right architecture. (This should not be needed if
362365 installing under virtualenv):
363366
364 ```
367 ```sh
365368 sudo pip uninstall py-bcrypt
366369 sudo pip install py-bcrypt
367370 ```
368371
369 ### Void Linux
372 #### Void Linux
370373
371374 Synapse can be found in the void repositories as 'synapse':
372375
373 ```
376 ```sh
374377 xbps-install -Su
375378 xbps-install -S synapse
376379 ```
377380
378 ### FreeBSD
381 #### FreeBSD
379382
380383 Synapse can be installed via FreeBSD Ports or Packages contributed by Brendan Molloy from:
381384
382 - Ports: `cd /usr/ports/net-im/py-matrix-synapse && make install clean`
383 - Packages: `pkg install py37-matrix-synapse`
384
385 ### OpenBSD
385 - Ports: `cd /usr/ports/net-im/py-matrix-synapse && make install clean`
386 - Packages: `pkg install py37-matrix-synapse`
387
388 #### OpenBSD
386389
387390 As of OpenBSD 6.7 Synapse is available as a pre-compiled binary. The filesystem
388391 underlying the homeserver directory (defaults to `/var/synapse`) has to be
391394
392395 Installing Synapse:
393396
394 ```
397 ```sh
395398 doas pkg_add synapse
396399 ```
397400
398 ### NixOS
401 #### NixOS
399402
400403 Robin Lambertz has packaged Synapse for NixOS at:
401 https://github.com/NixOS/nixpkgs/blob/master/nixos/modules/services/misc/matrix-synapse.nix
402
403 # Setting up Synapse
404 <https://github.com/NixOS/nixpkgs/blob/master/nixos/modules/services/misc/matrix-synapse.nix>
405
406 ## Setting up Synapse
404407
405408 Once you have installed synapse as above, you will need to configure it.
406409
407 ## TLS certificates
410 ### Using PostgreSQL
411
412 By default Synapse uses [SQLite](https://sqlite.org/) and in doing so trades performance for convenience.
413 SQLite is only recommended in Synapse for testing purposes or for servers with
414 very light workloads.
415
416 Almost all installations should opt to use [PostgreSQL](https://www.postgresql.org). Advantages include:
417
418 - significant performance improvements due to the superior threading and
419 caching model, smarter query optimiser
420 - allowing the DB to be run on separate hardware
421
422 For information on how to install and use PostgreSQL in Synapse, please see
423 [docs/postgres.md](docs/postgres.md)
424
425 ### TLS certificates
408426
409427 The default configuration exposes a single HTTP port on the local
410428 interface: `http://localhost:8008`. It is suitable for local testing,
418436 Alternatively, you can configure Synapse to expose an HTTPS port. To do
419437 so, you will need to edit `homeserver.yaml`, as follows:
420438
421 * First, under the `listeners` section, uncomment the configuration for the
439 - First, under the `listeners` section, uncomment the configuration for the
422440 TLS-enabled listener. (Remove the hash sign (`#`) at the start of
423441 each line). The relevant lines are like this:
424442
443 ```yaml
444 - port: 8448
445 type: http
446 tls: true
447 resources:
448 - names: [client, federation]
425449 ```
426 - port: 8448
427 type: http
428 tls: true
429 resources:
430 - names: [client, federation]
431 ```
432
433 * You will also need to uncomment the `tls_certificate_path` and
450
451 - You will also need to uncomment the `tls_certificate_path` and
434452 `tls_private_key_path` lines under the `TLS` section. You will need to manage
435453 provisioning of these certificates yourself — Synapse had built-in ACME
436454 support, but the ACMEv1 protocol Synapse implements is deprecated, not
445463 For a more detailed guide to configuring your server for federation, see
446464 [federate.md](docs/federate.md).
447465
448 ## Client Well-Known URI
466 ### Client Well-Known URI
449467
450468 Setting up the client Well-Known URI is optional but if you set it up, it will
451469 allow users to enter their full username (e.g. `@user:<server_name>`) into clients
456474 The URL `https://<server_name>/.well-known/matrix/client` should return JSON in
457475 the following format.
458476
459 ```
477 ```json
460478 {
461479 "m.homeserver": {
462480 "base_url": "https://<matrix.example.com>"
466484
467485 It can optionally contain identity server information as well.
468486
469 ```
487 ```json
470488 {
471489 "m.homeserver": {
472490 "base_url": "https://<matrix.example.com>"
483501 view it.
484502
485503 In nginx this would be something like:
486 ```
504
505 ```nginx
487506 location /.well-known/matrix/client {
488507 return 200 '{"m.homeserver": {"base_url": "https://<matrix.example.com>"}}';
489508 default_type application/json;
496515 connect to your server. This is the same URL you put for the `m.homeserver`
497516 `base_url` above.
498517
499 ```
518 ```yaml
500519 public_baseurl: "https://<matrix.example.com>"
501520 ```
502521
503 ## Email
522 ### Email
504523
505524 It is desirable for Synapse to have the capability to send email. This allows
506525 Synapse to send password reset emails, send verifications when an email address
515534 If email is not configured, password reset, registration and notifications via
516535 email will be disabled.
517536
518 ## Registering a user
537 ### Registering a user
519538
520539 The easiest way to create a new user is to do so from a client like [Element](https://element.io/).
521540
523542
524543 This can be done as follows:
525544
526 ```
545 ```sh
527546 $ source ~/synapse/env/bin/activate
528547 $ synctl start # if not already running
529548 $ register_new_matrix_user -c homeserver.yaml http://localhost:8008
541560 anyone with knowledge of it can register users, including admin accounts,
542561 on your server even if `enable_registration` is `false`.
543562
544 ## Setting up a TURN server
563 ### Setting up a TURN server
545564
546565 For reliable VoIP calls to be routed via this homeserver, you MUST configure
547566 a TURN server. See [docs/turn-howto.md](docs/turn-howto.md) for details.
548567
549 ## URL previews
568 ### URL previews
550569
551570 Synapse includes support for previewing URLs, which is disabled by default. To
552571 turn it on you must enable the `url_preview_enabled: True` config parameter
556575 spidering 'internal' URLs on your network. At the very least we recommend that
557576 your loopback and RFC1918 IP addresses are blacklisted.
558577
559 This also requires the optional `lxml` and `netaddr` python dependencies to be
560 installed. This in turn requires the `libxml2` library to be available - on
561 Debian/Ubuntu this means `apt-get install libxml2-dev`, or equivalent for
562 your OS.
563
564 # Troubleshooting Installation
578 This also requires the optional `lxml` python dependency to be installed. This
579 in turn requires the `libxml2` library to be available - on Debian/Ubuntu this
580 means `apt-get install libxml2-dev`, or equivalent for your OS.
581
582 ### Troubleshooting Installation
565583
566584 `pip` seems to leak *lots* of memory during installation. For instance, a Linux
567585 host with 512MB of RAM may run out of memory whilst installing Twisted. If this
568586 happens, you will have to individually install the dependencies which are
569587 failing, e.g.:
570588
571 ```
589 ```sh
572590 pip install twisted
573591 ```
574592
242242 Synapse Development
243243 ===================
244244
245 Join our developer community on Matrix: [#synapse-dev:matrix.org](https://matrix.to/#/#synapse-dev:matrix.org)
246
245247 Before setting up a development environment for synapse, make sure you have the
246248 system dependencies (such as the python header files) installed - see
247249 `Installing from source <INSTALL.md#installing-from-source>`_.
44 version you currently have installed to the current version of Synapse. The extra
55 instructions that may be required are listed later in this document.
66
7 * Check that your versions of Python and PostgreSQL are still supported.
8
9 Synapse follows upstream lifecycles for `Python`_ and `PostgreSQL`_, and
10 removes support for versions which are no longer maintained.
11
12 The website https://endoflife.date also offers convenient summaries.
13
14 .. _Python: https://devguide.python.org/devcycle/#end-of-life-branches
15 .. _PostgreSQL: https://www.postgresql.org/support/versioning/
16
717 * If Synapse was installed using `prebuilt packages
818 <INSTALL.md#prebuilt-packages>`_, you will need to follow the normal process
919 for upgrading those packages.
7383 # replace `1.3.0` and `stretch` accordingly:
7484 wget https://packages.matrix.org/debian/pool/main/m/matrix-synapse-py3/matrix-synapse-py3_1.3.0+stretch1_amd64.deb
7585 dpkg -i matrix-synapse-py3_1.3.0+stretch1_amd64.deb
86
87 Upgrading to v1.25.0
88 ====================
89
90 Last release supporting Python 3.5
91 ----------------------------------
92
93 This is the last release of Synapse which guarantees support with Python 3.5,
94 which passed its upstream End of Life date several months ago.
95
96 We will attempt to maintain support through March 2021, but without guarantees.
97
98 In the future, Synapse will follow upstream schedules for ending support of
99 older versions of Python and PostgreSQL. Please upgrade to at least Python 3.6
100 and PostgreSQL 9.6 as soon as possible.
101
102 Blacklisting IP ranges
103 ----------------------
104
105 Synapse v1.25.0 includes new settings, ``ip_range_blacklist`` and
106 ``ip_range_whitelist``, for controlling outgoing requests from Synapse for federation,
107 identity servers, push, and for checking key validity for third-party invite events.
108 The previous setting, ``federation_ip_range_blacklist``, is deprecated. The new
109 ``ip_range_blacklist`` defaults to private IP ranges if it is not defined.
110
111 If you have never customised ``federation_ip_range_blacklist`` it is recommended
112 that you remove that setting.
113
114 If you have customised ``federation_ip_range_blacklist`` you should update the
115 setting name to ``ip_range_blacklist``.
116
117 If you have a custom push server that is reached via private IP space you may
118 need to customise ``ip_range_blacklist`` or ``ip_range_whitelist``.
76119
77120 Upgrading to v1.24.0
78121 ====================
5757 labels:
5858 type: "PDU"
5959 expr: 'synapse_federation_transaction_queue_pending_pdus + 0'
60
61 - record: synapse_storage_events_persisted_by_source_type
62 expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep{origin_type="remote"})
63 labels:
64 type: remote
65 - record: synapse_storage_events_persisted_by_source_type
66 expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep{origin_entity="*client*",origin_type="local"})
67 labels:
68 type: local
69 - record: synapse_storage_events_persisted_by_source_type
70 expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep{origin_entity!="*client*",origin_type="local"})
71 labels:
72 type: bridges
73 - record: synapse_storage_events_persisted_by_event_type
74 expr: sum without(origin_entity, origin_type) (synapse_storage_events_persisted_events_sep)
75 - record: synapse_storage_events_persisted_by_origin
76 expr: sum without(type) (synapse_storage_events_persisted_events_sep)
77
0 matrix-synapse-py3 (1.25.0) stable; urgency=medium
1
2 [ Dan Callahan ]
3 * Update dependencies to account for the removal of the transitional
4 dh-systemd package from Debian Bullseye.
5
6 [ Synapse Packaging team ]
7 * New synapse release 1.25.0.
8
9 -- Synapse Packaging team <packages@matrix.org> Wed, 13 Jan 2021 10:14:55 +0000
10
011 matrix-synapse-py3 (1.24.0) stable; urgency=medium
112
213 * New synapse release 1.24.0.
314
415 -- Synapse Packaging team <packages@matrix.org> Wed, 09 Dec 2020 10:14:30 +0000
16
17 matrix-synapse-py3 (1.23.1) stable; urgency=medium
18
19 * New synapse release 1.23.1.
20
21 -- Synapse Packaging team <packages@matrix.org> Wed, 09 Dec 2020 10:40:39 +0000
522
623 matrix-synapse-py3 (1.23.0) stable; urgency=medium
724
22 Priority: extra
33 Maintainer: Synapse Packaging team <packages@matrix.org>
44 # keep this list in sync with the build dependencies in docker/Dockerfile-dhvirtualenv.
5 # TODO: Remove the dependency on dh-systemd after dropping support for Ubuntu xenial
6 # On all other supported releases, it's merely a transitional package which
7 # does nothing but depends on debhelper (> 9.20160709)
58 Build-Depends:
6 debhelper (>= 9),
7 dh-systemd,
9 debhelper (>= 9.20160709) | dh-systemd,
810 dh-virtualenv (>= 1.1),
911 libsystemd-dev,
1012 libpq-dev,
4949 ARG distro=""
5050 ENV distro ${distro}
5151
52 # Python < 3.7 assumes LANG="C" means ASCII-only and throws on printing unicode
53 # http://bugs.python.org/issue19846
54 ENV LANG C.UTF-8
55
5256 # Install the build dependencies
5357 #
5458 # NB: keep this list in sync with the list of build-deps in debian/control
5559 # TODO: it would be nice to do that automatically.
60 # TODO: Remove the dh-systemd stanza after dropping support for Ubuntu xenial
61 # it's a transitional package on all other, more recent releases
5662 RUN apt-get update -qq -o Acquire::Languages=none \
5763 && env DEBIAN_FRONTEND=noninteractive apt-get install \
5864 -yqq --no-install-recommends -o Dpkg::Options::=--force-unsafe-io \
5965 build-essential \
6066 debhelper \
6167 devscripts \
62 dh-systemd \
6368 libsystemd-dev \
6469 lsb-release \
6570 pkg-config \
6873 python3-setuptools \
6974 python3-venv \
7075 sqlite3 \
71 libpq-dev
76 libpq-dev \
77 xmlsec1 \
78 && ( env DEBIAN_FRONTEND=noninteractive apt-get install \
79 -yqq --no-install-recommends -o Dpkg::Options::=--force-unsafe-io \
80 dh-systemd || true )
7281
7382 COPY --from=builder /dh-virtualenv_1.2~dev-1_all.deb /
7483
0 # Contents
1 - [List all media in a room](#list-all-media-in-a-room)
2 - [Quarantine media](#quarantine-media)
3 * [Quarantining media by ID](#quarantining-media-by-id)
4 * [Quarantining media in a room](#quarantining-media-in-a-room)
5 * [Quarantining all media of a user](#quarantining-all-media-of-a-user)
6 - [Delete local media](#delete-local-media)
7 * [Delete a specific local media](#delete-a-specific-local-media)
8 * [Delete local media by date or size](#delete-local-media-by-date-or-size)
9 - [Purge Remote Media API](#purge-remote-media-api)
10
011 # List all media in a room
112
213 This API gets a list of known media in a room.
1021 server admin: see [README.rst](README.rst).
1122
1223 The API returns a JSON body like the following:
13 ```
14 {
15 "local": [
16 "mxc://localhost/xwvutsrqponmlkjihgfedcba",
17 "mxc://localhost/abcdefghijklmnopqrstuvwx"
18 ],
19 "remote": [
20 "mxc://matrix.org/xwvutsrqponmlkjihgfedcba",
21 "mxc://matrix.org/abcdefghijklmnopqrstuvwx"
22 ]
24 ```json
25 {
26 "local": [
27 "mxc://localhost/xwvutsrqponmlkjihgfedcba",
28 "mxc://localhost/abcdefghijklmnopqrstuvwx"
29 ],
30 "remote": [
31 "mxc://matrix.org/xwvutsrqponmlkjihgfedcba",
32 "mxc://matrix.org/abcdefghijklmnopqrstuvwx"
33 ]
2334 }
2435 ```
2536
4758
4859 Response:
4960
50 ```
61 ```json
5162 {}
5263 ```
5364
6778
6879 Response:
6980
70 ```
71 {
72 "num_quarantined": 10 # The number of media items successfully quarantined
73 }
74 ```
81 ```json
82 {
83 "num_quarantined": 10
84 }
85 ```
86
87 The following fields are returned in the JSON response body:
88
89 * `num_quarantined`: integer - The number of media items successfully quarantined
7590
7691 Note that there is a legacy endpoint, `POST
77 /_synapse/admin/v1/quarantine_media/<room_id >`, that operates the same.
92 /_synapse/admin/v1/quarantine_media/<room_id>`, that operates the same.
7893 However, it is deprecated and may be removed in a future release.
7994
8095 ## Quarantining all media of a user
91106 {}
92107 ```
93108
94 Where `user_id` is in the form of `@bob:example.org`.
95
96 Response:
97
98 ```
99 {
100 "num_quarantined": 10 # The number of media items successfully quarantined
101 }
102 ```
109 URL Parameters
110
111 * `user_id`: string - User ID in the form of `@bob:example.org`
112
113 Response:
114
115 ```json
116 {
117 "num_quarantined": 10
118 }
119 ```
120
121 The following fields are returned in the JSON response body:
122
123 * `num_quarantined`: integer - The number of media items successfully quarantined
103124
104125 # Delete local media
105126 This API deletes the *local* media from the disk of your own server.
107128 remote homeservers.
108129 This API will not affect media that has been uploaded to external
109130 media repositories (e.g https://github.com/turt2live/matrix-media-repo/).
110 See also [purge_remote_media.rst](purge_remote_media.rst).
131 See also [Purge Remote Media API](#purge-remote-media-api).
111132
112133 ## Delete a specific local media
113134 Delete a specific `media_id`.
128149 Response:
129150
130151 ```json
131 {
132 "deleted_media": [
133 "abcdefghijklmnopqrstuvwx"
134 ],
135 "total": 1
136 }
152 {
153 "deleted_media": [
154 "abcdefghijklmnopqrstuvwx"
155 ],
156 "total": 1
157 }
137158 ```
138159
139160 The following fields are returned in the JSON response body:
166187 Response:
167188
168189 ```json
169 {
170 "deleted_media": [
171 "abcdefghijklmnopqrstuvwx",
172 "abcdefghijklmnopqrstuvwz"
173 ],
174 "total": 2
175 }
190 {
191 "deleted_media": [
192 "abcdefghijklmnopqrstuvwx",
193 "abcdefghijklmnopqrstuvwz"
194 ],
195 "total": 2
196 }
176197 ```
177198
178199 The following fields are returned in the JSON response body:
179200
180201 * `deleted_media`: an array of strings - List of deleted `media_id`
181202 * `total`: integer - Total number of deleted `media_id`
203
204 # Purge Remote Media API
205
206 The purge remote media API allows server admins to purge old cached remote media.
207
208 The API is:
209
210 ```
211 POST /_synapse/admin/v1/purge_media_cache?before_ts=<unix_timestamp_in_ms>
212
213 {}
214 ```
215
216 URL Parameters
217
218 * `unix_timestamp_in_ms`: string representing a positive integer - Unix timestamp in ms.
219 All cached media that was last accessed before this timestamp will be removed.
220
221 Response:
222
223 ```json
224 {
225 "deleted": 10
226 }
227 ```
228
229 The following fields are returned in the JSON response body:
230
231 * `deleted`: integer - The number of media items successfully deleted
232
233 To use it, you will need to authenticate by providing an `access_token` for a
234 server admin: see [README.rst](README.rst).
235
236 If the user re-requests purged remote media, synapse will re-request the media
237 from the originating server.
+0
-20
docs/admin_api/purge_remote_media.rst less more
0 Purge Remote Media API
1 ======================
2
3 The purge remote media API allows server admins to purge old cached remote
4 media.
5
6 The API is::
7
8 POST /_synapse/admin/v1/purge_media_cache?before_ts=<unix_timestamp_in_ms>
9
10 {}
11
12 \... which will remove all cached media that was last accessed before
13 ``<unix_timestamp_in_ms>``.
14
15 To use it, you will need to authenticate by providing an ``access_token`` for a
16 server admin: see `README.rst <README.rst>`_.
17
18 If the user re-requests purged remote media, synapse will re-request the media
19 from the originating server.
0 Purge room API
1 ==============
0 Deprecated: Purge room API
1 ==========================
2
3 **The old Purge room API is deprecated and will be removed in a future release.
4 See the new [Delete Room API](rooms.md#delete-room-api) for more details.**
25
36 This API will remove all trace of a room from your database.
47
58 All local users must have left the room before it can be removed.
6
7 See also: [Delete Room API](rooms.md#delete-room-api)
89
910 The API is:
1011
0 # Contents
1 - [List Room API](#list-room-api)
2 * [Parameters](#parameters)
3 * [Usage](#usage)
4 - [Room Details API](#room-details-api)
5 - [Room Members API](#room-members-api)
6 - [Delete Room API](#delete-room-api)
7 * [Parameters](#parameters-1)
8 * [Response](#response)
9 * [Undoing room shutdowns](#undoing-room-shutdowns)
10 - [Make Room Admin API](#make-room-admin-api)
11
012 # List Room API
113
214 The List Room admin API allows server admins to get a list of rooms on their
7587
7688 Response:
7789
78 ```
90 ```jsonc
7991 {
8092 "rooms": [
8193 {
127139
128140 Response:
129141
130 ```
142 ```json
131143 {
132144 "rooms": [
133145 {
162174
163175 Response:
164176
165 ```
177 ```jsonc
166178 {
167179 "rooms": [
168180 {
218230
219231 Response:
220232
221 ```
233 ```jsonc
222234 {
223235 "rooms": [
224236 {
225237 "room_id": "!mscvqgqpHYjBGDxNym:matrix.org",
226238 "name": "Music Theory",
227239 "canonical_alias": "#musictheory:matrix.org",
228 "joined_members": 127
240 "joined_members": 127,
229241 "joined_local_members": 2,
230242 "version": "1",
231243 "creator": "@foo:matrix.org",
242254 "room_id": "!twcBhHVdZlQWuuxBhN:termina.org.uk",
243255 "name": "weechat-matrix",
244256 "canonical_alias": "#weechat-matrix:termina.org.uk",
245 "joined_members": 137
257 "joined_members": 137,
246258 "joined_local_members": 20,
247259 "version": "4",
248260 "creator": "@foo:termina.org.uk",
277289 * `canonical_alias` - The canonical (main) alias address of the room.
278290 * `joined_members` - How many users are currently in the room.
279291 * `joined_local_members` - How many local users are currently in the room.
292 * `joined_local_devices` - How many local devices are currently in the room.
280293 * `version` - The version of the room as a string.
281294 * `creator` - The `user_id` of the room creator.
282295 * `encryption` - Algorithm of end-to-end encryption of messages. Is `null` if encryption is not active.
299312
300313 Response:
301314
302 ```
315 ```json
303316 {
304317 "room_id": "!mscvqgqpHYjBGDxNym:matrix.org",
305318 "name": "Music Theory",
306319 "avatar": "mxc://matrix.org/AQDaVFlbkQoErdOgqWRgiGSV",
307320 "topic": "Theory, Composition, Notation, Analysis",
308321 "canonical_alias": "#musictheory:matrix.org",
309 "joined_members": 127
322 "joined_members": 127,
310323 "joined_local_members": 2,
324 "joined_local_devices": 2,
311325 "version": "1",
312326 "creator": "@foo:matrix.org",
313327 "encryption": null,
341355
342356 Response:
343357
344 ```
358 ```json
345359 {
346360 "members": [
347361 "@foo:matrix.org",
348362 "@bar:matrix.org",
349 "@foobar:matrix.org
350 ],
363 "@foobar:matrix.org"
364 ],
351365 "total": 3
352366 }
353367 ```
356370
357371 The Delete Room admin API allows server admins to remove rooms from server
358372 and block these rooms.
359 It is a combination and improvement of "[Shutdown room](shutdown_room.md)"
360 and "[Purge room](purge_room.md)" API.
361373
362374 Shuts down a room. Moves all local users and room aliases automatically to a
363375 new room if `new_room_user_id` is set. Otherwise local users only
454466 * `local_aliases` - An array of strings representing the local aliases that were migrated from
455467 the old room to the new.
456468 * `new_room_id` - A string representing the room ID of the new room.
469
470
471 ## Undoing room shutdowns
472
473 *Note*: This guide may be outdated by the time you read it. By nature of room shutdowns being performed at the database level,
474 the structure can and does change without notice.
475
476 First, it's important to understand that a room shutdown is very destructive. Undoing a shutdown is not as simple as pretending it
477 never happened - work has to be done to move forward instead of resetting the past. In fact, in some cases it might not be possible
478 to recover at all:
479
480 * If the room was invite-only, your users will need to be re-invited.
481 * If the room no longer has any members at all, it'll be impossible to rejoin.
482 * The first user to rejoin will have to do so via an alias on a different server.
483
484 With all that being said, if you still want to try and recover the room:
485
486 1. For safety reasons, shut down Synapse.
487 2. In the database, run `DELETE FROM blocked_rooms WHERE room_id = '!example:example.org';`
488 * For caution: it's recommended to run this in a transaction: `BEGIN; DELETE ...;`, verify you got 1 result, then `COMMIT;`.
489 * The room ID is the same one supplied to the shutdown room API, not the Content Violation room.
490 3. Restart Synapse.
491
492 You will have to manually handle, if you so choose, the following:
493
494 * Aliases that would have been redirected to the Content Violation room.
495 * Users that would have been booted from the room (and will have been force-joined to the Content Violation room).
496 * Removal of the Content Violation room if desired.
497
498
499 # Make Room Admin API
500
501 Grants another user the highest power available to a local user who is in the room.
502 If the user is not in the room, and it is not publicly joinable, then invite the user.
503
504 By default the server admin (the caller) is granted power, but another user can
505 optionally be specified, e.g.:
506
507 ```
508 POST /_synapse/admin/v1/rooms/<room_id_or_alias>/make_room_admin
509 {
510 "user_id": "@foo:example.com"
511 }
512 ```
0 # Shutdown room API
0 # Deprecated: Shutdown room API
1
2 **The old Shutdown room API is deprecated and will be removed in a future release.
3 See the new [Delete Room API](rooms.md#delete-room-api) for more details.**
14
25 Shuts down a room, preventing new joins and moves local users and room aliases automatically
36 to a new room. The new room will be created with the user specified by the
811
912 The local server will only have the power to move local user and room aliases to
1013 the new room. Users on other servers will be unaffected.
11
12 See also: [Delete Room API](rooms.md#delete-room-api)
1314
1415 ## API
1516
2929 ],
3030 "avatar_url": "<avatar_url>",
3131 "admin": false,
32 "deactivated": false
32 "deactivated": false,
33 "password_hash": "$2b$12$p9B4GkqYdRTPGD",
34 "creation_ts": 1560432506,
35 "appservice_id": null,
36 "consent_server_notice_sent": null,
37 "consent_version": null
3338 }
3439
3540 URL parameters:
138143 "users": [
139144 {
140145 "name": "<user_id1>",
141 "password_hash": "<password_hash1>",
142146 "is_guest": 0,
143147 "admin": 0,
144148 "user_type": null,
147151 "avatar_url": null
148152 }, {
149153 "name": "<user_id2>",
150 "password_hash": "<password_hash2>",
151154 "is_guest": 0,
152155 "admin": 1,
153156 "user_type": null,
3030 You should now have a Django project configured to serve CAS authentication with
3131 a single user created.
3232
33 ## Configure Synapse (and Riot) to use CAS
33 ## Configure Synapse (and Element) to use CAS
3434
3535 1. Modify your `homeserver.yaml` to enable CAS and point it to your locally
3636 running Django test server:
5050
5151 ## Testing the configuration
5252
53 Then in Riot:
53 Then in Element:
5454
55 1. Visit the login page with a Riot pointing at your homeserver.
55 1. Visit the login page with a Element pointing at your homeserver.
5656 2. Click the Single Sign-On button.
5757 3. Login using the credentials created with `createsuperuser`.
5858 4. You should be logged in.
143143 #
144144 #enable_search: false
145145
146 # Prevent outgoing requests from being sent to the following blacklisted IP address
147 # CIDR ranges. If this option is not specified then it defaults to private IP
148 # address ranges (see the example below).
149 #
150 # The blacklist applies to the outbound requests for federation, identity servers,
151 # push servers, and for checking key validity for third-party invite events.
152 #
153 # (0.0.0.0 and :: are always blacklisted, whether or not they are explicitly
154 # listed here, since they correspond to unroutable addresses.)
155 #
156 # This option replaces federation_ip_range_blacklist in Synapse v1.25.0.
157 #
158 #ip_range_blacklist:
159 # - '127.0.0.0/8'
160 # - '10.0.0.0/8'
161 # - '172.16.0.0/12'
162 # - '192.168.0.0/16'
163 # - '100.64.0.0/10'
164 # - '192.0.0.0/24'
165 # - '169.254.0.0/16'
166 # - '198.18.0.0/15'
167 # - '192.0.2.0/24'
168 # - '198.51.100.0/24'
169 # - '203.0.113.0/24'
170 # - '224.0.0.0/4'
171 # - '::1/128'
172 # - 'fe80::/10'
173 # - 'fc00::/7'
174
175 # List of IP address CIDR ranges that should be allowed for federation,
176 # identity servers, push servers, and for checking key validity for
177 # third-party invite events. This is useful for specifying exceptions to
178 # wide-ranging blacklisted target IP ranges - e.g. for communication with
179 # a push server only visible in your network.
180 #
181 # This whitelist overrides ip_range_blacklist and defaults to an empty
182 # list.
183 #
184 #ip_range_whitelist:
185 # - '192.168.1.1'
186
146187 # List of ports that Synapse should listen on, their purpose and their
147188 # configuration.
148189 #
640681 # - lon.example.com
641682 # - nyc.example.com
642683 # - syd.example.com
643
644 # Prevent federation requests from being sent to the following
645 # blacklist IP address CIDR ranges. If this option is not specified, or
646 # specified with an empty list, no ip range blacklist will be enforced.
647 #
648 # As of Synapse v1.4.0 this option also affects any outbound requests to identity
649 # servers provided by user input.
650 #
651 # (0.0.0.0 and :: are always blacklisted, whether or not they are explicitly
652 # listed here, since they correspond to unroutable addresses.)
653 #
654 federation_ip_range_blacklist:
655 - '127.0.0.0/8'
656 - '10.0.0.0/8'
657 - '172.16.0.0/12'
658 - '192.168.0.0/16'
659 - '100.64.0.0/10'
660 - '169.254.0.0/16'
661 - '::1/128'
662 - 'fe80::/64'
663 - 'fc00::/7'
664684
665685 # Report prometheus metrics on the age of PDUs being sent to and received from
666686 # the following domains. This can be used to give an idea of "delay" on inbound
952972 # - '172.16.0.0/12'
953973 # - '192.168.0.0/16'
954974 # - '100.64.0.0/10'
975 # - '192.0.0.0/24'
955976 # - '169.254.0.0/16'
977 # - '198.18.0.0/15'
978 # - '192.0.2.0/24'
979 # - '198.51.100.0/24'
980 # - '203.0.113.0/24'
981 # - '224.0.0.0/4'
956982 # - '::1/128'
957 # - 'fe80::/64'
983 # - 'fe80::/10'
958984 # - 'fc00::/7'
959985
960986 # List of IP address CIDR ranges that the URL preview spider is allowed
17981824 # * user: The claims returned by the UserInfo Endpoint and/or in the ID
17991825 # Token
18001826 #
1801 # This must be configured if using the default mapping provider.
1827 # If this is not set, the user will be prompted to choose their
1828 # own username.
18021829 #
1803 localpart_template: "{{ user.preferred_username }}"
1830 #localpart_template: "{{ user.preferred_username }}"
18041831
18051832 # Jinja2 template for the display name to set on first login.
18061833 #
18761903 # - https://my.custom.client/
18771904
18781905 # Directory in which Synapse will try to find the template files below.
1879 # If not set, default templates from within the Synapse package will be used.
1880 #
1881 # DO NOT UNCOMMENT THIS SETTING unless you want to customise the templates.
1882 # If you *do* uncomment it, you will need to make sure that all the templates
1883 # below are in the directory.
1906 # If not set, or the files named below are not found within the template
1907 # directory, default templates from within the Synapse package will be used.
18841908 #
18851909 # Synapse will look for the following templates in this directory:
18861910 #
20442068 #
20452069 #require_uppercase: true
20462070
2071 ui_auth:
2072 # The number of milliseconds to allow a user-interactive authentication
2073 # session to be active.
2074 #
2075 # This defaults to 0, meaning the user is queried for their credentials
2076 # before every action, but this can be overridden to alow a single
2077 # validation to be re-used. This weakens the protections afforded by
2078 # the user-interactive authentication process, by allowing for multiple
2079 # (and potentially different) operations to use the same validation session.
2080 #
2081 # Uncomment below to allow for credential validation to last for 15
2082 # seconds.
2083 #
2084 #session_timeout: 15000
2085
20472086
20482087 # Configuration for sending emails from Synapse.
20492088 #
21092148 #
21102149 #validation_token_lifetime: 15m
21112150
2151 # The web client location to direct users to during an invite. This is passed
2152 # to the identity server as the org.matrix.web_client_location key. Defaults
2153 # to unset, giving no guidance to the identity server.
2154 #
2155 #invite_client_location: https://app.element.io
2156
21122157 # Directory in which Synapse will try to find the template files below.
2113 # If not set, default templates from within the Synapse package will be used.
2114 #
2115 # Do not uncomment this setting unless you want to customise the templates.
2158 # If not set, or the files named below are not found within the template
2159 # directory, default templates from within the Synapse package will be used.
21162160 #
21172161 # Synapse will look for the following templates in this directory:
21182162 #
23212365 # If enabled, non server admins can only create groups with local parts
23222366 # starting with this prefix
23232367 #
2324 #group_creation_prefix: "unofficial/"
2368 #group_creation_prefix: "unofficial_"
23252369
23262370
23272371
25862630 #
25872631 #run_background_tasks_on: worker1
25882632
2633 # A shared secret used by the replication APIs to authenticate HTTP requests
2634 # from workers.
2635 #
2636 # By default this is unused and traffic is not authenticated.
2637 #
2638 #worker_replication_secret: ""
2639
25892640
25902641 # Configuration for Redis when using workers. This *must* be enabled when
25912642 # using workers (unless using old style direct TCP configuration).
2121 * `user_may_create_room`
2222 * `user_may_create_room_alias`
2323 * `user_may_publish_room`
24 * `check_username_for_spam`
25 * `check_registration_for_spam`
2426
2527 The details of the each of these methods (as well as their inputs and outputs)
2628 are documented in the `synapse.events.spamcheck.SpamChecker` class.
3133 ### Example
3234
3335 ```python
36 from synapse.spam_checker_api import RegistrationBehaviour
37
3438 class ExampleSpamChecker:
3539 def __init__(self, config, api):
3640 self.config = config
3741 self.api = api
3842
39 def check_event_for_spam(self, foo):
43 async def check_event_for_spam(self, foo):
4044 return False # allow all events
4145
42 def user_may_invite(self, inviter_userid, invitee_userid, room_id):
46 async def user_may_invite(self, inviter_userid, invitee_userid, room_id):
4347 return True # allow all invites
4448
45 def user_may_create_room(self, userid):
49 async def user_may_create_room(self, userid):
4650 return True # allow all room creations
4751
48 def user_may_create_room_alias(self, userid, room_alias):
52 async def user_may_create_room_alias(self, userid, room_alias):
4953 return True # allow all room aliases
5054
51 def user_may_publish_room(self, userid, room_id):
55 async def user_may_publish_room(self, userid, room_id):
5256 return True # allow publishing of all rooms
5357
54 def check_username_for_spam(self, user_profile):
58 async def check_username_for_spam(self, user_profile):
5559 return False # allow all usernames
60
61 async def check_registration_for_spam(self, email_threepid, username, request_info):
62 return RegistrationBehaviour.ALLOW # allow all registrations
5663 ```
5764
5865 ## Configuration
1414 SSO mapping providers are currently supported for OpenID and SAML SSO
1515 configurations. Please see the details below for how to implement your own.
1616
17 It is the responsibility of the mapping provider to normalise the SSO attributes
18 and map them to a valid Matrix ID. The
19 [specification for Matrix IDs](https://matrix.org/docs/spec/appendices#user-identifiers)
20 has some information about what is considered valid. Alternately an easy way to
21 ensure it is valid is to use a Synapse utility function:
22 `synapse.types.map_username_to_mxid_localpart`.
17 It is up to the mapping provider whether the user should be assigned a predefined
18 Matrix ID based on the SSO attributes, or if the user should be allowed to
19 choose their own username.
20
21 In the first case - where users are automatically allocated a Matrix ID - it is
22 the responsibility of the mapping provider to normalise the SSO attributes and
23 map them to a valid Matrix ID. The [specification for Matrix
24 IDs](https://matrix.org/docs/spec/appendices#user-identifiers) has some
25 information about what is considered valid.
26
27 If the mapping provider does not assign a Matrix ID, then Synapse will
28 automatically serve an HTML page allowing the user to pick their own username.
2329
2430 External mapping providers are provided to Synapse in the form of an external
2531 Python module. You can retrieve this module from [PyPI](https://pypi.org) or elsewhere,
7985 with failures=1. The method should then return a different
8086 `localpart` value, such as `john.doe1`.
8187 - Returns a dictionary with two keys:
82 - localpart: A required string, used to generate the Matrix ID.
83 - displayname: An optional string, the display name for the user.
88 - `localpart`: A string, used to generate the Matrix ID. If this is
89 `None`, the user is prompted to pick their own username.
90 - `displayname`: An optional string, the display name for the user.
8491 * `get_extra_attributes(self, userinfo, token)`
8592 - This method must be async.
8693 - Arguments:
115122
116123 A custom mapping provider must specify the following methods:
117124
118 * `__init__(self, parsed_config)`
125 * `__init__(self, parsed_config, module_api)`
119126 - Arguments:
120127 - `parsed_config` - A configuration object that is the return value of the
121128 `parse_config` method. You should set any configuration options needed by
122129 the module here.
130 - `module_api` - a `synapse.module_api.ModuleApi` object which provides the
131 stable API available for extension modules.
123132 * `parse_config(config)`
124133 - This method should have the `@staticmethod` decoration.
125134 - Arguments:
162171 redirected to.
163172 - This method must return a dictionary, which will then be used by Synapse
164173 to build a new user. The following keys are allowed:
165 * `mxid_localpart` - Required. The mxid localpart of the new user.
174 * `mxid_localpart` - The mxid localpart of the new user. If this is
175 `None`, the user is prompted to pick their own username.
166176 * `displayname` - The displayname of the new user. If not provided, will default to
167177 the value of `mxid_localpart`.
168178 * `emails` - A list of emails for the new user. If not provided, will
169179 default to an empty list.
170
180
171181 Alternatively it can raise a `synapse.api.errors.RedirectException` to
172182 redirect the user to another page. This is useful to prompt the user for
173183 additional information, e.g. if you want them to provide their own username.
8888 Normally, only a couple of changes are needed to make an existing configuration
8989 file suitable for use with workers. First, you need to enable an "HTTP replication
9090 listener" for the main process; and secondly, you need to enable redis-based
91 replication. For example:
91 replication. Optionally, a shared secret can be used to authenticate HTTP
92 traffic between workers. For example:
9293
9394
9495 ```yaml
101102 type: http
102103 resources:
103104 - names: [replication]
105
106 # Add a random shared secret to authenticate traffic.
107 worker_replication_secret: ""
104108
105109 redis:
106110 enabled: true
224228 ^/_matrix/client/(r0|unstable)/auth/.*/fallback/web$
225229
226230 # Event sending requests
231 ^/_matrix/client/(api/v1|r0|unstable)/rooms/.*/redact
227232 ^/_matrix/client/(api/v1|r0|unstable)/rooms/.*/send
228233 ^/_matrix/client/(api/v1|r0|unstable)/rooms/.*/state/
229234 ^/_matrix/client/(api/v1|r0|unstable)/rooms/.*/(join|invite|leave|ban|unban|kick)$
66 show_traceback = True
77 mypy_path = stubs
88 warn_unreachable = True
9
10 # To find all folders that pass mypy you run:
11 #
12 # find synapse/* -type d -not -name __pycache__ -exec bash -c "mypy '{}' > /dev/null" \; -print
13
914 files =
1015 scripts-dev/sign_json,
1116 synapse/api,
1217 synapse/appservice,
1318 synapse/config,
19 synapse/crypto,
1420 synapse/event_auth.py,
1521 synapse/events/builder.py,
1622 synapse/events/validator.py,
1925 synapse/handlers/_base.py,
2026 synapse/handlers/account_data.py,
2127 synapse/handlers/account_validity.py,
28 synapse/handlers/admin.py,
2229 synapse/handlers/appservice.py,
2330 synapse/handlers/auth.py,
2431 synapse/handlers/cas_handler.py,
3744 synapse/handlers/presence.py,
3845 synapse/handlers/profile.py,
3946 synapse/handlers/read_marker.py,
47 synapse/handlers/receipts.py,
4048 synapse/handlers/register.py,
4149 synapse/handlers/room.py,
50 synapse/handlers/room_list.py,
4251 synapse/handlers/room_member.py,
4352 synapse/handlers/room_member_worker.py,
4453 synapse/handlers/saml_handler.py,
54 synapse/handlers/sso.py,
4555 synapse/handlers/sync.py,
56 synapse/handlers/user_directory.py,
4657 synapse/handlers/ui_auth,
4758 synapse/http/client.py,
4859 synapse/http/federation/matrix_federation_agent.py,
5465 synapse/metrics,
5566 synapse/module_api,
5667 synapse/notifier.py,
57 synapse/push/pusherpool.py,
58 synapse/push/push_rule_evaluator.py,
68 synapse/push,
5969 synapse/replication,
6070 synapse/rest,
6171 synapse/server.py,
6272 synapse/server_notices,
6373 synapse/spam_checker_api,
6474 synapse/state,
75 synapse/storage/__init__.py,
76 synapse/storage/_base.py,
77 synapse/storage/background_updates.py,
6578 synapse/storage/databases/main/appservice.py,
6679 synapse/storage/databases/main/events.py,
80 synapse/storage/databases/main/keys.py,
81 synapse/storage/databases/main/pusher.py,
6782 synapse/storage/databases/main/registration.py,
6883 synapse/storage/databases/main/stream.py,
6984 synapse/storage/databases/main/ui_auth.py,
7085 synapse/storage/database.py,
7186 synapse/storage/engines,
87 synapse/storage/keys.py,
7288 synapse/storage/persist_events.py,
89 synapse/storage/prepare_database.py,
90 synapse/storage/purge_events.py,
91 synapse/storage/push_rule.py,
92 synapse/storage/relations.py,
93 synapse/storage/roommember.py,
7394 synapse/storage/state.py,
95 synapse/storage/types.py,
7496 synapse/storage/util,
7597 synapse/streams,
7698 synapse/types.py,
105127 ignore_missing_imports = True
106128
107129 [mypy-h11]
130 ignore_missing_imports = True
131
132 [mypy-msgpack]
108133 ignore_missing_imports = True
109134
110135 [mypy-opentracing]
3939 )
4040
4141 args = parser.parse_args()
42 args.output_file.write(DEFAULT_LOG_CONFIG.substitute(log_file=args.log_file))
42 out = args.output_file
43 out.write(DEFAULT_LOG_CONFIG.substitute(log_file=args.log_file))
44 out.flush()
3030 ) -> Optional[Callable[[MethodSigContext], CallableType]]:
3131 if fullname.startswith(
3232 "synapse.util.caches.descriptors._CachedFunction.__call__"
33 ) or fullname.startswith(
34 "synapse.util.caches.descriptors._LruCachedFunction.__call__"
3335 ):
3436 return cached_function_method_signature
3537 return None
4747 except ImportError:
4848 pass
4949
50 __version__ = "1.24.0"
50 __version__ = "1.25.0"
5151
5252 if bool(os.environ.get("SYNAPSE_TEST_PATCH_LOG_CONTEXTS", False)):
5353 # We import here so that we don't have to install a bunch of deps when
2222 import synapse.types
2323 from synapse import event_auth
2424 from synapse.api.auth_blocking import AuthBlocking
25 from synapse.api.constants import EventTypes, Membership
25 from synapse.api.constants import EventTypes, HistoryVisibility, Membership
2626 from synapse.api.errors import (
2727 AuthError,
2828 Codes,
3030 MissingClientTokenError,
3131 )
3232 from synapse.api.room_versions import KNOWN_ROOM_VERSIONS
33 from synapse.appservice import ApplicationService
3334 from synapse.events import EventBase
35 from synapse.http.site import SynapseRequest
3436 from synapse.logging import opentracing as opentracing
3537 from synapse.storage.databases.main.registration import TokenLookupResult
3638 from synapse.types import StateMap, UserID
473475 now = self.hs.get_clock().time_msec()
474476 return now < expiry
475477
476 def get_appservice_by_req(self, request):
478 def get_appservice_by_req(self, request: SynapseRequest) -> ApplicationService:
477479 token = self.get_access_token_from_request(request)
478480 service = self.store.get_app_service_by_token(token)
479481 if not service:
645647 )
646648 if (
647649 visibility
648 and visibility.content["history_visibility"] == "world_readable"
650 and visibility.content.get("history_visibility")
651 == HistoryVisibility.WORLD_READABLE
649652 ):
650653 return Membership.JOIN, None
651654 raise AuthError(
3535 self._limit_usage_by_mau = hs.config.limit_usage_by_mau
3636 self._mau_limits_reserved_threepids = hs.config.mau_limits_reserved_threepids
3737 self._server_name = hs.hostname
38 self._track_appservice_user_ips = hs.config.appservice.track_appservice_user_ips
3839
3940 async def check_auth_blocking(
4041 self,
7475 elif requester.authenticated_entity == self._server_name:
7576 # We never block the server from doing actions on behalf of
7677 # users.
78 return
79 elif requester.app_service and not self._track_appservice_user_ips:
80 # If we're authenticated as an appservice then we only block
81 # auth if `track_appservice_user_ips` is set, as that option
82 # implicitly means that application services are part of MAU
83 # limits.
7784 return
7885
7986 # Never fail an auth check for the server notices users or support user
9494
9595 Presence = "m.presence"
9696
97 Dummy = "org.matrix.dummy_event"
98
9799
98100 class RejectedReason:
99101 AUTH_ERROR = "auth_error"
159161 class AccountDataTypes:
160162 DIRECT = "m.direct"
161163 IGNORED_USER_LIST = "m.ignored_user_list"
164
165
166 class HistoryVisibility:
167 INVITED = "invited"
168 JOINED = "joined"
169 SHARED = "shared"
170 WORLD_READABLE = "world_readable"
244244 # Set up the SIGHUP machinery.
245245 if hasattr(signal, "SIGHUP"):
246246
247 reactor = hs.get_reactor()
248
247249 @wrap_as_background_process("sighup")
248250 def handle_sighup(*args, **kwargs):
249251 # Tell systemd our state, if we're using it. This will silently fail if
259261 # is so that we're in a sane state, e.g. flushing the logs may fail
260262 # if the sighup happens in the middle of writing a log entry.
261263 def run_sighup(*args, **kwargs):
262 hs.get_clock().call_later(0, handle_sighup, *args, **kwargs)
264 # `callFromThread` should be "signal safe" as well as thread
265 # safe.
266 reactor.callFromThread(handle_sighup, *args, **kwargs)
263267
264268 signal.signal(signal.SIGHUP, run_sighup)
265269
8888 ToDeviceStream,
8989 )
9090 from synapse.rest.admin import register_servlets_for_media_repo
91 from synapse.rest.client.v1 import events
91 from synapse.rest.client.v1 import events, room
9292 from synapse.rest.client.v1.initial_sync import InitialSyncRestServlet
9393 from synapse.rest.client.v1.login import LoginRestServlet
9494 from synapse.rest.client.v1.profile import (
9797 ProfileRestServlet,
9898 )
9999 from synapse.rest.client.v1.push_rule import PushRuleRestServlet
100 from synapse.rest.client.v1.room import (
101 JoinedRoomMemberListRestServlet,
102 JoinRoomAliasServlet,
103 PublicRoomListRestServlet,
104 RoomEventContextServlet,
105 RoomInitialSyncRestServlet,
106 RoomMemberListRestServlet,
107 RoomMembershipRestServlet,
108 RoomMessageListRestServlet,
109 RoomSendEventRestServlet,
110 RoomStateEventRestServlet,
111 RoomStateRestServlet,
112 RoomTypingRestServlet,
113 )
114100 from synapse.rest.client.v1.voip import VoipRestServlet
115101 from synapse.rest.client.v2_alpha import groups, sync, user_directory
116102 from synapse.rest.client.v2_alpha._base import client_patterns
265251 super().__init__(hs)
266252 self.hs = hs
267253 self.is_mine_id = hs.is_mine_id
268 self.http_client = hs.get_simple_http_client()
269254
270255 self._presence_enabled = hs.config.use_presence
271256
512497 elif name == "client":
513498 resource = JsonResource(self, canonical_json=False)
514499
515 PublicRoomListRestServlet(self).register(resource)
516 RoomMemberListRestServlet(self).register(resource)
517 JoinedRoomMemberListRestServlet(self).register(resource)
518 RoomStateRestServlet(self).register(resource)
519 RoomEventContextServlet(self).register(resource)
520 RoomMessageListRestServlet(self).register(resource)
521500 RegisterRestServlet(self).register(resource)
522501 LoginRestServlet(self).register(resource)
523502 ThreepidRestServlet(self).register(resource)
526505 VoipRestServlet(self).register(resource)
527506 PushRuleRestServlet(self).register(resource)
528507 VersionsRestServlet(self).register(resource)
529 RoomSendEventRestServlet(self).register(resource)
530 RoomMembershipRestServlet(self).register(resource)
531 RoomStateEventRestServlet(self).register(resource)
532 JoinRoomAliasServlet(self).register(resource)
508
533509 ProfileAvatarURLRestServlet(self).register(resource)
534510 ProfileDisplaynameRestServlet(self).register(resource)
535511 ProfileRestServlet(self).register(resource)
536512 KeyUploadServlet(self).register(resource)
537513 AccountDataServlet(self).register(resource)
538514 RoomAccountDataServlet(self).register(resource)
539 RoomTypingRestServlet(self).register(resource)
540515
541516 sync.register_servlets(self, resource)
542517 events.register_servlets(self, resource)
518 room.register_servlets(self, resource, True)
519 room.register_deprecated_servlets(self, resource)
543520 InitialSyncRestServlet(self).register(resource)
544 RoomInitialSyncRestServlet(self).register(resource)
545521
546522 user_directory.register_servlets(self, resource)
547523
1818 import logging
1919 import os
2020 import sys
21 from typing import Iterable
21 from typing import Iterable, Iterator
2222
2323 from twisted.application import service
2424 from twisted.internet import defer, reactor
6262 from synapse.rest.admin import AdminRestResource
6363 from synapse.rest.health import HealthResource
6464 from synapse.rest.key.v2 import KeyApiV2Resource
65 from synapse.rest.synapse.client.pick_username import pick_username_resource
6566 from synapse.rest.well_known import WellKnownResource
6667 from synapse.server import HomeServer
6768 from synapse.storage import DataStore
8990 tls = listener_config.tls
9091 site_tag = listener_config.http_options.tag
9192 if site_tag is None:
92 site_tag = port
93 site_tag = str(port)
9394
9495 # We always include a health resource.
9596 resources = {"/health": HealthResource()}
106107 logger.debug("Configuring additional resources: %r", additional_resources)
107108 module_api = self.get_module_api()
108109 for path, resmodule in additional_resources.items():
109 handler_cls, config = load_module(resmodule)
110 handler_cls, config = load_module(
111 resmodule,
112 ("listeners", site_tag, "additional_resources", "<%s>" % (path,)),
113 )
110114 handler = handler_cls(config, module_api)
111115 if IResource.providedBy(handler):
112116 resource = handler
188192 "/_matrix/client/versions": client_resource,
189193 "/.well-known/matrix/client": WellKnownResource(self),
190194 "/_synapse/admin": AdminRestResource(self),
195 "/_synapse/client/pick_username": pick_username_resource(self),
191196 }
192197 )
193198
341346 "Synapse Homeserver", config_options
342347 )
343348 except ConfigError as e:
344 sys.stderr.write("\nERROR: %s\n" % (e,))
349 sys.stderr.write("\n")
350 for f in format_config_error(e):
351 sys.stderr.write(f)
352 sys.stderr.write("\n")
345353 sys.exit(1)
346354
347355 if not config:
444452 return hs
445453
446454
455 def format_config_error(e: ConfigError) -> Iterator[str]:
456 """
457 Formats a config error neatly
458
459 The idea is to format the immediate error, plus the "causes" of those errors,
460 hopefully in a way that makes sense to the user. For example:
461
462 Error in configuration at 'oidc_config.user_mapping_provider.config.display_name_template':
463 Failed to parse config for module 'JinjaOidcMappingProvider':
464 invalid jinja template:
465 unexpected end of template, expected 'end of print statement'.
466
467 Args:
468 e: the error to be formatted
469
470 Returns: An iterator which yields string fragments to be formatted
471 """
472 yield "Error in configuration"
473
474 if e.path:
475 yield " at '%s'" % (".".join(e.path),)
476
477 yield ":\n %s" % (e.msg,)
478
479 e = e.__cause__
480 indent = 1
481 while e:
482 indent += 1
483 yield ":\n%s%s" % (" " * indent, str(e))
484 e = e.__cause__
485
486
447487 class SynapseService(service.Service):
448488 """
449489 A twisted Service class that will start synapse. Used to run synapse
2222 from collections import OrderedDict
2323 from hashlib import sha256
2424 from textwrap import dedent
25 from typing import Any, Callable, List, MutableMapping, Optional
25 from typing import Any, Callable, Iterable, List, MutableMapping, Optional
2626
2727 import attr
2828 import jinja2
3131
3232
3333 class ConfigError(Exception):
34 pass
34 """Represents a problem parsing the configuration
35
36 Args:
37 msg: A textual description of the error.
38 path: Where appropriate, an indication of where in the configuration
39 the problem lies.
40 """
41
42 def __init__(self, msg: str, path: Optional[Iterable[str]] = None):
43 self.msg = msg
44 self.path = path
3545
3646
3747 # We split these messages out to allow packages to override with package
0 from typing import Any, List, Optional
0 from typing import Any, Iterable, List, Optional
11
22 from synapse.config import (
33 api,
44 appservice,
5 auth,
56 captcha,
67 cas,
78 consent_config,
1314 logger,
1415 metrics,
1516 oidc_config,
16 password,
1717 password_auth_providers,
1818 push,
1919 ratelimiting,
3434 workers,
3535 )
3636
37 class ConfigError(Exception): ...
37 class ConfigError(Exception):
38 def __init__(self, msg: str, path: Optional[Iterable[str]] = None):
39 self.msg = msg
40 self.path = path
3841
3942 MISSING_REPORT_STATS_CONFIG_INSTRUCTIONS: str
4043 MISSING_REPORT_STATS_SPIEL: str
6164 sso: sso.SSOConfig
6265 oidc: oidc_config.OIDCConfig
6366 jwt: jwt_config.JWTConfig
64 password: password.PasswordConfig
67 auth: auth.AuthConfig
6568 email: emailconfig.EmailConfig
6669 worker: workers.WorkerConfig
6770 authproviders: password_auth_providers.PasswordAuthProviderConfig
3737 try:
3838 jsonschema.validate(config, json_schema)
3939 except jsonschema.ValidationError as e:
40 # copy `config_path` before modifying it.
41 path = list(config_path)
42 for p in list(e.path):
43 if isinstance(p, int):
44 path.append("<item %i>" % p)
45 else:
46 path.append(str(p))
40 raise json_error_to_config_error(e, config_path)
4741
48 raise ConfigError(
49 "Unable to parse configuration: %s at %s" % (e.message, ".".join(path))
50 )
42
43 def json_error_to_config_error(
44 e: jsonschema.ValidationError, config_path: Iterable[str]
45 ) -> ConfigError:
46 """Converts a json validation error to a user-readable ConfigError
47
48 Args:
49 e: the exception to be converted
50 config_path: the path within the config file. This will be used as a basis
51 for the error message.
52
53 Returns:
54 a ConfigError
55 """
56 # copy `config_path` before modifying it.
57 path = list(config_path)
58 for p in list(e.path):
59 if isinstance(p, int):
60 path.append("<item %i>" % p)
61 else:
62 path.append(str(p))
63 return ConfigError(e.message, path)
0 # -*- coding: utf-8 -*-
1 # Copyright 2015, 2016 OpenMarket Ltd
2 # Copyright 2020 The Matrix.org Foundation C.I.C.
3 #
4 # Licensed under the Apache License, Version 2.0 (the "License");
5 # you may not use this file except in compliance with the License.
6 # You may obtain a copy of the License at
7 #
8 # http://www.apache.org/licenses/LICENSE-2.0
9 #
10 # Unless required by applicable law or agreed to in writing, software
11 # distributed under the License is distributed on an "AS IS" BASIS,
12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 # See the License for the specific language governing permissions and
14 # limitations under the License.
15
16 from ._base import Config
17
18
19 class AuthConfig(Config):
20 """Password and login configuration
21 """
22
23 section = "auth"
24
25 def read_config(self, config, **kwargs):
26 password_config = config.get("password_config", {})
27 if password_config is None:
28 password_config = {}
29
30 self.password_enabled = password_config.get("enabled", True)
31 self.password_localdb_enabled = password_config.get("localdb_enabled", True)
32 self.password_pepper = password_config.get("pepper", "")
33
34 # Password policy
35 self.password_policy = password_config.get("policy") or {}
36 self.password_policy_enabled = self.password_policy.get("enabled", False)
37
38 # User-interactive authentication
39 ui_auth = config.get("ui_auth") or {}
40 self.ui_auth_session_timeout = ui_auth.get("session_timeout", 0)
41
42 def generate_config_section(self, config_dir_path, server_name, **kwargs):
43 return """\
44 password_config:
45 # Uncomment to disable password login
46 #
47 #enabled: false
48
49 # Uncomment to disable authentication against the local password
50 # database. This is ignored if `enabled` is false, and is only useful
51 # if you have other password_providers.
52 #
53 #localdb_enabled: false
54
55 # Uncomment and change to a secret random string for extra security.
56 # DO NOT CHANGE THIS AFTER INITIAL SETUP!
57 #
58 #pepper: "EVEN_MORE_SECRET"
59
60 # Define and enforce a password policy. Each parameter is optional.
61 # This is an implementation of MSC2000.
62 #
63 policy:
64 # Whether to enforce the password policy.
65 # Defaults to 'false'.
66 #
67 #enabled: true
68
69 # Minimum accepted length for a password.
70 # Defaults to 0.
71 #
72 #minimum_length: 15
73
74 # Whether a password must contain at least one digit.
75 # Defaults to 'false'.
76 #
77 #require_digit: true
78
79 # Whether a password must contain at least one symbol.
80 # A symbol is any character that's not a number or a letter.
81 # Defaults to 'false'.
82 #
83 #require_symbol: true
84
85 # Whether a password must contain at least one lowercase letter.
86 # Defaults to 'false'.
87 #
88 #require_lowercase: true
89
90 # Whether a password must contain at least one lowercase letter.
91 # Defaults to 'false'.
92 #
93 #require_uppercase: true
94
95 ui_auth:
96 # The number of milliseconds to allow a user-interactive authentication
97 # session to be active.
98 #
99 # This defaults to 0, meaning the user is queried for their credentials
100 # before every action, but this can be overridden to alow a single
101 # validation to be re-used. This weakens the protections afforded by
102 # the user-interactive authentication process, by allowing for multiple
103 # (and potentially different) operations to use the same validation session.
104 #
105 # Uncomment below to allow for credential validation to last for 15
106 # seconds.
107 #
108 #session_timeout: 15000
109 """
321321
322322 self.email_subjects = EmailSubjectConfig(**subjects)
323323
324 # The invite client location should be a HTTP(S) URL or None.
325 self.invite_client_location = email_config.get("invite_client_location") or None
326 if self.invite_client_location:
327 if not isinstance(self.invite_client_location, str):
328 raise ConfigError(
329 "Config option email.invite_client_location must be type str"
330 )
331 if not (
332 self.invite_client_location.startswith("http://")
333 or self.invite_client_location.startswith("https://")
334 ):
335 raise ConfigError(
336 "Config option email.invite_client_location must be a http or https URL",
337 path=("email", "invite_client_location"),
338 )
339
324340 def generate_config_section(self, config_dir_path, server_name, **kwargs):
325341 return (
326342 """\
388404 #
389405 #validation_token_lifetime: 15m
390406
407 # The web client location to direct users to during an invite. This is passed
408 # to the identity server as the org.matrix.web_client_location key. Defaults
409 # to unset, giving no guidance to the identity server.
410 #
411 #invite_client_location: https://app.element.io
412
391413 # Directory in which Synapse will try to find the template files below.
392 # If not set, default templates from within the Synapse package will be used.
393 #
394 # Do not uncomment this setting unless you want to customise the templates.
414 # If not set, or the files named below are not found within the template
415 # directory, default templates from within the Synapse package will be used.
395416 #
396417 # Synapse will look for the following templates in this directory:
397418 #
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14
1514 from typing import Optional
1615
17 from netaddr import IPSet
18
19 from synapse.config._base import Config, ConfigError
16 from synapse.config._base import Config
2017 from synapse.config._util import validate_config
2118
2219
3431
3532 for domain in federation_domain_whitelist:
3633 self.federation_domain_whitelist[domain] = True
37
38 self.federation_ip_range_blacklist = config.get(
39 "federation_ip_range_blacklist", []
40 )
41
42 # Attempt to create an IPSet from the given ranges
43 try:
44 self.federation_ip_range_blacklist = IPSet(
45 self.federation_ip_range_blacklist
46 )
47
48 # Always blacklist 0.0.0.0, ::
49 self.federation_ip_range_blacklist.update(["0.0.0.0", "::"])
50 except Exception as e:
51 raise ConfigError(
52 "Invalid range(s) provided in federation_ip_range_blacklist: %s" % e
53 )
5434
5535 federation_metrics_domains = config.get("federation_metrics_domains") or []
5636 validate_config(
7555 # - nyc.example.com
7656 # - syd.example.com
7757
78 # Prevent federation requests from being sent to the following
79 # blacklist IP address CIDR ranges. If this option is not specified, or
80 # specified with an empty list, no ip range blacklist will be enforced.
81 #
82 # As of Synapse v1.4.0 this option also affects any outbound requests to identity
83 # servers provided by user input.
84 #
85 # (0.0.0.0 and :: are always blacklisted, whether or not they are explicitly
86 # listed here, since they correspond to unroutable addresses.)
87 #
88 federation_ip_range_blacklist:
89 - '127.0.0.0/8'
90 - '10.0.0.0/8'
91 - '172.16.0.0/12'
92 - '192.168.0.0/16'
93 - '100.64.0.0/10'
94 - '169.254.0.0/16'
95 - '::1/128'
96 - 'fe80::/64'
97 - 'fc00::/7'
98
9958 # Report prometheus metrics on the age of PDUs being sent to and received from
10059 # the following domains. This can be used to give an idea of "delay" on inbound
10160 # and outbound federation, though be aware that any delay can be due to problems
3131 # If enabled, non server admins can only create groups with local parts
3232 # starting with this prefix
3333 #
34 #group_creation_prefix: "unofficial/"
34 #group_creation_prefix: "unofficial_"
3535 """
1616 from ._base import RootConfig
1717 from .api import ApiConfig
1818 from .appservice import AppServiceConfig
19 from .auth import AuthConfig
1920 from .cache import CacheConfig
2021 from .captcha import CaptchaConfig
2122 from .cas import CasConfig
2930 from .logger import LoggingConfig
3031 from .metrics import MetricsConfig
3132 from .oidc_config import OIDCConfig
32 from .password import PasswordConfig
3333 from .password_auth_providers import PasswordAuthProviderConfig
3434 from .push import PushConfig
3535 from .ratelimiting import RatelimitConfig
7575 CasConfig,
7676 SSOConfig,
7777 JWTConfig,
78 PasswordConfig,
78 AuthConfig,
7979 EmailConfig,
8080 PasswordAuthProviderConfig,
8181 PushConfig,
205205 # filter options, but care must when using e.g. MemoryHandler to buffer
206206 # writes.
207207
208 log_context_filter = LoggingContextFilter(request="")
208 log_context_filter = LoggingContextFilter()
209209 log_metadata_filter = MetadataFilter({"server_name": config.server_name})
210210 old_factory = logging.getLogRecordFactory()
211211
6565 (
6666 self.oidc_user_mapping_provider_class,
6767 self.oidc_user_mapping_provider_config,
68 ) = load_module(ump_config)
68 ) = load_module(ump_config, ("oidc_config", "user_mapping_provider"))
6969
7070 # Ensure loaded user mapping module has defined all necessary methods
7171 required_methods = [
202202 # * user: The claims returned by the UserInfo Endpoint and/or in the ID
203203 # Token
204204 #
205 # This must be configured if using the default mapping provider.
206 #
207 localpart_template: "{{{{ user.preferred_username }}}}"
205 # If this is not set, the user will be prompted to choose their
206 # own username.
207 #
208 #localpart_template: "{{{{ user.preferred_username }}}}"
208209
209210 # Jinja2 template for the display name to set on first login.
210211 #
+0
-90
synapse/config/password.py less more
0 # -*- coding: utf-8 -*-
1 # Copyright 2015, 2016 OpenMarket Ltd
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from ._base import Config
16
17
18 class PasswordConfig(Config):
19 """Password login configuration
20 """
21
22 section = "password"
23
24 def read_config(self, config, **kwargs):
25 password_config = config.get("password_config", {})
26 if password_config is None:
27 password_config = {}
28
29 self.password_enabled = password_config.get("enabled", True)
30 self.password_localdb_enabled = password_config.get("localdb_enabled", True)
31 self.password_pepper = password_config.get("pepper", "")
32
33 # Password policy
34 self.password_policy = password_config.get("policy") or {}
35 self.password_policy_enabled = self.password_policy.get("enabled", False)
36
37 def generate_config_section(self, config_dir_path, server_name, **kwargs):
38 return """\
39 password_config:
40 # Uncomment to disable password login
41 #
42 #enabled: false
43
44 # Uncomment to disable authentication against the local password
45 # database. This is ignored if `enabled` is false, and is only useful
46 # if you have other password_providers.
47 #
48 #localdb_enabled: false
49
50 # Uncomment and change to a secret random string for extra security.
51 # DO NOT CHANGE THIS AFTER INITIAL SETUP!
52 #
53 #pepper: "EVEN_MORE_SECRET"
54
55 # Define and enforce a password policy. Each parameter is optional.
56 # This is an implementation of MSC2000.
57 #
58 policy:
59 # Whether to enforce the password policy.
60 # Defaults to 'false'.
61 #
62 #enabled: true
63
64 # Minimum accepted length for a password.
65 # Defaults to 0.
66 #
67 #minimum_length: 15
68
69 # Whether a password must contain at least one digit.
70 # Defaults to 'false'.
71 #
72 #require_digit: true
73
74 # Whether a password must contain at least one symbol.
75 # A symbol is any character that's not a number or a letter.
76 # Defaults to 'false'.
77 #
78 #require_symbol: true
79
80 # Whether a password must contain at least one lowercase letter.
81 # Defaults to 'false'.
82 #
83 #require_lowercase: true
84
85 # Whether a password must contain at least one lowercase letter.
86 # Defaults to 'false'.
87 #
88 #require_uppercase: true
89 """
3535 providers.append({"module": LDAP_PROVIDER, "config": ldap_config})
3636
3737 providers.extend(config.get("password_providers") or [])
38 for provider in providers:
38 for i, provider in enumerate(providers):
3939 mod_name = provider["module"]
4040
4141 # This is for backwards compat when the ldap auth provider resided
4444 mod_name = LDAP_PROVIDER
4545
4646 (provider_class, provider_config) = load_module(
47 {"module": mod_name, "config": provider["config"]}
47 {"module": mod_name, "config": provider["config"]},
48 ("password_providers", "<item %i>" % i),
4849 )
4950
5051 self.password_providers.append((provider_class, provider_config))
1616 from collections import namedtuple
1717 from typing import Dict, List
1818
19 from netaddr import IPSet
20
21 from synapse.config.server import DEFAULT_IP_RANGE_BLACKLIST
1922 from synapse.python_dependencies import DependencyException, check_requirements
2023 from synapse.util.module_loader import load_module
2124
141144 # them to be started.
142145 self.media_storage_providers = [] # type: List[tuple]
143146
144 for provider_config in storage_providers:
147 for i, provider_config in enumerate(storage_providers):
145148 # We special case the module "file_system" so as not to need to
146149 # expose FileStorageProviderBackend
147150 if provider_config["module"] == "file_system":
150153 ".FileStorageProviderBackend"
151154 )
152155
153 provider_class, parsed_config = load_module(provider_config)
156 provider_class, parsed_config = load_module(
157 provider_config, ("media_storage_providers", "<item %i>" % i)
158 )
154159
155160 wrapper_config = MediaStorageProviderConfig(
156161 provider_config.get("store_local", False),
181186 "to work"
182187 )
183188
184 # netaddr is a dependency for url_preview
185 from netaddr import IPSet
186
187189 self.url_preview_ip_range_blacklist = IPSet(
188190 config["url_preview_ip_range_blacklist"]
189191 )
211213 )
212214 # strip final NL
213215 formatted_thumbnail_sizes = formatted_thumbnail_sizes[:-1]
216
217 ip_range_blacklist = "\n".join(
218 " # - '%s'" % ip for ip in DEFAULT_IP_RANGE_BLACKLIST
219 )
214220
215221 return (
216222 r"""
282288 # you uncomment the following list as a starting point.
283289 #
284290 #url_preview_ip_range_blacklist:
285 # - '127.0.0.0/8'
286 # - '10.0.0.0/8'
287 # - '172.16.0.0/12'
288 # - '192.168.0.0/16'
289 # - '100.64.0.0/10'
290 # - '169.254.0.0/16'
291 # - '::1/128'
292 # - 'fe80::/64'
293 # - 'fc00::/7'
291 %(ip_range_blacklist)s
294292
295293 # List of IP address CIDR ranges that the URL preview spider is allowed
296294 # to access even if they are specified in url_preview_ip_range_blacklist.
179179 self._alias_regex = glob_to_regex(alias)
180180 self._room_id_regex = glob_to_regex(room_id)
181181 except Exception as e:
182 raise ConfigError("Failed to parse glob into regex: %s", e)
182 raise ConfigError("Failed to parse glob into regex") from e
183183
184184 def matches(self, user_id, room_id, aliases):
185185 """Tests if this rule matches the given user_id, room_id and aliases.
124124 (
125125 self.saml2_user_mapping_provider_class,
126126 self.saml2_user_mapping_provider_config,
127 ) = load_module(ump_dict)
127 ) = load_module(ump_dict, ("saml2_config", "user_mapping_provider"))
128128
129129 # Ensure loaded user mapping module has defined all necessary methods
130130 # Note parse_config() is already checked during the call to load_module
2222
2323 import attr
2424 import yaml
25 from netaddr import IPSet
2526
2627 from synapse.api.room_versions import KNOWN_ROOM_VERSIONS
2728 from synapse.http.endpoint import parse_and_validate_server_name
3738 # We later check for errors when binding to 0.0.0.0 and ignore them if :: is also in
3839 # in the list.
3940 DEFAULT_BIND_ADDRESSES = ["::", "0.0.0.0"]
41
42 DEFAULT_IP_RANGE_BLACKLIST = [
43 # Localhost
44 "127.0.0.0/8",
45 # Private networks.
46 "10.0.0.0/8",
47 "172.16.0.0/12",
48 "192.168.0.0/16",
49 # Carrier grade NAT.
50 "100.64.0.0/10",
51 # Address registry.
52 "192.0.0.0/24",
53 # Link-local networks.
54 "169.254.0.0/16",
55 # Testing networks.
56 "198.18.0.0/15",
57 "192.0.2.0/24",
58 "198.51.100.0/24",
59 "203.0.113.0/24",
60 # Multicast.
61 "224.0.0.0/4",
62 # Localhost
63 "::1/128",
64 # Link-local addresses.
65 "fe80::/10",
66 # Unique local addresses.
67 "fc00::/7",
68 ]
4069
4170 DEFAULT_ROOM_VERSION = "6"
4271
254283 # Admin uri to direct users at should their instance become blocked
255284 # due to resource constraints
256285 self.admin_contact = config.get("admin_contact", None)
286
287 ip_range_blacklist = config.get(
288 "ip_range_blacklist", DEFAULT_IP_RANGE_BLACKLIST
289 )
290
291 # Attempt to create an IPSet from the given ranges
292 try:
293 self.ip_range_blacklist = IPSet(ip_range_blacklist)
294 except Exception as e:
295 raise ConfigError("Invalid range(s) provided in ip_range_blacklist.") from e
296 # Always blacklist 0.0.0.0, ::
297 self.ip_range_blacklist.update(["0.0.0.0", "::"])
298
299 try:
300 self.ip_range_whitelist = IPSet(config.get("ip_range_whitelist", ()))
301 except Exception as e:
302 raise ConfigError("Invalid range(s) provided in ip_range_whitelist.") from e
303
304 # The federation_ip_range_blacklist is used for backwards-compatibility
305 # and only applies to federation and identity servers. If it is not given,
306 # default to ip_range_blacklist.
307 federation_ip_range_blacklist = config.get(
308 "federation_ip_range_blacklist", ip_range_blacklist
309 )
310 try:
311 self.federation_ip_range_blacklist = IPSet(federation_ip_range_blacklist)
312 except Exception as e:
313 raise ConfigError(
314 "Invalid range(s) provided in federation_ip_range_blacklist."
315 ) from e
316 # Always blacklist 0.0.0.0, ::
317 self.federation_ip_range_blacklist.update(["0.0.0.0", "::"])
257318
258319 if self.public_baseurl is not None:
259320 if self.public_baseurl[-1] != "/":
560621 def generate_config_section(
561622 self, server_name, data_dir_path, open_private_ports, listeners, **kwargs
562623 ):
624 ip_range_blacklist = "\n".join(
625 " # - '%s'" % ip for ip in DEFAULT_IP_RANGE_BLACKLIST
626 )
627
563628 _, bind_port = parse_and_validate_server_name(server_name)
564629 if bind_port is not None:
565630 unsecure_port = bind_port - 400
751816 #
752817 #enable_search: false
753818
819 # Prevent outgoing requests from being sent to the following blacklisted IP address
820 # CIDR ranges. If this option is not specified then it defaults to private IP
821 # address ranges (see the example below).
822 #
823 # The blacklist applies to the outbound requests for federation, identity servers,
824 # push servers, and for checking key validity for third-party invite events.
825 #
826 # (0.0.0.0 and :: are always blacklisted, whether or not they are explicitly
827 # listed here, since they correspond to unroutable addresses.)
828 #
829 # This option replaces federation_ip_range_blacklist in Synapse v1.25.0.
830 #
831 #ip_range_blacklist:
832 %(ip_range_blacklist)s
833
834 # List of IP address CIDR ranges that should be allowed for federation,
835 # identity servers, push servers, and for checking key validity for
836 # third-party invite events. This is useful for specifying exceptions to
837 # wide-ranging blacklisted target IP ranges - e.g. for communication with
838 # a push server only visible in your network.
839 #
840 # This whitelist overrides ip_range_blacklist and defaults to an empty
841 # list.
842 #
843 #ip_range_whitelist:
844 # - '192.168.1.1'
845
754846 # List of ports that Synapse should listen on, their purpose and their
755847 # configuration.
756848 #
3232 # spam checker, and thus was simply a dictionary with module
3333 # and config keys. Support this old behaviour by checking
3434 # to see if the option resolves to a dictionary
35 self.spam_checkers.append(load_module(spam_checkers))
35 self.spam_checkers.append(load_module(spam_checkers, ("spam_checker",)))
3636 elif isinstance(spam_checkers, list):
37 for spam_checker in spam_checkers:
37 for i, spam_checker in enumerate(spam_checkers):
38 config_path = ("spam_checker", "<item %i>" % i)
3839 if not isinstance(spam_checker, dict):
39 raise ConfigError("spam_checker syntax is incorrect")
40 raise ConfigError("expected a mapping", config_path)
4041
41 self.spam_checkers.append(load_module(spam_checker))
42 self.spam_checkers.append(load_module(spam_checker, config_path))
4243 else:
4344 raise ConfigError("spam_checker syntax is incorrect")
4445
9292 # - https://my.custom.client/
9393
9494 # Directory in which Synapse will try to find the template files below.
95 # If not set, default templates from within the Synapse package will be used.
96 #
97 # DO NOT UNCOMMENT THIS SETTING unless you want to customise the templates.
98 # If you *do* uncomment it, you will need to make sure that all the templates
99 # below are in the directory.
95 # If not set, or the files named below are not found within the template
96 # directory, default templates from within the Synapse package will be used.
10097 #
10198 # Synapse will look for the following templates in this directory:
10299 #
2525
2626 provider = config.get("third_party_event_rules", None)
2727 if provider is not None:
28 self.third_party_event_rules = load_module(provider)
28 self.third_party_event_rules = load_module(
29 provider, ("third_party_event_rules",)
30 )
2931
3032 def generate_config_section(self, **kwargs):
3133 return """\
8484 # The port on the main synapse for HTTP replication endpoint
8585 self.worker_replication_http_port = config.get("worker_replication_http_port")
8686
87 # The shared secret used for authentication when connecting to the main synapse.
88 self.worker_replication_secret = config.get("worker_replication_secret", None)
89
8790 self.worker_name = config.get("worker_name", self.worker_app)
8891
8992 self.worker_main_http_uri = config.get("worker_main_http_uri", None)
184187 # data). If not provided this defaults to the main process.
185188 #
186189 #run_background_tasks_on: worker1
190
191 # A shared secret used by the replication APIs to authenticate HTTP requests
192 # from workers.
193 #
194 # By default this is unused and traffic is not authenticated.
195 #
196 #worker_replication_secret: ""
187197 """
188198
189199 def read_arguments(self, args):
226226
227227 # This code is based on twisted.internet.ssl.ClientTLSOptions.
228228
229 def __init__(self, hostname: bytes, verify_certs):
229 def __init__(self, hostname: bytes, verify_certs: bool):
230230 self._verify_certs = verify_certs
231231
232232 _decoded = hostname.decode("ascii")
1717 import collections.abc
1818 import hashlib
1919 import logging
20 from typing import Dict
20 from typing import Any, Callable, Dict, Tuple
2121
2222 from canonicaljson import encode_canonical_json
2323 from signedjson.sign import sign_json
2626
2727 from synapse.api.errors import Codes, SynapseError
2828 from synapse.api.room_versions import RoomVersion
29 from synapse.events import EventBase
2930 from synapse.events.utils import prune_event, prune_event_dict
3031 from synapse.types import JsonDict
3132
3233 logger = logging.getLogger(__name__)
3334
35 Hasher = Callable[[bytes], "hashlib._Hash"]
3436
35 def check_event_content_hash(event, hash_algorithm=hashlib.sha256):
37
38 def check_event_content_hash(
39 event: EventBase, hash_algorithm: Hasher = hashlib.sha256
40 ) -> bool:
3641 """Check whether the hash for this PDU matches the contents"""
3742 name, expected_hash = compute_content_hash(event.get_pdu_json(), hash_algorithm)
3843 logger.debug(
6671 return message_hash_bytes == expected_hash
6772
6873
69 def compute_content_hash(event_dict, hash_algorithm):
74 def compute_content_hash(
75 event_dict: Dict[str, Any], hash_algorithm: Hasher
76 ) -> Tuple[str, bytes]:
7077 """Compute the content hash of an event, which is the hash of the
7178 unredacted event.
7279
7380 Args:
74 event_dict (dict): The unredacted event as a dict
81 event_dict: The unredacted event as a dict
7582 hash_algorithm: A hasher from `hashlib`, e.g. hashlib.sha256, to use
7683 to hash the event
7784
7885 Returns:
79 tuple[str, bytes]: A tuple of the name of hash and the hash as raw
80 bytes.
86 A tuple of the name of hash and the hash as raw bytes.
8187 """
8288 event_dict = dict(event_dict)
8389 event_dict.pop("age_ts", None)
9399 return hashed.name, hashed.digest()
94100
95101
96 def compute_event_reference_hash(event, hash_algorithm=hashlib.sha256):
102 def compute_event_reference_hash(
103 event, hash_algorithm: Hasher = hashlib.sha256
104 ) -> Tuple[str, bytes]:
97105 """Computes the event reference hash. This is the hash of the redacted
98106 event.
99107
100108 Args:
101 event (FrozenEvent)
109 event
102110 hash_algorithm: A hasher from `hashlib`, e.g. hashlib.sha256, to use
103111 to hash the event
104112
105113 Returns:
106 tuple[str, bytes]: A tuple of the name of hash and the hash as raw
107 bytes.
114 A tuple of the name of hash and the hash as raw bytes.
108115 """
109116 tmp_event = prune_event(event)
110117 event_dict = tmp_event.get_pdu_json()
155162 event_dict: JsonDict,
156163 signature_name: str,
157164 signing_key: SigningKey,
158 ):
165 ) -> None:
159166 """Add content hash and sign the event
160167
161168 Args:
1313 # See the License for the specific language governing permissions and
1414 # limitations under the License.
1515
16 import abc
1617 import logging
1718 import urllib
1819 from collections import defaultdict
20 from typing import TYPE_CHECKING, Dict, Iterable, List, Optional, Set, Tuple
1921
2022 import attr
2123 from signedjson.key import (
3941 RequestSendFailed,
4042 SynapseError,
4143 )
44 from synapse.config.key import TrustedKeyServer
4245 from synapse.logging.context import (
4346 PreserveLoggingContext,
4447 make_deferred_yieldable,
4649 run_in_background,
4750 )
4851 from synapse.storage.keys import FetchKeyResult
52 from synapse.types import JsonDict
4953 from synapse.util import unwrapFirstError
5054 from synapse.util.async_helpers import yieldable_gather_results
5155 from synapse.util.metrics import Measure
5256 from synapse.util.retryutils import NotRetryingDestination
5357
58 if TYPE_CHECKING:
59 from synapse.app.homeserver import HomeServer
60
5461 logger = logging.getLogger(__name__)
5562
5663
6067 A request to verify a JSON object.
6168
6269 Attributes:
63 server_name(str): The name of the server to verify against.
64
65 key_ids(set[str]): The set of key_ids to that could be used to verify the
66 JSON object
67
68 json_object(dict): The JSON object to verify.
69
70 minimum_valid_until_ts (int): time at which we require the signing key to
70 server_name: The name of the server to verify against.
71
72 json_object: The JSON object to verify.
73
74 minimum_valid_until_ts: time at which we require the signing key to
7175 be valid. (0 implies we don't care)
76
77 request_name: The name of the request.
78
79 key_ids: The set of key_ids to that could be used to verify the JSON object
7280
7381 key_ready (Deferred[str, str, nacl.signing.VerifyKey]):
7482 A deferred (server_name, key_id, verify_key) tuple that resolves when
7987 errbacks with an M_UNAUTHORIZED SynapseError.
8088 """
8189
82 server_name = attr.ib()
83 json_object = attr.ib()
84 minimum_valid_until_ts = attr.ib()
85 request_name = attr.ib()
86 key_ids = attr.ib(init=False)
87 key_ready = attr.ib(default=attr.Factory(defer.Deferred))
90 server_name = attr.ib(type=str)
91 json_object = attr.ib(type=JsonDict)
92 minimum_valid_until_ts = attr.ib(type=int)
93 request_name = attr.ib(type=str)
94 key_ids = attr.ib(init=False, type=List[str])
95 key_ready = attr.ib(default=attr.Factory(defer.Deferred), type=defer.Deferred)
8896
8997 def __attrs_post_init__(self):
9098 self.key_ids = signature_ids(self.json_object, self.server_name)
95103
96104
97105 class Keyring:
98 def __init__(self, hs, key_fetchers=None):
106 def __init__(
107 self, hs: "HomeServer", key_fetchers: "Optional[Iterable[KeyFetcher]]" = None
108 ):
99109 self.clock = hs.get_clock()
100110
101111 if key_fetchers is None:
111121 # completes.
112122 #
113123 # These are regular, logcontext-agnostic Deferreds.
114 self.key_downloads = {}
124 self.key_downloads = {} # type: Dict[str, defer.Deferred]
115125
116126 def verify_json_for_server(
117 self, server_name, json_object, validity_time, request_name
118 ):
127 self,
128 server_name: str,
129 json_object: JsonDict,
130 validity_time: int,
131 request_name: str,
132 ) -> defer.Deferred:
119133 """Verify that a JSON object has been signed by a given server
120134
121135 Args:
122 server_name (str): name of the server which must have signed this object
123
124 json_object (dict): object to be checked
125
126 validity_time (int): timestamp at which we require the signing key to
136 server_name: name of the server which must have signed this object
137
138 json_object: object to be checked
139
140 validity_time: timestamp at which we require the signing key to
127141 be valid. (0 implies we don't care)
128142
129 request_name (str): an identifier for this json object (eg, an event id)
143 request_name: an identifier for this json object (eg, an event id)
130144 for logging.
131145
132146 Returns:
137151 requests = (req,)
138152 return make_deferred_yieldable(self._verify_objects(requests)[0])
139153
140 def verify_json_objects_for_server(self, server_and_json):
154 def verify_json_objects_for_server(
155 self, server_and_json: Iterable[Tuple[str, dict, int, str]]
156 ) -> List[defer.Deferred]:
141157 """Bulk verifies signatures of json objects, bulk fetching keys as
142158 necessary.
143159
144160 Args:
145 server_and_json (iterable[Tuple[str, dict, int, str]):
161 server_and_json:
146162 Iterable of (server_name, json_object, validity_time, request_name)
147163 tuples.
148164
163179 for server_name, json_object, validity_time, request_name in server_and_json
164180 )
165181
166 def _verify_objects(self, verify_requests):
182 def _verify_objects(
183 self, verify_requests: Iterable[VerifyJsonRequest]
184 ) -> List[defer.Deferred]:
167185 """Does the work of verify_json_[objects_]for_server
168186
169187
170188 Args:
171 verify_requests (iterable[VerifyJsonRequest]):
172 Iterable of verification requests.
189 verify_requests: Iterable of verification requests.
173190
174191 Returns:
175192 List<Deferred[None]>: for each input item, a deferred indicating success
181198 key_lookups = []
182199 handle = preserve_fn(_handle_key_deferred)
183200
184 def process(verify_request):
201 def process(verify_request: VerifyJsonRequest) -> defer.Deferred:
185202 """Process an entry in the request list
186203
187204 Adds a key request to key_lookups, and returns a deferred which
221238
222239 return results
223240
224 async def _start_key_lookups(self, verify_requests):
241 async def _start_key_lookups(
242 self, verify_requests: List[VerifyJsonRequest]
243 ) -> None:
225244 """Sets off the key fetches for each verify request
226245
227246 Once each fetch completes, verify_request.key_ready will be resolved.
228247
229248 Args:
230 verify_requests (List[VerifyJsonRequest]):
249 verify_requests:
231250 """
232251
233252 try:
234253 # map from server name to a set of outstanding request ids
235 server_to_request_ids = {}
254 server_to_request_ids = {} # type: Dict[str, Set[int]]
236255
237256 for verify_request in verify_requests:
238257 server_name = verify_request.server_name
274293 except Exception:
275294 logger.exception("Error starting key lookups")
276295
277 async def wait_for_previous_lookups(self, server_names) -> None:
296 async def wait_for_previous_lookups(self, server_names: Iterable[str]) -> None:
278297 """Waits for any previous key lookups for the given servers to finish.
279298
280299 Args:
281 server_names (Iterable[str]): list of servers which we want to look up
300 server_names: list of servers which we want to look up
282301
283302 Returns:
284303 Resolves once all key lookups for the given servers have
303322
304323 loop_count += 1
305324
306 def _get_server_verify_keys(self, verify_requests):
325 def _get_server_verify_keys(self, verify_requests: List[VerifyJsonRequest]) -> None:
307326 """Tries to find at least one key for each verify request
308327
309328 For each verify_request, verify_request.key_ready is called back with
311330 with a SynapseError if none of the keys are found.
312331
313332 Args:
314 verify_requests (list[VerifyJsonRequest]): list of verify requests
333 verify_requests: list of verify requests
315334 """
316335
317336 remaining_requests = {rq for rq in verify_requests if not rq.key_ready.called}
365384
366385 run_in_background(do_iterations)
367386
368 async def _attempt_key_fetches_with_fetcher(self, fetcher, remaining_requests):
387 async def _attempt_key_fetches_with_fetcher(
388 self, fetcher: "KeyFetcher", remaining_requests: Set[VerifyJsonRequest]
389 ):
369390 """Use a key fetcher to attempt to satisfy some key requests
370391
371392 Args:
372 fetcher (KeyFetcher): fetcher to use to fetch the keys
373 remaining_requests (set[VerifyJsonRequest]): outstanding key requests.
393 fetcher: fetcher to use to fetch the keys
394 remaining_requests: outstanding key requests.
374395 Any successfully-completed requests will be removed from the list.
375396 """
376 # dict[str, dict[str, int]]: keys to fetch.
397 # The keys to fetch.
377398 # server_name -> key_id -> min_valid_ts
378 missing_keys = defaultdict(dict)
399 missing_keys = defaultdict(dict) # type: Dict[str, Dict[str, int]]
379400
380401 for verify_request in remaining_requests:
381402 # any completed requests should already have been removed
437458 remaining_requests.difference_update(completed)
438459
439460
440 class KeyFetcher:
441 async def get_keys(self, keys_to_fetch):
442 """
443 Args:
444 keys_to_fetch (dict[str, dict[str, int]]):
461 class KeyFetcher(metaclass=abc.ABCMeta):
462 @abc.abstractmethod
463 async def get_keys(
464 self, keys_to_fetch: Dict[str, Dict[str, int]]
465 ) -> Dict[str, Dict[str, FetchKeyResult]]:
466 """
467 Args:
468 keys_to_fetch:
445469 the keys to be fetched. server_name -> key_id -> min_valid_ts
446470
447471 Returns:
448 Deferred[dict[str, dict[str, synapse.storage.keys.FetchKeyResult|None]]]:
449 map from server_name -> key_id -> FetchKeyResult
472 Map from server_name -> key_id -> FetchKeyResult
450473 """
451474 raise NotImplementedError
452475
454477 class StoreKeyFetcher(KeyFetcher):
455478 """KeyFetcher impl which fetches keys from our data store"""
456479
457 def __init__(self, hs):
480 def __init__(self, hs: "HomeServer"):
458481 self.store = hs.get_datastore()
459482
460 async def get_keys(self, keys_to_fetch):
483 async def get_keys(
484 self, keys_to_fetch: Dict[str, Dict[str, int]]
485 ) -> Dict[str, Dict[str, FetchKeyResult]]:
461486 """see KeyFetcher.get_keys"""
462487
463 keys_to_fetch = (
488 key_ids_to_fetch = (
464489 (server_name, key_id)
465490 for server_name, keys_for_server in keys_to_fetch.items()
466491 for key_id in keys_for_server.keys()
467492 )
468493
469 res = await self.store.get_server_verify_keys(keys_to_fetch)
470 keys = {}
494 res = await self.store.get_server_verify_keys(key_ids_to_fetch)
495 keys = {} # type: Dict[str, Dict[str, FetchKeyResult]]
471496 for (server_name, key_id), key in res.items():
472497 keys.setdefault(server_name, {})[key_id] = key
473498 return keys
474499
475500
476 class BaseV2KeyFetcher:
477 def __init__(self, hs):
501 class BaseV2KeyFetcher(KeyFetcher):
502 def __init__(self, hs: "HomeServer"):
478503 self.store = hs.get_datastore()
479504 self.config = hs.get_config()
480505
481 async def process_v2_response(self, from_server, response_json, time_added_ms):
506 async def process_v2_response(
507 self, from_server: str, response_json: JsonDict, time_added_ms: int
508 ) -> Dict[str, FetchKeyResult]:
482509 """Parse a 'Server Keys' structure from the result of a /key request
483510
484511 This is used to parse either the entirety of the response from
492519 to /_matrix/key/v2/query.
493520
494521 Args:
495 from_server (str): the name of the server producing this result: either
522 from_server: the name of the server producing this result: either
496523 the origin server for a /_matrix/key/v2/server request, or the notary
497524 for a /_matrix/key/v2/query.
498525
499 response_json (dict): the json-decoded Server Keys response object
500
501 time_added_ms (int): the timestamp to record in server_keys_json
526 response_json: the json-decoded Server Keys response object
527
528 time_added_ms: the timestamp to record in server_keys_json
502529
503530 Returns:
504 Deferred[dict[str, FetchKeyResult]]: map from key_id to result object
531 Map from key_id to result object
505532 """
506533 ts_valid_until_ms = response_json["valid_until_ts"]
507534
574601 class PerspectivesKeyFetcher(BaseV2KeyFetcher):
575602 """KeyFetcher impl which fetches keys from the "perspectives" servers"""
576603
577 def __init__(self, hs):
604 def __init__(self, hs: "HomeServer"):
578605 super().__init__(hs)
579606 self.clock = hs.get_clock()
580 self.client = hs.get_http_client()
607 self.client = hs.get_federation_http_client()
581608 self.key_servers = self.config.key_servers
582609
583 async def get_keys(self, keys_to_fetch):
610 async def get_keys(
611 self, keys_to_fetch: Dict[str, Dict[str, int]]
612 ) -> Dict[str, Dict[str, FetchKeyResult]]:
584613 """see KeyFetcher.get_keys"""
585614
586 async def get_key(key_server):
615 async def get_key(key_server: TrustedKeyServer) -> Dict:
587616 try:
588 result = await self.get_server_verify_key_v2_indirect(
617 return await self.get_server_verify_key_v2_indirect(
589618 keys_to_fetch, key_server
590619 )
591 return result
592620 except KeyLookupError as e:
593621 logger.warning(
594622 "Key lookup failed from %r: %s", key_server.server_name, e
610638 ).addErrback(unwrapFirstError)
611639 )
612640
613 union_of_keys = {}
641 union_of_keys = {} # type: Dict[str, Dict[str, FetchKeyResult]]
614642 for result in results:
615643 for server_name, keys in result.items():
616644 union_of_keys.setdefault(server_name, {}).update(keys)
617645
618646 return union_of_keys
619647
620 async def get_server_verify_key_v2_indirect(self, keys_to_fetch, key_server):
621 """
622 Args:
623 keys_to_fetch (dict[str, dict[str, int]]):
648 async def get_server_verify_key_v2_indirect(
649 self, keys_to_fetch: Dict[str, Dict[str, int]], key_server: TrustedKeyServer
650 ) -> Dict[str, Dict[str, FetchKeyResult]]:
651 """
652 Args:
653 keys_to_fetch:
624654 the keys to be fetched. server_name -> key_id -> min_valid_ts
625655
626 key_server (synapse.config.key.TrustedKeyServer): notary server to query for
627 the keys
656 key_server: notary server to query for the keys
628657
629658 Returns:
630 dict[str, dict[str, synapse.storage.keys.FetchKeyResult]]: map
631 from server_name -> key_id -> FetchKeyResult
659 Map from server_name -> key_id -> FetchKeyResult
632660
633661 Raises:
634662 KeyLookupError if there was an error processing the entire response from
661689 except HttpResponseException as e:
662690 raise KeyLookupError("Remote server returned an error: %s" % (e,))
663691
664 keys = {}
665 added_keys = []
692 keys = {} # type: Dict[str, Dict[str, FetchKeyResult]]
693 added_keys = [] # type: List[Tuple[str, str, FetchKeyResult]]
666694
667695 time_now_ms = self.clock.time_msec()
668696
697 assert isinstance(query_response, dict)
669698 for response in query_response["server_keys"]:
670699 # do this first, so that we can give useful errors thereafter
671700 server_name = response.get("server_name")
703732
704733 return keys
705734
706 def _validate_perspectives_response(self, key_server, response):
735 def _validate_perspectives_response(
736 self, key_server: TrustedKeyServer, response: JsonDict
737 ) -> None:
707738 """Optionally check the signature on the result of a /key/query request
708739
709740 Args:
710 key_server (synapse.config.key.TrustedKeyServer): the notary server that
711 produced this result
712
713 response (dict): the json-decoded Server Keys response object
741 key_server: the notary server that produced this result
742
743 response: the json-decoded Server Keys response object
714744 """
715745 perspective_name = key_server.server_name
716746 perspective_keys = key_server.verify_keys
744774 class ServerKeyFetcher(BaseV2KeyFetcher):
745775 """KeyFetcher impl which fetches keys from the origin servers"""
746776
747 def __init__(self, hs):
777 def __init__(self, hs: "HomeServer"):
748778 super().__init__(hs)
749779 self.clock = hs.get_clock()
750 self.client = hs.get_http_client()
751
752 async def get_keys(self, keys_to_fetch):
753 """
754 Args:
755 keys_to_fetch (dict[str, iterable[str]]):
780 self.client = hs.get_federation_http_client()
781
782 async def get_keys(
783 self, keys_to_fetch: Dict[str, Dict[str, int]]
784 ) -> Dict[str, Dict[str, FetchKeyResult]]:
785 """
786 Args:
787 keys_to_fetch:
756788 the keys to be fetched. server_name -> key_ids
757789
758790 Returns:
759 dict[str, dict[str, synapse.storage.keys.FetchKeyResult|None]]:
760 map from server_name -> key_id -> FetchKeyResult
791 Map from server_name -> key_id -> FetchKeyResult
761792 """
762793
763794 results = {}
764795
765 async def get_key(key_to_fetch_item):
796 async def get_key(key_to_fetch_item: Tuple[str, Dict[str, int]]) -> None:
766797 server_name, key_ids = key_to_fetch_item
767798 try:
768799 keys = await self.get_server_verify_key_v2_direct(server_name, key_ids)
777808 await yieldable_gather_results(get_key, keys_to_fetch.items())
778809 return results
779810
780 async def get_server_verify_key_v2_direct(self, server_name, key_ids):
781 """
782
783 Args:
784 server_name (str):
785 key_ids (iterable[str]):
811 async def get_server_verify_key_v2_direct(
812 self, server_name: str, key_ids: Iterable[str]
813 ) -> Dict[str, FetchKeyResult]:
814 """
815
816 Args:
817 server_name:
818 key_ids:
786819
787820 Returns:
788 dict[str, FetchKeyResult]: map from key ID to lookup result
821 Map from key ID to lookup result
789822
790823 Raises:
791824 KeyLookupError if there was a problem making the lookup
792825 """
793 keys = {} # type: dict[str, FetchKeyResult]
826 keys = {} # type: Dict[str, FetchKeyResult]
794827
795828 for requested_key_id in key_ids:
796829 # we may have found this key as a side-effect of asking for another.
824857 except HttpResponseException as e:
825858 raise KeyLookupError("Remote server returned an error: %s" % (e,))
826859
860 assert isinstance(response, dict)
827861 if response["server_name"] != server_name:
828862 raise KeyLookupError(
829863 "Expected a response for server %r not %r"
845879 return keys
846880
847881
848 async def _handle_key_deferred(verify_request) -> None:
882 async def _handle_key_deferred(verify_request: VerifyJsonRequest) -> None:
849883 """Waits for the key to become available, and then performs a verification
850884
851885 Args:
852 verify_request (VerifyJsonRequest):
886 verify_request:
853887
854888 Raises:
855889 SynapseError if there was a problem performing the verification
1414 # limitations under the License.
1515
1616 import inspect
17 from typing import TYPE_CHECKING, Any, Dict, List, Optional, Tuple
17 from typing import TYPE_CHECKING, Any, Dict, List, Optional, Tuple, Union
1818
1919 from synapse.spam_checker_api import RegistrationBehaviour
2020 from synapse.types import Collection
21 from synapse.util.async_helpers import maybe_awaitable
2122
2223 if TYPE_CHECKING:
2324 import synapse.events
3839 else:
3940 self.spam_checkers.append(module(config=config))
4041
41 def check_event_for_spam(self, event: "synapse.events.EventBase") -> bool:
42 async def check_event_for_spam(
43 self, event: "synapse.events.EventBase"
44 ) -> Union[bool, str]:
4245 """Checks if a given event is considered "spammy" by this server.
4346
4447 If the server considers an event spammy, then it will be rejected if
4952 event: the event to be checked
5053
5154 Returns:
52 True if the event is spammy.
53 """
54 for spam_checker in self.spam_checkers:
55 if spam_checker.check_event_for_spam(event):
55 True or a string if the event is spammy. If a string is returned it
56 will be used as the error message returned to the user.
57 """
58 for spam_checker in self.spam_checkers:
59 if await maybe_awaitable(spam_checker.check_event_for_spam(event)):
5660 return True
5761
5862 return False
5963
60 def user_may_invite(
64 async def user_may_invite(
6165 self, inviter_userid: str, invitee_userid: str, room_id: str
6266 ) -> bool:
6367 """Checks if a given user may send an invite
7478 """
7579 for spam_checker in self.spam_checkers:
7680 if (
77 spam_checker.user_may_invite(inviter_userid, invitee_userid, room_id)
78 is False
79 ):
80 return False
81
82 return True
83
84 def user_may_create_room(self, userid: str) -> bool:
81 await maybe_awaitable(
82 spam_checker.user_may_invite(
83 inviter_userid, invitee_userid, room_id
84 )
85 )
86 is False
87 ):
88 return False
89
90 return True
91
92 async def user_may_create_room(self, userid: str) -> bool:
8593 """Checks if a given user may create a room
8694
8795 If this method returns false, the creation request will be rejected.
93101 True if the user may create a room, otherwise False
94102 """
95103 for spam_checker in self.spam_checkers:
96 if spam_checker.user_may_create_room(userid) is False:
97 return False
98
99 return True
100
101 def user_may_create_room_alias(self, userid: str, room_alias: str) -> bool:
104 if (
105 await maybe_awaitable(spam_checker.user_may_create_room(userid))
106 is False
107 ):
108 return False
109
110 return True
111
112 async def user_may_create_room_alias(self, userid: str, room_alias: str) -> bool:
102113 """Checks if a given user may create a room alias
103114
104115 If this method returns false, the association request will be rejected.
111122 True if the user may create a room alias, otherwise False
112123 """
113124 for spam_checker in self.spam_checkers:
114 if spam_checker.user_may_create_room_alias(userid, room_alias) is False:
115 return False
116
117 return True
118
119 def user_may_publish_room(self, userid: str, room_id: str) -> bool:
125 if (
126 await maybe_awaitable(
127 spam_checker.user_may_create_room_alias(userid, room_alias)
128 )
129 is False
130 ):
131 return False
132
133 return True
134
135 async def user_may_publish_room(self, userid: str, room_id: str) -> bool:
120136 """Checks if a given user may publish a room to the directory
121137
122138 If this method returns false, the publish request will be rejected.
129145 True if the user may publish the room, otherwise False
130146 """
131147 for spam_checker in self.spam_checkers:
132 if spam_checker.user_may_publish_room(userid, room_id) is False:
133 return False
134
135 return True
136
137 def check_username_for_spam(self, user_profile: Dict[str, str]) -> bool:
148 if (
149 await maybe_awaitable(
150 spam_checker.user_may_publish_room(userid, room_id)
151 )
152 is False
153 ):
154 return False
155
156 return True
157
158 async def check_username_for_spam(self, user_profile: Dict[str, str]) -> bool:
138159 """Checks if a user ID or display name are considered "spammy" by this server.
139160
140161 If the server considers a username spammy, then it will not be included in
156177 if checker:
157178 # Make a copy of the user profile object to ensure the spam checker
158179 # cannot modify it.
159 if checker(user_profile.copy()):
180 if await maybe_awaitable(checker(user_profile.copy())):
160181 return True
161182
162183 return False
163184
164 def check_registration_for_spam(
185 async def check_registration_for_spam(
165186 self,
166187 email_threepid: Optional[dict],
167188 username: Optional[str],
184205 # spam checker
185206 checker = getattr(spam_checker, "check_registration_for_spam", None)
186207 if checker:
187 behaviour = checker(email_threepid, username, request_info)
208 behaviour = await maybe_awaitable(
209 checker(email_threepid, username, request_info)
210 )
188211 assert isinstance(behaviour, RegistrationBehaviour)
189212 if behaviour != RegistrationBehaviour.ALLOW:
190213 return behaviour
7777
7878 ctx = current_context()
7979
80 @defer.inlineCallbacks
8081 def callback(_, pdu: EventBase):
8182 with PreserveLoggingContext(ctx):
8283 if not check_event_content_hash(pdu):
104105 )
105106 return redacted_event
106107
107 if self.spam_checker.check_event_for_spam(pdu):
108 result = yield defer.ensureDeferred(
109 self.spam_checker.check_event_for_spam(pdu)
110 )
111
112 if result:
108113 logger.warning(
109114 "Event contains spam, redacting %s: %s",
110115 pdu.event_id,
844844
845845 def __init__(self, hs: "HomeServer"):
846846 self.config = hs.config
847 self.http_client = hs.get_simple_http_client()
848847 self.clock = hs.get_clock()
849848 self._instance_name = hs.get_instance_name()
850849
3434
3535 def __init__(self, hs):
3636 self.server_name = hs.hostname
37 self.client = hs.get_http_client()
37 self.client = hs.get_federation_http_client()
3838
3939 @log_function
4040 def get_room_state_ids(self, destination, room_id, event_id):
143143 ):
144144 raise FederationDeniedError(origin)
145145
146 if not json_request["signatures"]:
146 if origin is None or not json_request["signatures"]:
147147 raise NoAuthenticationError(
148148 401, "Missing Authorization headers", Codes.UNAUTHORIZED
149149 )
14611461
14621462 Args:
14631463 hs (synapse.server.HomeServer): homeserver
1464 resource (TransportLayerServer): resource class to register to
1464 resource (JsonResource): resource class to register to
14651465 authenticator (Authenticator): authenticator to use
14661466 ratelimiter (util.ratelimitutils.FederationRateLimiter): ratelimiter to use
14671467 servlet_groups (list[str], optional): List of servlet groups to register.
3131 class BaseHandler:
3232 """
3333 Common base class for the event handlers.
34
35 Deprecated: new code should not use this. Instead, Handler classes should define the
36 fields they actually need. The utility methods should either be factored out to
37 standalone helper functions, or to different Handler classes.
3438 """
3539
3640 def __init__(self, hs: "HomeServer"):
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414
15 import abc
1516 import logging
16 from typing import List
17 from typing import TYPE_CHECKING, Any, Dict, List, Optional, Set
1718
1819 from synapse.api.constants import Membership
19 from synapse.events import FrozenEvent
20 from synapse.types import RoomStreamToken, StateMap
20 from synapse.events import EventBase
21 from synapse.types import JsonDict, RoomStreamToken, StateMap, UserID
2122 from synapse.visibility import filter_events_for_client
2223
2324 from ._base import BaseHandler
2425
26 if TYPE_CHECKING:
27 from synapse.app.homeserver import HomeServer
28
2529 logger = logging.getLogger(__name__)
2630
2731
2832 class AdminHandler(BaseHandler):
29 def __init__(self, hs):
33 def __init__(self, hs: "HomeServer"):
3034 super().__init__(hs)
3135
3236 self.storage = hs.get_storage()
3337 self.state_store = self.storage.state
3438
35 async def get_whois(self, user):
39 async def get_whois(self, user: UserID) -> JsonDict:
3640 connections = []
3741
3842 sessions = await self.store.get_user_ip_and_agents(user)
5256
5357 return ret
5458
55 async def get_user(self, user):
59 async def get_user(self, user: UserID) -> Optional[JsonDict]:
5660 """Function to get user details"""
5761 ret = await self.store.get_user_by_id(user.to_string())
5862 if ret:
6367 ret["threepids"] = threepids
6468 return ret
6569
66 async def export_user_data(self, user_id, writer):
70 async def export_user_data(self, user_id: str, writer: "ExfiltrationWriter") -> Any:
6771 """Write all data we have on the user to the given writer.
6872
6973 Args:
70 user_id (str)
71 writer (ExfiltrationWriter)
74 user_id: The user ID to fetch data of.
75 writer: The writer to write to.
7276
7377 Returns:
7478 Resolves when all data for a user has been written.
127131 from_key = RoomStreamToken(0, 0)
128132 to_key = RoomStreamToken(None, stream_ordering)
129133
130 written_events = set() # Events that we've processed in this room
134 # Events that we've processed in this room
135 written_events = set() # type: Set[str]
131136
132137 # We need to track gaps in the events stream so that we can then
133138 # write out the state at those events. We do this by keeping track
139144
140145 # The reverse mapping to above, i.e. map from unseen event to events
141146 # that have the unseen event in their prev_events, i.e. the unseen
142 # events "children". dict[str, set[str]]
143 unseen_to_child_events = {}
147 # events "children".
148 unseen_to_child_events = {} # type: Dict[str, Set[str]]
144149
145150 # We fetch events in the room the user could see by fetching *all*
146151 # events that we have and then filtering, this isn't the most
196201 return writer.finished()
197202
198203
199 class ExfiltrationWriter:
204 class ExfiltrationWriter(metaclass=abc.ABCMeta):
200205 """Interface used to specify how to write exported data.
201206 """
202207
203 def write_events(self, room_id: str, events: List[FrozenEvent]):
208 @abc.abstractmethod
209 def write_events(self, room_id: str, events: List[EventBase]) -> None:
204210 """Write a batch of events for a room.
205211 """
206 pass
207
208 def write_state(self, room_id: str, event_id: str, state: StateMap[FrozenEvent]):
212 raise NotImplementedError()
213
214 @abc.abstractmethod
215 def write_state(
216 self, room_id: str, event_id: str, state: StateMap[EventBase]
217 ) -> None:
209218 """Write the state at the given event in the room.
210219
211220 This only gets called for backward extremities rather than for each
212221 event.
213222 """
214 pass
215
216 def write_invite(self, room_id: str, event: FrozenEvent, state: StateMap[dict]):
223 raise NotImplementedError()
224
225 @abc.abstractmethod
226 def write_invite(
227 self, room_id: str, event: EventBase, state: StateMap[dict]
228 ) -> None:
217229 """Write an invite for the room, with associated invite state.
218230
219231 Args:
220 room_id
221 event
222 state: A subset of the state at the
223 invite, with a subset of the event keys (type, state_key
224 content and sender)
225 """
226
227 def finished(self):
232 room_id: The room ID the invite is for.
233 event: The invite event.
234 state: A subset of the state at the invite, with a subset of the
235 event keys (type, state_key content and sender).
236 """
237 raise NotImplementedError()
238
239 @abc.abstractmethod
240 def finished(self) -> Any:
228241 """Called when all data has successfully been exported and written.
229242
230243 This functions return value is passed to the caller of
231244 `export_user_data`.
232245 """
233 pass
246 raise NotImplementedError()
1313 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1414 # See the License for the specific language governing permissions and
1515 # limitations under the License.
16 import inspect
1716 import logging
1817 import time
1918 import unicodedata
2120 from typing import (
2221 TYPE_CHECKING,
2322 Any,
23 Awaitable,
2424 Callable,
2525 Dict,
2626 Iterable,
3434 import attr
3535 import bcrypt
3636 import pymacaroons
37
38 from twisted.web.http import Request
3739
3840 from synapse.api.constants import LoginType
3941 from synapse.api.errors import (
5557 from synapse.module_api import ModuleApi
5658 from synapse.types import JsonDict, Requester, UserID
5759 from synapse.util import stringutils as stringutils
60 from synapse.util.async_helpers import maybe_awaitable
5861 from synapse.util.msisdn import phone_number_to_msisdn
5962 from synapse.util.threepids import canonicalise_email
6063
192195 self.hs = hs # FIXME better possibility to access registrationHandler later?
193196 self.macaroon_gen = hs.get_macaroon_generator()
194197 self._password_enabled = hs.config.password_enabled
195 self._sso_enabled = (
196 hs.config.cas_enabled or hs.config.saml2_enabled or hs.config.oidc_enabled
197 )
198
199 # we keep this as a list despite the O(N^2) implication so that we can
200 # keep PASSWORD first and avoid confusing clients which pick the first
201 # type in the list. (NB that the spec doesn't require us to do so and
202 # clients which favour types that they don't understand over those that
203 # they do are technically broken)
198 self._password_localdb_enabled = hs.config.password_localdb_enabled
204199
205200 # start out by assuming PASSWORD is enabled; we will remove it later if not.
206 login_types = []
207 if hs.config.password_localdb_enabled:
208 login_types.append(LoginType.PASSWORD)
201 login_types = set()
202 if self._password_localdb_enabled:
203 login_types.add(LoginType.PASSWORD)
209204
210205 for provider in self.password_providers:
211 if hasattr(provider, "get_supported_login_types"):
212 for t in provider.get_supported_login_types().keys():
213 if t not in login_types:
214 login_types.append(t)
206 login_types.update(provider.get_supported_login_types().keys())
215207
216208 if not self._password_enabled:
209 login_types.discard(LoginType.PASSWORD)
210
211 # Some clients just pick the first type in the list. In this case, we want
212 # them to use PASSWORD (rather than token or whatever), so we want to make sure
213 # that comes first, where it's present.
214 self._supported_login_types = []
215 if LoginType.PASSWORD in login_types:
216 self._supported_login_types.append(LoginType.PASSWORD)
217217 login_types.remove(LoginType.PASSWORD)
218
219 self._supported_login_types = login_types
220
221 # Login types and UI Auth types have a heavy overlap, but are not
222 # necessarily identical. Login types have SSO (and other login types)
223 # added in the rest layer, see synapse.rest.client.v1.login.LoginRestServerlet.on_GET.
224 ui_auth_types = login_types.copy()
225 if self._sso_enabled:
226 ui_auth_types.append(LoginType.SSO)
227 self._supported_ui_auth_types = ui_auth_types
218 self._supported_login_types.extend(login_types)
228219
229220 # Ratelimiter for failed auth during UIA. Uses same ratelimit config
230221 # as per `rc_login.failed_attempts`.
233224 rate_hz=self.hs.config.rc_login_failed_attempts.per_second,
234225 burst_count=self.hs.config.rc_login_failed_attempts.burst_count,
235226 )
227
228 # The number of seconds to keep a UI auth session active.
229 self._ui_auth_session_timeout = hs.config.ui_auth_session_timeout
236230
237231 # Ratelimitier for failed /login attempts
238232 self._failed_login_attempts_ratelimiter = Ratelimiter(
291285 request_body: Dict[str, Any],
292286 clientip: str,
293287 description: str,
294 ) -> Tuple[dict, str]:
288 ) -> Tuple[dict, Optional[str]]:
295289 """
296290 Checks that the user is who they claim to be, via a UI auth.
297291
318312 have been given only in a previous call).
319313
320314 'session_id' is the ID of this session, either passed in by the
321 client or assigned by this call
315 client or assigned by this call. This is None if UI auth was
316 skipped (by re-using a previous validation).
322317
323318 Raises:
324319 InteractiveAuthIncompleteError if the client has not yet completed
332327
333328 """
334329
330 if self._ui_auth_session_timeout:
331 last_validated = await self.store.get_access_token_last_validated(
332 requester.access_token_id
333 )
334 if self.clock.time_msec() - last_validated < self._ui_auth_session_timeout:
335 # Return the input parameters, minus the auth key, which matches
336 # the logic in check_ui_auth.
337 request_body.pop("auth", None)
338 return request_body, None
339
335340 user_id = requester.user.to_string()
336341
337342 # Check if we should be ratelimited due to too many previous failed attempts
338343 self._failed_uia_attempts_ratelimiter.ratelimit(user_id, update=False)
339344
340345 # build a list of supported flows
341 flows = [[login_type] for login_type in self._supported_ui_auth_types]
346 supported_ui_auth_types = await self._get_available_ui_auth_types(
347 requester.user
348 )
349 flows = [[login_type] for login_type in supported_ui_auth_types]
342350
343351 try:
344352 result, params, session_id = await self.check_ui_auth(
350358 raise
351359
352360 # find the completed login type
353 for login_type in self._supported_ui_auth_types:
361 for login_type in supported_ui_auth_types:
354362 if login_type not in result:
355363 continue
356364
364372 if user_id != requester.user.to_string():
365373 raise AuthError(403, "Invalid auth")
366374
375 # Note that the access token has been validated.
376 await self.store.update_access_token_last_validated(requester.access_token_id)
377
367378 return params, session_id
379
380 async def _get_available_ui_auth_types(self, user: UserID) -> Iterable[str]:
381 """Get a list of the authentication types this user can use
382 """
383
384 ui_auth_types = set()
385
386 # if the HS supports password auth, and the user has a non-null password, we
387 # support password auth
388 if self._password_localdb_enabled and self._password_enabled:
389 lookupres = await self._find_user_id_and_pwd_hash(user.to_string())
390 if lookupres:
391 _, password_hash = lookupres
392 if password_hash:
393 ui_auth_types.add(LoginType.PASSWORD)
394
395 # also allow auth from password providers
396 for provider in self.password_providers:
397 for t in provider.get_supported_login_types().keys():
398 if t == LoginType.PASSWORD and not self._password_enabled:
399 continue
400 ui_auth_types.add(t)
401
402 # if sso is enabled, allow the user to log in via SSO iff they have a mapping
403 # from sso to mxid.
404 if self.hs.config.saml2.saml2_enabled or self.hs.config.oidc.oidc_enabled:
405 if await self.store.get_external_ids_by_user(user.to_string()):
406 ui_auth_types.add(LoginType.SSO)
407
408 # Our CAS impl does not (yet) correctly register users in user_external_ids,
409 # so always offer that if it's available.
410 if self.hs.config.cas.cas_enabled:
411 ui_auth_types.add(LoginType.SSO)
412
413 return ui_auth_types
368414
369415 def get_enabled_auth_types(self):
370416 """Return the enabled user-interactive authentication types
422468 all the stages in any of the permitted flows.
423469 """
424470
425 authdict = None
426471 sid = None # type: Optional[str]
427 if clientdict and "auth" in clientdict:
428 authdict = clientdict["auth"]
429 del clientdict["auth"]
430 if "session" in authdict:
431 sid = authdict["session"]
472 authdict = clientdict.pop("auth", {})
473 if "session" in authdict:
474 sid = authdict["session"]
432475
433476 # Convert the URI and method to strings.
434477 uri = request.uri.decode("utf-8")
533576
534577 creds = await self.store.get_completed_ui_auth_stages(session.session_id)
535578 for f in flows:
579 # If all the required credentials have been supplied, the user has
580 # successfully completed the UI auth process!
536581 if len(set(f) - set(creds)) == 0:
537582 # it's very useful to know what args are stored, but this can
538583 # include the password in the case of registering, so only log
708753 device_id: Optional[str],
709754 valid_until_ms: Optional[int],
710755 puppets_user_id: Optional[str] = None,
756 is_appservice_ghost: bool = False,
711757 ) -> str:
712758 """
713759 Creates a new access token for the user with the given user ID.
724770 we should always have a device ID)
725771 valid_until_ms: when the token is valid until. None for
726772 no expiry.
773 is_appservice_ghost: Whether the user is an application ghost user
727774 Returns:
728775 The access token for the user's session.
729776 Raises:
744791 "Logging in user %s on device %s%s", user_id, device_id, fmt_expiry
745792 )
746793
747 await self.auth.check_auth_blocking(user_id)
794 if (
795 not is_appservice_ghost
796 or self.hs.config.appservice.track_appservice_user_ips
797 ):
798 await self.auth.check_auth_blocking(user_id)
748799
749800 access_token = self.macaroon_gen.generate_access_token(user_id)
750801 await self.store.add_access_token_to_user(
830881
831882 async def validate_login(
832883 self, login_submission: Dict[str, Any], ratelimit: bool = False,
833 ) -> Tuple[str, Optional[Callable[[Dict[str, str]], None]]]:
884 ) -> Tuple[str, Optional[Callable[[Dict[str, str]], Awaitable[None]]]]:
834885 """Authenticates the user for the /login API
835886
836887 Also used by the user-interactive auth flow to validate auth types which don't
9731024
9741025 async def _validate_userid_login(
9751026 self, username: str, login_submission: Dict[str, Any],
976 ) -> Tuple[str, Optional[Callable[[Dict[str, str]], None]]]:
1027 ) -> Tuple[str, Optional[Callable[[Dict[str, str]], Awaitable[None]]]]:
9771028 """Helper for validate_login
9781029
9791030 Handles login, once we've mapped 3pids onto userids
10281079 if result:
10291080 return result
10301081
1031 if login_type == LoginType.PASSWORD and self.hs.config.password_localdb_enabled:
1082 if login_type == LoginType.PASSWORD and self._password_localdb_enabled:
10321083 known_login_type = True
10331084
10341085 # we've already checked that there is a (valid) password field
10511102
10521103 async def check_password_provider_3pid(
10531104 self, medium: str, address: str, password: str
1054 ) -> Tuple[Optional[str], Optional[Callable[[Dict[str, str]], None]]]:
1105 ) -> Tuple[Optional[str], Optional[Callable[[Dict[str, str]], Awaitable[None]]]]:
10551106 """Check if a password provider is able to validate a thirdparty login
10561107
10571108 Args:
13021353 )
13031354
13041355 async def complete_sso_ui_auth(
1305 self, registered_user_id: str, session_id: str, request: SynapseRequest,
1356 self, registered_user_id: str, session_id: str, request: Request,
13061357 ):
13071358 """Having figured out a mxid for this user, complete the HTTP request
13081359
13091360 Args:
13101361 registered_user_id: The registered user ID to complete SSO login for.
1362 session_id: The ID of the user-interactive auth session.
13111363 request: The request to complete.
1312 client_redirect_url: The URL to which to redirect the user at the end of the
1313 process.
13141364 """
13151365 # Mark the stage of the authentication as successful.
13161366 # Save the user who authenticated with SSO, this will be used to ensure
13261376 async def complete_sso_login(
13271377 self,
13281378 registered_user_id: str,
1329 request: SynapseRequest,
1379 request: Request,
13301380 client_redirect_url: str,
13311381 extra_attributes: Optional[JsonDict] = None,
13321382 ):
13541404 def _complete_sso_login(
13551405 self,
13561406 registered_user_id: str,
1357 request: SynapseRequest,
1407 request: Request,
13581408 client_redirect_url: str,
13591409 extra_attributes: Optional[JsonDict] = None,
13601410 ):
16081658
16091659 # This might return an awaitable, if it does block the log out
16101660 # until it completes.
1611 result = g(user_id=user_id, device_id=device_id, access_token=access_token,)
1612 if inspect.isawaitable(result):
1613 await result
1661 await maybe_awaitable(
1662 g(user_id=user_id, device_id=device_id, access_token=access_token,)
1663 )
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414 import logging
15 import urllib
16 from typing import TYPE_CHECKING, Dict, Optional, Tuple
15 import urllib.parse
16 from typing import TYPE_CHECKING, Dict, Optional
1717 from xml.etree import ElementTree as ET
1818
19 import attr
20
1921 from twisted.web.client import PartialDownloadError
2022
21 from synapse.api.errors import Codes, LoginError
23 from synapse.api.errors import HttpResponseException
24 from synapse.handlers.sso import MappingException, UserAttributes
2225 from synapse.http.site import SynapseRequest
2326 from synapse.types import UserID, map_username_to_mxid_localpart
2427
2629 from synapse.app.homeserver import HomeServer
2730
2831 logger = logging.getLogger(__name__)
32
33
34 class CasError(Exception):
35 """Used to catch errors when validating the CAS ticket.
36 """
37
38 def __init__(self, error, error_description=None):
39 self.error = error
40 self.error_description = error_description
41
42 def __str__(self):
43 if self.error_description:
44 return "{}: {}".format(self.error, self.error_description)
45 return self.error
46
47
48 @attr.s(slots=True, frozen=True)
49 class CasResponse:
50 username = attr.ib(type=str)
51 attributes = attr.ib(type=Dict[str, Optional[str]])
2952
3053
3154 class CasHandler:
3962 def __init__(self, hs: "HomeServer"):
4063 self.hs = hs
4164 self._hostname = hs.hostname
65 self._store = hs.get_datastore()
4266 self._auth_handler = hs.get_auth_handler()
4367 self._registration_handler = hs.get_registration_handler()
4468
4872 self._cas_required_attributes = hs.config.cas_required_attributes
4973
5074 self._http_client = hs.get_proxied_http_client()
75
76 # identifier for the external_ids table
77 self._auth_provider_id = "cas"
78
79 self._sso_handler = hs.get_sso_handler()
5180
5281 def _build_service_param(self, args: Dict[str, str]) -> str:
5382 """
6897
6998 async def _validate_ticket(
7099 self, ticket: str, service_args: Dict[str, str]
71 ) -> Tuple[str, Optional[str]]:
72 """
73 Validate a CAS ticket with the server, parse the response, and return the user and display name.
100 ) -> CasResponse:
101 """
102 Validate a CAS ticket with the server, and return the parsed the response.
74103
75104 Args:
76105 ticket: The CAS ticket from the client.
77106 service_args: Additional arguments to include in the service URL.
78107 Should be the same as those passed to `get_redirect_url`.
108
109 Raises:
110 CasError: If there's an error parsing the CAS response.
111
112 Returns:
113 The parsed CAS response.
79114 """
80115 uri = self._cas_server_url + "/proxyValidate"
81116 args = {
88123 # Twisted raises this error if the connection is closed,
89124 # even if that's being used old-http style to signal end-of-data
90125 body = pde.response
91
92 user, attributes = self._parse_cas_response(body)
93 displayname = attributes.pop(self._cas_displayname_attribute, None)
94
95 for required_attribute, required_value in self._cas_required_attributes.items():
96 # If required attribute was not in CAS Response - Forbidden
97 if required_attribute not in attributes:
98 raise LoginError(401, "Unauthorized", errcode=Codes.UNAUTHORIZED)
99
100 # Also need to check value
101 if required_value is not None:
102 actual_value = attributes[required_attribute]
103 # If required attribute value does not match expected - Forbidden
104 if required_value != actual_value:
105 raise LoginError(401, "Unauthorized", errcode=Codes.UNAUTHORIZED)
106
107 return user, displayname
108
109 def _parse_cas_response(
110 self, cas_response_body: bytes
111 ) -> Tuple[str, Dict[str, Optional[str]]]:
126 except HttpResponseException as e:
127 description = (
128 (
129 'Authorization server responded with a "{status}" error '
130 "while exchanging the authorization code."
131 ).format(status=e.code),
132 )
133 raise CasError("server_error", description) from e
134
135 return self._parse_cas_response(body)
136
137 def _parse_cas_response(self, cas_response_body: bytes) -> CasResponse:
112138 """
113139 Retrieve the user and other parameters from the CAS response.
114140
115141 Args:
116142 cas_response_body: The response from the CAS query.
117143
144 Raises:
145 CasError: If there's an error parsing the CAS response.
146
118147 Returns:
119 A tuple of the user and a mapping of other attributes.
120 """
148 The parsed CAS response.
149 """
150
151 # Ensure the response is valid.
152 root = ET.fromstring(cas_response_body)
153 if not root.tag.endswith("serviceResponse"):
154 raise CasError(
155 "missing_service_response",
156 "root of CAS response is not serviceResponse",
157 )
158
159 success = root[0].tag.endswith("authenticationSuccess")
160 if not success:
161 raise CasError("unsucessful_response", "Unsuccessful CAS response")
162
163 # Iterate through the nodes and pull out the user and any extra attributes.
121164 user = None
122165 attributes = {}
123 try:
124 root = ET.fromstring(cas_response_body)
125 if not root.tag.endswith("serviceResponse"):
126 raise Exception("root of CAS response is not serviceResponse")
127 success = root[0].tag.endswith("authenticationSuccess")
128 for child in root[0]:
129 if child.tag.endswith("user"):
130 user = child.text
131 if child.tag.endswith("attributes"):
132 for attribute in child:
133 # ElementTree library expands the namespace in
134 # attribute tags to the full URL of the namespace.
135 # We don't care about namespace here and it will always
136 # be encased in curly braces, so we remove them.
137 tag = attribute.tag
138 if "}" in tag:
139 tag = tag.split("}")[1]
140 attributes[tag] = attribute.text
141 if user is None:
142 raise Exception("CAS response does not contain user")
143 except Exception:
144 logger.exception("Error parsing CAS response")
145 raise LoginError(401, "Invalid CAS response", errcode=Codes.UNAUTHORIZED)
146 if not success:
147 raise LoginError(
148 401, "Unsuccessful CAS response", errcode=Codes.UNAUTHORIZED
149 )
150 return user, attributes
166 for child in root[0]:
167 if child.tag.endswith("user"):
168 user = child.text
169 if child.tag.endswith("attributes"):
170 for attribute in child:
171 # ElementTree library expands the namespace in
172 # attribute tags to the full URL of the namespace.
173 # We don't care about namespace here and it will always
174 # be encased in curly braces, so we remove them.
175 tag = attribute.tag
176 if "}" in tag:
177 tag = tag.split("}")[1]
178 attributes[tag] = attribute.text
179
180 # Ensure a user was found.
181 if user is None:
182 raise CasError("no_user", "CAS response does not contain user")
183
184 return CasResponse(user, attributes)
151185
152186 def get_redirect_url(self, service_args: Dict[str, str]) -> str:
153187 """
200234 args["redirectUrl"] = client_redirect_url
201235 if session:
202236 args["session"] = session
203 username, user_display_name = await self._validate_ticket(ticket, args)
204
205 # Pull out the user-agent and IP from the request.
206 user_agent = request.get_user_agent("")
207 ip_address = self.hs.get_ip_from_request(request)
208
209 # Get the matrix ID from the CAS username.
210 user_id = await self._map_cas_user_to_matrix_user(
211 username, user_display_name, user_agent, ip_address
237
238 try:
239 cas_response = await self._validate_ticket(ticket, args)
240 except CasError as e:
241 logger.exception("Could not validate ticket")
242 self._sso_handler.render_error(request, e.error, e.error_description, 401)
243 return
244
245 await self._handle_cas_response(
246 request, cas_response, client_redirect_url, session
212247 )
213248
249 async def _handle_cas_response(
250 self,
251 request: SynapseRequest,
252 cas_response: CasResponse,
253 client_redirect_url: Optional[str],
254 session: Optional[str],
255 ) -> None:
256 """Handle a CAS response to a ticket request.
257
258 Assumes that the response has been validated. Maps the user onto an MXID,
259 registering them if necessary, and returns a response to the browser.
260
261 Args:
262 request: the incoming request from the browser. We'll respond to it with an
263 HTML page or a redirect
264
265 cas_response: The parsed CAS response.
266
267 client_redirect_url: the redirectUrl parameter from the `/cas/ticket` HTTP request, if given.
268 This should be the same as the redirectUrl from the original `/login/sso/redirect` request.
269
270 session: The session parameter from the `/cas/ticket` HTTP request, if given.
271 This should be the UI Auth session id.
272 """
273
274 # first check if we're doing a UIA
214275 if session:
215 await self._auth_handler.complete_sso_ui_auth(
216 user_id, session, request,
217 )
218 else:
219 # If this not a UI auth request than there must be a redirect URL.
220 assert client_redirect_url
221
222 await self._auth_handler.complete_sso_login(
223 user_id, request, client_redirect_url
224 )
225
226 async def _map_cas_user_to_matrix_user(
276 return await self._sso_handler.complete_sso_ui_auth_request(
277 self._auth_provider_id, cas_response.username, session, request,
278 )
279
280 # otherwise, we're handling a login request.
281
282 # Ensure that the attributes of the logged in user meet the required
283 # attributes.
284 for required_attribute, required_value in self._cas_required_attributes.items():
285 # If required attribute was not in CAS Response - Forbidden
286 if required_attribute not in cas_response.attributes:
287 self._sso_handler.render_error(
288 request,
289 "unauthorised",
290 "You are not authorised to log in here.",
291 401,
292 )
293 return
294
295 # Also need to check value
296 if required_value is not None:
297 actual_value = cas_response.attributes[required_attribute]
298 # If required attribute value does not match expected - Forbidden
299 if required_value != actual_value:
300 self._sso_handler.render_error(
301 request,
302 "unauthorised",
303 "You are not authorised to log in here.",
304 401,
305 )
306 return
307
308 # Call the mapper to register/login the user
309
310 # If this not a UI auth request than there must be a redirect URL.
311 assert client_redirect_url is not None
312
313 try:
314 await self._complete_cas_login(cas_response, request, client_redirect_url)
315 except MappingException as e:
316 logger.exception("Could not map user")
317 self._sso_handler.render_error(request, "mapping_error", str(e))
318
319 async def _complete_cas_login(
227320 self,
228 remote_user_id: str,
229 display_name: Optional[str],
230 user_agent: str,
231 ip_address: str,
232 ) -> str:
233 """
234 Given a CAS username, retrieve the user ID for it and possibly register the user.
235
236 Args:
237 remote_user_id: The username from the CAS response.
238 display_name: The display name from the CAS response.
239 user_agent: The user agent of the client making the request.
240 ip_address: The IP address of the client making the request.
241
242 Returns:
243 The user ID associated with this response.
244 """
245
246 localpart = map_username_to_mxid_localpart(remote_user_id)
247 user_id = UserID(localpart, self._hostname).to_string()
248 registered_user_id = await self._auth_handler.check_user_exists(user_id)
249
250 # If the user does not exist, register it.
251 if not registered_user_id:
252 registered_user_id = await self._registration_handler.register_user(
253 localpart=localpart,
254 default_display_name=display_name,
255 user_agent_ips=[(user_agent, ip_address)],
256 )
257
258 return registered_user_id
321 cas_response: CasResponse,
322 request: SynapseRequest,
323 client_redirect_url: str,
324 ) -> None:
325 """
326 Given a CAS response, complete the login flow
327
328 Retrieves the remote user ID, registers the user if necessary, and serves
329 a redirect back to the client with a login-token.
330
331 Args:
332 cas_response: The parsed CAS response.
333 request: The request to respond to
334 client_redirect_url: The redirect URL passed in by the client.
335
336 Raises:
337 MappingException if there was a problem mapping the response to a user.
338 RedirectException: some mapping providers may raise this if they need
339 to redirect to an interstitial page.
340 """
341 # Note that CAS does not support a mapping provider, so the logic is hard-coded.
342 localpart = map_username_to_mxid_localpart(cas_response.username)
343
344 async def cas_response_to_user_attributes(failures: int) -> UserAttributes:
345 """
346 Map from CAS attributes to user attributes.
347 """
348 # Due to the grandfathering logic matching any previously registered
349 # mxids it isn't expected for there to be any failures.
350 if failures:
351 raise RuntimeError("CAS is not expected to de-duplicate Matrix IDs")
352
353 display_name = cas_response.attributes.get(
354 self._cas_displayname_attribute, None
355 )
356
357 return UserAttributes(localpart=localpart, display_name=display_name)
358
359 async def grandfather_existing_users() -> Optional[str]:
360 # Since CAS did not always use the user_external_ids table, always
361 # to attempt to map to existing users.
362 user_id = UserID(localpart, self._hostname).to_string()
363
364 logger.debug(
365 "Looking for existing account based on mapped %s", user_id,
366 )
367
368 users = await self._store.get_users_by_id_case_insensitive(user_id)
369 if users:
370 registered_user_id = list(users.keys())[0]
371 logger.info("Grandfathering mapping to %s", registered_user_id)
372 return registered_user_id
373
374 return None
375
376 await self._sso_handler.complete_sso_login_request(
377 self._auth_provider_id,
378 cas_response.username,
379 request,
380 client_redirect_url,
381 cas_response_to_user_attributes,
382 grandfather_existing_users,
383 )
132132 403, "You must be in the room to create an alias for it"
133133 )
134134
135 if not self.spam_checker.user_may_create_room_alias(user_id, room_alias):
135 if not await self.spam_checker.user_may_create_room_alias(
136 user_id, room_alias
137 ):
136138 raise AuthError(403, "This user is not permitted to create this alias")
137139
138140 if not self.config.is_alias_creation_allowed(
408410 """
409411 user_id = requester.user.to_string()
410412
411 if not self.spam_checker.user_may_publish_room(user_id, room_id):
413 if not await self.spam_checker.user_may_publish_room(user_id, room_id):
412414 raise AuthError(
413415 403, "This user is not permitted to publish rooms to the room list"
414416 )
139139 self._message_handler = hs.get_message_handler()
140140 self._server_notices_mxid = hs.config.server_notices_mxid
141141 self.config = hs.config
142 self.http_client = hs.get_simple_http_client()
142 self.http_client = hs.get_proxied_blacklisted_http_client()
143143 self._instance_name = hs.get_instance_name()
144144 self._replication = hs.get_replication_data_handler()
145145
15921592 if self.hs.config.block_non_admin_invites:
15931593 raise SynapseError(403, "This server does not accept room invites")
15941594
1595 if not self.spam_checker.user_may_invite(
1595 if not await self.spam_checker.user_may_invite(
15961596 event.sender, event.state_key, event.room_id
15971597 ):
15981598 raise SynapseError(
2828
2929 async def f(self, group_id, *args, **kwargs):
3030 if not GroupID.is_valid(group_id):
31 raise SynapseError(400, "%s was not legal group ID" % (group_id,))
31 raise SynapseError(400, "%s is not a legal group ID" % (group_id,))
3232
3333 if self.is_mine_id(group_id):
3434 return await getattr(self.groups_server_handler, func_name)(
4545 def __init__(self, hs):
4646 super().__init__(hs)
4747
48 # An HTTP client for contacting trusted URLs.
4849 self.http_client = SimpleHttpClient(hs)
49 # We create a blacklisting instance of SimpleHttpClient for contacting identity
50 # servers specified by clients
50 # An HTTP client for contacting identity servers specified by clients.
5151 self.blacklisting_http_client = SimpleHttpClient(
5252 hs, ip_blacklist=hs.config.federation_ip_range_blacklist
5353 )
54 self.federation_http_client = hs.get_http_client()
54 self.federation_http_client = hs.get_federation_http_client()
5555 self.hs = hs
56
57 self._web_client_location = hs.config.invite_client_location
5658
5759 async def threepid_from_creds(
5860 self, id_server: str, creds: Dict[str, str]
802804 "sender_display_name": inviter_display_name,
803805 "sender_avatar_url": inviter_avatar_url,
804806 }
807 # If a custom web client location is available, include it in the request.
808 if self._web_client_location:
809 invite_config["org.matrix.web_client_location"] = self._web_client_location
805810
806811 # Add the identity service access token to the JSON body and use the v2
807812 # Identity Service endpoints if id_access_token is present
322322 member_event_id: str,
323323 is_peeking: bool,
324324 ) -> JsonDict:
325 room_state = await self.state_store.get_state_for_events([member_event_id])
326
327 room_state = room_state[member_event_id]
325 room_state = await self.state_store.get_state_for_event(member_event_id)
328326
329327 limit = pagin_config.limit if pagin_config else None
330328 if limit is None:
743743 event.sender,
744744 )
745745
746 spam_error = self.spam_checker.check_event_for_spam(event)
746 spam_error = await self.spam_checker.check_event_for_spam(event)
747747 if spam_error:
748748 if not isinstance(spam_error, str):
749749 spam_error = "Spam is not permitted here"
12601260 event, context = await self.create_event(
12611261 requester,
12621262 {
1263 "type": "org.matrix.dummy_event",
1263 "type": EventTypes.Dummy,
12641264 "content": {},
12651265 "room_id": room_id,
12661266 "sender": user_id,
114114 self._allow_existing_users = hs.config.oidc_allow_existing_users # type: bool
115115
116116 self._http_client = hs.get_proxied_http_client()
117 self._auth_handler = hs.get_auth_handler()
118 self._registration_handler = hs.get_registration_handler()
119117 self._server_name = hs.config.server_name # type: str
120118 self._macaroon_secret_key = hs.config.macaroon_secret_key
121119
673671 self._sso_handler.render_error(request, "invalid_token", str(e))
674672 return
675673
676 # Pull out the user-agent and IP from the request.
677 user_agent = request.get_user_agent("")
678 ip_address = self.hs.get_ip_from_request(request)
674 # first check if we're doing a UIA
675 if ui_auth_session_id:
676 try:
677 remote_user_id = self._remote_id_from_userinfo(userinfo)
678 except Exception as e:
679 logger.exception("Could not extract remote user id")
680 self._sso_handler.render_error(request, "mapping_error", str(e))
681 return
682
683 return await self._sso_handler.complete_sso_ui_auth_request(
684 self._auth_provider_id, remote_user_id, ui_auth_session_id, request
685 )
686
687 # otherwise, it's a login
679688
680689 # Call the mapper to register/login the user
681690 try:
682 user_id = await self._map_userinfo_to_user(
683 userinfo, token, user_agent, ip_address
691 await self._complete_oidc_login(
692 userinfo, token, request, client_redirect_url
684693 )
685694 except MappingException as e:
686695 logger.exception("Could not map user")
687696 self._sso_handler.render_error(request, "mapping_error", str(e))
688 return
689
690 # Mapping providers might not have get_extra_attributes: only call this
691 # method if it exists.
692 extra_attributes = None
693 get_extra_attributes = getattr(
694 self._user_mapping_provider, "get_extra_attributes", None
695 )
696 if get_extra_attributes:
697 extra_attributes = await get_extra_attributes(userinfo, token)
698
699 # and finally complete the login
700 if ui_auth_session_id:
701 await self._auth_handler.complete_sso_ui_auth(
702 user_id, ui_auth_session_id, request
703 )
704 else:
705 await self._auth_handler.complete_sso_login(
706 user_id, request, client_redirect_url, extra_attributes
707 )
708697
709698 def _generate_oidc_session_token(
710699 self,
827816 now = self.clock.time_msec()
828817 return now < expiry
829818
830 async def _map_userinfo_to_user(
831 self, userinfo: UserInfo, token: Token, user_agent: str, ip_address: str
832 ) -> str:
833 """Maps a UserInfo object to a mxid.
819 async def _complete_oidc_login(
820 self,
821 userinfo: UserInfo,
822 token: Token,
823 request: SynapseRequest,
824 client_redirect_url: str,
825 ) -> None:
826 """Given a UserInfo response, complete the login flow
834827
835828 UserInfo should have a claim that uniquely identifies users. This claim
836829 is usually `sub`, but can be configured with `oidc_config.subject_claim`.
842835 If a user already exists with the mxid we've mapped and allow_existing_users
843836 is disabled, raise an exception.
844837
838 Otherwise, render a redirect back to the client_redirect_url with a loginToken.
839
845840 Args:
846841 userinfo: an object representing the user
847842 token: a dict with the tokens obtained from the provider
848 user_agent: The user agent of the client making the request.
849 ip_address: The IP address of the client making the request.
843 request: The request to respond to
844 client_redirect_url: The redirect URL passed in by the client.
850845
851846 Raises:
852847 MappingException: if there was an error while mapping some properties
853
854 Returns:
855 The mxid of the user
856848 """
857849 try:
858 remote_user_id = self._user_mapping_provider.get_remote_user_id(userinfo)
850 remote_user_id = self._remote_id_from_userinfo(userinfo)
859851 except Exception as e:
860852 raise MappingException(
861853 "Failed to extract subject from OIDC response: %s" % (e,)
862854 )
863 # Some OIDC providers use integer IDs, but Synapse expects external IDs
864 # to be strings.
865 remote_user_id = str(remote_user_id)
866855
867856 # Older mapping providers don't accept the `failures` argument, so we
868857 # try and detect support.
923912
924913 return None
925914
926 return await self._sso_handler.get_mxid_from_sso(
915 # Mapping providers might not have get_extra_attributes: only call this
916 # method if it exists.
917 extra_attributes = None
918 get_extra_attributes = getattr(
919 self._user_mapping_provider, "get_extra_attributes", None
920 )
921 if get_extra_attributes:
922 extra_attributes = await get_extra_attributes(userinfo, token)
923
924 await self._sso_handler.complete_sso_login_request(
927925 self._auth_provider_id,
928926 remote_user_id,
929 user_agent,
930 ip_address,
927 request,
928 client_redirect_url,
931929 oidc_response_to_user_attributes,
932930 grandfather_existing_users,
933 )
931 extra_attributes,
932 )
933
934 def _remote_id_from_userinfo(self, userinfo: UserInfo) -> str:
935 """Extract the unique remote id from an OIDC UserInfo block
936
937 Args:
938 userinfo: An object representing the user given by the OIDC provider
939 Returns:
940 remote user id
941 """
942 remote_user_id = self._user_mapping_provider.get_remote_user_id(userinfo)
943 # Some OIDC providers use integer IDs, but Synapse expects external IDs
944 # to be strings.
945 return str(remote_user_id)
934946
935947
936948 UserAttributeDict = TypedDict(
937 "UserAttributeDict", {"localpart": str, "display_name": Optional[str]}
949 "UserAttributeDict", {"localpart": Optional[str], "display_name": Optional[str]}
938950 )
939951 C = TypeVar("C")
940952
10151027
10161028 @attr.s
10171029 class JinjaOidcMappingConfig:
1018 subject_claim = attr.ib() # type: str
1019 localpart_template = attr.ib() # type: Template
1020 display_name_template = attr.ib() # type: Optional[Template]
1021 extra_attributes = attr.ib() # type: Dict[str, Template]
1030 subject_claim = attr.ib(type=str)
1031 localpart_template = attr.ib(type=Optional[Template])
1032 display_name_template = attr.ib(type=Optional[Template])
1033 extra_attributes = attr.ib(type=Dict[str, Template])
10221034
10231035
10241036 class JinjaOidcMappingProvider(OidcMappingProvider[JinjaOidcMappingConfig]):
10341046 def parse_config(config: dict) -> JinjaOidcMappingConfig:
10351047 subject_claim = config.get("subject_claim", "sub")
10361048
1037 if "localpart_template" not in config:
1038 raise ConfigError(
1039 "missing key: oidc_config.user_mapping_provider.config.localpart_template"
1040 )
1041
1042 try:
1043 localpart_template = env.from_string(config["localpart_template"])
1044 except Exception as e:
1045 raise ConfigError(
1046 "invalid jinja template for oidc_config.user_mapping_provider.config.localpart_template: %r"
1047 % (e,)
1048 )
1049 localpart_template = None # type: Optional[Template]
1050 if "localpart_template" in config:
1051 try:
1052 localpart_template = env.from_string(config["localpart_template"])
1053 except Exception as e:
1054 raise ConfigError(
1055 "invalid jinja template", path=["localpart_template"]
1056 ) from e
10491057
10501058 display_name_template = None # type: Optional[Template]
10511059 if "display_name_template" in config:
10531061 display_name_template = env.from_string(config["display_name_template"])
10541062 except Exception as e:
10551063 raise ConfigError(
1056 "invalid jinja template for oidc_config.user_mapping_provider.config.display_name_template: %r"
1057 % (e,)
1058 )
1064 "invalid jinja template", path=["display_name_template"]
1065 ) from e
10591066
10601067 extra_attributes = {} # type Dict[str, Template]
10611068 if "extra_attributes" in config:
10621069 extra_attributes_config = config.get("extra_attributes") or {}
10631070 if not isinstance(extra_attributes_config, dict):
1064 raise ConfigError(
1065 "oidc_config.user_mapping_provider.config.extra_attributes must be a dict"
1066 )
1071 raise ConfigError("must be a dict", path=["extra_attributes"])
10671072
10681073 for key, value in extra_attributes_config.items():
10691074 try:
10701075 extra_attributes[key] = env.from_string(value)
10711076 except Exception as e:
10721077 raise ConfigError(
1073 "invalid jinja template for oidc_config.user_mapping_provider.config.extra_attributes.%s: %r"
1074 % (key, e)
1075 )
1078 "invalid jinja template", path=["extra_attributes", key]
1079 ) from e
10761080
10771081 return JinjaOidcMappingConfig(
10781082 subject_claim=subject_claim,
10871091 async def map_user_attributes(
10881092 self, userinfo: UserInfo, token: Token, failures: int
10891093 ) -> UserAttributeDict:
1090 localpart = self._config.localpart_template.render(user=userinfo).strip()
1091
1092 # Ensure only valid characters are included in the MXID.
1093 localpart = map_username_to_mxid_localpart(localpart)
1094
1095 # Append suffix integer if last call to this function failed to produce
1096 # a usable mxid.
1097 localpart += str(failures) if failures else ""
1094 localpart = None
1095
1096 if self._config.localpart_template:
1097 localpart = self._config.localpart_template.render(user=userinfo).strip()
1098
1099 # Ensure only valid characters are included in the MXID.
1100 localpart = map_username_to_mxid_localpart(localpart)
1101
1102 # Append suffix integer if last call to this function failed to produce
1103 # a usable mxid.
1104 localpart += str(failures) if failures else ""
10981105
10991106 display_name = None # type: Optional[str]
11001107 if self._config.display_name_template is not None:
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414 import logging
15 from typing import List, Tuple
15 from typing import TYPE_CHECKING, List, Optional, Tuple
1616
1717 from synapse.appservice import ApplicationService
1818 from synapse.handlers._base import BaseHandler
1919 from synapse.types import JsonDict, ReadReceipt, get_domain_from_id
20 from synapse.util.async_helpers import maybe_awaitable
20
21 if TYPE_CHECKING:
22 from synapse.app.homeserver import HomeServer
2123
2224 logger = logging.getLogger(__name__)
2325
2426
2527 class ReceiptsHandler(BaseHandler):
26 def __init__(self, hs):
28 def __init__(self, hs: "HomeServer"):
2729 super().__init__(hs)
2830
2931 self.server_name = hs.config.server_name
3638 self.clock = self.hs.get_clock()
3739 self.state = hs.get_state_handler()
3840
39 async def _received_remote_receipt(self, origin, content):
41 async def _received_remote_receipt(self, origin: str, content: JsonDict) -> None:
4042 """Called when we receive an EDU of type m.receipt from a remote HS.
4143 """
4244 receipts = []
6365
6466 await self._handle_new_receipts(receipts)
6567
66 async def _handle_new_receipts(self, receipts):
68 async def _handle_new_receipts(self, receipts: List[ReadReceipt]) -> bool:
6769 """Takes a list of receipts, stores them and informs the notifier.
6870 """
69 min_batch_id = None
70 max_batch_id = None
71 min_batch_id = None # type: Optional[int]
72 max_batch_id = None # type: Optional[int]
7173
7274 for receipt in receipts:
7375 res = await self.store.insert_receipt(
8991 if max_batch_id is None or max_persisted_id > max_batch_id:
9092 max_batch_id = max_persisted_id
9193
92 if min_batch_id is None:
94 # Either both of these should be None or neither.
95 if min_batch_id is None or max_batch_id is None:
9396 # no new receipts
9497 return False
9598
97100
98101 self.notifier.on_new_event("receipt_key", max_batch_id, rooms=affected_room_ids)
99102 # Note that the min here shouldn't be relied upon to be accurate.
100 await maybe_awaitable(
101 self.hs.get_pusherpool().on_new_receipts(
102 min_batch_id, max_batch_id, affected_room_ids
103 )
103 await self.hs.get_pusherpool().on_new_receipts(
104 min_batch_id, max_batch_id, affected_room_ids
104105 )
105106
106107 return True
107108
108 async def received_client_receipt(self, room_id, receipt_type, user_id, event_id):
109 async def received_client_receipt(
110 self, room_id: str, receipt_type: str, user_id: str, event_id: str
111 ) -> None:
109112 """Called when a client tells us a local user has read up to the given
110113 event_id in the room.
111114 """
125128
126129
127130 class ReceiptEventSource:
128 def __init__(self, hs):
131 def __init__(self, hs: "HomeServer"):
129132 self.store = hs.get_datastore()
130133
131 async def get_new_events(self, from_key, room_ids, **kwargs):
134 async def get_new_events(
135 self, from_key: int, room_ids: List[str], **kwargs
136 ) -> Tuple[List[JsonDict], int]:
132137 from_key = int(from_key)
133138 to_key = self.get_current_key()
134139
173178
174179 return (events, to_key)
175180
176 def get_current_key(self, direction="f"):
181 def get_current_key(self, direction: str = "f") -> int:
177182 return self.store.get_max_receipt_stream_id()
186186 """
187187 self.check_registration_ratelimit(address)
188188
189 result = self.spam_checker.check_registration_for_spam(
189 result = await self.spam_checker.check_registration_for_spam(
190190 threepid, localpart, user_agent_ips or [],
191191 )
192192
629629 device_id: Optional[str],
630630 initial_display_name: Optional[str],
631631 is_guest: bool = False,
632 is_appservice_ghost: bool = False,
632633 ) -> Tuple[str, str]:
633634 """Register a device for a user and generate an access token.
634635
650651 device_id=device_id,
651652 initial_display_name=initial_display_name,
652653 is_guest=is_guest,
654 is_appservice_ghost=is_appservice_ghost,
653655 )
654656 return r["device_id"], r["access_token"]
655657
671673 )
672674 else:
673675 access_token = await self._auth_handler.get_access_token_for_user_id(
674 user_id, device_id=registered_device_id, valid_until_ms=valid_until_ms
676 user_id,
677 device_id=registered_device_id,
678 valid_until_ms=valid_until_ms,
679 is_appservice_ghost=is_appservice_ghost,
675680 )
676681
677682 return (registered_device_id, access_token)
2626
2727 from synapse.api.constants import (
2828 EventTypes,
29 HistoryVisibility,
2930 JoinRules,
3031 Membership,
3132 RoomCreationPreset,
8081 self._presets_dict = {
8182 RoomCreationPreset.PRIVATE_CHAT: {
8283 "join_rules": JoinRules.INVITE,
83 "history_visibility": "shared",
84 "history_visibility": HistoryVisibility.SHARED,
8485 "original_invitees_have_ops": False,
8586 "guest_can_join": True,
8687 "power_level_content_override": {"invite": 0},
8788 },
8889 RoomCreationPreset.TRUSTED_PRIVATE_CHAT: {
8990 "join_rules": JoinRules.INVITE,
90 "history_visibility": "shared",
91 "history_visibility": HistoryVisibility.SHARED,
9192 "original_invitees_have_ops": True,
9293 "guest_can_join": True,
9394 "power_level_content_override": {"invite": 0},
9495 },
9596 RoomCreationPreset.PUBLIC_CHAT: {
9697 "join_rules": JoinRules.PUBLIC,
97 "history_visibility": "shared",
98 "history_visibility": HistoryVisibility.SHARED,
9899 "original_invitees_have_ops": False,
99100 "guest_can_join": False,
100101 "power_level_content_override": {},
357358 """
358359 user_id = requester.user.to_string()
359360
360 if not self.spam_checker.user_may_create_room(user_id):
361 if not await self.spam_checker.user_may_create_room(user_id):
361362 raise SynapseError(403, "You are not permitted to create rooms")
362363
363364 creation_content = {
439440 invite_list=[],
440441 initial_state=initial_state,
441442 creation_content=creation_content,
443 ratelimit=False,
442444 )
443445
444446 # Transfer membership events
607609 403, "You are not permitted to create rooms", Codes.FORBIDDEN
608610 )
609611
610 if not is_requester_admin and not self.spam_checker.user_may_create_room(
612 if not is_requester_admin and not await self.spam_checker.user_may_create_room(
611613 user_id
612614 ):
613615 raise SynapseError(403, "You are not permitted to create rooms")
734736 room_alias=room_alias,
735737 power_level_content_override=power_level_content_override,
736738 creator_join_profile=creator_join_profile,
739 ratelimit=ratelimit,
737740 )
738741
739742 if "name" in config:
837840 room_alias: Optional[RoomAlias] = None,
838841 power_level_content_override: Optional[JsonDict] = None,
839842 creator_join_profile: Optional[JsonDict] = None,
843 ratelimit: bool = True,
840844 ) -> int:
841845 """Sends the initial events into a new room.
842846
883887 creator.user,
884888 room_id,
885889 "join",
886 ratelimit=False,
890 ratelimit=ratelimit,
887891 content=creator_join_profile,
888892 )
889893
1414
1515 import logging
1616 from collections import namedtuple
17 from typing import Any, Dict, Optional
17 from typing import TYPE_CHECKING, Optional, Tuple
1818
1919 import msgpack
2020 from unpaddedbase64 import decode_base64, encode_base64
2121
22 from synapse.api.constants import EventTypes, JoinRules
22 from synapse.api.constants import EventTypes, HistoryVisibility, JoinRules
2323 from synapse.api.errors import Codes, HttpResponseException
24 from synapse.types import ThirdPartyInstanceID
24 from synapse.types import JsonDict, ThirdPartyInstanceID
2525 from synapse.util.caches.descriptors import cached
2626 from synapse.util.caches.response_cache import ResponseCache
2727
2828 from ._base import BaseHandler
2929
30 if TYPE_CHECKING:
31 from synapse.app.homeserver import HomeServer
32
3033 logger = logging.getLogger(__name__)
3134
3235 REMOTE_ROOM_LIST_POLL_INTERVAL = 60 * 1000
3639
3740
3841 class RoomListHandler(BaseHandler):
39 def __init__(self, hs):
42 def __init__(self, hs: "HomeServer"):
4043 super().__init__(hs)
4144 self.enable_room_list_search = hs.config.enable_room_list_search
42 self.response_cache = ResponseCache(hs, "room_list")
45 self.response_cache = ResponseCache(
46 hs, "room_list"
47 ) # type: ResponseCache[Tuple[Optional[int], Optional[str], ThirdPartyInstanceID]]
4348 self.remote_response_cache = ResponseCache(
4449 hs, "remote_room_list", timeout_ms=30 * 1000
45 )
50 ) # type: ResponseCache[Tuple[str, Optional[int], Optional[str], bool, Optional[str]]]
4651
4752 async def get_local_public_room_list(
4853 self,
49 limit=None,
50 since_token=None,
51 search_filter=None,
52 network_tuple=EMPTY_THIRD_PARTY_ID,
53 from_federation=False,
54 ):
54 limit: Optional[int] = None,
55 since_token: Optional[str] = None,
56 search_filter: Optional[dict] = None,
57 network_tuple: ThirdPartyInstanceID = EMPTY_THIRD_PARTY_ID,
58 from_federation: bool = False,
59 ) -> JsonDict:
5560 """Generate a local public room list.
5661
5762 There are multiple different lists: the main one plus one per third
5863 party network. A client can ask for a specific list or to return all.
5964
6065 Args:
61 limit (int|None)
62 since_token (str|None)
63 search_filter (dict|None)
64 network_tuple (ThirdPartyInstanceID): Which public list to use.
66 limit
67 since_token
68 search_filter
69 network_tuple: Which public list to use.
6570 This can be (None, None) to indicate the main list, or a particular
6671 appservice and network id to use an appservice specific one.
6772 Setting to None returns all public rooms across all lists.
68 from_federation (bool): true iff the request comes from the federation
69 API
73 from_federation: true iff the request comes from the federation API
7074 """
7175 if not self.enable_room_list_search:
7276 return {"chunk": [], "total_room_count_estimate": 0}
106110 self,
107111 limit: Optional[int] = None,
108112 since_token: Optional[str] = None,
109 search_filter: Optional[Dict] = None,
113 search_filter: Optional[dict] = None,
110114 network_tuple: ThirdPartyInstanceID = EMPTY_THIRD_PARTY_ID,
111115 from_federation: bool = False,
112 ) -> Dict[str, Any]:
116 ) -> JsonDict:
113117 """Generate a public room list.
114118 Args:
115119 limit: Maximum amount of rooms to return.
130134 if since_token:
131135 batch_token = RoomListNextBatch.from_token(since_token)
132136
133 bounds = (batch_token.last_joined_members, batch_token.last_room_id)
137 bounds = (
138 batch_token.last_joined_members,
139 batch_token.last_room_id,
140 ) # type: Optional[Tuple[int, str]]
134141 forwards = batch_token.direction_is_forward
142 has_batch_token = True
135143 else:
136 batch_token = None
137144 bounds = None
138145
139146 forwards = True
147 has_batch_token = False
140148
141149 # we request one more than wanted to see if there are more pages to come
142150 probing_limit = limit + 1 if limit is not None else None
158166 "canonical_alias": room["canonical_alias"],
159167 "num_joined_members": room["joined_members"],
160168 "avatar_url": room["avatar"],
161 "world_readable": room["history_visibility"] == "world_readable",
169 "world_readable": room["history_visibility"]
170 == HistoryVisibility.WORLD_READABLE,
162171 "guest_can_join": room["guest_access"] == "can_join",
163172 }
164173
167176
168177 results = [build_room_entry(r) for r in results]
169178
170 response = {}
179 response = {} # type: JsonDict
171180 num_results = len(results)
172181 if limit is not None:
173182 more_to_come = num_results == probing_limit
185194 initial_entry = results[0]
186195
187196 if forwards:
188 if batch_token:
197 if has_batch_token:
189198 # If there was a token given then we assume that there
190199 # must be previous results.
191200 response["prev_batch"] = RoomListNextBatch(
201210 direction_is_forward=True,
202211 ).to_token()
203212 else:
204 if batch_token:
213 if has_batch_token:
205214 response["next_batch"] = RoomListNextBatch(
206215 last_joined_members=final_entry["num_joined_members"],
207216 last_room_id=final_entry["room_id"],
291300 return None
292301
293302 # Return whether this room is open to federation users or not
294 create_event = current_state.get((EventTypes.Create, ""))
303 create_event = current_state[EventTypes.Create, ""]
295304 result["m.federate"] = create_event.content.get("m.federate", True)
296305
297306 name_event = current_state.get((EventTypes.Name, ""))
316325 visibility = None
317326 if visibility_event:
318327 visibility = visibility_event.content.get("history_visibility", None)
319 result["world_readable"] = visibility == "world_readable"
328 result["world_readable"] = visibility == HistoryVisibility.WORLD_READABLE
320329
321330 guest_event = current_state.get((EventTypes.GuestAccess, ""))
322331 guest = None
334343
335344 async def get_remote_public_room_list(
336345 self,
337 server_name,
338 limit=None,
339 since_token=None,
340 search_filter=None,
341 include_all_networks=False,
342 third_party_instance_id=None,
343 ):
346 server_name: str,
347 limit: Optional[int] = None,
348 since_token: Optional[str] = None,
349 search_filter: Optional[dict] = None,
350 include_all_networks: bool = False,
351 third_party_instance_id: Optional[str] = None,
352 ) -> JsonDict:
344353 if not self.enable_room_list_search:
345354 return {"chunk": [], "total_room_count_estimate": 0}
346355
397406
398407 async def _get_remote_list_cached(
399408 self,
400 server_name,
401 limit=None,
402 since_token=None,
403 search_filter=None,
404 include_all_networks=False,
405 third_party_instance_id=None,
406 ):
409 server_name: str,
410 limit: Optional[int] = None,
411 since_token: Optional[str] = None,
412 search_filter: Optional[dict] = None,
413 include_all_networks: bool = False,
414 third_party_instance_id: Optional[str] = None,
415 ) -> JsonDict:
407416 repl_layer = self.hs.get_federation_client()
408417 if search_filter:
409418 # We can't cache when asking for search
454463 REVERSE_KEY_DICT = {v: k for k, v in KEY_DICT.items()}
455464
456465 @classmethod
457 def from_token(cls, token):
466 def from_token(cls, token: str) -> "RoomListNextBatch":
458467 decoded = msgpack.loads(decode_base64(token), raw=False)
459468 return RoomListNextBatch(
460469 **{cls.REVERSE_KEY_DICT[key]: val for key, val in decoded.items()}
461470 )
462471
463 def to_token(self):
472 def to_token(self) -> str:
464473 return encode_base64(
465474 msgpack.dumps(
466475 {self.KEY_DICT[key]: val for key, val in self._asdict().items()}
467476 )
468477 )
469478
470 def copy_and_replace(self, **kwds):
479 def copy_and_replace(self, **kwds) -> "RoomListNextBatch":
471480 return self._replace(**kwds)
472481
473482
474 def _matches_room_entry(room_entry, search_filter):
483 def _matches_room_entry(room_entry: JsonDict, search_filter: dict) -> bool:
475484 if search_filter and search_filter.get("generic_search_term", None):
476485 generic_search_term = search_filter["generic_search_term"].upper()
477486 if generic_search_term in room_entry.get("name", "").upper():
202202
203203 # Only rate-limit if the user actually joined the room, otherwise we'll end
204204 # up blocking profile updates.
205 if newly_joined:
205 if newly_joined and ratelimit:
206206 time_now_s = self.clock.time()
207207 (
208208 allowed,
407407 )
408408 block_invite = True
409409
410 if not self.spam_checker.user_may_invite(
410 if not await self.spam_checker.user_may_invite(
411411 requester.user.to_string(), target.to_string(), room_id
412412 ):
413413 logger.info("Blocking invite due to spam checker")
487487 raise AuthError(403, "Guest access not allowed")
488488
489489 if not is_host_in_room:
490 time_now_s = self.clock.time()
491 (
492 allowed,
493 time_allowed,
494 ) = self._join_rate_limiter_remote.can_requester_do_action(requester,)
495
496 if not allowed:
497 raise LimitExceededError(
498 retry_after_ms=int(1000 * (time_allowed - time_now_s))
490 if ratelimit:
491 time_now_s = self.clock.time()
492 (
493 allowed,
494 time_allowed,
495 ) = self._join_rate_limiter_remote.can_requester_do_action(
496 requester,
499497 )
498
499 if not allowed:
500 raise LimitExceededError(
501 retry_after_ms=int(1000 * (time_allowed - time_now_s))
502 )
500503
501504 inviter = await self._get_inviter(target.to_string(), room_id)
502505 if inviter and not self.hs.is_mine(inviter):
3333 map_username_to_mxid_localpart,
3434 mxid_localpart_allowed_characters,
3535 )
36 from synapse.util.async_helpers import Linearizer
3736 from synapse.util.iterutils import chunk_seq
3837
3938 if TYPE_CHECKING:
5857 super().__init__(hs)
5958 self._saml_client = Saml2Client(hs.config.saml2_sp_config)
6059 self._saml_idp_entityid = hs.config.saml2_idp_entityid
61 self._auth_handler = hs.get_auth_handler()
62 self._registration_handler = hs.get_registration_handler()
6360
6461 self._saml2_session_lifetime = hs.config.saml2_session_lifetime
6562 self._grandfathered_mxid_source_attribute = (
7976
8077 # a map from saml session id to Saml2SessionData object
8178 self._outstanding_requests_dict = {} # type: Dict[str, Saml2SessionData]
82
83 # a lock on the mappings
84 self._mapping_lock = Linearizer(name="saml_mapping", clock=self.clock)
8579
8680 self._sso_handler = hs.get_sso_handler()
8781
166160 return
167161
168162 logger.debug("SAML2 response: %s", saml2_auth.origxml)
163
164 await self._handle_authn_response(request, saml2_auth, relay_state)
165
166 async def _handle_authn_response(
167 self,
168 request: SynapseRequest,
169 saml2_auth: saml2.response.AuthnResponse,
170 relay_state: str,
171 ) -> None:
172 """Handle an AuthnResponse, having parsed it from the request params
173
174 Assumes that the signature on the response object has been checked. Maps
175 the user onto an MXID, registering them if necessary, and returns a response
176 to the browser.
177
178 Args:
179 request: the incoming request from the browser. We'll respond to it with an
180 HTML page or a redirect
181 saml2_auth: the parsed AuthnResponse object
182 relay_state: the RelayState query param, which encodes the URI to rediret
183 back to
184 """
185
169186 for assertion in saml2_auth.assertions:
170187 # kibana limits the length of a log field, whereas this is all rather
171188 # useful, so split it up.
182199 saml2_auth.in_response_to, None
183200 )
184201
202 # first check if we're doing a UIA
203 if current_session and current_session.ui_auth_session_id:
204 try:
205 remote_user_id = self._remote_id_from_saml_response(saml2_auth, None)
206 except MappingException as e:
207 logger.exception("Failed to extract remote user id from SAML response")
208 self._sso_handler.render_error(request, "mapping_error", str(e))
209 return
210
211 return await self._sso_handler.complete_sso_ui_auth_request(
212 self._auth_provider_id,
213 remote_user_id,
214 current_session.ui_auth_session_id,
215 request,
216 )
217
218 # otherwise, we're handling a login request.
219
185220 # Ensure that the attributes of the logged in user meet the required
186221 # attributes.
187222 for requirement in self._saml2_attribute_requirements:
191226 )
192227 return
193228
194 # Pull out the user-agent and IP from the request.
195 user_agent = request.get_user_agent("")
196 ip_address = self.hs.get_ip_from_request(request)
197
198229 # Call the mapper to register/login the user
199230 try:
200 user_id = await self._map_saml_response_to_user(
201 saml2_auth, relay_state, user_agent, ip_address
202 )
231 await self._complete_saml_login(saml2_auth, request, relay_state)
203232 except MappingException as e:
204233 logger.exception("Could not map user")
205234 self._sso_handler.render_error(request, "mapping_error", str(e))
206 return
207
208 # Complete the interactive auth session or the login.
209 if current_session and current_session.ui_auth_session_id:
210 await self._auth_handler.complete_sso_ui_auth(
211 user_id, current_session.ui_auth_session_id, request
212 )
213
214 else:
215 await self._auth_handler.complete_sso_login(user_id, request, relay_state)
216
217 async def _map_saml_response_to_user(
235
236 async def _complete_saml_login(
218237 self,
219238 saml2_auth: saml2.response.AuthnResponse,
239 request: SynapseRequest,
220240 client_redirect_url: str,
221 user_agent: str,
222 ip_address: str,
223 ) -> str:
224 """
225 Given a SAML response, retrieve the user ID for it and possibly register the user.
241 ) -> None:
242 """
243 Given a SAML response, complete the login flow
244
245 Retrieves the remote user ID, registers the user if necessary, and serves
246 a redirect back to the client with a login-token.
226247
227248 Args:
228249 saml2_auth: The parsed SAML2 response.
250 request: The request to respond to
229251 client_redirect_url: The redirect URL passed in by the client.
230 user_agent: The user agent of the client making the request.
231 ip_address: The IP address of the client making the request.
232
233 Returns:
234 The user ID associated with this response.
235252
236253 Raises:
237254 MappingException if there was a problem mapping the response to a user.
238255 RedirectException: some mapping providers may raise this if they need
239256 to redirect to an interstitial page.
240257 """
241
242 remote_user_id = self._user_mapping_provider.get_remote_user_id(
258 remote_user_id = self._remote_id_from_saml_response(
243259 saml2_auth, client_redirect_url
244260 )
245
246 if not remote_user_id:
247 raise MappingException(
248 "Failed to extract remote user id from SAML response"
249 )
250261
251262 async def saml_response_to_remapped_user_attributes(
252263 failures: int,
293304
294305 return None
295306
296 with (await self._mapping_lock.queue(self._auth_provider_id)):
297 return await self._sso_handler.get_mxid_from_sso(
298 self._auth_provider_id,
299 remote_user_id,
300 user_agent,
301 ip_address,
302 saml_response_to_remapped_user_attributes,
303 grandfather_existing_users,
304 )
307 await self._sso_handler.complete_sso_login_request(
308 self._auth_provider_id,
309 remote_user_id,
310 request,
311 client_redirect_url,
312 saml_response_to_remapped_user_attributes,
313 grandfather_existing_users,
314 )
315
316 def _remote_id_from_saml_response(
317 self,
318 saml2_auth: saml2.response.AuthnResponse,
319 client_redirect_url: Optional[str],
320 ) -> str:
321 """Extract the unique remote id from a SAML2 AuthnResponse
322
323 Args:
324 saml2_auth: The parsed SAML2 response.
325 client_redirect_url: The redirect URL passed in by the client.
326 Returns:
327 remote user id
328
329 Raises:
330 MappingException if there was an error extracting the user id
331 """
332 # It's not obvious why we need to pass in the redirect URI to the mapping
333 # provider, but we do :/
334 remote_user_id = self._user_mapping_provider.get_remote_user_id(
335 saml2_auth, client_redirect_url
336 )
337
338 if not remote_user_id:
339 raise MappingException(
340 "Failed to extract remote user id from SAML response"
341 )
342
343 return remote_user_id
305344
306345 def expire_sessions(self):
307346 expire_before = self.clock.time_msec() - self._saml2_session_lifetime
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414 import logging
15 from typing import TYPE_CHECKING, Awaitable, Callable, List, Optional
15 from typing import TYPE_CHECKING, Awaitable, Callable, Dict, List, Optional
1616
1717 import attr
18
19 from synapse.api.errors import RedirectException
20 from synapse.handlers._base import BaseHandler
18 from typing_extensions import NoReturn
19
20 from twisted.web.http import Request
21
22 from synapse.api.errors import RedirectException, SynapseError
2123 from synapse.http.server import respond_with_html
22 from synapse.types import UserID, contains_invalid_mxid_characters
24 from synapse.http.site import SynapseRequest
25 from synapse.types import JsonDict, UserID, contains_invalid_mxid_characters
26 from synapse.util.async_helpers import Linearizer
27 from synapse.util.stringutils import random_string
2328
2429 if TYPE_CHECKING:
2530 from synapse.server import HomeServer
3641
3742 @attr.s
3843 class UserAttributes:
39 localpart = attr.ib(type=str)
44 # the localpart of the mxid that the mapper has assigned to the user.
45 # if `None`, the mapper has not picked a userid, and the user should be prompted to
46 # enter one.
47 localpart = attr.ib(type=Optional[str])
4048 display_name = attr.ib(type=Optional[str], default=None)
4149 emails = attr.ib(type=List[str], default=attr.Factory(list))
4250
4351
44 class SsoHandler(BaseHandler):
52 @attr.s(slots=True)
53 class UsernameMappingSession:
54 """Data we track about SSO sessions"""
55
56 # A unique identifier for this SSO provider, e.g. "oidc" or "saml".
57 auth_provider_id = attr.ib(type=str)
58
59 # user ID on the IdP server
60 remote_user_id = attr.ib(type=str)
61
62 # attributes returned by the ID mapper
63 display_name = attr.ib(type=Optional[str])
64 emails = attr.ib(type=List[str])
65
66 # An optional dictionary of extra attributes to be provided to the client in the
67 # login response.
68 extra_login_attributes = attr.ib(type=Optional[JsonDict])
69
70 # where to redirect the client back to
71 client_redirect_url = attr.ib(type=str)
72
73 # expiry time for the session, in milliseconds
74 expiry_time_ms = attr.ib(type=int)
75
76
77 # the HTTP cookie used to track the mapping session id
78 USERNAME_MAPPING_SESSION_COOKIE_NAME = b"username_mapping_session"
79
80
81 class SsoHandler:
4582 # The number of attempts to ask the mapping provider for when generating an MXID.
4683 _MAP_USERNAME_RETRIES = 1000
4784
85 # the time a UsernameMappingSession remains valid for
86 _MAPPING_SESSION_VALIDITY_PERIOD_MS = 15 * 60 * 1000
87
4888 def __init__(self, hs: "HomeServer"):
49 super().__init__(hs)
89 self._clock = hs.get_clock()
90 self._store = hs.get_datastore()
91 self._server_name = hs.hostname
5092 self._registration_handler = hs.get_registration_handler()
5193 self._error_template = hs.config.sso_error_template
94 self._auth_handler = hs.get_auth_handler()
95
96 # a lock on the mappings
97 self._mapping_lock = Linearizer(name="sso_user_mapping", clock=hs.get_clock())
98
99 # a map from session id to session data
100 self._username_mapping_sessions = {} # type: Dict[str, UsernameMappingSession]
52101
53102 def render_error(
54 self, request, error: str, error_description: Optional[str] = None
103 self,
104 request: Request,
105 error: str,
106 error_description: Optional[str] = None,
107 code: int = 400,
55108 ) -> None:
56109 """Renders the error template and responds with it.
57110
63116 We'll respond with an HTML page describing the error.
64117 error: A technical identifier for this error.
65118 error_description: A human-readable description of the error.
119 code: The integer error code (an HTTP response code)
66120 """
67121 html = self._error_template.render(
68122 error=error, error_description=error_description
69123 )
70 respond_with_html(request, 400, html)
124 respond_with_html(request, code, html)
71125
72126 async def get_sso_user_by_remote_user_id(
73127 self, auth_provider_id: str, remote_user_id: str
94148 )
95149
96150 # Check if we already have a mapping for this user.
97 previously_registered_user_id = await self.store.get_user_by_external_id(
151 previously_registered_user_id = await self._store.get_user_by_external_id(
98152 auth_provider_id, remote_user_id,
99153 )
100154
111165 # No match.
112166 return None
113167
114 async def get_mxid_from_sso(
168 async def complete_sso_login_request(
115169 self,
116170 auth_provider_id: str,
117171 remote_user_id: str,
118 user_agent: str,
119 ip_address: str,
172 request: SynapseRequest,
173 client_redirect_url: str,
120174 sso_to_matrix_id_mapper: Callable[[int], Awaitable[UserAttributes]],
121 grandfather_existing_users: Optional[Callable[[], Awaitable[Optional[str]]]],
122 ) -> str:
175 grandfather_existing_users: Callable[[], Awaitable[Optional[str]]],
176 extra_login_attributes: Optional[JsonDict] = None,
177 ) -> None:
123178 """
124179 Given an SSO ID, retrieve the user ID for it and possibly register the user.
125180
138193 given user-agent and IP address and the SSO ID is linked to this matrix
139194 ID for subsequent calls.
140195
196 Finally, we generate a redirect to the supplied redirect uri, with a login token
197
141198 Args:
142199 auth_provider_id: A unique identifier for this SSO provider, e.g.
143200 "oidc" or "saml".
201
144202 remote_user_id: The unique identifier from the SSO provider.
145 user_agent: The user agent of the client making the request.
146 ip_address: The IP address of the client making the request.
203
204 request: The request to respond to
205
206 client_redirect_url: The redirect URL passed in by the client.
207
147208 sso_to_matrix_id_mapper: A callable to generate the user attributes.
148209 The only parameter is an integer which represents the amount of
149210 times the returned mxid localpart mapping has failed.
155216 to the user.
156217 RedirectException to redirect to an additional page (e.g.
157218 to prompt the user for more information).
219
158220 grandfather_existing_users: A callable which can return an previously
159221 existing matrix ID. The SSO ID is then linked to the returned
160222 matrix ID.
161223
162 Returns:
163 The user ID associated with the SSO response.
224 extra_login_attributes: An optional dictionary of extra
225 attributes to be provided to the client in the login response.
164226
165227 Raises:
166228 MappingException if there was a problem mapping the response to a user.
168230 to an additional page. (e.g. to prompt for more information)
169231
170232 """
171 # first of all, check if we already have a mapping for this user
172 previously_registered_user_id = await self.get_sso_user_by_remote_user_id(
173 auth_provider_id, remote_user_id,
174 )
175 if previously_registered_user_id:
176 return previously_registered_user_id
177
178 # Check for grandfathering of users.
179 if grandfather_existing_users:
180 previously_registered_user_id = await grandfather_existing_users()
181 if previously_registered_user_id:
182 # Future logins should also match this user ID.
183 await self.store.record_user_external_id(
184 auth_provider_id, remote_user_id, previously_registered_user_id
233 # grab a lock while we try to find a mapping for this user. This seems...
234 # optimistic, especially for implementations that end up redirecting to
235 # interstitial pages.
236 with await self._mapping_lock.queue(auth_provider_id):
237 # first of all, check if we already have a mapping for this user
238 user_id = await self.get_sso_user_by_remote_user_id(
239 auth_provider_id, remote_user_id,
240 )
241
242 # Check for grandfathering of users.
243 if not user_id:
244 user_id = await grandfather_existing_users()
245 if user_id:
246 # Future logins should also match this user ID.
247 await self._store.record_user_external_id(
248 auth_provider_id, remote_user_id, user_id
249 )
250
251 # Otherwise, generate a new user.
252 if not user_id:
253 attributes = await self._call_attribute_mapper(sso_to_matrix_id_mapper)
254
255 if attributes.localpart is None:
256 # the mapper doesn't return a username. bail out with a redirect to
257 # the username picker.
258 await self._redirect_to_username_picker(
259 auth_provider_id,
260 remote_user_id,
261 attributes,
262 client_redirect_url,
263 extra_login_attributes,
264 )
265
266 user_id = await self._register_mapped_user(
267 attributes,
268 auth_provider_id,
269 remote_user_id,
270 request.get_user_agent(""),
271 request.getClientIP(),
185272 )
186 return previously_registered_user_id
187
188 # Otherwise, generate a new user.
273
274 await self._auth_handler.complete_sso_login(
275 user_id, request, client_redirect_url, extra_login_attributes
276 )
277
278 async def _call_attribute_mapper(
279 self, sso_to_matrix_id_mapper: Callable[[int], Awaitable[UserAttributes]],
280 ) -> UserAttributes:
281 """Call the attribute mapper function in a loop, until we get a unique userid"""
189282 for i in range(self._MAP_USERNAME_RETRIES):
190283 try:
191284 attributes = await sso_to_matrix_id_mapper(i)
207300 )
208301
209302 if not attributes.localpart:
210 raise MappingException(
211 "Error parsing SSO response: SSO mapping provider plugin "
212 "did not return a localpart value"
213 )
303 # the mapper has not picked a localpart
304 return attributes
214305
215306 # Check if this mxid already exists
216 user_id = UserID(attributes.localpart, self.server_name).to_string()
217 if not await self.store.get_users_by_id_case_insensitive(user_id):
307 user_id = UserID(attributes.localpart, self._server_name).to_string()
308 if not await self._store.get_users_by_id_case_insensitive(user_id):
218309 # This mxid is free
219310 break
220311 else:
223314 raise MappingException(
224315 "Unable to generate a Matrix ID from the SSO response"
225316 )
317 return attributes
318
319 async def _redirect_to_username_picker(
320 self,
321 auth_provider_id: str,
322 remote_user_id: str,
323 attributes: UserAttributes,
324 client_redirect_url: str,
325 extra_login_attributes: Optional[JsonDict],
326 ) -> NoReturn:
327 """Creates a UsernameMappingSession and redirects the browser
328
329 Called if the user mapping provider doesn't return a localpart for a new user.
330 Raises a RedirectException which redirects the browser to the username picker.
331
332 Args:
333 auth_provider_id: A unique identifier for this SSO provider, e.g.
334 "oidc" or "saml".
335
336 remote_user_id: The unique identifier from the SSO provider.
337
338 attributes: the user attributes returned by the user mapping provider.
339
340 client_redirect_url: The redirect URL passed in by the client, which we
341 will eventually redirect back to.
342
343 extra_login_attributes: An optional dictionary of extra
344 attributes to be provided to the client in the login response.
345
346 Raises:
347 RedirectException
348 """
349 session_id = random_string(16)
350 now = self._clock.time_msec()
351 session = UsernameMappingSession(
352 auth_provider_id=auth_provider_id,
353 remote_user_id=remote_user_id,
354 display_name=attributes.display_name,
355 emails=attributes.emails,
356 client_redirect_url=client_redirect_url,
357 expiry_time_ms=now + self._MAPPING_SESSION_VALIDITY_PERIOD_MS,
358 extra_login_attributes=extra_login_attributes,
359 )
360
361 self._username_mapping_sessions[session_id] = session
362 logger.info("Recorded registration session id %s", session_id)
363
364 # Set the cookie and redirect to the username picker
365 e = RedirectException(b"/_synapse/client/pick_username")
366 e.cookies.append(
367 b"%s=%s; path=/"
368 % (USERNAME_MAPPING_SESSION_COOKIE_NAME, session_id.encode("ascii"))
369 )
370 raise e
371
372 async def _register_mapped_user(
373 self,
374 attributes: UserAttributes,
375 auth_provider_id: str,
376 remote_user_id: str,
377 user_agent: str,
378 ip_address: str,
379 ) -> str:
380 """Register a new SSO user.
381
382 This is called once we have successfully mapped the remote user id onto a local
383 user id, one way or another.
384
385 Args:
386 attributes: user attributes returned by the user mapping provider,
387 including a non-empty localpart.
388
389 auth_provider_id: A unique identifier for this SSO provider, e.g.
390 "oidc" or "saml".
391
392 remote_user_id: The unique identifier from the SSO provider.
393
394 user_agent: The user-agent in the HTTP request (used for potential
395 shadow-banning.)
396
397 ip_address: The IP address of the requester (used for potential
398 shadow-banning.)
399
400 Raises:
401 a MappingException if the localpart is invalid.
402
403 a SynapseError with code 400 and errcode Codes.USER_IN_USE if the localpart
404 is already taken.
405 """
226406
227407 # Since the localpart is provided via a potentially untrusted module,
228408 # ensure the MXID is valid before registering.
229 if contains_invalid_mxid_characters(attributes.localpart):
409 if not attributes.localpart or contains_invalid_mxid_characters(
410 attributes.localpart
411 ):
230412 raise MappingException("localpart is invalid: %s" % (attributes.localpart,))
231413
232414 logger.debug("Mapped SSO user to local part %s", attributes.localpart)
237419 user_agent_ips=[(user_agent, ip_address)],
238420 )
239421
240 await self.store.record_user_external_id(
422 await self._store.record_user_external_id(
241423 auth_provider_id, remote_user_id, registered_user_id
242424 )
243425 return registered_user_id
426
427 async def complete_sso_ui_auth_request(
428 self,
429 auth_provider_id: str,
430 remote_user_id: str,
431 ui_auth_session_id: str,
432 request: Request,
433 ) -> None:
434 """
435 Given an SSO ID, retrieve the user ID for it and complete UIA.
436
437 Note that this requires that the user is mapped in the "user_external_ids"
438 table. This will be the case if they have ever logged in via SAML or OIDC in
439 recentish synapse versions, but may not be for older users.
440
441 Args:
442 auth_provider_id: A unique identifier for this SSO provider, e.g.
443 "oidc" or "saml".
444 remote_user_id: The unique identifier from the SSO provider.
445 ui_auth_session_id: The ID of the user-interactive auth session.
446 request: The request to complete.
447 """
448
449 user_id = await self.get_sso_user_by_remote_user_id(
450 auth_provider_id, remote_user_id,
451 )
452
453 if not user_id:
454 logger.warning(
455 "Remote user %s/%s has not previously logged in here: UIA will fail",
456 auth_provider_id,
457 remote_user_id,
458 )
459 # Let the UIA flow handle this the same as if they presented creds for a
460 # different user.
461 user_id = ""
462
463 await self._auth_handler.complete_sso_ui_auth(
464 user_id, ui_auth_session_id, request
465 )
466
467 async def check_username_availability(
468 self, localpart: str, session_id: str,
469 ) -> bool:
470 """Handle an "is username available" callback check
471
472 Args:
473 localpart: desired localpart
474 session_id: the session id for the username picker
475 Returns:
476 True if the username is available
477 Raises:
478 SynapseError if the localpart is invalid or the session is unknown
479 """
480
481 # make sure that there is a valid mapping session, to stop people dictionary-
482 # scanning for accounts
483
484 self._expire_old_sessions()
485 session = self._username_mapping_sessions.get(session_id)
486 if not session:
487 logger.info("Couldn't find session id %s", session_id)
488 raise SynapseError(400, "unknown session")
489
490 logger.info(
491 "[session %s] Checking for availability of username %s",
492 session_id,
493 localpart,
494 )
495
496 if contains_invalid_mxid_characters(localpart):
497 raise SynapseError(400, "localpart is invalid: %s" % (localpart,))
498 user_id = UserID(localpart, self._server_name).to_string()
499 user_infos = await self._store.get_users_by_id_case_insensitive(user_id)
500
501 logger.info("[session %s] users: %s", session_id, user_infos)
502 return not user_infos
503
504 async def handle_submit_username_request(
505 self, request: SynapseRequest, localpart: str, session_id: str
506 ) -> None:
507 """Handle a request to the username-picker 'submit' endpoint
508
509 Will serve an HTTP response to the request.
510
511 Args:
512 request: HTTP request
513 localpart: localpart requested by the user
514 session_id: ID of the username mapping session, extracted from a cookie
515 """
516 self._expire_old_sessions()
517 session = self._username_mapping_sessions.get(session_id)
518 if not session:
519 logger.info("Couldn't find session id %s", session_id)
520 raise SynapseError(400, "unknown session")
521
522 logger.info("[session %s] Registering localpart %s", session_id, localpart)
523
524 attributes = UserAttributes(
525 localpart=localpart,
526 display_name=session.display_name,
527 emails=session.emails,
528 )
529
530 # the following will raise a 400 error if the username has been taken in the
531 # meantime.
532 user_id = await self._register_mapped_user(
533 attributes,
534 session.auth_provider_id,
535 session.remote_user_id,
536 request.get_user_agent(""),
537 request.getClientIP(),
538 )
539
540 logger.info("[session %s] Registered userid %s", session_id, user_id)
541
542 # delete the mapping session and the cookie
543 del self._username_mapping_sessions[session_id]
544
545 # delete the cookie
546 request.addCookie(
547 USERNAME_MAPPING_SESSION_COOKIE_NAME,
548 b"",
549 expires=b"Thu, 01 Jan 1970 00:00:00 GMT",
550 path=b"/",
551 )
552
553 await self._auth_handler.complete_sso_login(
554 user_id,
555 request,
556 session.client_redirect_url,
557 session.extra_login_attributes,
558 )
559
560 def _expire_old_sessions(self):
561 to_expire = []
562 now = int(self._clock.time_msec())
563
564 for session_id, session in self._username_mapping_sessions.items():
565 if session.expiry_time_ms <= now:
566 to_expire.append(session_id)
567
568 for session_id in to_expire:
569 logger.info("Expiring mapping session %s", session_id)
570 del self._username_mapping_sessions[session_id]
553553 event.event_id, state_filter=state_filter
554554 )
555555 if event.is_state():
556 state_ids = state_ids.copy()
556 state_ids = dict(state_ids)
557557 state_ids[(event.type, event.state_key)] = event.event_id
558558 return state_ids
559559
1313 # limitations under the License.
1414
1515 import logging
16 from typing import TYPE_CHECKING, Any, Dict, List, Optional
1617
1718 import synapse.metrics
18 from synapse.api.constants import EventTypes, JoinRules, Membership
19 from synapse.api.constants import EventTypes, HistoryVisibility, JoinRules, Membership
1920 from synapse.handlers.state_deltas import StateDeltasHandler
2021 from synapse.metrics.background_process_metrics import run_as_background_process
2122 from synapse.storage.roommember import ProfileInfo
23 from synapse.types import JsonDict
2224 from synapse.util.metrics import Measure
25
26 if TYPE_CHECKING:
27 from synapse.app.homeserver import HomeServer
2328
2429 logger = logging.getLogger(__name__)
2530
3540 be in the directory or not when necessary.
3641 """
3742
38 def __init__(self, hs):
43 def __init__(self, hs: "HomeServer"):
3944 super().__init__(hs)
4045
4146 self.store = hs.get_datastore()
4853 self.search_all_users = hs.config.user_directory_search_all_users
4954 self.spam_checker = hs.get_spam_checker()
5055 # The current position in the current_state_delta stream
51 self.pos = None
56 self.pos = None # type: Optional[int]
5257
5358 # Guard to ensure we only process deltas one at a time
5459 self._is_processing = False
6065 # we start populating the user directory
6166 self.clock.call_later(0, self.notify_new_event)
6267
63 async def search_users(self, user_id, search_term, limit):
68 async def search_users(
69 self, user_id: str, search_term: str, limit: int
70 ) -> JsonDict:
6471 """Searches for users in directory
6572
6673 Returns:
8087 results = await self.store.search_user_dir(user_id, search_term, limit)
8188
8289 # Remove any spammy users from the results.
83 results["results"] = [
84 user
85 for user in results["results"]
86 if not self.spam_checker.check_username_for_spam(user)
87 ]
90 non_spammy_users = []
91 for user in results["results"]:
92 if not await self.spam_checker.check_username_for_spam(user):
93 non_spammy_users.append(user)
94 results["results"] = non_spammy_users
8895
8996 return results
9097
91 def notify_new_event(self):
98 def notify_new_event(self) -> None:
9299 """Called when there may be more deltas to process
93100 """
94101 if not self.update_user_directory:
106113 self._is_processing = True
107114 run_as_background_process("user_directory.notify_new_event", process)
108115
109 async def handle_local_profile_change(self, user_id, profile):
116 async def handle_local_profile_change(
117 self, user_id: str, profile: ProfileInfo
118 ) -> None:
110119 """Called to update index of our local user profiles when they change
111120 irrespective of any rooms the user may be in.
112121 """
113122 # FIXME(#3714): We should probably do this in the same worker as all
114123 # the other changes.
124
125 # Support users are for diagnostics and should not appear in the user directory.
115126 is_support = await self.store.is_support_user(user_id)
116 # Support users are for diagnostics and should not appear in the user directory.
117 if not is_support:
127 # When change profile information of deactivated user it should not appear in the user directory.
128 is_deactivated = await self.store.get_user_deactivated_status(user_id)
129
130 if not (is_support or is_deactivated):
118131 await self.store.update_profile_in_user_dir(
119132 user_id, profile.display_name, profile.avatar_url
120133 )
121134
122 async def handle_user_deactivated(self, user_id):
135 async def handle_user_deactivated(self, user_id: str) -> None:
123136 """Called when a user ID is deactivated
124137 """
125138 # FIXME(#3714): We should probably do this in the same worker as all
126139 # the other changes.
127140 await self.store.remove_from_user_dir(user_id)
128141
129 async def _unsafe_process(self):
142 async def _unsafe_process(self) -> None:
130143 # If self.pos is None then means we haven't fetched it from DB
131144 if self.pos is None:
132145 self.pos = await self.store.get_user_directory_stream_pos()
161174
162175 await self.store.update_user_directory_stream_pos(max_pos)
163176
164 async def _handle_deltas(self, deltas):
177 async def _handle_deltas(self, deltas: List[Dict[str, Any]]) -> None:
165178 """Called with the state deltas to process
166179 """
167180 for delta in deltas:
231244 logger.debug("Ignoring irrelevant type: %r", typ)
232245
233246 async def _handle_room_publicity_change(
234 self, room_id, prev_event_id, event_id, typ
235 ):
247 self,
248 room_id: str,
249 prev_event_id: Optional[str],
250 event_id: Optional[str],
251 typ: str,
252 ) -> None:
236253 """Handle a room having potentially changed from/to world_readable/publicly
237254 joinable.
238255
239256 Args:
240 room_id (str)
241 prev_event_id (str|None): The previous event before the state change
242 event_id (str|None): The new event after the state change
243 typ (str): Type of the event
257 room_id: The ID of the room which changed.
258 prev_event_id: The previous event before the state change
259 event_id: The new event after the state change
260 typ: Type of the event
244261 """
245262 logger.debug("Handling change for %s: %s", typ, room_id)
246263
249266 prev_event_id,
250267 event_id,
251268 key_name="history_visibility",
252 public_value="world_readable",
269 public_value=HistoryVisibility.WORLD_READABLE,
253270 )
254271 elif typ == EventTypes.JoinRules:
255272 change = await self._get_key_change(
298315 for user_id, profile in users_with_profile.items():
299316 await self._handle_new_user(room_id, user_id, profile)
300317
301 async def _handle_new_user(self, room_id, user_id, profile):
318 async def _handle_new_user(
319 self, room_id: str, user_id: str, profile: ProfileInfo
320 ) -> None:
302321 """Called when we might need to add user to directory
303322
304323 Args:
305 room_id (str): room_id that user joined or started being public
306 user_id (str)
324 room_id: The room ID that user joined or started being public
325 user_id
307326 """
308327 logger.debug("Adding new user to dir, %r", user_id)
309328
351370 if to_insert:
352371 await self.store.add_users_who_share_private_room(room_id, to_insert)
353372
354 async def _handle_remove_user(self, room_id, user_id):
373 async def _handle_remove_user(self, room_id: str, user_id: str) -> None:
355374 """Called when we might need to remove user from directory
356375
357376 Args:
358 room_id (str): room_id that user left or stopped being public that
359 user_id (str)
377 room_id: The room ID that user left or stopped being public that
378 user_id
360379 """
361380 logger.debug("Removing user %r", user_id)
362381
369388 if len(rooms_user_is_in) == 0:
370389 await self.store.remove_from_user_dir(user_id)
371390
372 async def _handle_profile_change(self, user_id, room_id, prev_event_id, event_id):
391 async def _handle_profile_change(
392 self,
393 user_id: str,
394 room_id: str,
395 prev_event_id: Optional[str],
396 event_id: Optional[str],
397 ) -> None:
373398 """Check member event changes for any profile changes and update the
374399 database if there are.
375400 """
124124 return _scheduler
125125
126126
127 class IPBlacklistingResolver:
127 class _IPBlacklistingResolver:
128128 """
129129 A proxy for reactor.nameResolver which only produces non-blacklisted IP
130130 addresses, preventing DNS rebinding attacks on URL preview.
196196 )
197197
198198 return r
199
200
201 @implementer(IReactorPluggableNameResolver)
202 class BlacklistingReactorWrapper:
203 """
204 A Reactor wrapper which will prevent DNS resolution to blacklisted IP
205 addresses, to prevent DNS rebinding.
206 """
207
208 def __init__(
209 self,
210 reactor: IReactorPluggableNameResolver,
211 ip_whitelist: Optional[IPSet],
212 ip_blacklist: IPSet,
213 ):
214 self._reactor = reactor
215
216 # We need to use a DNS resolver which filters out blacklisted IP
217 # addresses, to prevent DNS rebinding.
218 self._nameResolver = _IPBlacklistingResolver(
219 self._reactor, ip_whitelist, ip_blacklist
220 )
221
222 def __getattr__(self, attr: str) -> Any:
223 # Passthrough to the real reactor except for the DNS resolver.
224 if attr == "nameResolver":
225 return self._nameResolver
226 else:
227 return getattr(self._reactor, attr)
199228
200229
201230 class BlacklistingAgentWrapper(Agent):
291320 self.user_agent = self.user_agent.encode("ascii")
292321
293322 if self._ip_blacklist:
294 real_reactor = hs.get_reactor()
295323 # If we have an IP blacklist, we need to use a DNS resolver which
296324 # filters out blacklisted IP addresses, to prevent DNS rebinding.
297 nameResolver = IPBlacklistingResolver(
298 real_reactor, self._ip_whitelist, self._ip_blacklist
325 self.reactor = BlacklistingReactorWrapper(
326 hs.get_reactor(), self._ip_whitelist, self._ip_blacklist
299327 )
300
301 @implementer(IReactorPluggableNameResolver)
302 class Reactor:
303 def __getattr__(_self, attr):
304 if attr == "nameResolver":
305 return nameResolver
306 else:
307 return getattr(real_reactor, attr)
308
309 self.reactor = Reactor()
310328 else:
311329 self.reactor = hs.get_reactor()
312330
322340
323341 self.agent = ProxyAgent(
324342 self.reactor,
343 hs.get_reactor(),
325344 connectTimeout=15,
326345 contextFactory=self.hs.get_http_client_context_factory(),
327346 pool=pool,
701720
702721 try:
703722 length = await make_deferred_yieldable(
704 readBodyToFile(response, output_stream, max_size)
723 read_body_with_max_size(response, output_stream, max_size)
705724 )
706 except SynapseError:
707 # This can happen e.g. because the body is too large.
708 raise
725 except BodyExceededMaxSize:
726 SynapseError(
727 502,
728 "Requested file is too large > %r bytes" % (max_size,),
729 Codes.TOO_LARGE,
730 )
709731 except Exception as e:
710732 raise SynapseError(502, ("Failed to download remote body: %s" % e)) from e
711733
729751 return f
730752
731753
732 class _ReadBodyToFileProtocol(protocol.Protocol):
754 class BodyExceededMaxSize(Exception):
755 """The maximum allowed size of the HTTP body was exceeded."""
756
757
758 class _ReadBodyWithMaxSizeProtocol(protocol.Protocol):
733759 def __init__(
734760 self, stream: BinaryIO, deferred: defer.Deferred, max_size: Optional[int]
735761 ):
742768 self.stream.write(data)
743769 self.length += len(data)
744770 if self.max_size is not None and self.length >= self.max_size:
745 self.deferred.errback(
746 SynapseError(
747 502,
748 "Requested file is too large > %r bytes" % (self.max_size,),
749 Codes.TOO_LARGE,
750 )
751 )
771 self.deferred.errback(BodyExceededMaxSize())
752772 self.deferred = defer.Deferred()
753773 self.transport.loseConnection()
754774
763783 self.deferred.errback(reason)
764784
765785
766 def readBodyToFile(
786 def read_body_with_max_size(
767787 response: IResponse, stream: BinaryIO, max_size: Optional[int]
768788 ) -> defer.Deferred:
769789 """
770790 Read a HTTP response body to a file-object. Optionally enforcing a maximum file size.
791
792 If the maximum file size is reached, the returned Deferred will resolve to a
793 Failure with a BodyExceededMaxSize exception.
771794
772795 Args:
773796 response: The HTTP response to read from.
779802 """
780803
781804 d = defer.Deferred()
782 response.deliverBody(_ReadBodyToFileProtocol(stream, d, max_size))
805 response.deliverBody(_ReadBodyWithMaxSizeProtocol(stream, d, max_size))
783806 return d
784807
785808
1515 import urllib.parse
1616 from typing import List, Optional
1717
18 from netaddr import AddrFormatError, IPAddress
18 from netaddr import AddrFormatError, IPAddress, IPSet
1919 from zope.interface import implementer
2020
2121 from twisted.internet import defer
3030 from twisted.web.iweb import IAgent, IAgentEndpointFactory, IBodyProducer
3131
3232 from synapse.crypto.context_factory import FederationPolicyForHTTPS
33 from synapse.http.client import BlacklistingAgentWrapper
3334 from synapse.http.federation.srv_resolver import Server, SrvResolver
3435 from synapse.http.federation.well_known_resolver import WellKnownResolver
3536 from synapse.logging.context import make_deferred_yieldable, run_in_background
6970 reactor: IReactorCore,
7071 tls_client_options_factory: Optional[FederationPolicyForHTTPS],
7172 user_agent: bytes,
73 ip_blacklist: IPSet,
7274 _srv_resolver: Optional[SrvResolver] = None,
7375 _well_known_resolver: Optional[WellKnownResolver] = None,
7476 ):
8991 self.user_agent = user_agent
9092
9193 if _well_known_resolver is None:
94 # Note that the name resolver has already been wrapped in a
95 # IPBlacklistingResolver by MatrixFederationHttpClient.
9296 _well_known_resolver = WellKnownResolver(
9397 self._reactor,
94 agent=Agent(
98 agent=BlacklistingAgentWrapper(
99 Agent(
100 self._reactor,
101 pool=self._pool,
102 contextFactory=tls_client_options_factory,
103 ),
95104 self._reactor,
96 pool=self._pool,
97 contextFactory=tls_client_options_factory,
105 ip_blacklist=ip_blacklist,
98106 ),
99107 user_agent=self.user_agent,
100108 )
1414 import logging
1515 import random
1616 import time
17 from io import BytesIO
1718 from typing import Callable, Dict, Optional, Tuple
1819
1920 import attr
2021
2122 from twisted.internet import defer
2223 from twisted.internet.interfaces import IReactorTime
23 from twisted.web.client import RedirectAgent, readBody
24 from twisted.web.client import RedirectAgent
2425 from twisted.web.http import stringToDatetime
2526 from twisted.web.http_headers import Headers
2627 from twisted.web.iweb import IAgent, IResponse
2728
29 from synapse.http.client import BodyExceededMaxSize, read_body_with_max_size
2830 from synapse.logging.context import make_deferred_yieldable
2931 from synapse.util import Clock, json_decoder
3032 from synapse.util.caches.ttlcache import TTLCache
5153
5254 # lower bound for .well-known cache period
5355 WELL_KNOWN_MIN_CACHE_PERIOD = 5 * 60
56
57 # The maximum size (in bytes) to allow a well-known file to be.
58 WELL_KNOWN_MAX_SIZE = 50 * 1024 # 50 KiB
5459
5560 # Attempt to refetch a cached well-known N% of the TTL before it expires.
5661 # e.g. if set to 0.2 and we have a cached entry with a TTL of 5mins, then
228233 server_name: name of the server, from the requested url
229234 retry: Whether to retry the request if it fails.
230235
236 Raises:
237 _FetchWellKnownFailure if we fail to lookup a result
238
231239 Returns:
232240 Returns the response object and body. Response may be a non-200 response.
233241 """
249257 b"GET", uri, headers=Headers(headers)
250258 )
251259 )
252 body = await make_deferred_yieldable(readBody(response))
260 body_stream = BytesIO()
261 await make_deferred_yieldable(
262 read_body_with_max_size(response, body_stream, WELL_KNOWN_MAX_SIZE)
263 )
264 body = body_stream.getvalue()
253265
254266 if 500 <= response.code < 600:
255267 raise Exception("Non-200 response %s" % (response.code,))
258270 except defer.CancelledError:
259271 # Bail if we've been cancelled
260272 raise
273 except BodyExceededMaxSize:
274 # If the well-known file was too large, do not keep attempting
275 # to download it, but consider it a temporary error.
276 logger.warning(
277 "Requested .well-known file for %s is too large > %r bytes",
278 server_name.decode("ascii"),
279 WELL_KNOWN_MAX_SIZE,
280 )
281 raise _FetchWellKnownFailure(temporary=True)
261282 except Exception as e:
262283 if not retry or i >= WELL_KNOWN_RETRY_ATTEMPTS:
263284 logger.info("Error fetching %s: %s", uri_str, e)
2525 from canonicaljson import encode_canonical_json
2626 from prometheus_client import Counter
2727 from signedjson.sign import sign_json
28 from zope.interface import implementer
2928
3029 from twisted.internet import defer
3130 from twisted.internet.error import DNSLookupError
32 from twisted.internet.interfaces import IReactorPluggableNameResolver, IReactorTime
31 from twisted.internet.interfaces import IReactorTime
3332 from twisted.internet.task import _EPSILON, Cooperator
3433 from twisted.web.http_headers import Headers
3534 from twisted.web.iweb import IBodyProducer, IResponse
3736 import synapse.metrics
3837 import synapse.util.retryutils
3938 from synapse.api.errors import (
39 Codes,
4040 FederationDeniedError,
4141 HttpResponseException,
4242 RequestSendFailed,
43 SynapseError,
4344 )
4445 from synapse.http import QuieterFileBodyProducer
4546 from synapse.http.client import (
4647 BlacklistingAgentWrapper,
47 IPBlacklistingResolver,
48 BlacklistingReactorWrapper,
49 BodyExceededMaxSize,
4850 encode_query_args,
49 readBodyToFile,
51 read_body_with_max_size,
5052 )
5153 from synapse.http.federation.matrix_federation_agent import MatrixFederationAgent
5254 from synapse.logging.context import make_deferred_yieldable
220222 self.signing_key = hs.signing_key
221223 self.server_name = hs.hostname
222224
223 real_reactor = hs.get_reactor()
224
225225 # We need to use a DNS resolver which filters out blacklisted IP
226226 # addresses, to prevent DNS rebinding.
227 nameResolver = IPBlacklistingResolver(
228 real_reactor, None, hs.config.federation_ip_range_blacklist
229 )
230
231 @implementer(IReactorPluggableNameResolver)
232 class Reactor:
233 def __getattr__(_self, attr):
234 if attr == "nameResolver":
235 return nameResolver
236 else:
237 return getattr(real_reactor, attr)
238
239 self.reactor = Reactor()
227 self.reactor = BlacklistingReactorWrapper(
228 hs.get_reactor(), None, hs.config.federation_ip_range_blacklist
229 )
240230
241231 user_agent = hs.version_string
242232 if hs.config.user_agent_suffix:
244234 user_agent = user_agent.encode("ascii")
245235
246236 self.agent = MatrixFederationAgent(
247 self.reactor, tls_client_options_factory, user_agent
237 self.reactor,
238 tls_client_options_factory,
239 user_agent,
240 hs.config.federation_ip_range_blacklist,
248241 )
249242
250243 # Use a BlacklistingAgentWrapper to prevent circumventing the IP
984977 headers = dict(response.headers.getAllRawHeaders())
985978
986979 try:
987 d = readBodyToFile(response, output_stream, max_size)
980 d = read_body_with_max_size(response, output_stream, max_size)
988981 d.addTimeout(self.default_timeout, self.reactor)
989982 length = await make_deferred_yieldable(d)
983 except BodyExceededMaxSize:
984 msg = "Requested file is too large > %r bytes" % (max_size,)
985 logger.warning(
986 "{%s} [%s] %s", request.txn_id, request.destination, msg,
987 )
988 SynapseError(502, msg, Codes.TOO_LARGE)
990989 except Exception as e:
991990 logger.warning(
992991 "{%s} [%s] Error reading response: %s",
3838 reactor: twisted reactor to place outgoing
3939 connections.
4040
41 proxy_reactor: twisted reactor to use for connections to the proxy server
42 reactor might have some blacklisting applied (i.e. for DNS queries),
43 but we need unblocked access to the proxy.
44
4145 contextFactory (IPolicyForHTTPS): A factory for TLS contexts, to control the
4246 verification parameters of OpenSSL. The default is to use a
4347 `BrowserLikePolicyForHTTPS`, so unless you have special
5862 def __init__(
5963 self,
6064 reactor,
65 proxy_reactor=None,
6166 contextFactory=BrowserLikePolicyForHTTPS(),
6267 connectTimeout=None,
6368 bindAddress=None,
6772 ):
6873 _AgentBase.__init__(self, reactor, pool)
6974
75 if proxy_reactor is None:
76 self.proxy_reactor = reactor
77 else:
78 self.proxy_reactor = proxy_reactor
79
7080 self._endpoint_kwargs = {}
7181 if connectTimeout is not None:
7282 self._endpoint_kwargs["timeout"] = connectTimeout
7484 self._endpoint_kwargs["bindAddress"] = bindAddress
7585
7686 self.http_proxy_endpoint = _http_proxy_endpoint(
77 http_proxy, reactor, **self._endpoint_kwargs
87 http_proxy, self.proxy_reactor, **self._endpoint_kwargs
7888 )
7989
8090 self.https_proxy_endpoint = _http_proxy_endpoint(
81 https_proxy, reactor, **self._endpoint_kwargs
91 https_proxy, self.proxy_reactor, **self._endpoint_kwargs
8292 )
8393
8494 self._policy_for_https = contextFactory
136146 request_path = uri
137147 elif parsed_uri.scheme == b"https" and self.https_proxy_endpoint:
138148 endpoint = HTTPConnectProxyEndpoint(
139 self._reactor,
149 self.proxy_reactor,
140150 self.https_proxy_endpoint,
141151 parsed_uri.host,
142152 parsed_uri.port,
274274 formatting responses and errors as JSON.
275275 """
276276
277 def __init__(self, canonical_json=False, extract_context=False):
278 super().__init__(extract_context)
279 self.canonical_json = canonical_json
280
277281 def _send_response(
278282 self, request: Request, code: int, response_object: Any,
279283 ):
317321 )
318322
319323 def __init__(self, hs, canonical_json=True, extract_context=False):
320 super().__init__(extract_context)
321
322 self.canonical_json = canonical_json
324 super().__init__(canonical_json, extract_context)
323325 self.clock = hs.get_clock()
324326 self.path_regexs = {}
325327 self.hs = hs
127127
128128 # create a LogContext for this request
129129 request_id = self.get_request_id()
130 logcontext = self.logcontext = LoggingContext(request_id)
131 logcontext.request = request_id
130 self.logcontext = LoggingContext(request_id, request=request_id)
132131
133132 # override the Server header which is set by twisted
134133 self.setHeader("Server", self.site.server_version_string)
202202 def copy_to(self, record):
203203 pass
204204
205 def copy_to_twisted_log_entry(self, record):
206 record["request"] = None
207 record["scope"] = None
208
209205 def start(self, rusage: "Optional[resource._RUsage]"):
210206 pass
211207
371367 # we also track the current scope:
372368 record.scope = self.scope
373369
374 def copy_to_twisted_log_entry(self, record) -> None:
375 """
376 Copy logging fields from this context to a Twisted log record.
377 """
378 record["request"] = self.request
379 record["scope"] = self.scope
380
381370 def start(self, rusage: "Optional[resource._RUsage]") -> None:
382371 """
383372 Record that this logcontext is currently running.
541530 class LoggingContextFilter(logging.Filter):
542531 """Logging filter that adds values from the current logging context to each
543532 record.
544 Args:
545 **defaults: Default values to avoid formatters complaining about
546 missing fields
547 """
548
549 def __init__(self, **defaults) -> None:
550 self.defaults = defaults
533 """
534
535 def __init__(self, request: str = ""):
536 self._default_request = request
551537
552538 def filter(self, record) -> Literal[True]:
553539 """Add each fields from the logging contexts to the record.
555541 True to include the record in the log output.
556542 """
557543 context = current_context()
558 for key, value in self.defaults.items():
559 setattr(record, key, value)
544 record.request = self._default_request
560545
561546 # context should never be None, but if it somehow ends up being, then
562547 # we end up in a death spiral of infinite loops, so let's check, for
563548 # robustness' sake.
564549 if context is not None:
565 context.copy_to(record)
550 # Logging is interested in the request.
551 record.request = context.request
566552
567553 return True
568554
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414
15 import inspect
1615 import logging
1716 import threading
1817 from functools import wraps
2423
2524 from synapse.logging.context import LoggingContext, PreserveLoggingContext
2625 from synapse.logging.opentracing import noop_context_manager, start_active_span
26 from synapse.util.async_helpers import maybe_awaitable
2727
2828 if TYPE_CHECKING:
2929 import resource
198198 _background_process_start_count.labels(desc).inc()
199199 _background_process_in_flight_count.labels(desc).inc()
200200
201 with BackgroundProcessLoggingContext(desc) as context:
202 context.request = "%s-%i" % (desc, count)
201 with BackgroundProcessLoggingContext(desc, "%s-%i" % (desc, count)) as context:
203202 try:
204203 ctx = noop_context_manager()
205204 if bg_start_span:
206205 ctx = start_active_span(desc, tags={"request_id": context.request})
207206 with ctx:
208 result = func(*args, **kwargs)
209
210 if inspect.isawaitable(result):
211 result = await result
212
213 return result
207 return await maybe_awaitable(func(*args, **kwargs))
214208 except Exception:
215209 logger.exception(
216210 "Background process '%s' threw an exception", desc,
248242
249243 __slots__ = ["_proc"]
250244
251 def __init__(self, name: str):
252 super().__init__(name)
245 def __init__(self, name: str, request: Optional[str] = None):
246 super().__init__(name, request=request)
253247
254248 self._proc = _BackgroundProcess(name, self)
255249
3333 from twisted.internet import defer
3434
3535 import synapse.server
36 from synapse.api.constants import EventTypes, Membership
36 from synapse.api.constants import EventTypes, HistoryVisibility, Membership
3737 from synapse.api.errors import AuthError
3838 from synapse.events import EventBase
3939 from synapse.handlers.presence import format_user_presence_state
610610 room_id, EventTypes.RoomHistoryVisibility, ""
611611 )
612612 if state and "history_visibility" in state.content:
613 return state.content["history_visibility"] == "world_readable"
613 return (
614 state.content["history_visibility"] == HistoryVisibility.WORLD_READABLE
615 )
614616 else:
615617 return False
616618
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414
15 import abc
16 from typing import TYPE_CHECKING, Any, Dict, Optional
17
18 import attr
19
20 from synapse.types import JsonDict, RoomStreamToken
21
22 if TYPE_CHECKING:
23 from synapse.app.homeserver import HomeServer
24
25
26 @attr.s(slots=True)
27 class PusherConfig:
28 """Parameters necessary to configure a pusher."""
29
30 id = attr.ib(type=Optional[str])
31 user_name = attr.ib(type=str)
32 access_token = attr.ib(type=Optional[int])
33 profile_tag = attr.ib(type=str)
34 kind = attr.ib(type=str)
35 app_id = attr.ib(type=str)
36 app_display_name = attr.ib(type=str)
37 device_display_name = attr.ib(type=str)
38 pushkey = attr.ib(type=str)
39 ts = attr.ib(type=int)
40 lang = attr.ib(type=Optional[str])
41 data = attr.ib(type=Optional[JsonDict])
42 last_stream_ordering = attr.ib(type=int)
43 last_success = attr.ib(type=Optional[int])
44 failing_since = attr.ib(type=Optional[int])
45
46 def as_dict(self) -> Dict[str, Any]:
47 """Information that can be retrieved about a pusher after creation."""
48 return {
49 "app_display_name": self.app_display_name,
50 "app_id": self.app_id,
51 "data": self.data,
52 "device_display_name": self.device_display_name,
53 "kind": self.kind,
54 "lang": self.lang,
55 "profile_tag": self.profile_tag,
56 "pushkey": self.pushkey,
57 }
58
59
60 @attr.s(slots=True)
61 class ThrottleParams:
62 """Parameters for controlling the rate of sending pushes via email."""
63
64 last_sent_ts = attr.ib(type=int)
65 throttle_ms = attr.ib(type=int)
66
67
68 class Pusher(metaclass=abc.ABCMeta):
69 def __init__(self, hs: "HomeServer", pusher_config: PusherConfig):
70 self.hs = hs
71 self.store = self.hs.get_datastore()
72 self.clock = self.hs.get_clock()
73
74 self.pusher_id = pusher_config.id
75 self.user_id = pusher_config.user_name
76 self.app_id = pusher_config.app_id
77 self.pushkey = pusher_config.pushkey
78
79 self.last_stream_ordering = pusher_config.last_stream_ordering
80
81 # This is the highest stream ordering we know it's safe to process.
82 # When new events arrive, we'll be given a window of new events: we
83 # should honour this rather than just looking for anything higher
84 # because of potential out-of-order event serialisation.
85 self.max_stream_ordering = self.store.get_room_max_stream_ordering()
86
87 def on_new_notifications(self, max_token: RoomStreamToken) -> None:
88 # We just use the minimum stream ordering and ignore the vector clock
89 # component. This is safe to do as long as we *always* ignore the vector
90 # clock components.
91 max_stream_ordering = max_token.stream
92
93 self.max_stream_ordering = max(max_stream_ordering, self.max_stream_ordering)
94 self._start_processing()
95
96 @abc.abstractmethod
97 def _start_processing(self):
98 """Start processing push notifications."""
99 raise NotImplementedError()
100
101 @abc.abstractmethod
102 def on_new_receipts(self, min_stream_id: int, max_stream_id: int) -> None:
103 raise NotImplementedError()
104
105 @abc.abstractmethod
106 def on_started(self, have_notifs: bool) -> None:
107 """Called when this pusher has been started.
108
109 Args:
110 should_check_for_notifs: Whether we should immediately
111 check for push to send. Set to False only if it's known there
112 is nothing to send
113 """
114 raise NotImplementedError()
115
116 @abc.abstractmethod
117 def on_stop(self) -> None:
118 raise NotImplementedError()
119
15120
16121 class PusherConfigException(Exception):
17 def __init__(self, msg):
18 super().__init__(msg)
122 """An error occurred when creating a pusher."""
1313 # limitations under the License.
1414
1515 import logging
16 from typing import TYPE_CHECKING
1617
18 from synapse.events import EventBase
19 from synapse.events.snapshot import EventContext
20 from synapse.push.bulk_push_rule_evaluator import BulkPushRuleEvaluator
1721 from synapse.util.metrics import Measure
1822
19 from .bulk_push_rule_evaluator import BulkPushRuleEvaluator
23 if TYPE_CHECKING:
24 from synapse.app.homeserver import HomeServer
2025
2126 logger = logging.getLogger(__name__)
2227
2328
2429 class ActionGenerator:
25 def __init__(self, hs):
26 self.hs = hs
30 def __init__(self, hs: "HomeServer"):
2731 self.clock = hs.get_clock()
28 self.store = hs.get_datastore()
2932 self.bulk_evaluator = BulkPushRuleEvaluator(hs)
3033 # really we want to get all user ids and all profile tags too,
3134 # since we want the actions for each profile tag for every user and
3437 # event stream, so we just run the rules for a client with no profile
3538 # tag (ie. we just need all the users).
3639
37 async def handle_push_actions_for_event(self, event, context):
40 async def handle_push_actions_for_event(
41 self, event: EventBase, context: EventContext
42 ) -> None:
3843 with Measure(self.clock, "action_for_event_by_user"):
3944 await self.bulk_evaluator.action_for_event_by_user(event, context)
1414 # limitations under the License.
1515
1616 import copy
17 from typing import Any, Dict, List
1718
1819 from synapse.push.rulekinds import PRIORITY_CLASS_INVERSE_MAP, PRIORITY_CLASS_MAP
1920
2021
21 def list_with_base_rules(rawrules, use_new_defaults=False):
22 def list_with_base_rules(
23 rawrules: List[Dict[str, Any]], use_new_defaults: bool = False
24 ) -> List[Dict[str, Any]]:
2225 """Combine the list of rules set by the user with the default push rules
2326
2427 Args:
25 rawrules(list): The rules the user has modified or set.
26 use_new_defaults(bool): Whether to use the new experimental default rules when
28 rawrules: The rules the user has modified or set.
29 use_new_defaults: Whether to use the new experimental default rules when
2730 appending or prepending default rules.
2831
2932 Returns:
9396 return ruleslist
9497
9598
96 def make_base_append_rules(kind, modified_base_rules, use_new_defaults=False):
99 def make_base_append_rules(
100 kind: str,
101 modified_base_rules: Dict[str, Dict[str, Any]],
102 use_new_defaults: bool = False,
103 ) -> List[Dict[str, Any]]:
97104 rules = []
98105
99106 if kind == "override":
115122 rules = copy.deepcopy(rules)
116123 for r in rules:
117124 # Only modify the actions, keep the conditions the same.
125 assert isinstance(r["rule_id"], str)
118126 modified = modified_base_rules.get(r["rule_id"])
119127 if modified:
120128 r["actions"] = modified["actions"]
122130 return rules
123131
124132
125 def make_base_prepend_rules(kind, modified_base_rules, use_new_defaults=False):
133 def make_base_prepend_rules(
134 kind: str,
135 modified_base_rules: Dict[str, Dict[str, Any]],
136 use_new_defaults: bool = False,
137 ) -> List[Dict[str, Any]]:
126138 rules = []
127139
128140 if kind == "override":
132144 rules = copy.deepcopy(rules)
133145 for r in rules:
134146 # Only modify the actions, keep the conditions the same.
147 assert isinstance(r["rule_id"], str)
135148 modified = modified_base_rules.get(r["rule_id"])
136149 if modified:
137150 r["actions"] = modified["actions"]
1414 # limitations under the License.
1515
1616 import logging
17 from typing import TYPE_CHECKING, Any, Dict, List, Optional, Set, Tuple, Union
1718
1819 import attr
1920 from prometheus_client import Counter
2425 from synapse.events.snapshot import EventContext
2526 from synapse.state import POWER_KEY
2627 from synapse.util.async_helpers import Linearizer
27 from synapse.util.caches import register_cache
28 from synapse.util.caches import CacheMetric, register_cache
2829 from synapse.util.caches.descriptors import lru_cache
2930 from synapse.util.caches.lrucache import LruCache
3031
3132 from .push_rule_evaluator import PushRuleEvaluatorForEvent
3233
34 if TYPE_CHECKING:
35 from synapse.app.homeserver import HomeServer
36
3337 logger = logging.getLogger(__name__)
34
35
36 rules_by_room = {}
3738
3839
3940 push_rules_invalidation_counter = Counter(
100101 room at once.
101102 """
102103
103 def __init__(self, hs):
104 def __init__(self, hs: "HomeServer"):
104105 self.hs = hs
105106 self.store = hs.get_datastore()
106107 self.auth = hs.get_auth()
112113 resizable=False,
113114 )
114115
115 async def _get_rules_for_event(self, event, context):
116 async def _get_rules_for_event(
117 self, event: EventBase, context: EventContext
118 ) -> Dict[str, List[Dict[str, Any]]]:
116119 """This gets the rules for all users in the room at the time of the event,
117120 as well as the push rules for the invitee if the event is an invite.
118121
139142 return rules_by_user
140143
141144 @lru_cache()
142 def _get_rules_for_room(self, room_id):
145 def _get_rules_for_room(self, room_id: str) -> "RulesForRoom":
143146 """Get the current RulesForRoom object for the given room id
144
145 Returns:
146 RulesForRoom
147147 """
148148 # It's important that RulesForRoom gets added to self._get_rules_for_room.cache
149149 # before any lookup methods get called on it as otherwise there may be
155155 self.room_push_rule_cache_metrics,
156156 )
157157
158 async def _get_power_levels_and_sender_level(self, event, context):
158 async def _get_power_levels_and_sender_level(
159 self, event: EventBase, context: EventContext
160 ) -> Tuple[dict, int]:
159161 prev_state_ids = await context.get_prev_state_ids()
160162 pl_event_id = prev_state_ids.get(POWER_KEY)
161163 if pl_event_id:
162164 # fastpath: if there's a power level event, that's all we need, and
163165 # not having a power level event is an extreme edge case
164 pl_event = await self.store.get_event(pl_event_id)
165 auth_events = {POWER_KEY: pl_event}
166 auth_events = {POWER_KEY: await self.store.get_event(pl_event_id)}
166167 else:
167168 auth_events_ids = self.auth.compute_auth_events(
168169 event, prev_state_ids, for_verification=False
169170 )
170 auth_events = await self.store.get_events(auth_events_ids)
171 auth_events = {(e.type, e.state_key): e for e in auth_events.values()}
171 auth_events_dict = await self.store.get_events(auth_events_ids)
172 auth_events = {(e.type, e.state_key): e for e in auth_events_dict.values()}
172173
173174 sender_level = get_user_power_level(event.sender, auth_events)
174175
176177
177178 return pl_event.content if pl_event else {}, sender_level
178179
179 async def action_for_event_by_user(self, event, context) -> None:
180 async def action_for_event_by_user(
181 self, event: EventBase, context: EventContext
182 ) -> None:
180183 """Given an event and context, evaluate the push rules, check if the message
181184 should increment the unread count, and insert the results into the
182185 event_push_actions_staging table.
184187 count_as_unread = _should_count_as_unread(event, context)
185188
186189 rules_by_user = await self._get_rules_for_event(event, context)
187 actions_by_user = {}
190 actions_by_user = {} # type: Dict[str, List[Union[dict, str]]]
188191
189192 room_members = await self.store.get_joined_users_from_context(event, context)
190193
197200 event, len(room_members), sender_power_level, power_levels
198201 )
199202
200 condition_cache = {}
203 condition_cache = {} # type: Dict[str, bool]
201204
202205 for uid, rules in rules_by_user.items():
203206 if event.sender == uid:
248251 )
249252
250253
251 def _condition_checker(evaluator, conditions, uid, display_name, cache):
254 def _condition_checker(
255 evaluator: PushRuleEvaluatorForEvent,
256 conditions: List[dict],
257 uid: str,
258 display_name: str,
259 cache: Dict[str, bool],
260 ) -> bool:
252261 for cond in conditions:
253262 _id = cond.get("_id", None)
254263 if _id:
276285 """
277286
278287 def __init__(
279 self, hs, room_id, rules_for_room_cache: LruCache, room_push_rule_cache_metrics
288 self,
289 hs: "HomeServer",
290 room_id: str,
291 rules_for_room_cache: LruCache,
292 room_push_rule_cache_metrics: CacheMetric,
280293 ):
281294 """
282295 Args:
283 hs (HomeServer)
284 room_id (str)
296 hs: The HomeServer object.
297 room_id: The room ID.
285298 rules_for_room_cache: The cache object that caches these
286299 RoomsForUser objects.
287 room_push_rule_cache_metrics (CacheMetric)
300 room_push_rule_cache_metrics: The metrics object
288301 """
289302 self.room_id = room_id
290303 self.is_mine_id = hs.is_mine_id
293306
294307 self.linearizer = Linearizer(name="rules_for_room")
295308
296 self.member_map = {} # event_id -> (user_id, state)
297 self.rules_by_user = {} # user_id -> rules
309 # event_id -> (user_id, state)
310 self.member_map = {} # type: Dict[str, Tuple[str, str]]
311 # user_id -> rules
312 self.rules_by_user = {} # type: Dict[str, List[Dict[str, dict]]]
298313
299314 # The last state group we updated the caches for. If the state_group of
300315 # a new event comes along, we know that we can just return the cached
314329 # calculate push for)
315330 # These never need to be invalidated as we will never set up push for
316331 # them.
317 self.uninteresting_user_set = set()
332 self.uninteresting_user_set = set() # type: Set[str]
318333
319334 # We need to be clever on the invalidating caches callbacks, as
320335 # otherwise the invalidation callback holds a reference to the object,
324339 # to self around in the callback.
325340 self.invalidate_all_cb = _Invalidation(rules_for_room_cache, room_id)
326341
327 async def get_rules(self, event, context):
342 async def get_rules(
343 self, event: EventBase, context: EventContext
344 ) -> Dict[str, List[Dict[str, dict]]]:
328345 """Given an event context return the rules for all users who are
329346 currently in the room.
330347 """
355372 else:
356373 current_state_ids = await context.get_current_state_ids()
357374 push_rules_delta_state_cache_metric.inc_misses()
375 # Ensure the state IDs exist.
376 assert current_state_ids is not None
358377
359378 push_rules_state_size_counter.inc(len(current_state_ids))
360379
419438 return ret_rules_by_user
420439
421440 async def _update_rules_with_member_event_ids(
422 self, ret_rules_by_user, member_event_ids, state_group, event
423 ):
441 self,
442 ret_rules_by_user: Dict[str, list],
443 member_event_ids: Dict[str, str],
444 state_group: Optional[int],
445 event: EventBase,
446 ) -> None:
424447 """Update the partially filled rules_by_user dict by fetching rules for
425448 any newly joined users in the `member_event_ids` list.
426449
427450 Args:
428 ret_rules_by_user (dict): Partiallly filled dict of push rules. Gets
451 ret_rules_by_user: Partially filled dict of push rules. Gets
429452 updated with any new rules.
430 member_event_ids (dict): Dict of user id to event id for membership events
453 member_event_ids: Dict of user id to event id for membership events
431454 that have happened since the last time we filled rules_by_user
432455 state_group: The state group we are currently computing push rules
433456 for. Used when updating the cache.
457 event: The event we are currently computing push rules for.
434458 """
435459 sequence = self.sequence
436460
448472 if logger.isEnabledFor(logging.DEBUG):
449473 logger.debug("Found members %r: %r", self.room_id, members.values())
450474
451 user_ids = {
475 joined_user_ids = {
452476 user_id
453477 for user_id, membership in members.values()
454478 if membership == Membership.JOIN
455479 }
456480
457 logger.debug("Joined: %r", user_ids)
481 logger.debug("Joined: %r", joined_user_ids)
458482
459483 # Previously we only considered users with pushers or read receipts in that
460484 # room. We can't do this anymore because we use push actions to calculate unread
461485 # counts, which don't rely on the user having pushers or sent a read receipt into
462486 # the room. Therefore we just need to filter for local users here.
463 user_ids = list(filter(self.is_mine_id, user_ids))
487 user_ids = list(filter(self.is_mine_id, joined_user_ids))
464488
465489 rules_by_user = await self.store.bulk_get_push_rules(
466490 user_ids, on_invalidate=self.invalidate_all_cb
472496
473497 self.update_cache(sequence, members, ret_rules_by_user, state_group)
474498
475 def invalidate_all(self):
499 def invalidate_all(self) -> None:
476500 # Note: Don't hand this function directly to an invalidation callback
477501 # as it keeps a reference to self and will stop this instance from being
478502 # GC'd if it gets dropped from the rules_to_user cache. Instead use
484508 self.rules_by_user = {}
485509 push_rules_invalidation_counter.inc()
486510
487 def update_cache(self, sequence, members, rules_by_user, state_group):
511 def update_cache(self, sequence, members, rules_by_user, state_group) -> None:
488512 if sequence == self.sequence:
489513 self.member_map.update(members)
490514 self.rules_by_user = rules_by_user
505529 cache = attr.ib(type=LruCache)
506530 room_id = attr.ib(type=str)
507531
508 def __call__(self):
532 def __call__(self) -> None:
509533 rules = self.cache.get(self.room_id, None, update_metrics=False)
510534 if rules:
511535 rules.invalidate_all()
1313 # limitations under the License.
1414
1515 import copy
16 from typing import Any, Dict, List, Optional
1617
1718 from synapse.push.rulekinds import PRIORITY_CLASS_INVERSE_MAP, PRIORITY_CLASS_MAP
19 from synapse.types import UserID
1820
1921
20 def format_push_rules_for_user(user, ruleslist):
22 def format_push_rules_for_user(user: UserID, ruleslist) -> Dict[str, Dict[str, list]]:
2123 """Converts a list of rawrules and a enabled map into nested dictionaries
2224 to match the Matrix client-server format for push rules"""
2325
2426 # We're going to be mutating this a lot, so do a deep copy
2527 ruleslist = copy.deepcopy(ruleslist)
2628
27 rules = {"global": {}, "device": {}}
29 rules = {
30 "global": {},
31 "device": {},
32 } # type: Dict[str, Dict[str, List[Dict[str, Any]]]]
2833
2934 rules["global"] = _add_empty_priority_class_arrays(rules["global"])
3035
3136 for r in ruleslist:
32 rulearray = None
33
3437 template_name = _priority_class_to_template_name(r["priority_class"])
3538
3639 # Remove internal stuff.
5659 return rules
5760
5861
59 def _add_empty_priority_class_arrays(d):
62 def _add_empty_priority_class_arrays(d: Dict[str, list]) -> Dict[str, list]:
6063 for pc in PRIORITY_CLASS_MAP.keys():
6164 d[pc] = []
6265 return d
6366
6467
65 def _rule_to_template(rule):
68 def _rule_to_template(rule: Dict[str, Any]) -> Optional[Dict[str, Any]]:
6669 unscoped_rule_id = None
6770 if "rule_id" in rule:
6871 unscoped_rule_id = _rule_id_from_namespaced(rule["rule_id"])
8184 return None
8285 templaterule = {"actions": rule["actions"]}
8386 templaterule["pattern"] = thecond["pattern"]
87 else:
88 # This should not be reached unless this function is not kept in sync
89 # with PRIORITY_CLASS_INVERSE_MAP.
90 raise ValueError("Unexpected template_name: %s" % (template_name,))
8491
8592 if unscoped_rule_id:
8693 templaterule["rule_id"] = unscoped_rule_id
8996 return templaterule
9097
9198
92 def _rule_id_from_namespaced(in_rule_id):
99 def _rule_id_from_namespaced(in_rule_id: str) -> str:
93100 return in_rule_id.split("/")[-1]
94101
95102
96 def _priority_class_to_template_name(pc):
103 def _priority_class_to_template_name(pc: int) -> str:
97104 return PRIORITY_CLASS_INVERSE_MAP[pc]
1313 # limitations under the License.
1414
1515 import logging
16
16 from typing import TYPE_CHECKING, Dict, List, Optional
17
18 from twisted.internet.base import DelayedCall
1719 from twisted.internet.error import AlreadyCalled, AlreadyCancelled
1820
1921 from synapse.metrics.background_process_metrics import run_as_background_process
20 from synapse.types import RoomStreamToken
22 from synapse.push import Pusher, PusherConfig, ThrottleParams
23 from synapse.push.mailer import Mailer
24
25 if TYPE_CHECKING:
26 from synapse.app.homeserver import HomeServer
2127
2228 logger = logging.getLogger(__name__)
2329
4551 INCLUDE_ALL_UNREAD_NOTIFS = False
4652
4753
48 class EmailPusher:
54 class EmailPusher(Pusher):
4955 """
5056 A pusher that sends email notifications about events (approximately)
5157 when they happen.
5359 factor out the common parts
5460 """
5561
56 def __init__(self, hs, pusherdict, mailer):
57 self.hs = hs
62 def __init__(self, hs: "HomeServer", pusher_config: PusherConfig, mailer: Mailer):
63 super().__init__(hs, pusher_config)
5864 self.mailer = mailer
5965
6066 self.store = self.hs.get_datastore()
61 self.clock = self.hs.get_clock()
62 self.pusher_id = pusherdict["id"]
63 self.user_id = pusherdict["user_name"]
64 self.app_id = pusherdict["app_id"]
65 self.email = pusherdict["pushkey"]
66 self.last_stream_ordering = pusherdict["last_stream_ordering"]
67 self.timed_call = None
68 self.throttle_params = None
69
70 # See httppusher
71 self.max_stream_ordering = None
67 self.email = pusher_config.pushkey
68 self.timed_call = None # type: Optional[DelayedCall]
69 self.throttle_params = {} # type: Dict[str, ThrottleParams]
70 self._inited = False
7271
7372 self._is_processing = False
7473
75 def on_started(self, should_check_for_notifs):
74 def on_started(self, should_check_for_notifs: bool) -> None:
7675 """Called when this pusher has been started.
7776
7877 Args:
79 should_check_for_notifs (bool): Whether we should immediately
78 should_check_for_notifs: Whether we should immediately
8079 check for push to send. Set to False only if it's known there
8180 is nothing to send
8281 """
8382 if should_check_for_notifs and self.mailer is not None:
8483 self._start_processing()
8584
86 def on_stop(self):
85 def on_stop(self) -> None:
8786 if self.timed_call:
8887 try:
8988 self.timed_call.cancel()
9190 pass
9291 self.timed_call = None
9392
94 def on_new_notifications(self, max_token: RoomStreamToken):
95 # We just use the minimum stream ordering and ignore the vector clock
96 # component. This is safe to do as long as we *always* ignore the vector
97 # clock components.
98 max_stream_ordering = max_token.stream
99
100 if self.max_stream_ordering:
101 self.max_stream_ordering = max(
102 max_stream_ordering, self.max_stream_ordering
103 )
104 else:
105 self.max_stream_ordering = max_stream_ordering
106 self._start_processing()
107
108 def on_new_receipts(self, min_stream_id, max_stream_id):
93 def on_new_receipts(self, min_stream_id: int, max_stream_id: int) -> None:
10994 # We could wake up and cancel the timer but there tend to be quite a
11095 # lot of read receipts so it's probably less work to just let the
11196 # timer fire
11297 pass
11398
114 def on_timer(self):
99 def on_timer(self) -> None:
115100 self.timed_call = None
116101 self._start_processing()
117102
118 def _start_processing(self):
103 def _start_processing(self) -> None:
119104 if self._is_processing:
120105 return
121106
122107 run_as_background_process("emailpush.process", self._process)
123108
124 def _pause_processing(self):
109 def _pause_processing(self) -> None:
125110 """Used by tests to temporarily pause processing of events.
126111
127112 Asserts that its not currently processing.
129114 assert not self._is_processing
130115 self._is_processing = True
131116
132 def _resume_processing(self):
117 def _resume_processing(self) -> None:
133118 """Used by tests to resume processing of events after pausing.
134119 """
135120 assert self._is_processing
136121 self._is_processing = False
137122 self._start_processing()
138123
139 async def _process(self):
124 async def _process(self) -> None:
140125 # we should never get here if we are already processing
141126 assert not self._is_processing
142127
143128 try:
144129 self._is_processing = True
145130
146 if self.throttle_params is None:
131 if not self._inited:
147132 # this is our first loop: load up the throttle params
133 assert self.pusher_id is not None
148134 self.throttle_params = await self.store.get_throttle_params_by_room(
149135 self.pusher_id
150136 )
137 self._inited = True
151138
152139 # if the max ordering changes while we're running _unsafe_process,
153140 # call it again, and so on until we've caught up.
162149 finally:
163150 self._is_processing = False
164151
165 async def _unsafe_process(self):
152 async def _unsafe_process(self) -> None:
166153 """
167154 Main logic of the push loop without the wrapper function that sets
168155 up logging, measures and guards against multiple instances of it
169156 being run.
170157 """
171158 start = 0 if INCLUDE_ALL_UNREAD_NOTIFS else self.last_stream_ordering
172 fn = self.store.get_unread_push_actions_for_user_in_range_for_email
173 unprocessed = await fn(self.user_id, start, self.max_stream_ordering)
174
175 soonest_due_at = None
159 unprocessed = await self.store.get_unread_push_actions_for_user_in_range_for_email(
160 self.user_id, start, self.max_stream_ordering
161 )
162
163 soonest_due_at = None # type: Optional[int]
176164
177165 if not unprocessed:
178166 await self.save_last_stream_ordering_and_success(self.max_stream_ordering)
229217 self.seconds_until(soonest_due_at), self.on_timer
230218 )
231219
232 async def save_last_stream_ordering_and_success(self, last_stream_ordering):
233 if last_stream_ordering is None:
234 # This happens if we haven't yet processed anything
235 return
236
220 async def save_last_stream_ordering_and_success(
221 self, last_stream_ordering: int
222 ) -> None:
237223 self.last_stream_ordering = last_stream_ordering
238224 pusher_still_exists = await self.store.update_pusher_last_stream_ordering_and_success(
239225 self.app_id,
247233 # lets just stop and return.
248234 self.on_stop()
249235
250 def seconds_until(self, ts_msec):
236 def seconds_until(self, ts_msec: int) -> float:
251237 secs = (ts_msec - self.clock.time_msec()) / 1000
252238 return max(secs, 0)
253239
254 def get_room_throttle_ms(self, room_id):
240 def get_room_throttle_ms(self, room_id: str) -> int:
255241 if room_id in self.throttle_params:
256 return self.throttle_params[room_id]["throttle_ms"]
242 return self.throttle_params[room_id].throttle_ms
257243 else:
258244 return 0
259245
260 def get_room_last_sent_ts(self, room_id):
246 def get_room_last_sent_ts(self, room_id: str) -> int:
261247 if room_id in self.throttle_params:
262 return self.throttle_params[room_id]["last_sent_ts"]
248 return self.throttle_params[room_id].last_sent_ts
263249 else:
264250 return 0
265251
266 def room_ready_to_notify_at(self, room_id):
252 def room_ready_to_notify_at(self, room_id: str) -> int:
267253 """
268254 Determines whether throttling should prevent us from sending an email
269255 for the given room
270 Returns: The timestamp when we are next allowed to send an email notif
271 for this room
256
257 Returns:
258 The timestamp when we are next allowed to send an email notif
259 for this room
272260 """
273261 last_sent_ts = self.get_room_last_sent_ts(room_id)
274262 throttle_ms = self.get_room_throttle_ms(room_id)
276264 may_send_at = last_sent_ts + throttle_ms
277265 return may_send_at
278266
279 async def sent_notif_update_throttle(self, room_id, notified_push_action):
267 async def sent_notif_update_throttle(
268 self, room_id: str, notified_push_action: dict
269 ) -> None:
280270 # We have sent a notification, so update the throttle accordingly.
281271 # If the event that triggered the notif happened more than
282272 # THROTTLE_RESET_AFTER_MS after the previous one that triggered a
306296 new_throttle_ms = min(
307297 current_throttle_ms * THROTTLE_MULTIPLIER, THROTTLE_MAX_MS
308298 )
309 self.throttle_params[room_id] = {
310 "last_sent_ts": self.clock.time_msec(),
311 "throttle_ms": new_throttle_ms,
312 }
299 self.throttle_params[room_id] = ThrottleParams(
300 self.clock.time_msec(), new_throttle_ms,
301 )
302 assert self.pusher_id is not None
313303 await self.store.set_throttle_params(
314304 self.pusher_id, room_id, self.throttle_params[room_id]
315305 )
316306
317 async def send_notification(self, push_actions, reason):
307 async def send_notification(self, push_actions: List[dict], reason: dict) -> None:
318308 logger.info("Sending notif email for user %r", self.user_id)
319309
320310 await self.mailer.send_notification_mail(
1313 # See the License for the specific language governing permissions and
1414 # limitations under the License.
1515 import logging
16 import urllib.parse
17 from typing import TYPE_CHECKING, Any, Dict, Iterable, Union
1618
1719 from prometheus_client import Counter
1820
1921 from twisted.internet.error import AlreadyCalled, AlreadyCancelled
2022
2123 from synapse.api.constants import EventTypes
24 from synapse.events import EventBase
2225 from synapse.logging import opentracing
2326 from synapse.metrics.background_process_metrics import run_as_background_process
24 from synapse.push import PusherConfigException
25 from synapse.types import RoomStreamToken
27 from synapse.push import Pusher, PusherConfig, PusherConfigException
2628
2729 from . import push_rule_evaluator, push_tools
30
31 if TYPE_CHECKING:
32 from synapse.app.homeserver import HomeServer
2833
2934 logger = logging.getLogger(__name__)
3035
4954 )
5055
5156
52 class HttpPusher:
57 class HttpPusher(Pusher):
5358 INITIAL_BACKOFF_SEC = 1 # in seconds because that's what Twisted takes
5459 MAX_BACKOFF_SEC = 60 * 60
5560
5661 # This one's in ms because we compare it against the clock
5762 GIVE_UP_AFTER_MS = 24 * 60 * 60 * 1000
5863
59 def __init__(self, hs, pusherdict):
60 self.hs = hs
61 self.store = self.hs.get_datastore()
64 def __init__(self, hs: "HomeServer", pusher_config: PusherConfig):
65 super().__init__(hs, pusher_config)
6266 self.storage = self.hs.get_storage()
63 self.clock = self.hs.get_clock()
64 self.state_handler = self.hs.get_state_handler()
65 self.user_id = pusherdict["user_name"]
66 self.app_id = pusherdict["app_id"]
67 self.app_display_name = pusherdict["app_display_name"]
68 self.device_display_name = pusherdict["device_display_name"]
69 self.pushkey = pusherdict["pushkey"]
70 self.pushkey_ts = pusherdict["ts"]
71 self.data = pusherdict["data"]
72 self.last_stream_ordering = pusherdict["last_stream_ordering"]
67 self.app_display_name = pusher_config.app_display_name
68 self.device_display_name = pusher_config.device_display_name
69 self.pushkey_ts = pusher_config.ts
70 self.data = pusher_config.data
7371 self.backoff_delay = HttpPusher.INITIAL_BACKOFF_SEC
74 self.failing_since = pusherdict["failing_since"]
72 self.failing_since = pusher_config.failing_since
7573 self.timed_call = None
7674 self._is_processing = False
7775 self._group_unread_count_by_room = hs.config.push_group_unread_count_by_room
7876
79 # This is the highest stream ordering we know it's safe to process.
80 # When new events arrive, we'll be given a window of new events: we
81 # should honour this rather than just looking for anything higher
82 # because of potential out-of-order event serialisation. This starts
83 # off as None though as we don't know any better.
84 self.max_stream_ordering = None
85
86 if "data" not in pusherdict:
87 raise PusherConfigException("No 'data' key for HTTP pusher")
88 self.data = pusherdict["data"]
77 self.data = pusher_config.data
78 if self.data is None:
79 raise PusherConfigException("'data' key can not be null for HTTP pusher")
8980
9081 self.name = "%s/%s/%s" % (
91 pusherdict["user_name"],
92 pusherdict["app_id"],
93 pusherdict["pushkey"],
82 pusher_config.user_name,
83 pusher_config.app_id,
84 pusher_config.pushkey,
9485 )
9586
96 if self.data is None:
97 raise PusherConfigException("data can not be null for HTTP pusher")
98
87 # Validate that there's a URL and it is of the proper form.
9988 if "url" not in self.data:
10089 raise PusherConfigException("'url' required in data for HTTP pusher")
101 self.url = self.data["url"]
102 self.http_client = hs.get_proxied_http_client()
90
91 url = self.data["url"]
92 if not isinstance(url, str):
93 raise PusherConfigException("'url' must be a string")
94 url_parts = urllib.parse.urlparse(url)
95 # Note that the specification also says the scheme must be HTTPS, but
96 # it isn't up to the homeserver to verify that.
97 if url_parts.path != "/_matrix/push/v1/notify":
98 raise PusherConfigException(
99 "'url' must have a path of '/_matrix/push/v1/notify'"
100 )
101
102 self.url = url
103 self.http_client = hs.get_proxied_blacklisted_http_client()
103104 self.data_minus_url = {}
104105 self.data_minus_url.update(self.data)
105106 del self.data_minus_url["url"]
106107
107 def on_started(self, should_check_for_notifs):
108 def on_started(self, should_check_for_notifs: bool) -> None:
108109 """Called when this pusher has been started.
109110
110111 Args:
111 should_check_for_notifs (bool): Whether we should immediately
112 should_check_for_notifs: Whether we should immediately
112113 check for push to send. Set to False only if it's known there
113114 is nothing to send
114115 """
115116 if should_check_for_notifs:
116117 self._start_processing()
117118
118 def on_new_notifications(self, max_token: RoomStreamToken):
119 # We just use the minimum stream ordering and ignore the vector clock
120 # component. This is safe to do as long as we *always* ignore the vector
121 # clock components.
122 max_stream_ordering = max_token.stream
123
124 self.max_stream_ordering = max(
125 max_stream_ordering, self.max_stream_ordering or 0
126 )
127 self._start_processing()
128
129 def on_new_receipts(self, min_stream_id, max_stream_id):
119 def on_new_receipts(self, min_stream_id: int, max_stream_id: int) -> None:
130120 # Note that the min here shouldn't be relied upon to be accurate.
131121
132122 # We could check the receipts are actually m.read receipts here,
133123 # but currently that's the only type of receipt anyway...
134124 run_as_background_process("http_pusher.on_new_receipts", self._update_badge)
135125
136 async def _update_badge(self):
126 async def _update_badge(self) -> None:
137127 # XXX as per https://github.com/matrix-org/matrix-doc/issues/2627, this seems
138128 # to be largely redundant. perhaps we can remove it.
139129 badge = await push_tools.get_badge_count(
143133 )
144134 await self._send_badge(badge)
145135
146 def on_timer(self):
136 def on_timer(self) -> None:
147137 self._start_processing()
148138
149 def on_stop(self):
139 def on_stop(self) -> None:
150140 if self.timed_call:
151141 try:
152142 self.timed_call.cancel()
154144 pass
155145 self.timed_call = None
156146
157 def _start_processing(self):
147 def _start_processing(self) -> None:
158148 if self._is_processing:
159149 return
160150
161151 run_as_background_process("httppush.process", self._process)
162152
163 async def _process(self):
153 async def _process(self) -> None:
164154 # we should never get here if we are already processing
165155 assert not self._is_processing
166156
179169 finally:
180170 self._is_processing = False
181171
182 async def _unsafe_process(self):
172 async def _unsafe_process(self) -> None:
183173 """
184174 Looks for unset notifications and dispatch them, in order
185175 Never call this directly: use _process which will only allow this to
186176 run once per pusher.
187177 """
188
189 fn = self.store.get_unread_push_actions_for_user_in_range_for_http
190 unprocessed = await fn(
178 unprocessed = await self.store.get_unread_push_actions_for_user_in_range_for_http(
191179 self.user_id, self.last_stream_ordering, self.max_stream_ordering
192180 )
193181
256244 )
257245 self.backoff_delay = HttpPusher.INITIAL_BACKOFF_SEC
258246 self.last_stream_ordering = push_action["stream_ordering"]
259 pusher_still_exists = await self.store.update_pusher_last_stream_ordering(
247 await self.store.update_pusher_last_stream_ordering(
260248 self.app_id,
261249 self.pushkey,
262250 self.user_id,
263251 self.last_stream_ordering,
264252 )
265 if not pusher_still_exists:
266 # The pusher has been deleted while we were processing, so
267 # lets just stop and return.
268 self.on_stop()
269 return
270253
271254 self.failing_since = None
272255 await self.store.update_pusher_failing_since(
282265 )
283266 break
284267
285 async def _process_one(self, push_action):
268 async def _process_one(self, push_action: dict) -> bool:
286269 if "notify" not in push_action["actions"]:
287270 return True
288271
313296 await self.hs.remove_pusher(self.app_id, pk, self.user_id)
314297 return True
315298
316 async def _build_notification_dict(self, event, tweaks, badge):
299 async def _build_notification_dict(
300 self, event: EventBase, tweaks: Dict[str, bool], badge: int
301 ) -> Dict[str, Any]:
317302 priority = "low"
318303 if (
319304 event.type == EventTypes.Encrypted
324309 # or may do so (i.e. is encrypted so has unknown effects).
325310 priority = "high"
326311
312 # This was checked in the __init__, but mypy doesn't seem to know that.
313 assert self.data is not None
327314 if self.data.get("format") == "event_id_only":
328315 d = {
329316 "notification": {
343330 }
344331 return d
345332
346 ctx = await push_tools.get_context_for_event(
347 self.storage, self.state_handler, event, self.user_id
348 )
333 ctx = await push_tools.get_context_for_event(self.storage, event, self.user_id)
349334
350335 d = {
351336 "notification": {
385370
386371 return d
387372
388 async def dispatch_push(self, event, tweaks, badge):
373 async def dispatch_push(
374 self, event: EventBase, tweaks: Dict[str, bool], badge: int
375 ) -> Union[bool, Iterable[str]]:
389376 notification_dict = await self._build_notification_dict(event, tweaks, badge)
390377 if not notification_dict:
391378 return []
1818 import urllib.parse
1919 from email.mime.multipart import MIMEMultipart
2020 from email.mime.text import MIMEText
21 from typing import Iterable, List, TypeVar
21 from typing import TYPE_CHECKING, Any, Dict, Iterable, List, Optional, TypeVar
2222
2323 import bleach
2424 import jinja2
2626 from synapse.api.constants import EventTypes, Membership
2727 from synapse.api.errors import StoreError
2828 from synapse.config.emailconfig import EmailSubjectConfig
29 from synapse.events import EventBase
2930 from synapse.logging.context import make_deferred_yieldable
3031 from synapse.push.presentable_names import (
3132 calculate_room_name,
3233 descriptor_from_member_events,
3334 name_from_member_event,
3435 )
35 from synapse.types import UserID
36 from synapse.types import StateMap, UserID
3637 from synapse.util.async_helpers import concurrently_execute
3738 from synapse.visibility import filter_events_for_client
39
40 if TYPE_CHECKING:
41 from synapse.app.homeserver import HomeServer
3842
3943 logger = logging.getLogger(__name__)
4044
9296
9397
9498 class Mailer:
95 def __init__(self, hs, app_name, template_html, template_text):
99 def __init__(
100 self,
101 hs: "HomeServer",
102 app_name: str,
103 template_html: jinja2.Template,
104 template_text: jinja2.Template,
105 ):
96106 self.hs = hs
97107 self.template_html = template_html
98108 self.template_text = template_text
107117
108118 logger.info("Created Mailer for app_name %s" % app_name)
109119
110 async def send_password_reset_mail(self, email_address, token, client_secret, sid):
120 async def send_password_reset_mail(
121 self, email_address: str, token: str, client_secret: str, sid: str
122 ) -> None:
111123 """Send an email with a password reset link to a user
112124
113125 Args:
114 email_address (str): Email address we're sending the password
126 email_address: Email address we're sending the password
115127 reset to
116 token (str): Unique token generated by the server to verify
128 token: Unique token generated by the server to verify
117129 the email was received
118 client_secret (str): Unique token generated by the client to
130 client_secret: Unique token generated by the client to
119131 group together multiple email sending attempts
120 sid (str): The generated session ID
132 sid: The generated session ID
121133 """
122134 params = {"token": token, "client_secret": client_secret, "sid": sid}
123135 link = (
135147 template_vars,
136148 )
137149
138 async def send_registration_mail(self, email_address, token, client_secret, sid):
150 async def send_registration_mail(
151 self, email_address: str, token: str, client_secret: str, sid: str
152 ) -> None:
139153 """Send an email with a registration confirmation link to a user
140154
141155 Args:
142 email_address (str): Email address we're sending the registration
156 email_address: Email address we're sending the registration
143157 link to
144 token (str): Unique token generated by the server to verify
158 token: Unique token generated by the server to verify
145159 the email was received
146 client_secret (str): Unique token generated by the client to
160 client_secret: Unique token generated by the client to
147161 group together multiple email sending attempts
148 sid (str): The generated session ID
162 sid: The generated session ID
149163 """
150164 params = {"token": token, "client_secret": client_secret, "sid": sid}
151165 link = (
163177 template_vars,
164178 )
165179
166 async def send_add_threepid_mail(self, email_address, token, client_secret, sid):
180 async def send_add_threepid_mail(
181 self, email_address: str, token: str, client_secret: str, sid: str
182 ) -> None:
167183 """Send an email with a validation link to a user for adding a 3pid to their account
168184
169185 Args:
170 email_address (str): Email address we're sending the validation link to
171
172 token (str): Unique token generated by the server to verify the email was received
173
174 client_secret (str): Unique token generated by the client to group together
186 email_address: Email address we're sending the validation link to
187
188 token: Unique token generated by the server to verify the email was received
189
190 client_secret: Unique token generated by the client to group together
175191 multiple email sending attempts
176192
177 sid (str): The generated session ID
193 sid: The generated session ID
178194 """
179195 params = {"token": token, "client_secret": client_secret, "sid": sid}
180196 link = (
193209 )
194210
195211 async def send_notification_mail(
196 self, app_id, user_id, email_address, push_actions, reason
197 ):
212 self,
213 app_id: str,
214 user_id: str,
215 email_address: str,
216 push_actions: Iterable[Dict[str, Any]],
217 reason: Dict[str, Any],
218 ) -> None:
198219 """Send email regarding a user's room notifications"""
199220 rooms_in_order = deduped_ordered_list([pa["room_id"] for pa in push_actions])
200221
202223 [pa["event_id"] for pa in push_actions]
203224 )
204225
205 notifs_by_room = {}
226 notifs_by_room = {} # type: Dict[str, List[Dict[str, Any]]]
206227 for pa in push_actions:
207228 notifs_by_room.setdefault(pa["room_id"], []).append(pa)
208229
261282
262283 await self.send_email(email_address, summary_text, template_vars)
263284
264 async def send_email(self, email_address, subject, extra_template_vars):
285 async def send_email(
286 self, email_address: str, subject: str, extra_template_vars: Dict[str, Any]
287 ) -> None:
265288 """Send an email with the given information and template text"""
266289 try:
267290 from_string = self.hs.config.email_notif_from % {"app": self.app_name}
314337 )
315338
316339 async def get_room_vars(
317 self, room_id, user_id, notifs, notif_events, room_state_ids
318 ):
340 self,
341 room_id: str,
342 user_id: str,
343 notifs: Iterable[Dict[str, Any]],
344 notif_events: Dict[str, EventBase],
345 room_state_ids: StateMap[str],
346 ) -> Dict[str, Any]:
319347 # Check if one of the notifs is an invite event for the user.
320348 is_invite = False
321349 for n in notifs:
333361 "notifs": [],
334362 "invite": is_invite,
335363 "link": self.make_room_link(room_id),
336 }
364 } # type: Dict[str, Any]
337365
338366 if not is_invite:
339367 for n in notifs:
364392
365393 return room_vars
366394
367 async def get_notif_vars(self, notif, user_id, notif_event, room_state_ids):
395 async def get_notif_vars(
396 self,
397 notif: Dict[str, Any],
398 user_id: str,
399 notif_event: EventBase,
400 room_state_ids: StateMap[str],
401 ) -> Dict[str, Any]:
368402 results = await self.store.get_events_around(
369403 notif["room_id"],
370404 notif["event_id"],
390424
391425 return ret
392426
393 async def get_message_vars(self, notif, event, room_state_ids):
427 async def get_message_vars(
428 self, notif: Dict[str, Any], event: EventBase, room_state_ids: StateMap[str]
429 ) -> Optional[Dict[str, Any]]:
394430 if event.type != EventTypes.Message and event.type != EventTypes.Encrypted:
395431 return None
396432
431467
432468 return ret
433469
434 def add_text_message_vars(self, messagevars, event):
470 def add_text_message_vars(
471 self, messagevars: Dict[str, Any], event: EventBase
472 ) -> None:
435473 msgformat = event.content.get("format")
436474
437475 messagevars["format"] = msgformat
444482 elif body:
445483 messagevars["body_text_html"] = safe_text(body)
446484
447 return messagevars
448
449 def add_image_message_vars(self, messagevars, event):
450 messagevars["image_url"] = event.content["url"]
451
452 return messagevars
485 def add_image_message_vars(
486 self, messagevars: Dict[str, Any], event: EventBase
487 ) -> None:
488 """
489 Potentially add an image URL to the message variables.
490 """
491 if "url" in event.content:
492 messagevars["image_url"] = event.content["url"]
453493
454494 async def make_summary_text(
455 self, notifs_by_room, room_state_ids, notif_events, user_id, reason
495 self,
496 notifs_by_room: Dict[str, List[Dict[str, Any]]],
497 room_state_ids: Dict[str, StateMap[str]],
498 notif_events: Dict[str, EventBase],
499 user_id: str,
500 reason: Dict[str, Any],
456501 ):
457502 if len(notifs_by_room) == 1:
458503 # Only one room has new stuff
579624 "app": self.app_name,
580625 }
581626
582 def make_room_link(self, room_id):
627 def make_room_link(self, room_id: str) -> str:
583628 if self.hs.config.email_riot_base_url:
584629 base_url = "%s/#/room" % (self.hs.config.email_riot_base_url)
585630 elif self.app_name == "Vector":
589634 base_url = "https://matrix.to/#"
590635 return "%s/%s" % (base_url, room_id)
591636
592 def make_notif_link(self, notif):
637 def make_notif_link(self, notif: Dict[str, str]) -> str:
593638 if self.hs.config.email_riot_base_url:
594639 return "%s/#/room/%s/%s" % (
595640 self.hs.config.email_riot_base_url,
605650 else:
606651 return "https://matrix.to/#/%s/%s" % (notif["room_id"], notif["event_id"])
607652
608 def make_unsubscribe_link(self, user_id, app_id, email_address):
653 def make_unsubscribe_link(
654 self, user_id: str, app_id: str, email_address: str
655 ) -> str:
609656 params = {
610657 "access_token": self.macaroon_gen.generate_delete_pusher_token(user_id),
611658 "app_id": app_id,
619666 )
620667
621668
622 def safe_markup(raw_html):
669 def safe_markup(raw_html: str) -> jinja2.Markup:
623670 return jinja2.Markup(
624671 bleach.linkify(
625672 bleach.clean(
634681 )
635682
636683
637 def safe_text(raw_text):
684 def safe_text(raw_text: str) -> jinja2.Markup:
638685 """
639686 Process text: treat it as HTML but escape any tags (ie. just escape the
640687 HTML) then linkify it.
654701 return ret
655702
656703
657 def string_ordinal_total(s):
704 def string_ordinal_total(s: str) -> int:
658705 tot = 0
659706 for c in s:
660707 tot += ord(c)
1414
1515 import logging
1616 import re
17 from typing import TYPE_CHECKING, Dict, Iterable, Optional
1718
1819 from synapse.api.constants import EventTypes
20 from synapse.events import EventBase
21 from synapse.types import StateMap
22
23 if TYPE_CHECKING:
24 from synapse.storage.databases.main import DataStore
1925
2026 logger = logging.getLogger(__name__)
2127
2733
2834
2935 async def calculate_room_name(
30 store,
31 room_state_ids,
32 user_id,
33 fallback_to_members=True,
34 fallback_to_single_member=True,
35 ):
36 store: "DataStore",
37 room_state_ids: StateMap[str],
38 user_id: str,
39 fallback_to_members: bool = True,
40 fallback_to_single_member: bool = True,
41 ) -> Optional[str]:
3642 """
3743 Works out a user-facing name for the given room as per Matrix
3844 spec recommendations.
3945 Does not yet support internationalisation.
4046 Args:
41 room_state: Dictionary of the room's state
47 store: The data store to query.
48 room_state_ids: Dictionary of the room's state IDs.
4249 user_id: The ID of the user to whom the room name is being presented
4350 fallback_to_members: If False, return None instead of generating a name
4451 based on the room's members if the room has no
4552 title or aliases.
53 fallback_to_single_member: If False, return None instead of generating a
54 name based on the user who invited this user to the room if the room
55 has no title or aliases.
4656
4757 Returns:
48 (string or None) A human readable name for the room.
58 A human readable name for the room, if possible.
4959 """
5060 # does it have a name?
5161 if (EventTypes.Name, "") in room_state_ids:
96106 name_from_member_event(inviter_member_event),
97107 )
98108 else:
99 return
109 return None
100110 else:
101111 return "Room Invite"
102112
149159 else:
150160 return ALL_ALONE
151161 elif len(other_members) == 1 and not fallback_to_single_member:
152 return
153 else:
154 return descriptor_from_member_events(other_members)
155
156
157 def descriptor_from_member_events(member_events):
162 return None
163
164 return descriptor_from_member_events(other_members)
165
166
167 def descriptor_from_member_events(member_events: Iterable[EventBase]) -> str:
158168 """Get a description of the room based on the member events.
159169
160170 Args:
161 member_events (Iterable[FrozenEvent])
171 member_events: The events of a room.
162172
163173 Returns:
164 str
174 The room description
165175 """
166176
167177 member_events = list(member_events)
182192 )
183193
184194
185 def name_from_member_event(member_event):
195 def name_from_member_event(member_event: EventBase) -> str:
186196 if (
187197 member_event.content
188198 and "displayname" in member_event.content
192202 return member_event.state_key
193203
194204
195 def _state_as_two_level_dict(state):
196 ret = {}
205 def _state_as_two_level_dict(state: StateMap[str]) -> Dict[str, Dict[str, str]]:
206 ret = {} # type: Dict[str, Dict[str, str]]
197207 for k, v in state.items():
198208 ret.setdefault(k[0], {})[k[1]] = v
199209 return ret
200210
201211
202 def _looks_like_an_alias(string):
212 def _looks_like_an_alias(string: str) -> bool:
203213 return ALIAS_RE.match(string) is not None
2929 INEQUALITY_EXPR = re.compile("^([=<>]*)([0-9]*)$")
3030
3131
32 def _room_member_count(ev, condition, room_member_count):
32 def _room_member_count(
33 ev: EventBase, condition: Dict[str, Any], room_member_count: int
34 ) -> bool:
3335 return _test_ineq_condition(condition, room_member_count)
3436
3537
36 def _sender_notification_permission(ev, condition, sender_power_level, power_levels):
38 def _sender_notification_permission(
39 ev: EventBase,
40 condition: Dict[str, Any],
41 sender_power_level: int,
42 power_levels: Dict[str, Union[int, Dict[str, int]]],
43 ) -> bool:
3744 notif_level_key = condition.get("key")
3845 if notif_level_key is None:
3946 return False
4047
4148 notif_levels = power_levels.get("notifications", {})
49 assert isinstance(notif_levels, dict)
4250 room_notif_level = notif_levels.get(notif_level_key, 50)
4351
4452 return sender_power_level >= room_notif_level
4553
4654
47 def _test_ineq_condition(condition, number):
55 def _test_ineq_condition(condition: Dict[str, Any], number: int) -> bool:
4856 if "is" not in condition:
4957 return False
5058 m = INEQUALITY_EXPR.match(condition["is"])
109117 event: EventBase,
110118 room_member_count: int,
111119 sender_power_level: int,
112 power_levels: dict,
120 power_levels: Dict[str, Union[int, Dict[str, int]]],
113121 ):
114122 self._event = event
115123 self._room_member_count = room_member_count
119127 # Maps strings of e.g. 'content.body' -> event["content"]["body"]
120128 self._value_cache = _flatten_dict(event)
121129
122 def matches(self, condition: dict, user_id: str, display_name: str) -> bool:
130 def matches(
131 self, condition: Dict[str, Any], user_id: str, display_name: str
132 ) -> bool:
123133 if condition["kind"] == "event_match":
124134 return self._event_match(condition, user_id)
125135 elif condition["kind"] == "contains_display_name":
260270 return r"(^|\W)%s(\W|$)" % (r,)
261271
262272
263 def _flatten_dict(d, prefix=[], result=None):
273 def _flatten_dict(
274 d: Union[EventBase, dict],
275 prefix: Optional[List[str]] = None,
276 result: Optional[Dict[str, str]] = None,
277 ) -> Dict[str, str]:
278 if prefix is None:
279 prefix = []
264280 if result is None:
265281 result = {}
266282 for key, value in d.items():
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14 from typing import Dict
15
16 from synapse.events import EventBase
1417 from synapse.push.presentable_names import calculate_room_name, name_from_member_event
1518 from synapse.storage import Storage
1619 from synapse.storage.databases.main import DataStore
4548 return badge
4649
4750
48 async def get_context_for_event(storage: Storage, state_handler, ev, user_id):
51 async def get_context_for_event(
52 storage: Storage, ev: EventBase, user_id: str
53 ) -> Dict[str, str]:
4954 ctx = {}
5055
5156 room_state_ids = await storage.state.get_state_ids_for_event(ev.event_id)
1313 # limitations under the License.
1414
1515 import logging
16 from typing import TYPE_CHECKING, Callable, Dict, Optional
1617
18 from synapse.push import Pusher, PusherConfig
1719 from synapse.push.emailpusher import EmailPusher
20 from synapse.push.httppusher import HttpPusher
1821 from synapse.push.mailer import Mailer
1922
20 from .httppusher import HttpPusher
23 if TYPE_CHECKING:
24 from synapse.app.homeserver import HomeServer
2125
2226 logger = logging.getLogger(__name__)
2327
2428
2529 class PusherFactory:
26 def __init__(self, hs):
30 def __init__(self, hs: "HomeServer"):
2731 self.hs = hs
2832 self.config = hs.config
2933
30 self.pusher_types = {"http": HttpPusher}
34 self.pusher_types = {
35 "http": HttpPusher
36 } # type: Dict[str, Callable[[HomeServer, PusherConfig], Pusher]]
3137
3238 logger.info("email enable notifs: %r", hs.config.email_enable_notifs)
3339 if hs.config.email_enable_notifs:
34 self.mailers = {} # app_name -> Mailer
40 self.mailers = {} # type: Dict[str, Mailer]
3541
3642 self._notif_template_html = hs.config.email_notif_template_html
3743 self._notif_template_text = hs.config.email_notif_template_text
4046
4147 logger.info("defined email pusher type")
4248
43 def create_pusher(self, pusherdict):
44 kind = pusherdict["kind"]
49 def create_pusher(self, pusher_config: PusherConfig) -> Optional[Pusher]:
50 kind = pusher_config.kind
4551 f = self.pusher_types.get(kind, None)
4652 if not f:
4753 return None
48 logger.debug("creating %s pusher for %r", kind, pusherdict)
49 return f(self.hs, pusherdict)
54 logger.debug("creating %s pusher for %r", kind, pusher_config)
55 return f(self.hs, pusher_config)
5056
51 def _create_email_pusher(self, _hs, pusherdict):
52 app_name = self._app_name_from_pusherdict(pusherdict)
57 def _create_email_pusher(
58 self, _hs: "HomeServer", pusher_config: PusherConfig
59 ) -> EmailPusher:
60 app_name = self._app_name_from_pusherdict(pusher_config)
5361 mailer = self.mailers.get(app_name)
5462 if not mailer:
5563 mailer = Mailer(
5967 template_text=self._notif_template_text,
6068 )
6169 self.mailers[app_name] = mailer
62 return EmailPusher(self.hs, pusherdict, mailer)
70 return EmailPusher(self.hs, pusher_config, mailer)
6371
64 def _app_name_from_pusherdict(self, pusherdict):
65 data = pusherdict["data"]
72 def _app_name_from_pusherdict(self, pusher_config: PusherConfig) -> str:
73 data = pusher_config.data
6674
6775 if isinstance(data, dict):
6876 brand = data.get("brand")
1414 # limitations under the License.
1515
1616 import logging
17 from typing import TYPE_CHECKING, Dict, Union
17 from typing import TYPE_CHECKING, Dict, Iterable, Optional
1818
1919 from prometheus_client import Gauge
2020
2222 run_as_background_process,
2323 wrap_as_background_process,
2424 )
25 from synapse.push import PusherConfigException
26 from synapse.push.emailpusher import EmailPusher
27 from synapse.push.httppusher import HttpPusher
25 from synapse.push import Pusher, PusherConfig, PusherConfigException
2826 from synapse.push.pusher import PusherFactory
29 from synapse.types import RoomStreamToken
27 from synapse.types import JsonDict, RoomStreamToken
3028 from synapse.util.async_helpers import concurrently_execute
3129
3230 if TYPE_CHECKING:
7674 self._last_room_stream_id_seen = self.store.get_room_max_stream_ordering()
7775
7876 # map from user id to app_id:pushkey to pusher
79 self.pushers = {} # type: Dict[str, Dict[str, Union[HttpPusher, EmailPusher]]]
80
81 def start(self):
77 self.pushers = {} # type: Dict[str, Dict[str, Pusher]]
78
79 def start(self) -> None:
8280 """Starts the pushers off in a background process.
8381 """
8482 if not self._should_start_pushers:
8886
8987 async def add_pusher(
9088 self,
91 user_id,
92 access_token,
93 kind,
94 app_id,
95 app_display_name,
96 device_display_name,
97 pushkey,
98 lang,
99 data,
100 profile_tag="",
101 ):
89 user_id: str,
90 access_token: Optional[int],
91 kind: str,
92 app_id: str,
93 app_display_name: str,
94 device_display_name: str,
95 pushkey: str,
96 lang: Optional[str],
97 data: JsonDict,
98 profile_tag: str = "",
99 ) -> Optional[Pusher]:
102100 """Creates a new pusher and adds it to the pool
103101
104102 Returns:
105 EmailPusher|HttpPusher
103 The newly created pusher.
106104 """
107105
108106 time_now_msec = self.clock.time_msec()
107
108 # create the pusher setting last_stream_ordering to the current maximum
109 # stream ordering, so it will process pushes from this point onwards.
110 last_stream_ordering = self.store.get_room_max_stream_ordering()
109111
110112 # we try to create the pusher just to validate the config: it
111113 # will then get pulled out of the database,
112114 # recreated, added and started: this means we have only one
113115 # code path adding pushers.
114116 self.pusher_factory.create_pusher(
115 {
116 "id": None,
117 "user_name": user_id,
118 "kind": kind,
119 "app_id": app_id,
120 "app_display_name": app_display_name,
121 "device_display_name": device_display_name,
122 "pushkey": pushkey,
123 "ts": time_now_msec,
124 "lang": lang,
125 "data": data,
126 "last_stream_ordering": None,
127 "last_success": None,
128 "failing_since": None,
129 }
117 PusherConfig(
118 id=None,
119 user_name=user_id,
120 access_token=access_token,
121 profile_tag=profile_tag,
122 kind=kind,
123 app_id=app_id,
124 app_display_name=app_display_name,
125 device_display_name=device_display_name,
126 pushkey=pushkey,
127 ts=time_now_msec,
128 lang=lang,
129 data=data,
130 last_stream_ordering=last_stream_ordering,
131 last_success=None,
132 failing_since=None,
133 )
130134 )
131
132 # create the pusher setting last_stream_ordering to the current maximum
133 # stream ordering in event_push_actions, so it will process
134 # pushes from this point onwards.
135 last_stream_ordering = await self.store.get_latest_push_action_stream_ordering()
136135
137136 await self.store.add_pusher(
138137 user_id=user_id,
153152 return pusher
154153
155154 async def remove_pushers_by_app_id_and_pushkey_not_user(
156 self, app_id, pushkey, not_user_id
157 ):
155 self, app_id: str, pushkey: str, not_user_id: str
156 ) -> None:
158157 to_remove = await self.store.get_pushers_by_app_id_and_pushkey(app_id, pushkey)
159158 for p in to_remove:
160 if p["user_name"] != not_user_id:
159 if p.user_name != not_user_id:
161160 logger.info(
162161 "Removing pusher for app id %s, pushkey %s, user %s",
163162 app_id,
164163 pushkey,
165 p["user_name"],
164 p.user_name,
166165 )
167 await self.remove_pusher(p["app_id"], p["pushkey"], p["user_name"])
168
169 async def remove_pushers_by_access_token(self, user_id, access_tokens):
166 await self.remove_pusher(p.app_id, p.pushkey, p.user_name)
167
168 async def remove_pushers_by_access_token(
169 self, user_id: str, access_tokens: Iterable[int]
170 ) -> None:
170171 """Remove the pushers for a given user corresponding to a set of
171172 access_tokens.
172173
173174 Args:
174 user_id (str): user to remove pushers for
175 access_tokens (Iterable[int]): access token *ids* to remove pushers
176 for
175 user_id: user to remove pushers for
176 access_tokens: access token *ids* to remove pushers for
177177 """
178178 if not self._pusher_shard_config.should_handle(self._instance_name, user_id):
179179 return
180180
181181 tokens = set(access_tokens)
182182 for p in await self.store.get_pushers_by_user_id(user_id):
183 if p["access_token"] in tokens:
183 if p.access_token in tokens:
184184 logger.info(
185185 "Removing pusher for app id %s, pushkey %s, user %s",
186 p["app_id"],
187 p["pushkey"],
188 p["user_name"],
186 p.app_id,
187 p.pushkey,
188 p.user_name,
189189 )
190 await self.remove_pusher(p["app_id"], p["pushkey"], p["user_name"])
191
192 def on_new_notifications(self, max_token: RoomStreamToken):
190 await self.remove_pusher(p.app_id, p.pushkey, p.user_name)
191
192 def on_new_notifications(self, max_token: RoomStreamToken) -> None:
193193 if not self.pushers:
194194 # nothing to do here.
195195 return
208208 self._on_new_notifications(max_token)
209209
210210 @wrap_as_background_process("on_new_notifications")
211 async def _on_new_notifications(self, max_token: RoomStreamToken):
211 async def _on_new_notifications(self, max_token: RoomStreamToken) -> None:
212212 # We just use the minimum stream ordering and ignore the vector clock
213213 # component. This is safe to do as long as we *always* ignore the vector
214214 # clock components.
238238 except Exception:
239239 logger.exception("Exception in pusher on_new_notifications")
240240
241 async def on_new_receipts(self, min_stream_id, max_stream_id, affected_room_ids):
241 async def on_new_receipts(
242 self, min_stream_id: int, max_stream_id: int, affected_room_ids: Iterable[str]
243 ) -> None:
242244 if not self.pushers:
243245 # nothing to do here.
244246 return
266268 except Exception:
267269 logger.exception("Exception in pusher on_new_receipts")
268270
269 async def start_pusher_by_id(self, app_id, pushkey, user_id):
271 async def start_pusher_by_id(
272 self, app_id: str, pushkey: str, user_id: str
273 ) -> Optional[Pusher]:
270274 """Look up the details for the given pusher, and start it
271275
272276 Returns:
273 EmailPusher|HttpPusher|None: The pusher started, if any
277 The pusher started, if any
274278 """
275279 if not self._should_start_pushers:
276 return
280 return None
277281
278282 if not self._pusher_shard_config.should_handle(self._instance_name, user_id):
279 return
283 return None
280284
281285 resultlist = await self.store.get_pushers_by_app_id_and_pushkey(app_id, pushkey)
282286
283 pusher_dict = None
287 pusher_config = None
284288 for r in resultlist:
285 if r["user_name"] == user_id:
286 pusher_dict = r
289 if r.user_name == user_id:
290 pusher_config = r
287291
288292 pusher = None
289 if pusher_dict:
290 pusher = await self._start_pusher(pusher_dict)
293 if pusher_config:
294 pusher = await self._start_pusher(pusher_config)
291295
292296 return pusher
293297
302306
303307 logger.info("Started pushers")
304308
305 async def _start_pusher(self, pusherdict):
309 async def _start_pusher(self, pusher_config: PusherConfig) -> Optional[Pusher]:
306310 """Start the given pusher
307311
308312 Args:
309 pusherdict (dict): dict with the values pulled from the db table
313 pusher_config: The pusher configuration with the values pulled from the db table
310314
311315 Returns:
312 EmailPusher|HttpPusher
316 The newly created pusher or None.
313317 """
314318 if not self._pusher_shard_config.should_handle(
315 self._instance_name, pusherdict["user_name"]
319 self._instance_name, pusher_config.user_name
316320 ):
317 return
321 return None
318322
319323 try:
320 p = self.pusher_factory.create_pusher(pusherdict)
324 p = self.pusher_factory.create_pusher(pusher_config)
321325 except PusherConfigException as e:
322326 logger.warning(
323327 "Pusher incorrectly configured id=%i, user=%s, appid=%s, pushkey=%s: %s",
324 pusherdict["id"],
325 pusherdict.get("user_name"),
326 pusherdict.get("app_id"),
327 pusherdict.get("pushkey"),
328 pusher_config.id,
329 pusher_config.user_name,
330 pusher_config.app_id,
331 pusher_config.pushkey,
328332 e,
329333 )
330 return
334 return None
331335 except Exception:
332336 logger.exception(
333 "Couldn't start pusher id %i: caught Exception", pusherdict["id"],
334 )
335 return
337 "Couldn't start pusher id %i: caught Exception", pusher_config.id,
338 )
339 return None
336340
337341 if not p:
338 return
339
340 appid_pushkey = "%s:%s" % (pusherdict["app_id"], pusherdict["pushkey"])
341
342 byuser = self.pushers.setdefault(pusherdict["user_name"], {})
342 return None
343
344 appid_pushkey = "%s:%s" % (pusher_config.app_id, pusher_config.pushkey)
345
346 byuser = self.pushers.setdefault(pusher_config.user_name, {})
343347 if appid_pushkey in byuser:
344348 byuser[appid_pushkey].on_stop()
345349 byuser[appid_pushkey] = p
349353 # Check if there *may* be push to process. We do this as this check is a
350354 # lot cheaper to do than actually fetching the exact rows we need to
351355 # push.
352 user_id = pusherdict["user_name"]
353 last_stream_ordering = pusherdict["last_stream_ordering"]
356 user_id = pusher_config.user_name
357 last_stream_ordering = pusher_config.last_stream_ordering
354358 if last_stream_ordering:
355359 have_notifs = await self.store.get_if_maybe_push_in_range_for_user(
356360 user_id, last_stream_ordering
364368
365369 return p
366370
367 async def remove_pusher(self, app_id, pushkey, user_id):
371 async def remove_pusher(self, app_id: str, pushkey: str, user_id: str) -> None:
368372 appid_pushkey = "%s:%s" % (app_id, pushkey)
369373
370374 byuser = self.pushers.get(user_id, {})
105105
106106 assert self.METHOD in ("PUT", "POST", "GET")
107107
108 self._replication_secret = None
109 if hs.config.worker.worker_replication_secret:
110 self._replication_secret = hs.config.worker.worker_replication_secret
111
112 def _check_auth(self, request) -> None:
113 # Get the authorization header.
114 auth_headers = request.requestHeaders.getRawHeaders(b"Authorization")
115
116 if len(auth_headers) > 1:
117 raise RuntimeError("Too many Authorization headers.")
118 parts = auth_headers[0].split(b" ")
119 if parts[0] == b"Bearer" and len(parts) == 2:
120 received_secret = parts[1].decode("ascii")
121 if self._replication_secret == received_secret:
122 # Success!
123 return
124
125 raise RuntimeError("Invalid Authorization header.")
126
108127 @abc.abstractmethod
109128 async def _serialize_payload(**kwargs):
110129 """Static method that is called when creating a request.
148167 instance_map = hs.config.worker.instance_map
149168
150169 outgoing_gauge = _pending_outgoing_requests.labels(cls.NAME)
170
171 replication_secret = None
172 if hs.config.worker.worker_replication_secret:
173 replication_secret = hs.config.worker.worker_replication_secret.encode(
174 "ascii"
175 )
151176
152177 @trace(opname="outgoing_replication_request")
153178 @outgoing_gauge.track_inprogress()
201226 # the master, and so whether we should clean up or not.
202227 while True:
203228 headers = {} # type: Dict[bytes, List[bytes]]
229 # Add an authorization header, if configured.
230 if replication_secret:
231 headers[b"Authorization"] = [b"Bearer " + replication_secret]
204232 inject_active_span_byte_dict(headers, None, check_destination=False)
205233 try:
206234 result = await request_func(uri, data, headers=headers)
235263 """
236264
237265 url_args = list(self.PATH_ARGS)
238 handler = self._handle_request
239266 method = self.METHOD
240267
241268 if self.CACHE:
242 handler = self._cached_handler # type: ignore
243269 url_args.append("txn_id")
244270
245271 args = "/".join("(?P<%s>[^/]+)" % (arg,) for arg in url_args)
246272 pattern = re.compile("^/_synapse/replication/%s/%s$" % (self.NAME, args))
247273
248274 http_server.register_paths(
249 method, [pattern], handler, self.__class__.__name__,
275 method, [pattern], self._check_auth_and_handle, self.__class__.__name__,
250276 )
251277
252 def _cached_handler(self, request, txn_id, **kwargs):
278 def _check_auth_and_handle(self, request, **kwargs):
253279 """Called on new incoming requests when caching is enabled. Checks
254280 if there is a cached response for the request and returns that,
255281 otherwise calls `_handle_request` and caches its response.
257283 # We just use the txn_id here, but we probably also want to use the
258284 # other PATH_ARGS as well.
259285
260 assert self.CACHE
261
262 return self.response_cache.wrap(txn_id, self._handle_request, request, **kwargs)
286 # Check the authorization headers before handling the request.
287 if self._replication_secret:
288 self._check_auth(request)
289
290 if self.CACHE:
291 txn_id = kwargs.pop("txn_id")
292
293 return self.response_cache.wrap(
294 txn_id, self._handle_request, request, **kwargs
295 )
296
297 return self._handle_request(request, **kwargs)
3535 self.registration_handler = hs.get_registration_handler()
3636
3737 @staticmethod
38 async def _serialize_payload(user_id, device_id, initial_display_name, is_guest):
38 async def _serialize_payload(
39 user_id, device_id, initial_display_name, is_guest, is_appservice_ghost
40 ):
3941 """
4042 Args:
4143 device_id (str|None): Device ID to use, if None a new one is
4749 "device_id": device_id,
4850 "initial_display_name": initial_display_name,
4951 "is_guest": is_guest,
52 "is_appservice_ghost": is_appservice_ghost,
5053 }
5154
5255 async def _handle_request(self, request, user_id):
5558 device_id = content["device_id"]
5659 initial_display_name = content["initial_display_name"]
5760 is_guest = content["is_guest"]
61 is_appservice_ghost = content["is_appservice_ghost"]
5862
5963 device_id, access_token = await self.registration_handler.register_device(
60 user_id, device_id, initial_display_name, is_guest
64 user_id,
65 device_id,
66 initial_display_name,
67 is_guest,
68 is_appservice_ghost=is_appservice_ghost,
6169 )
6270
6371 return 200, {"device_id": device_id, "access_token": access_token}
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14 from typing import List, Optional, Tuple
1415
16 from synapse.storage.types import Connection
1517 from synapse.storage.util.id_generators import _load_current_id
1618
1719
1820 class SlavedIdTracker:
19 def __init__(self, db_conn, table, column, extra_tables=[], step=1):
21 def __init__(
22 self,
23 db_conn: Connection,
24 table: str,
25 column: str,
26 extra_tables: Optional[List[Tuple[str, str]]] = None,
27 step: int = 1,
28 ):
2029 self.step = step
2130 self._current = _load_current_id(db_conn, table, column, step)
22 for table, column in extra_tables:
23 self.advance(None, _load_current_id(db_conn, table, column))
31 if extra_tables:
32 for table, column in extra_tables:
33 self.advance(None, _load_current_id(db_conn, table, column))
2434
25 def advance(self, instance_name, new_id):
35 def advance(self, instance_name: Optional[str], new_id: int):
2636 self._current = (max if self.step > 0 else min)(self._current, new_id)
2737
28 def get_current_token(self):
38 def get_current_token(self) -> int:
2939 """
3040
3141 Returns:
1212 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1313 # See the License for the specific language governing permissions and
1414 # limitations under the License.
15 from typing import TYPE_CHECKING
1516
1617 from synapse.replication.tcp.streams import PushersStream
1718 from synapse.storage.database import DatabasePool
1819 from synapse.storage.databases.main.pusher import PusherWorkerStore
20 from synapse.storage.types import Connection
1921
2022 from ._base import BaseSlavedStore
2123 from ._slaved_id_tracker import SlavedIdTracker
2224
25 if TYPE_CHECKING:
26 from synapse.app.homeserver import HomeServer
27
2328
2429 class SlavedPusherStore(PusherWorkerStore, BaseSlavedStore):
25 def __init__(self, database: DatabasePool, db_conn, hs):
30 def __init__(self, database: DatabasePool, db_conn: Connection, hs: "HomeServer"):
2631 super().__init__(database, db_conn, hs)
27 self._pushers_id_gen = SlavedIdTracker(
32 self._pushers_id_gen = SlavedIdTracker( # type: ignore
2833 db_conn, "pushers", "id", extra_tables=[("deleted_pushers", "stream_id")]
2934 )
3035
31 def get_pushers_stream_token(self):
36 def get_pushers_stream_token(self) -> int:
3237 return self._pushers_id_gen.get_current_token()
3338
34 def process_replication_rows(self, stream_name, instance_name, token, rows):
39 def process_replication_rows(
40 self, stream_name: str, instance_name: str, token, rows
41 ) -> None:
3542 if stream_name == PushersStream.NAME:
36 self._pushers_id_gen.advance(instance_name, token)
43 self._pushers_id_gen.advance(instance_name, token) # type: ignore
3744 return super().process_replication_rows(stream_name, instance_name, token, rows)
171171 # a logcontext which we use for processing incoming commands. We declare it as a
172172 # background process so that the CPU stats get reported to prometheus.
173173 ctx_name = "replication-conn-%s" % self.conn_id
174 self._logging_context = BackgroundProcessLoggingContext(ctx_name)
175 self._logging_context.request = ctx_name
174 self._logging_context = BackgroundProcessLoggingContext(ctx_name, ctx_name)
176175
177176 def connectionMade(self):
178177 logger.info("[%s] Connection established", self.id())
2828 {{ message.body_text_html }}
2929 {%- elif message.msgtype == "m.notice" %}
3030 {{ message.body_text_html }}
31 {%- elif message.msgtype == "m.image" %}
31 {%- elif message.msgtype == "m.image" and message.image_url %}
3232 <img src="{{ message.image_url|mxc_to_http(640, 480, scale) }}" />
3333 {%- elif message.msgtype == "m.file" %}
3434 <span class="filename">{{ message.body_text_plain }}</span>
0 <!DOCTYPE html>
1 <html lang="en">
2 <head>
3 <title>Synapse Login</title>
4 <link rel="stylesheet" href="style.css" type="text/css" />
5 </head>
6 <body>
7 <div class="card">
8 <form method="post" class="form__input" id="form" action="submit">
9 <label for="field-username">Please pick your username:</label>
10 <input type="text" name="username" id="field-username" autofocus="">
11 <input type="submit" class="button button--full-width" id="button-submit" value="Submit">
12 </form>
13 <!-- this is used for feedback -->
14 <div role=alert class="tooltip hidden" id="message"></div>
15 <script src="script.js"></script>
16 </div>
17 </body>
18 </html>
0 let inputField = document.getElementById("field-username");
1 let inputForm = document.getElementById("form");
2 let submitButton = document.getElementById("button-submit");
3 let message = document.getElementById("message");
4
5 // Submit username and receive response
6 function showMessage(messageText) {
7 // Unhide the message text
8 message.classList.remove("hidden");
9
10 message.textContent = messageText;
11 };
12
13 function doSubmit() {
14 showMessage("Success. Please wait a moment for your browser to redirect.");
15
16 // remove the event handler before re-submitting the form.
17 delete inputForm.onsubmit;
18 inputForm.submit();
19 }
20
21 function onResponse(response) {
22 // Display message
23 showMessage(response);
24
25 // Enable submit button and input field
26 submitButton.classList.remove('button--disabled');
27 submitButton.value = "Submit";
28 };
29
30 let allowedUsernameCharacters = RegExp("[^a-z0-9\\.\\_\\=\\-\\/]");
31 function usernameIsValid(username) {
32 return !allowedUsernameCharacters.test(username);
33 }
34 let allowedCharactersString = "lowercase letters, digits, ., _, -, /, =";
35
36 function buildQueryString(params) {
37 return Object.keys(params)
38 .map(k => encodeURIComponent(k) + '=' + encodeURIComponent(params[k]))
39 .join('&');
40 }
41
42 function submitUsername(username) {
43 if(username.length == 0) {
44 onResponse("Please enter a username.");
45 return;
46 }
47 if(!usernameIsValid(username)) {
48 onResponse("Invalid username. Only the following characters are allowed: " + allowedCharactersString);
49 return;
50 }
51
52 // if this browser doesn't support fetch, skip the availability check.
53 if(!window.fetch) {
54 doSubmit();
55 return;
56 }
57
58 let check_uri = 'check?' + buildQueryString({"username": username});
59 fetch(check_uri, {
60 // include the cookie
61 "credentials": "same-origin",
62 }).then((response) => {
63 if(!response.ok) {
64 // for non-200 responses, raise the body of the response as an exception
65 return response.text().then((text) => { throw text; });
66 } else {
67 return response.json();
68 }
69 }).then((json) => {
70 if(json.error) {
71 throw json.error;
72 } else if(json.available) {
73 doSubmit();
74 } else {
75 onResponse("This username is not available, please choose another.");
76 }
77 }).catch((err) => {
78 onResponse("Error checking username availability: " + err);
79 });
80 }
81
82 function clickSubmit() {
83 event.preventDefault();
84 if(submitButton.classList.contains('button--disabled')) { return; }
85
86 // Disable submit button and input field
87 submitButton.classList.add('button--disabled');
88
89 // Submit username
90 submitButton.value = "Checking...";
91 submitUsername(inputField.value);
92 };
93
94 inputForm.onsubmit = clickSubmit;
0 input[type="text"] {
1 font-size: 100%;
2 background-color: #ededf0;
3 border: 1px solid #fff;
4 border-radius: .2em;
5 padding: .5em .9em;
6 display: block;
7 width: 26em;
8 }
9
10 .button--disabled {
11 border-color: #fff;
12 background-color: transparent;
13 color: #000;
14 text-transform: none;
15 }
16
17 .hidden {
18 display: none;
19 }
20
21 .tooltip {
22 background-color: #f9f9fa;
23 padding: 1em;
24 margin: 1em 0;
25 }
26
3737 DeleteRoomRestServlet,
3838 JoinRoomAliasServlet,
3939 ListRoomRestServlet,
40 MakeRoomAdminRestServlet,
4041 RoomMembersRestServlet,
4142 RoomRestServlet,
4243 ShutdownRoomRestServlet,
227228 EventReportDetailRestServlet(hs).register(http_server)
228229 EventReportsRestServlet(hs).register(http_server)
229230 PushersRestServlet(hs).register(http_server)
231 MakeRoomAdminRestServlet(hs).register(http_server)
230232
231233
232234 def register_servlets_for_client_rest_resource(hs, http_server):
1313 # limitations under the License.
1414 import logging
1515 from http import HTTPStatus
16 from typing import List, Optional
17
18 from synapse.api.constants import EventTypes, JoinRules
19 from synapse.api.errors import Codes, NotFoundError, SynapseError
16 from typing import TYPE_CHECKING, List, Optional, Tuple
17
18 from synapse.api.constants import EventTypes, JoinRules, Membership
19 from synapse.api.errors import AuthError, Codes, NotFoundError, SynapseError
2020 from synapse.http.servlet import (
2121 RestServlet,
2222 assert_params_in_dict,
2424 parse_json_object_from_request,
2525 parse_string,
2626 )
27 from synapse.http.site import SynapseRequest
2728 from synapse.rest.admin._base import (
2829 admin_patterns,
2930 assert_requester_is_admin,
3031 assert_user_is_admin,
3132 )
3233 from synapse.storage.databases.main.room import RoomSortOrder
33 from synapse.types import RoomAlias, RoomID, UserID, create_requester
34 from synapse.types import JsonDict, RoomAlias, RoomID, UserID, create_requester
35
36 if TYPE_CHECKING:
37 from synapse.server import HomeServer
38
3439
3540 logger = logging.getLogger(__name__)
3641
4449
4550 PATTERNS = admin_patterns("/shutdown_room/(?P<room_id>[^/]+)")
4651
47 def __init__(self, hs):
52 def __init__(self, hs: "HomeServer"):
4853 self.hs = hs
4954 self.auth = hs.get_auth()
5055 self.room_shutdown_handler = hs.get_room_shutdown_handler()
5156
52 async def on_POST(self, request, room_id):
57 async def on_POST(
58 self, request: SynapseRequest, room_id: str
59 ) -> Tuple[int, JsonDict]:
5360 requester = await self.auth.get_user_by_req(request)
5461 await assert_user_is_admin(self.auth, requester.user)
5562
8592
8693 PATTERNS = admin_patterns("/rooms/(?P<room_id>[^/]+)/delete$")
8794
88 def __init__(self, hs):
95 def __init__(self, hs: "HomeServer"):
8996 self.hs = hs
9097 self.auth = hs.get_auth()
9198 self.room_shutdown_handler = hs.get_room_shutdown_handler()
9299 self.pagination_handler = hs.get_pagination_handler()
93100
94 async def on_POST(self, request, room_id):
101 async def on_POST(
102 self, request: SynapseRequest, room_id: str
103 ) -> Tuple[int, JsonDict]:
95104 requester = await self.auth.get_user_by_req(request)
96105 await assert_user_is_admin(self.auth, requester.user)
97106
145154
146155 PATTERNS = admin_patterns("/rooms$")
147156
148 def __init__(self, hs):
157 def __init__(self, hs: "HomeServer"):
149158 self.store = hs.get_datastore()
150159 self.auth = hs.get_auth()
151160 self.admin_handler = hs.get_admin_handler()
152161
153 async def on_GET(self, request):
162 async def on_GET(self, request: SynapseRequest) -> Tuple[int, JsonDict]:
154163 requester = await self.auth.get_user_by_req(request)
155164 await assert_user_is_admin(self.auth, requester.user)
156165
235244
236245 PATTERNS = admin_patterns("/rooms/(?P<room_id>[^/]+)$")
237246
238 def __init__(self, hs):
247 def __init__(self, hs: "HomeServer"):
239248 self.hs = hs
240249 self.auth = hs.get_auth()
241250 self.store = hs.get_datastore()
242251
243 async def on_GET(self, request, room_id):
252 async def on_GET(
253 self, request: SynapseRequest, room_id: str
254 ) -> Tuple[int, JsonDict]:
244255 await assert_requester_is_admin(self.auth, request)
245256
246257 ret = await self.store.get_room_with_stats(room_id)
247258 if not ret:
248259 raise NotFoundError("Room not found")
249260
250 return 200, ret
261 members = await self.store.get_users_in_room(room_id)
262 ret["joined_local_devices"] = await self.store.count_devices_by_users(members)
263
264 return (200, ret)
251265
252266
253267 class RoomMembersRestServlet(RestServlet):
257271
258272 PATTERNS = admin_patterns("/rooms/(?P<room_id>[^/]+)/members")
259273
260 def __init__(self, hs):
274 def __init__(self, hs: "HomeServer"):
261275 self.hs = hs
262276 self.auth = hs.get_auth()
263277 self.store = hs.get_datastore()
264278
265 async def on_GET(self, request, room_id):
279 async def on_GET(
280 self, request: SynapseRequest, room_id: str
281 ) -> Tuple[int, JsonDict]:
266282 await assert_requester_is_admin(self.auth, request)
267283
268284 ret = await self.store.get_room(room_id)
279295
280296 PATTERNS = admin_patterns("/join/(?P<room_identifier>[^/]*)")
281297
282 def __init__(self, hs):
298 def __init__(self, hs: "HomeServer"):
283299 self.hs = hs
284300 self.auth = hs.get_auth()
285301 self.room_member_handler = hs.get_room_member_handler()
286302 self.admin_handler = hs.get_admin_handler()
287303 self.state_handler = hs.get_state_handler()
288304
289 async def on_POST(self, request, room_identifier):
305 async def on_POST(
306 self, request: SynapseRequest, room_identifier: str
307 ) -> Tuple[int, JsonDict]:
290308 requester = await self.auth.get_user_by_req(request)
291309 await assert_user_is_admin(self.auth, requester.user)
292310
313331 handler = self.room_member_handler
314332 room_alias = RoomAlias.from_string(room_identifier)
315333 room_id, remote_room_hosts = await handler.lookup_room_alias(room_alias)
316 room_id = room_id.to_string()
317334 else:
318335 raise SynapseError(
319336 400, "%s was not legal room ID or room alias" % (room_identifier,)
350367 )
351368
352369 return 200, {"room_id": room_id}
370
371
372 class MakeRoomAdminRestServlet(RestServlet):
373 """Allows a server admin to get power in a room if a local user has power in
374 a room. Will also invite the user if they're not in the room and it's a
375 private room. Can specify another user (rather than the admin user) to be
376 granted power, e.g.:
377
378 POST/_synapse/admin/v1/rooms/<room_id_or_alias>/make_room_admin
379 {
380 "user_id": "@foo:example.com"
381 }
382 """
383
384 PATTERNS = admin_patterns("/rooms/(?P<room_identifier>[^/]*)/make_room_admin")
385
386 def __init__(self, hs: "HomeServer"):
387 self.hs = hs
388 self.auth = hs.get_auth()
389 self.room_member_handler = hs.get_room_member_handler()
390 self.event_creation_handler = hs.get_event_creation_handler()
391 self.state_handler = hs.get_state_handler()
392 self.is_mine_id = hs.is_mine_id
393
394 async def on_POST(self, request, room_identifier):
395 requester = await self.auth.get_user_by_req(request)
396 await assert_user_is_admin(self.auth, requester.user)
397 content = parse_json_object_from_request(request, allow_empty_body=True)
398
399 # Resolve to a room ID, if necessary.
400 if RoomID.is_valid(room_identifier):
401 room_id = room_identifier
402 elif RoomAlias.is_valid(room_identifier):
403 room_alias = RoomAlias.from_string(room_identifier)
404 room_id, _ = await self.room_member_handler.lookup_room_alias(room_alias)
405 room_id = room_id.to_string()
406 else:
407 raise SynapseError(
408 400, "%s was not legal room ID or room alias" % (room_identifier,)
409 )
410
411 # Which user to grant room admin rights to.
412 user_to_add = content.get("user_id", requester.user.to_string())
413
414 # Figure out which local users currently have power in the room, if any.
415 room_state = await self.state_handler.get_current_state(room_id)
416 if not room_state:
417 raise SynapseError(400, "Server not in room")
418
419 create_event = room_state[(EventTypes.Create, "")]
420 power_levels = room_state.get((EventTypes.PowerLevels, ""))
421
422 if power_levels is not None:
423 # We pick the local user with the highest power.
424 user_power = power_levels.content.get("users", {})
425 admin_users = [
426 user_id for user_id in user_power if self.is_mine_id(user_id)
427 ]
428 admin_users.sort(key=lambda user: user_power[user])
429
430 if not admin_users:
431 raise SynapseError(400, "No local admin user in room")
432
433 admin_user_id = admin_users[-1]
434
435 pl_content = power_levels.content
436 else:
437 # If there is no power level events then the creator has rights.
438 pl_content = {}
439 admin_user_id = create_event.sender
440 if not self.is_mine_id(admin_user_id):
441 raise SynapseError(
442 400, "No local admin user in room",
443 )
444
445 # Grant the user power equal to the room admin by attempting to send an
446 # updated power level event.
447 new_pl_content = dict(pl_content)
448 new_pl_content["users"] = dict(pl_content.get("users", {}))
449 new_pl_content["users"][user_to_add] = new_pl_content["users"][admin_user_id]
450
451 fake_requester = create_requester(
452 admin_user_id, authenticated_entity=requester.authenticated_entity,
453 )
454
455 try:
456 await self.event_creation_handler.create_and_send_nonmember_event(
457 fake_requester,
458 event_dict={
459 "content": new_pl_content,
460 "sender": admin_user_id,
461 "type": EventTypes.PowerLevels,
462 "state_key": "",
463 "room_id": room_id,
464 },
465 )
466 except AuthError:
467 # The admin user we found turned out not to have enough power.
468 raise SynapseError(
469 400, "No local admin user in room with power to update power levels."
470 )
471
472 # Now we check if the user we're granting admin rights to is already in
473 # the room. If not and it's not a public room we invite them.
474 member_event = room_state.get((EventTypes.Member, user_to_add))
475 is_joined = False
476 if member_event:
477 is_joined = member_event.content["membership"] in (
478 Membership.JOIN,
479 Membership.INVITE,
480 )
481
482 if is_joined:
483 return 200, {}
484
485 join_rules = room_state.get((EventTypes.JoinRules, ""))
486 is_public = False
487 if join_rules:
488 is_public = join_rules.content.get("join_rule") == JoinRules.PUBLIC
489
490 if is_public:
491 return 200, {}
492
493 await self.room_member_handler.update_membership(
494 fake_requester,
495 target=UserID.from_string(user_to_add),
496 room_id=room_id,
497 action=Membership.INVITE,
498 )
499
500 return 200, {}
4040 from synapse.server import HomeServer
4141
4242 logger = logging.getLogger(__name__)
43
44 _GET_PUSHERS_ALLOWED_KEYS = {
45 "app_display_name",
46 "app_id",
47 "data",
48 "device_display_name",
49 "kind",
50 "lang",
51 "profile_tag",
52 "pushkey",
53 }
5443
5544
5645 class UsersRestServlet(RestServlet):
319308 data={},
320309 )
321310
322 if "avatar_url" in body and type(body["avatar_url"]) == str:
311 if "avatar_url" in body and isinstance(body["avatar_url"], str):
323312 await self.profile_handler.set_avatar_url(
324 user_id, requester, body["avatar_url"], True
313 target_user, requester, body["avatar_url"], True
325314 )
326315
327316 ret = await self.admin_handler.get_user(target_user)
418407
419408 if user_type is not None and user_type not in UserTypes.ALL_USER_TYPES:
420409 raise SynapseError(400, "Invalid user type")
410
411 if "mac" not in body:
412 raise SynapseError(400, "mac must be specified", errcode=Codes.BAD_JSON)
421413
422414 got_mac = body["mac"]
423415
766758
767759 pushers = await self.store.get_pushers_by_user_id(user_id)
768760
769 filtered_pushers = [
770 {k: v for k, v in p.items() if k in _GET_PUSHERS_ALLOWED_KEYS}
771 for p in pushers
772 ]
761 filtered_pushers = [p.as_dict() for p in pushers]
773762
774763 return 200, {"pushers": filtered_pushers, "total": len(filtered_pushers)}
775764
1313 # limitations under the License.
1414
1515 import logging
16 from typing import Awaitable, Callable, Dict, Optional
16 from typing import TYPE_CHECKING, Awaitable, Callable, Dict, Optional
1717
1818 from synapse.api.errors import Codes, LoginError, SynapseError
1919 from synapse.api.ratelimiting import Ratelimiter
2929 from synapse.rest.well_known import WellKnownBuilder
3030 from synapse.types import JsonDict, UserID
3131
32 if TYPE_CHECKING:
33 from synapse.server import HomeServer
34
3235 logger = logging.getLogger(__name__)
3336
3437
4144 JWT_TYPE_DEPRECATED = "m.login.jwt"
4245 APPSERVICE_TYPE = "uk.half-shot.msc2778.login.application_service"
4346
44 def __init__(self, hs):
47 def __init__(self, hs: "HomeServer"):
4548 super().__init__()
4649 self.hs = hs
4750
104107 return 200, {"flows": flows}
105108
106109 async def on_POST(self, request: SynapseRequest):
107 self._address_ratelimiter.ratelimit(request.getClientIP())
108
109110 login_submission = parse_json_object_from_request(request)
110111
111112 try:
112113 if login_submission["type"] == LoginRestServlet.APPSERVICE_TYPE:
113114 appservice = self.auth.get_appservice_by_req(request)
115
116 if appservice.is_rate_limited():
117 self._address_ratelimiter.ratelimit(request.getClientIP())
118
114119 result = await self._do_appservice_login(login_submission, appservice)
115120 elif self.jwt_enabled and (
116121 login_submission["type"] == LoginRestServlet.JWT_TYPE
117122 or login_submission["type"] == LoginRestServlet.JWT_TYPE_DEPRECATED
118123 ):
124 self._address_ratelimiter.ratelimit(request.getClientIP())
119125 result = await self._do_jwt_login(login_submission)
120126 elif login_submission["type"] == LoginRestServlet.TOKEN_TYPE:
127 self._address_ratelimiter.ratelimit(request.getClientIP())
121128 result = await self._do_token_login(login_submission)
122129 else:
130 self._address_ratelimiter.ratelimit(request.getClientIP())
123131 result = await self._do_other_login(login_submission)
124132 except KeyError:
125133 raise SynapseError(400, "Missing JSON keys.")
158166 if not appservice.is_interested_in_user(qualified_user_id):
159167 raise LoginError(403, "Invalid access_token", errcode=Codes.FORBIDDEN)
160168
161 return await self._complete_login(qualified_user_id, login_submission)
169 return await self._complete_login(
170 qualified_user_id, login_submission, ratelimit=appservice.is_rate_limited()
171 )
162172
163173 async def _do_other_login(self, login_submission: JsonDict) -> Dict[str, str]:
164174 """Handle non-token/saml/jwt logins
193203 login_submission: JsonDict,
194204 callback: Optional[Callable[[Dict[str, str]], Awaitable[None]]] = None,
195205 create_non_existent_users: bool = False,
206 ratelimit: bool = True,
196207 ) -> Dict[str, str]:
197208 """Called when we've successfully authed the user and now need to
198209 actually login them in (e.g. create devices). This gets called on
207218 callback: Callback function to run after login.
208219 create_non_existent_users: Whether to create the user if they don't
209220 exist. Defaults to False.
221 ratelimit: Whether to ratelimit the login request.
210222
211223 Returns:
212224 result: Dictionary of account information after successful login.
215227 # Before we actually log them in we check if they've already logged in
216228 # too often. This happens here rather than before as we don't
217229 # necessarily know the user before now.
218 self._account_ratelimiter.ratelimit(user_id.lower())
230 if ratelimit:
231 self._account_ratelimiter.ratelimit(user_id.lower())
219232
220233 if create_non_existent_users:
221234 canonical_uid = await self.auth_handler.check_user_exists(user_id)
2727
2828 logger = logging.getLogger(__name__)
2929
30 ALLOWED_KEYS = {
31 "app_display_name",
32 "app_id",
33 "data",
34 "device_display_name",
35 "kind",
36 "lang",
37 "profile_tag",
38 "pushkey",
39 }
40
4130
4231 class PushersRestServlet(RestServlet):
4332 PATTERNS = client_patterns("/pushers$", v1=True)
5342
5443 pushers = await self.hs.get_datastore().get_pushers_by_user_id(user.to_string())
5544
56 filtered_pushers = [
57 {k: v for k, v in p.items() if k in ALLOWED_KEYS} for p in pushers
58 ]
45 filtered_pushers = [p.as_dict() for p in pushers]
5946
6047 return 200, {"pushers": filtered_pushers}
6148
962962 )
963963
964964
965 def register_servlets(hs, http_server):
965 def register_servlets(hs, http_server, is_worker=False):
966966 RoomStateEventRestServlet(hs).register(http_server)
967 RoomCreateRestServlet(hs).register(http_server)
968967 RoomMemberListRestServlet(hs).register(http_server)
969968 JoinedRoomMemberListRestServlet(hs).register(http_server)
970969 RoomMessageListRestServlet(hs).register(http_server)
971970 JoinRoomAliasServlet(hs).register(http_server)
972 RoomForgetRestServlet(hs).register(http_server)
973971 RoomMembershipRestServlet(hs).register(http_server)
974972 RoomSendEventRestServlet(hs).register(http_server)
975973 PublicRoomListRestServlet(hs).register(http_server)
976974 RoomStateRestServlet(hs).register(http_server)
977975 RoomRedactEventRestServlet(hs).register(http_server)
978976 RoomTypingRestServlet(hs).register(http_server)
979 SearchRestServlet(hs).register(http_server)
980 JoinedRoomsRestServlet(hs).register(http_server)
981 RoomEventServlet(hs).register(http_server)
982977 RoomEventContextServlet(hs).register(http_server)
983 RoomAliasListServlet(hs).register(http_server)
978
979 # Some servlets only get registered for the main process.
980 if not is_worker:
981 RoomCreateRestServlet(hs).register(http_server)
982 RoomForgetRestServlet(hs).register(http_server)
983 SearchRestServlet(hs).register(http_server)
984 JoinedRoomsRestServlet(hs).register(http_server)
985 RoomEventServlet(hs).register(http_server)
986 RoomAliasListServlet(hs).register(http_server)
984987
985988
986989 def register_deprecated_servlets(hs, http_server):
253253 logger.error("Auth succeeded but no known type! %r", result.keys())
254254 raise SynapseError(500, "", Codes.UNKNOWN)
255255
256 # If we have a password in this request, prefer it. Otherwise, there
257 # must be a password hash from an earlier request.
256 # If we have a password in this request, prefer it. Otherwise, use the
257 # password hash from an earlier request.
258258 if new_password:
259259 password_hash = await self.auth_handler.hash(new_password)
260 else:
260 elif session_id is not None:
261261 password_hash = await self.auth_handler.get_session_data(
262262 session_id, "password_hash", None
263263 )
264 else:
265 # UI validation was skipped, but the request did not include a new
266 # password.
267 password_hash = None
264268 if not password_hash:
265269 raise SynapseError(400, "Missing params: password", Codes.MISSING_PARAM)
266270
1414 # limitations under the License.
1515
1616 import logging
17 from functools import wraps
1718
1819 from synapse.api.errors import SynapseError
1920 from synapse.http.servlet import RestServlet, parse_json_object_from_request
2425 logger = logging.getLogger(__name__)
2526
2627
28 def _validate_group_id(f):
29 """Wrapper to validate the form of the group ID.
30
31 Can be applied to any on_FOO methods that accepts a group ID as a URL parameter.
32 """
33
34 @wraps(f)
35 def wrapper(self, request, group_id, *args, **kwargs):
36 if not GroupID.is_valid(group_id):
37 raise SynapseError(400, "%s is not a legal group ID" % (group_id,))
38
39 return f(self, request, group_id, *args, **kwargs)
40
41 return wrapper
42
43
2744 class GroupServlet(RestServlet):
2845 """Get the group profile
2946 """
3653 self.clock = hs.get_clock()
3754 self.groups_handler = hs.get_groups_local_handler()
3855
56 @_validate_group_id
3957 async def on_GET(self, request, group_id):
4058 requester = await self.auth.get_user_by_req(request, allow_guest=True)
4159 requester_user_id = requester.user.to_string()
4664
4765 return 200, group_description
4866
67 @_validate_group_id
4968 async def on_POST(self, request, group_id):
5069 requester = await self.auth.get_user_by_req(request)
5170 requester_user_id = requester.user.to_string()
7089 self.clock = hs.get_clock()
7190 self.groups_handler = hs.get_groups_local_handler()
7291
92 @_validate_group_id
7393 async def on_GET(self, request, group_id):
7494 requester = await self.auth.get_user_by_req(request, allow_guest=True)
7595 requester_user_id = requester.user.to_string()
101121 self.clock = hs.get_clock()
102122 self.groups_handler = hs.get_groups_local_handler()
103123
124 @_validate_group_id
104125 async def on_PUT(self, request, group_id, category_id, room_id):
105126 requester = await self.auth.get_user_by_req(request)
106127 requester_user_id = requester.user.to_string()
116137
117138 return 200, resp
118139
140 @_validate_group_id
119141 async def on_DELETE(self, request, group_id, category_id, room_id):
120142 requester = await self.auth.get_user_by_req(request)
121143 requester_user_id = requester.user.to_string()
141163 self.clock = hs.get_clock()
142164 self.groups_handler = hs.get_groups_local_handler()
143165
166 @_validate_group_id
144167 async def on_GET(self, request, group_id, category_id):
145168 requester = await self.auth.get_user_by_req(request, allow_guest=True)
146169 requester_user_id = requester.user.to_string()
151174
152175 return 200, category
153176
177 @_validate_group_id
154178 async def on_PUT(self, request, group_id, category_id):
155179 requester = await self.auth.get_user_by_req(request)
156180 requester_user_id = requester.user.to_string()
162186
163187 return 200, resp
164188
189 @_validate_group_id
165190 async def on_DELETE(self, request, group_id, category_id):
166191 requester = await self.auth.get_user_by_req(request)
167192 requester_user_id = requester.user.to_string()
185210 self.clock = hs.get_clock()
186211 self.groups_handler = hs.get_groups_local_handler()
187212
213 @_validate_group_id
188214 async def on_GET(self, request, group_id):
189215 requester = await self.auth.get_user_by_req(request, allow_guest=True)
190216 requester_user_id = requester.user.to_string()
208234 self.clock = hs.get_clock()
209235 self.groups_handler = hs.get_groups_local_handler()
210236
237 @_validate_group_id
211238 async def on_GET(self, request, group_id, role_id):
212239 requester = await self.auth.get_user_by_req(request, allow_guest=True)
213240 requester_user_id = requester.user.to_string()
218245
219246 return 200, category
220247
248 @_validate_group_id
221249 async def on_PUT(self, request, group_id, role_id):
222250 requester = await self.auth.get_user_by_req(request)
223251 requester_user_id = requester.user.to_string()
229257
230258 return 200, resp
231259
260 @_validate_group_id
232261 async def on_DELETE(self, request, group_id, role_id):
233262 requester = await self.auth.get_user_by_req(request)
234263 requester_user_id = requester.user.to_string()
252281 self.clock = hs.get_clock()
253282 self.groups_handler = hs.get_groups_local_handler()
254283
284 @_validate_group_id
255285 async def on_GET(self, request, group_id):
256286 requester = await self.auth.get_user_by_req(request, allow_guest=True)
257287 requester_user_id = requester.user.to_string()
283313 self.clock = hs.get_clock()
284314 self.groups_handler = hs.get_groups_local_handler()
285315
316 @_validate_group_id
286317 async def on_PUT(self, request, group_id, role_id, user_id):
287318 requester = await self.auth.get_user_by_req(request)
288319 requester_user_id = requester.user.to_string()
298329
299330 return 200, resp
300331
332 @_validate_group_id
301333 async def on_DELETE(self, request, group_id, role_id, user_id):
302334 requester = await self.auth.get_user_by_req(request)
303335 requester_user_id = requester.user.to_string()
321353 self.clock = hs.get_clock()
322354 self.groups_handler = hs.get_groups_local_handler()
323355
356 @_validate_group_id
324357 async def on_GET(self, request, group_id):
325358 requester = await self.auth.get_user_by_req(request, allow_guest=True)
326359 requester_user_id = requester.user.to_string()
327
328 if not GroupID.is_valid(group_id):
329 raise SynapseError(400, "%s was not legal group ID" % (group_id,))
330360
331361 result = await self.groups_handler.get_rooms_in_group(
332362 group_id, requester_user_id
347377 self.clock = hs.get_clock()
348378 self.groups_handler = hs.get_groups_local_handler()
349379
380 @_validate_group_id
350381 async def on_GET(self, request, group_id):
351382 requester = await self.auth.get_user_by_req(request, allow_guest=True)
352383 requester_user_id = requester.user.to_string()
370401 self.clock = hs.get_clock()
371402 self.groups_handler = hs.get_groups_local_handler()
372403
404 @_validate_group_id
373405 async def on_GET(self, request, group_id):
374406 requester = await self.auth.get_user_by_req(request)
375407 requester_user_id = requester.user.to_string()
392424 self.auth = hs.get_auth()
393425 self.groups_handler = hs.get_groups_local_handler()
394426
427 @_validate_group_id
395428 async def on_PUT(self, request, group_id):
396429 requester = await self.auth.get_user_by_req(request)
397430 requester_user_id = requester.user.to_string()
448481 self.clock = hs.get_clock()
449482 self.groups_handler = hs.get_groups_local_handler()
450483
484 @_validate_group_id
451485 async def on_PUT(self, request, group_id, room_id):
452486 requester = await self.auth.get_user_by_req(request)
453487 requester_user_id = requester.user.to_string()
459493
460494 return 200, result
461495
496 @_validate_group_id
462497 async def on_DELETE(self, request, group_id, room_id):
463498 requester = await self.auth.get_user_by_req(request)
464499 requester_user_id = requester.user.to_string()
485520 self.clock = hs.get_clock()
486521 self.groups_handler = hs.get_groups_local_handler()
487522
523 @_validate_group_id
488524 async def on_PUT(self, request, group_id, room_id, config_key):
489525 requester = await self.auth.get_user_by_req(request)
490526 requester_user_id = requester.user.to_string()
513549 self.store = hs.get_datastore()
514550 self.is_mine_id = hs.is_mine_id
515551
552 @_validate_group_id
516553 async def on_PUT(self, request, group_id, user_id):
517554 requester = await self.auth.get_user_by_req(request)
518555 requester_user_id = requester.user.to_string()
540577 self.clock = hs.get_clock()
541578 self.groups_handler = hs.get_groups_local_handler()
542579
580 @_validate_group_id
543581 async def on_PUT(self, request, group_id, user_id):
544582 requester = await self.auth.get_user_by_req(request)
545583 requester_user_id = requester.user.to_string()
564602 self.clock = hs.get_clock()
565603 self.groups_handler = hs.get_groups_local_handler()
566604
605 @_validate_group_id
567606 async def on_PUT(self, request, group_id):
568607 requester = await self.auth.get_user_by_req(request)
569608 requester_user_id = requester.user.to_string()
588627 self.clock = hs.get_clock()
589628 self.groups_handler = hs.get_groups_local_handler()
590629
630 @_validate_group_id
591631 async def on_PUT(self, request, group_id):
592632 requester = await self.auth.get_user_by_req(request)
593633 requester_user_id = requester.user.to_string()
612652 self.clock = hs.get_clock()
613653 self.groups_handler = hs.get_groups_local_handler()
614654
655 @_validate_group_id
615656 async def on_PUT(self, request, group_id):
616657 requester = await self.auth.get_user_by_req(request)
617658 requester_user_id = requester.user.to_string()
636677 self.clock = hs.get_clock()
637678 self.store = hs.get_datastore()
638679
680 @_validate_group_id
639681 async def on_PUT(self, request, group_id):
640682 requester = await self.auth.get_user_by_req(request)
641683 requester_user_id = requester.user.to_string()
450450
451451 # == Normal User Registration == (everyone else)
452452 if not self._registration_enabled:
453 raise SynapseError(403, "Registration has been disabled")
453 raise SynapseError(403, "Registration has been disabled", Codes.FORBIDDEN)
454454
455455 # For regular registration, convert the provided username to lowercase
456456 # before attempting to register it. This should mean that people who try
654654 user_id = await self.registration_handler.appservice_register(
655655 username, as_token
656656 )
657 return await self._create_registration_details(user_id, body)
658
659 async def _create_registration_details(self, user_id, params):
657 return await self._create_registration_details(
658 user_id, body, is_appservice_ghost=True,
659 )
660
661 async def _create_registration_details(
662 self, user_id, params, is_appservice_ghost=False
663 ):
660664 """Complete registration of newly-registered user
661665
662666 Allocates device_id if one was not given; also creates access_token.
673677 device_id = params.get("device_id")
674678 initial_display_name = params.get("initial_device_display_name")
675679 device_id, access_token = await self.registration_handler.register_device(
676 user_id, device_id, initial_display_name, is_guest=False
680 user_id,
681 device_id,
682 initial_display_name,
683 is_guest=False,
684 is_appservice_ghost=is_appservice_ghost,
677685 )
678686
679687 result.update({"access_token": access_token, "device_id": device_id})
1616 from typing import Tuple
1717
1818 from synapse.http import servlet
19 from synapse.http.servlet import parse_json_object_from_request
19 from synapse.http.servlet import assert_params_in_dict, parse_json_object_from_request
2020 from synapse.logging.opentracing import set_tag, trace
2121 from synapse.rest.client.transactions import HttpTransactionCache
2222
5353 requester = await self.auth.get_user_by_req(request, allow_guest=True)
5454
5555 content = parse_json_object_from_request(request)
56 assert_params_in_dict(content, ("messages",))
5657
5758 sender_user_id = requester.user.to_string()
5859
1212 # limitations under the License.
1313
1414 import logging
15 from typing import Dict, Set
15 from typing import Dict
1616
1717 from signedjson.sign import sign_json
1818
141141
142142 time_now_ms = self.clock.time_msec()
143143
144 cache_misses = {} # type: Dict[str, Set[str]]
144 # Note that the value is unused.
145 cache_misses = {} # type: Dict[str, Dict[str, int]]
145146 for (server_name, key_id, from_server), results in cached.items():
146147 results = [(result["ts_added_ms"], result) for result in results]
147148
148149 if not results and key_id is not None:
149 cache_misses.setdefault(server_name, set()).add(key_id)
150 cache_misses.setdefault(server_name, {})[key_id] = 0
150151 continue
151152
152153 if key_id is not None:
200201 )
201202
202203 if miss:
203 cache_misses.setdefault(server_name, set()).add(key_id)
204 cache_misses.setdefault(server_name, {})[key_id] = 0
204205 # Cast to bytes since postgresql returns a memoryview.
205206 json_results.add(bytes(most_recent_result["key_json"]))
206207 else:
153153 # clients are smart enough to be happy with Cache-Control
154154 request.setHeader(b"Cache-Control", b"public,max-age=86400,s-maxage=86400")
155155 request.setHeader(b"Content-Length", b"%d" % (file_size,))
156
157 # Tell web crawlers to not index, archive, or follow links in media. This
158 # should help to prevent things in the media repo from showing up in web
159 # search results.
160 request.setHeader(b"X-Robots-Tag", "noindex, nofollow, noarchive, noimageindex")
156161
157162
158163 # separators as defined in RFC2616. SP and HT are handled separately.
6565 def __init__(self, hs):
6666 self.hs = hs
6767 self.auth = hs.get_auth()
68 self.client = hs.get_http_client()
68 self.client = hs.get_federation_http_client()
6969 self.clock = hs.get_clock()
7070 self.server_name = hs.hostname
7171 self.store = hs.get_datastore()
675675 logger.debug("No media removed from url cache")
676676
677677
678 def decode_and_calc_og(body, media_uri, request_encoding=None):
678 def decode_and_calc_og(body, media_uri, request_encoding=None) -> Dict[str, str]:
679 # If there's no body, nothing useful is going to be found.
680 if not body:
681 return {}
682
679683 from lxml import etree
680684
681685 try:
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414
15 import inspect
1615 import logging
1716 import os
1817 import shutil
2019
2120 from synapse.config._base import Config
2221 from synapse.logging.context import defer_to_thread, run_in_background
22 from synapse.util.async_helpers import maybe_awaitable
2323
2424 from ._base import FileInfo, Responder
2525 from .media_storage import FileResponder
9090 if self.store_synchronous:
9191 # store_file is supposed to return an Awaitable, but guard
9292 # against improper implementations.
93 result = self.backend.store_file(path, file_info)
94 if inspect.isawaitable(result):
95 return await result
93 return await maybe_awaitable(self.backend.store_file(path, file_info))
9694 else:
9795 # TODO: Handle errors.
9896 async def store():
9997 try:
100 result = self.backend.store_file(path, file_info)
101 if inspect.isawaitable(result):
102 return await result
98 return await maybe_awaitable(
99 self.backend.store_file(path, file_info)
100 )
103101 except Exception:
104102 logger.exception("Error storing file")
105103
109107 async def fetch(self, path, file_info):
110108 # store_file is supposed to return an Awaitable, but guard
111109 # against improper implementations.
112 result = self.backend.fetch(path, file_info)
113 if inspect.isawaitable(result):
114 return await result
110 return await maybe_awaitable(self.backend.fetch(path, file_info))
115111
116112
117113 class FileStorageProviderBackend(StorageProvider):
4343 requester = await self.auth.get_user_by_req(request)
4444 # TODO: The checks here are a bit late. The content will have
4545 # already been uploaded to a tmp file at this point
46 content_length = request.getHeader(b"Content-Length").decode("ascii")
46 content_length = request.getHeader("Content-Length")
4747 if content_length is None:
4848 raise SynapseError(msg="Request must specify a Content-Length", code=400)
4949 if int(content_length) > self.max_upload_size:
0 # -*- coding: utf-8 -*-
1 # Copyright 2020 The Matrix.org Foundation C.I.C.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 from typing import TYPE_CHECKING
15
16 import pkg_resources
17
18 from twisted.web.http import Request
19 from twisted.web.resource import Resource
20 from twisted.web.static import File
21
22 from synapse.api.errors import SynapseError
23 from synapse.handlers.sso import USERNAME_MAPPING_SESSION_COOKIE_NAME
24 from synapse.http.server import DirectServeHtmlResource, DirectServeJsonResource
25 from synapse.http.servlet import parse_string
26 from synapse.http.site import SynapseRequest
27
28 if TYPE_CHECKING:
29 from synapse.server import HomeServer
30
31
32 def pick_username_resource(hs: "HomeServer") -> Resource:
33 """Factory method to generate the username picker resource.
34
35 This resource gets mounted under /_synapse/client/pick_username. The top-level
36 resource is just a File resource which serves up the static files in the resources
37 "res" directory, but it has a couple of children:
38
39 * "submit", which does the mechanics of registering the new user, and redirects the
40 browser back to the client URL
41
42 * "check": checks if a userid is free.
43 """
44
45 # XXX should we make this path customisable so that admins can restyle it?
46 base_path = pkg_resources.resource_filename("synapse", "res/username_picker")
47
48 res = File(base_path)
49 res.putChild(b"submit", SubmitResource(hs))
50 res.putChild(b"check", AvailabilityCheckResource(hs))
51
52 return res
53
54
55 class AvailabilityCheckResource(DirectServeJsonResource):
56 def __init__(self, hs: "HomeServer"):
57 super().__init__()
58 self._sso_handler = hs.get_sso_handler()
59
60 async def _async_render_GET(self, request: Request):
61 localpart = parse_string(request, "username", required=True)
62
63 session_id = request.getCookie(USERNAME_MAPPING_SESSION_COOKIE_NAME)
64 if not session_id:
65 raise SynapseError(code=400, msg="missing session_id")
66
67 is_available = await self._sso_handler.check_username_availability(
68 localpart, session_id.decode("ascii", errors="replace")
69 )
70 return 200, {"available": is_available}
71
72
73 class SubmitResource(DirectServeHtmlResource):
74 def __init__(self, hs: "HomeServer"):
75 super().__init__()
76 self._sso_handler = hs.get_sso_handler()
77
78 async def _async_render_POST(self, request: SynapseRequest):
79 localpart = parse_string(request, "username", required=True)
80
81 session_id = request.getCookie(USERNAME_MAPPING_SESSION_COOKIE_NAME)
82 if not session_id:
83 raise SynapseError(code=400, msg="missing session_id")
84
85 await self._sso_handler.handle_submit_username_request(
86 request, localpart, session_id.decode("ascii", errors="replace")
87 )
349349
350350 @cache_in_self
351351 def get_simple_http_client(self) -> SimpleHttpClient:
352 """
353 An HTTP client with no special configuration.
354 """
352355 return SimpleHttpClient(self)
353356
354357 @cache_in_self
355358 def get_proxied_http_client(self) -> SimpleHttpClient:
359 """
360 An HTTP client that uses configured HTTP(S) proxies.
361 """
356362 return SimpleHttpClient(
357363 self,
358364 http_proxy=os.getenvb(b"http_proxy"),
360366 )
361367
362368 @cache_in_self
369 def get_proxied_blacklisted_http_client(self) -> SimpleHttpClient:
370 """
371 An HTTP client that uses configured HTTP(S) proxies and blacklists IPs
372 based on the IP range blacklist/whitelist.
373 """
374 return SimpleHttpClient(
375 self,
376 ip_whitelist=self.config.ip_range_whitelist,
377 ip_blacklist=self.config.ip_range_blacklist,
378 http_proxy=os.getenvb(b"http_proxy"),
379 https_proxy=os.getenvb(b"HTTPS_PROXY"),
380 )
381
382 @cache_in_self
383 def get_federation_http_client(self) -> MatrixFederationHttpClient:
384 """
385 An HTTP client for federation.
386 """
387 tls_client_options_factory = context_factory.FederationPolicyForHTTPS(
388 self.config
389 )
390 return MatrixFederationHttpClient(self, tls_client_options_factory)
391
392 @cache_in_self
363393 def get_room_creation_handler(self) -> RoomCreationHandler:
364394 return RoomCreationHandler(self)
365395
512542 @cache_in_self
513543 def get_pusherpool(self) -> PusherPool:
514544 return PusherPool(self)
515
516 @cache_in_self
517 def get_http_client(self) -> MatrixFederationHttpClient:
518 tls_client_options_factory = context_factory.FederationPolicyForHTTPS(
519 self.config
520 )
521 return MatrixFederationHttpClient(self, tls_client_options_factory)
522545
523546 @cache_in_self
524547 def get_media_repository_resource(self) -> MediaRepositoryResource:
594617 return StatsHandler(self)
595618
596619 @cache_in_self
597 def get_spam_checker(self):
620 def get_spam_checker(self) -> SpamChecker:
598621 return SpamChecker(self)
599622
600623 @cache_in_self
782782 )
783783
784784 def get_auth_chain_difference(
785 self, state_sets: List[Set[str]]
785 self, room_id: str, state_sets: List[Set[str]]
786786 ) -> Awaitable[Set[str]]:
787787 """Given sets of state events figure out the auth chain difference (as
788788 per state res v2 algorithm).
795795 An awaitable that resolves to a set of event IDs.
796796 """
797797
798 return self.store.get_auth_chain_difference(state_sets)
798 return self.store.get_auth_chain_difference(room_id, state_sets)
3737 from synapse.api.errors import AuthError
3838 from synapse.api.room_versions import KNOWN_ROOM_VERSIONS
3939 from synapse.events import EventBase
40 from synapse.types import MutableStateMap, StateMap
40 from synapse.types import Collection, MutableStateMap, StateMap
4141 from synapse.util import Clock
4242
4343 logger = logging.getLogger(__name__)
9696
9797 # Also fetch all auth events that appear in only some of the state sets'
9898 # auth chains.
99 auth_diff = await _get_auth_chain_difference(state_sets, event_map, state_res_store)
99 auth_diff = await _get_auth_chain_difference(
100 room_id, state_sets, event_map, state_res_store
101 )
100102
101103 full_conflicted_set = set(
102104 itertools.chain(
235237
236238
237239 async def _get_auth_chain_difference(
240 room_id: str,
238241 state_sets: Sequence[StateMap[str]],
239242 event_map: Dict[str, EventBase],
240243 state_res_store: "synapse.state.StateResolutionStore",
251254 Set of event IDs
252255 """
253256
257 # The `StateResolutionStore.get_auth_chain_difference` function assumes that
258 # all events passed to it (and their auth chains) have been persisted
259 # previously. This is not the case for any events in the `event_map`, and so
260 # we need to manually handle those events.
261 #
262 # We do this by:
263 # 1. calculating the auth chain difference for the state sets based on the
264 # events in `event_map` alone
265 # 2. replacing any events in the state_sets that are also in `event_map`
266 # with their auth events (recursively), and then calling
267 # `store.get_auth_chain_difference` as normal
268 # 3. adding the results of 1 and 2 together.
269
270 # Map from event ID in `event_map` to their auth event IDs, and their auth
271 # event IDs if they appear in the `event_map`. This is the intersection of
272 # the event's auth chain with the events in the `event_map` *plus* their
273 # auth event IDs.
274 events_to_auth_chain = {} # type: Dict[str, Set[str]]
275 for event in event_map.values():
276 chain = {event.event_id}
277 events_to_auth_chain[event.event_id] = chain
278
279 to_search = [event]
280 while to_search:
281 for auth_id in to_search.pop().auth_event_ids():
282 chain.add(auth_id)
283 auth_event = event_map.get(auth_id)
284 if auth_event:
285 to_search.append(auth_event)
286
287 # We now a) calculate the auth chain difference for the unpersisted events
288 # and b) work out the state sets to pass to the store.
289 #
290 # Note: If the `event_map` is empty (which is the common case), we can do a
291 # much simpler calculation.
292 if event_map:
293 # The list of state sets to pass to the store, where each state set is a set
294 # of the event ids making up the state. This is similar to `state_sets`,
295 # except that (a) we only have event ids, not the complete
296 # ((type, state_key)->event_id) mappings; and (b) we have stripped out
297 # unpersisted events and replaced them with the persisted events in
298 # their auth chain.
299 state_sets_ids = [] # type: List[Set[str]]
300
301 # For each state set, the unpersisted event IDs reachable (by their auth
302 # chain) from the events in that set.
303 unpersisted_set_ids = [] # type: List[Set[str]]
304
305 for state_set in state_sets:
306 set_ids = set() # type: Set[str]
307 state_sets_ids.append(set_ids)
308
309 unpersisted_ids = set() # type: Set[str]
310 unpersisted_set_ids.append(unpersisted_ids)
311
312 for event_id in state_set.values():
313 event_chain = events_to_auth_chain.get(event_id)
314 if event_chain is not None:
315 # We have an event in `event_map`. We add all the auth
316 # events that it references (that aren't also in `event_map`).
317 set_ids.update(e for e in event_chain if e not in event_map)
318
319 # We also add the full chain of unpersisted event IDs
320 # referenced by this state set, so that we can work out the
321 # auth chain difference of the unpersisted events.
322 unpersisted_ids.update(e for e in event_chain if e in event_map)
323 else:
324 set_ids.add(event_id)
325
326 # The auth chain difference of the unpersisted events of the state sets
327 # is calculated by taking the difference between the union and
328 # intersections.
329 union = unpersisted_set_ids[0].union(*unpersisted_set_ids[1:])
330 intersection = unpersisted_set_ids[0].intersection(*unpersisted_set_ids[1:])
331
332 difference_from_event_map = union - intersection # type: Collection[str]
333 else:
334 difference_from_event_map = ()
335 state_sets_ids = [set(state_set.values()) for state_set in state_sets]
336
254337 difference = await state_res_store.get_auth_chain_difference(
255 [set(state_set.values()) for state_set in state_sets]
256 )
338 room_id, state_sets_ids
339 )
340 difference.update(difference_from_event_map)
257341
258342 return difference
259343
573657 # We do an iterative search, replacing `event with the power level in its
574658 # auth events (if any)
575659 while tmp_event:
576 depth = mainline_map.get(event.event_id)
660 depth = mainline_map.get(tmp_event.event_id)
577661 if depth is not None:
578662 return depth
579663
2626 data stores associated with them (e.g. the schema version tables), which are
2727 stored in `synapse.storage.schema`.
2828 """
29 from typing import TYPE_CHECKING
2930
3031 from synapse.storage.databases import Databases
3132 from synapse.storage.databases.main import DataStore
3334 from synapse.storage.purge_events import PurgeEventsStorage
3435 from synapse.storage.state import StateGroupStorage
3536
36 __all__ = ["DataStores", "DataStore"]
37 if TYPE_CHECKING:
38 from synapse.app.homeserver import HomeServer
39
40
41 __all__ = ["Databases", "DataStore"]
3742
3843
3944 class Storage:
4045 """The high level interfaces for talking to various storage layers.
4146 """
4247
43 def __init__(self, hs, stores: Databases):
48 def __init__(self, hs: "HomeServer", stores: Databases):
4449 # We include the main data store here mainly so that we don't have to
4550 # rewrite all the existing code to split it into high vs low level
4651 # interfaces.
1616 import logging
1717 import random
1818 from abc import ABCMeta
19 from typing import Any, Optional
19 from typing import TYPE_CHECKING, Any, Iterable, Optional, Union
2020
2121 from synapse.storage.database import LoggingTransaction # noqa: F401
2222 from synapse.storage.database import make_in_list_sql_clause # noqa: F401
2323 from synapse.storage.database import DatabasePool
24 from synapse.types import Collection, get_domain_from_id
24 from synapse.storage.types import Connection
25 from synapse.types import Collection, StreamToken, get_domain_from_id
2526 from synapse.util import json_decoder
27
28 if TYPE_CHECKING:
29 from synapse.app.homeserver import HomeServer
2630
2731 logger = logging.getLogger(__name__)
2832
3539 per data store (and not one per physical database).
3640 """
3741
38 def __init__(self, database: DatabasePool, db_conn, hs):
42 def __init__(self, database: DatabasePool, db_conn: Connection, hs: "HomeServer"):
3943 self.hs = hs
4044 self._clock = hs.get_clock()
4145 self.database_engine = database.engine
4246 self.db_pool = database
4347 self.rand = random.SystemRandom()
4448
45 def process_replication_rows(self, stream_name, instance_name, token, rows):
49 def process_replication_rows(
50 self,
51 stream_name: str,
52 instance_name: str,
53 token: StreamToken,
54 rows: Iterable[Any],
55 ) -> None:
4656 pass
4757
48 def _invalidate_state_caches(self, room_id, members_changed):
58 def _invalidate_state_caches(
59 self, room_id: str, members_changed: Iterable[str]
60 ) -> None:
4961 """Invalidates caches that are based on the current state, but does
5062 not stream invalidations down replication.
5163
5264 Args:
53 room_id (str): Room where state changed
54 members_changed (iterable[str]): The user_ids of members that have
55 changed
65 room_id: Room where state changed
66 members_changed: The user_ids of members that have changed
5667 """
5768 for host in {get_domain_from_id(u) for u in members_changed}:
5869 self._attempt_to_invalidate_cache("is_host_joined", (room_id, host))
6374
6475 def _attempt_to_invalidate_cache(
6576 self, cache_name: str, key: Optional[Collection[Any]]
66 ):
77 ) -> None:
6778 """Attempts to invalidate the cache of the given name, ignoring if the
6879 cache doesn't exist. Mainly used for invalidating caches on workers,
6980 where they may not have the cache.
8798 cache.invalidate(tuple(key))
8899
89100
90 def db_to_json(db_content):
101 def db_to_json(db_content: Union[memoryview, bytes, bytearray, str]) -> Any:
91102 """
92103 Take some data from a database row and return a JSON-decoded object.
93104
94105 Args:
95 db_content (memoryview|buffer|bytes|bytearray|unicode)
106 db_content: The JSON-encoded contents from the database.
107
108 Returns:
109 The object decoded from JSON.
96110 """
97111 # psycopg2 on Python 3 returns memoryview objects, which we need to
98112 # cast to bytes to decode
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14
1514 import logging
16 from typing import Optional
15 from typing import TYPE_CHECKING, Awaitable, Callable, Dict, Iterable, Optional
1716
1817 from synapse.metrics.background_process_metrics import run_as_background_process
18 from synapse.storage.types import Connection
19 from synapse.types import JsonDict
1920 from synapse.util import json_encoder
2021
2122 from . import engines
23
24 if TYPE_CHECKING:
25 from synapse.app.homeserver import HomeServer
26 from synapse.storage.database import DatabasePool, LoggingTransaction
2227
2328 logger = logging.getLogger(__name__)
2429
2631 class BackgroundUpdatePerformance:
2732 """Tracks the how long a background update is taking to update its items"""
2833
29 def __init__(self, name):
34 def __init__(self, name: str):
3035 self.name = name
3136 self.total_item_count = 0
32 self.total_duration_ms = 0
33 self.avg_item_count = 0
34 self.avg_duration_ms = 0
35
36 def update(self, item_count, duration_ms):
37 self.total_duration_ms = 0.0
38 self.avg_item_count = 0.0
39 self.avg_duration_ms = 0.0
40
41 def update(self, item_count: int, duration_ms: float) -> None:
3742 """Update the stats after doing an update"""
3843 self.total_item_count += item_count
3944 self.total_duration_ms += duration_ms
4348 self.avg_item_count += 0.1 * (item_count - self.avg_item_count)
4449 self.avg_duration_ms += 0.1 * (duration_ms - self.avg_duration_ms)
4550
46 def average_items_per_ms(self):
51 def average_items_per_ms(self) -> Optional[float]:
4752 """An estimate of how long it takes to do a single update.
4853 Returns:
4954 A duration in ms as a float
5762 # changes in how long the update process takes.
5863 return float(self.avg_item_count) / float(self.avg_duration_ms)
5964
60 def total_items_per_ms(self):
65 def total_items_per_ms(self) -> Optional[float]:
6166 """An estimate of how long it takes to do a single update.
6267 Returns:
6368 A duration in ms as a float
8287 BACKGROUND_UPDATE_INTERVAL_MS = 1000
8388 BACKGROUND_UPDATE_DURATION_MS = 100
8489
85 def __init__(self, hs, database):
90 def __init__(self, hs: "HomeServer", database: "DatabasePool"):
8691 self._clock = hs.get_clock()
8792 self.db_pool = database
8893
8994 # if a background update is currently running, its name.
9095 self._current_background_update = None # type: Optional[str]
9196
92 self._background_update_performance = {}
93 self._background_update_handlers = {}
97 self._background_update_performance = (
98 {}
99 ) # type: Dict[str, BackgroundUpdatePerformance]
100 self._background_update_handlers = (
101 {}
102 ) # type: Dict[str, Callable[[JsonDict, int], Awaitable[int]]]
94103 self._all_done = False
95104
96 def start_doing_background_updates(self):
105 def start_doing_background_updates(self) -> None:
97106 run_as_background_process("background_updates", self.run_background_updates)
98107
99 async def run_background_updates(self, sleep=True):
108 async def run_background_updates(self, sleep: bool = True) -> None:
100109 logger.info("Starting background schema updates")
101110 while True:
102111 if sleep:
147156
148157 return False
149158
150 async def has_completed_background_update(self, update_name) -> bool:
159 async def has_completed_background_update(self, update_name: str) -> bool:
151160 """Check if the given background update has finished running.
152161 """
153162 if self._all_done:
172181 Returns once some amount of work is done.
173182
174183 Args:
175 desired_duration_ms(float): How long we want to spend
176 updating.
184 desired_duration_ms: How long we want to spend updating.
177185 Returns:
178186 True if we have finished running all the background updates, otherwise False
179187 """
219227 return False
220228
221229 async def _do_background_update(self, desired_duration_ms: float) -> int:
230 assert self._current_background_update is not None
222231 update_name = self._current_background_update
223232 logger.info("Starting update batch on background update '%s'", update_name)
224233
272281
273282 return len(self._background_update_performance)
274283
275 def register_background_update_handler(self, update_name, update_handler):
284 def register_background_update_handler(
285 self,
286 update_name: str,
287 update_handler: Callable[[JsonDict, int], Awaitable[int]],
288 ):
276289 """Register a handler for doing a background update.
277290
278291 The handler should take two arguments:
286299 The handler is responsible for updating the progress of the update.
287300
288301 Args:
289 update_name(str): The name of the update that this code handles.
290 update_handler(function): The function that does the update.
302 update_name: The name of the update that this code handles.
303 update_handler: The function that does the update.
291304 """
292305 self._background_update_handlers[update_name] = update_handler
293306
294 def register_noop_background_update(self, update_name):
307 def register_noop_background_update(self, update_name: str) -> None:
295308 """Register a noop handler for a background update.
296309
297310 This is useful when we previously did a background update, but no
301314 also be called to clear the update.
302315
303316 Args:
304 update_name (str): Name of update
305 """
306
307 async def noop_update(progress, batch_size):
317 update_name: Name of update
318 """
319
320 async def noop_update(progress: JsonDict, batch_size: int) -> int:
308321 await self._end_background_update(update_name)
309322 return 1
310323
312325
313326 def register_background_index_update(
314327 self,
315 update_name,
316 index_name,
317 table,
318 columns,
319 where_clause=None,
320 unique=False,
321 psql_only=False,
322 ):
328 update_name: str,
329 index_name: str,
330 table: str,
331 columns: Iterable[str],
332 where_clause: Optional[str] = None,
333 unique: bool = False,
334 psql_only: bool = False,
335 ) -> None:
323336 """Helper for store classes to do a background index addition
324337
325338 To use:
331344 2. In the Store constructor, call this method
332345
333346 Args:
334 update_name (str): update_name to register for
335 index_name (str): name of index to add
336 table (str): table to add index to
337 columns (list[str]): columns/expressions to include in index
338 unique (bool): true to make a UNIQUE index
347 update_name: update_name to register for
348 index_name: name of index to add
349 table: table to add index to
350 columns: columns/expressions to include in index
351 unique: true to make a UNIQUE index
339352 psql_only: true to only create this index on psql databases (useful
340353 for virtual sqlite tables)
341354 """
342355
343 def create_index_psql(conn):
356 def create_index_psql(conn: Connection) -> None:
344357 conn.rollback()
345358 # postgres insists on autocommit for the index
346 conn.set_session(autocommit=True)
359 conn.set_session(autocommit=True) # type: ignore
347360
348361 try:
349362 c = conn.cursor()
370383 logger.debug("[SQL] %s", sql)
371384 c.execute(sql)
372385 finally:
373 conn.set_session(autocommit=False)
374
375 def create_index_sqlite(conn):
386 conn.set_session(autocommit=False) # type: ignore
387
388 def create_index_sqlite(conn: Connection) -> None:
376389 # Sqlite doesn't support concurrent creation of indexes.
377390 #
378391 # We don't use partial indices on SQLite as it wasn't introduced
398411 c.execute(sql)
399412
400413 if isinstance(self.db_pool.engine, engines.PostgresEngine):
401 runner = create_index_psql
414 runner = create_index_psql # type: Optional[Callable[[Connection], None]]
402415 elif psql_only:
403416 runner = None
404417 else:
432445 "background_updates", keyvalues={"update_name": update_name}
433446 )
434447
435 async def _background_update_progress(self, update_name: str, progress: dict):
448 async def _background_update_progress(
449 self, update_name: str, progress: dict
450 ) -> None:
436451 """Update the progress of a background update
437452
438453 Args:
440455 progress: The progress of the update.
441456 """
442457
443 return await self.db_pool.runInteraction(
458 await self.db_pool.runInteraction(
444459 "background_update_progress",
445460 self._background_update_progress_txn,
446461 update_name,
447462 progress,
448463 )
449464
450 def _background_update_progress_txn(self, txn, update_name, progress):
465 def _background_update_progress_txn(
466 self, txn: "LoggingTransaction", update_name: str, progress: JsonDict
467 ) -> None:
451468 """Update the progress of a background update
452469
453470 Args:
454 txn(cursor): The transaction.
455 update_name(str): The name of the background update task
456 progress(dict): The progress of the update.
471 txn: The transaction.
472 update_name: The name of the background update task
473 progress: The progress of the update.
457474 """
458475
459476 progress_json = json_encoder.encode(progress)
148148 self._event_reports_id_gen = IdGenerator(db_conn, "event_reports", "id")
149149 self._push_rule_id_gen = IdGenerator(db_conn, "push_rules", "id")
150150 self._push_rules_enable_id_gen = IdGenerator(db_conn, "push_rules_enable", "id")
151 self._pushers_id_gen = StreamIdGenerator(
152 db_conn, "pushers", "id", extra_tables=[("deleted_pushers", "stream_id")]
153 )
154151 self._group_updates_id_gen = StreamIdGenerator(
155152 db_conn, "local_group_updates", "stream_id"
156153 )
341338 filters = []
342339 args = [self.hs.config.server_name]
343340
341 # `name` is in database already in lower case
344342 if name:
345 filters.append("(name LIKE ? OR displayname LIKE ?)")
346 args.extend(["@%" + name + "%:%", "%" + name + "%"])
343 filters.append("(name LIKE ? OR LOWER(displayname) LIKE ?)")
344 args.extend(["@%" + name.lower() + "%:%", "%" + name.lower() + "%"])
347345 elif user_id:
348346 filters.append("name LIKE ?")
349 args.extend(["%" + user_id + "%"])
347 args.extend(["%" + user_id.lower() + "%"])
350348
351349 if not guests:
352350 filters.append("is_guest = 0")
1313 # limitations under the License.
1414
1515 import logging
16 from typing import Dict, Optional, Tuple
16 from typing import Dict, List, Optional, Tuple, Union
1717
1818 from synapse.metrics.background_process_metrics import wrap_as_background_process
1919 from synapse.storage._base import SQLBaseStore
2020 from synapse.storage.database import DatabasePool, make_tuple_comparison_clause
21 from synapse.types import UserID
2122 from synapse.util.caches.lrucache import LruCache
2223
2324 logger = logging.getLogger(__name__)
545546 }
546547 return ret
547548
548 async def get_user_ip_and_agents(self, user):
549 async def get_user_ip_and_agents(
550 self, user: UserID
551 ) -> List[Dict[str, Union[str, int]]]:
549552 user_id = user.to_string()
550553 results = {}
551554
5555 self._clock.looping_call(
5656 self._prune_old_outbound_device_pokes, 60 * 60 * 1000
5757 )
58
59 async def count_devices_by_users(self, user_ids: Optional[List[str]] = None) -> int:
60 """Retrieve number of all devices of given users.
61 Only returns number of devices that are not marked as hidden.
62
63 Args:
64 user_ids: The IDs of the users which owns devices
65 Returns:
66 Number of devices of this users.
67 """
68
69 def count_devices_by_users_txn(txn, user_ids):
70 sql = """
71 SELECT count(*)
72 FROM devices
73 WHERE
74 hidden = '0' AND
75 """
76
77 clause, args = make_in_list_sql_clause(
78 txn.database_engine, "user_id", user_ids
79 )
80
81 txn.execute(sql + clause, args)
82 return txn.fetchone()[0]
83
84 if not user_ids:
85 return 0
86
87 return await self.db_pool.runInteraction(
88 "count_devices_by_users", count_devices_by_users_txn, user_ids
89 )
5890
5991 async def get_device(self, user_id: str, device_id: str) -> Dict[str, Any]:
6092 """Retrieve a device. Only returns devices that are not marked as
136136
137137 return list(results)
138138
139 async def get_auth_chain_difference(self, state_sets: List[Set[str]]) -> Set[str]:
139 async def get_auth_chain_difference(
140 self, room_id: str, state_sets: List[Set[str]]
141 ) -> Set[str]:
140142 """Given sets of state events figure out the auth chain difference (as
141143 per state res v2 algorithm).
142144
893893 pa["actions"] = _deserialize_action(pa["actions"], pa["highlight"])
894894 return push_actions
895895
896 async def get_latest_push_action_stream_ordering(self):
897 def f(txn):
898 txn.execute("SELECT MAX(stream_ordering) FROM event_push_actions")
899 return txn.fetchone()
900
901 result = await self.db_pool.runInteraction(
902 "get_latest_push_action_stream_ordering", f
903 )
904 return result[0] or 0
905
906896 def _remove_old_push_actions_before_txn(
907897 self, txn, room_id, user_id, stream_ordering
908898 ):
2121
2222 from synapse.storage._base import SQLBaseStore
2323 from synapse.storage.keys import FetchKeyResult
24 from synapse.storage.types import Cursor
2425 from synapse.util.caches.descriptors import cached, cachedList
2526 from synapse.util.iterutils import batch_iter
2627
4344 )
4445 async def get_server_verify_keys(
4546 self, server_name_and_key_ids: Iterable[Tuple[str, str]]
46 ) -> Dict[Tuple[str, str], Optional[FetchKeyResult]]:
47 ) -> Dict[Tuple[str, str], FetchKeyResult]:
4748 """
4849 Args:
4950 server_name_and_key_ids:
5556 """
5657 keys = {}
5758
58 def _get_keys(txn, batch):
59 def _get_keys(txn: Cursor, batch: Tuple[Tuple[str, str]]) -> None:
5960 """Processes a batch of keys to fetch, and adds the result to `keys`."""
6061
6162 # batch_iter always returns tuples so it's safe to do len(batch)
7677 # `ts_valid_until_ms`.
7778 ts_valid_until_ms = 0
7879
79 res = FetchKeyResult(
80 keys[(server_name, key_id)] = FetchKeyResult(
8081 verify_key=decode_verify_key_bytes(key_id, bytes(key_bytes)),
8182 valid_until_ts=ts_valid_until_ms,
8283 )
83 keys[(server_name, key_id)] = res
84
85 def _txn(txn):
84
85 def _txn(txn: Cursor) -> Dict[Tuple[str, str], FetchKeyResult]:
8686 for batch in batch_iter(server_name_and_key_ids, 50):
8787 _get_keys(txn, batch)
8888 return keys
1414 # limitations under the License.
1515
1616 import logging
17 from typing import Iterable, Iterator, List, Tuple
17 from typing import TYPE_CHECKING, Any, Dict, Iterable, Iterator, List, Optional, Tuple
1818
1919 from canonicaljson import encode_canonical_json
2020
21 from synapse.push import PusherConfig, ThrottleParams
2122 from synapse.storage._base import SQLBaseStore, db_to_json
23 from synapse.storage.database import DatabasePool
24 from synapse.storage.types import Connection
25 from synapse.storage.util.id_generators import StreamIdGenerator
26 from synapse.types import JsonDict
2227 from synapse.util.caches.descriptors import cached, cachedList
2328
29 if TYPE_CHECKING:
30 from synapse.app.homeserver import HomeServer
31
2432 logger = logging.getLogger(__name__)
2533
2634
2735 class PusherWorkerStore(SQLBaseStore):
28 def _decode_pushers_rows(self, rows: Iterable[dict]) -> Iterator[dict]:
36 def __init__(self, database: DatabasePool, db_conn: Connection, hs: "HomeServer"):
37 super().__init__(database, db_conn, hs)
38 self._pushers_id_gen = StreamIdGenerator(
39 db_conn, "pushers", "id", extra_tables=[("deleted_pushers", "stream_id")]
40 )
41
42 def _decode_pushers_rows(self, rows: Iterable[dict]) -> Iterator[PusherConfig]:
2943 """JSON-decode the data in the rows returned from the `pushers` table
3044
3145 Drops any rows whose data cannot be decoded
4357 )
4458 continue
4559
46 yield r
47
48 async def user_has_pusher(self, user_id):
60 yield PusherConfig(**r)
61
62 async def user_has_pusher(self, user_id: str) -> bool:
4963 ret = await self.db_pool.simple_select_one_onecol(
5064 "pushers", {"user_name": user_id}, "id", allow_none=True
5165 )
5266 return ret is not None
5367
54 def get_pushers_by_app_id_and_pushkey(self, app_id, pushkey):
55 return self.get_pushers_by({"app_id": app_id, "pushkey": pushkey})
56
57 def get_pushers_by_user_id(self, user_id):
58 return self.get_pushers_by({"user_name": user_id})
59
60 async def get_pushers_by(self, keyvalues):
68 async def get_pushers_by_app_id_and_pushkey(
69 self, app_id: str, pushkey: str
70 ) -> Iterator[PusherConfig]:
71 return await self.get_pushers_by({"app_id": app_id, "pushkey": pushkey})
72
73 async def get_pushers_by_user_id(self, user_id: str) -> Iterator[PusherConfig]:
74 return await self.get_pushers_by({"user_name": user_id})
75
76 async def get_pushers_by(self, keyvalues: Dict[str, Any]) -> Iterator[PusherConfig]:
6177 ret = await self.db_pool.simple_select_list(
6278 "pushers",
6379 keyvalues,
8298 )
8399 return self._decode_pushers_rows(ret)
84100
85 async def get_all_pushers(self):
101 async def get_all_pushers(self) -> Iterator[PusherConfig]:
86102 def get_pushers(txn):
87103 txn.execute("SELECT * FROM pushers")
88104 rows = self.db_pool.cursor_to_dict(txn)
158174 )
159175
160176 @cached(num_args=1, max_entries=15000)
161 async def get_if_user_has_pusher(self, user_id):
177 async def get_if_user_has_pusher(self, user_id: str):
162178 # This only exists for the cachedList decorator
163179 raise NotImplementedError()
164180
165181 @cachedList(
166182 cached_method_name="get_if_user_has_pusher", list_name="user_ids", num_args=1,
167183 )
168 async def get_if_users_have_pushers(self, user_ids):
184 async def get_if_users_have_pushers(
185 self, user_ids: Iterable[str]
186 ) -> Dict[str, bool]:
169187 rows = await self.db_pool.simple_select_many_batch(
170188 table="pushers",
171189 column="user_name",
223241 return bool(updated)
224242
225243 async def update_pusher_failing_since(
226 self, app_id, pushkey, user_id, failing_since
244 self, app_id: str, pushkey: str, user_id: str, failing_since: Optional[int]
227245 ) -> None:
228246 await self.db_pool.simple_update(
229247 table="pushers",
232250 desc="update_pusher_failing_since",
233251 )
234252
235 async def get_throttle_params_by_room(self, pusher_id):
253 async def get_throttle_params_by_room(
254 self, pusher_id: str
255 ) -> Dict[str, ThrottleParams]:
236256 res = await self.db_pool.simple_select_list(
237257 "pusher_throttle",
238258 {"pusher": pusher_id},
242262
243263 params_by_room = {}
244264 for row in res:
245 params_by_room[row["room_id"]] = {
246 "last_sent_ts": row["last_sent_ts"],
247 "throttle_ms": row["throttle_ms"],
248 }
265 params_by_room[row["room_id"]] = ThrottleParams(
266 row["last_sent_ts"], row["throttle_ms"],
267 )
249268
250269 return params_by_room
251270
252 async def set_throttle_params(self, pusher_id, room_id, params) -> None:
271 async def set_throttle_params(
272 self, pusher_id: str, room_id: str, params: ThrottleParams
273 ) -> None:
253274 # no need to lock because `pusher_throttle` has a primary key on
254275 # (pusher, room_id) so simple_upsert will retry
255276 await self.db_pool.simple_upsert(
256277 "pusher_throttle",
257278 {"pusher": pusher_id, "room_id": room_id},
258 params,
279 {"last_sent_ts": params.last_sent_ts, "throttle_ms": params.throttle_ms},
259280 desc="set_throttle_params",
260281 lock=False,
261282 )
262283
263284
264285 class PusherStore(PusherWorkerStore):
265 def get_pushers_stream_token(self):
286 def get_pushers_stream_token(self) -> int:
266287 return self._pushers_id_gen.get_current_token()
267288
268289 async def add_pusher(
269290 self,
270 user_id,
271 access_token,
272 kind,
273 app_id,
274 app_display_name,
275 device_display_name,
276 pushkey,
277 pushkey_ts,
278 lang,
279 data,
280 last_stream_ordering,
281 profile_tag="",
291 user_id: str,
292 access_token: Optional[int],
293 kind: str,
294 app_id: str,
295 app_display_name: str,
296 device_display_name: str,
297 pushkey: str,
298 pushkey_ts: int,
299 lang: Optional[str],
300 data: Optional[JsonDict],
301 last_stream_ordering: int,
302 profile_tag: str = "",
282303 ) -> None:
283304 async with self._pushers_id_gen.get_next() as stream_id:
284305 # no need to lock because `pushers` has a unique key on
310331 # invalidate, since we the user might not have had a pusher before
311332 await self.db_pool.runInteraction(
312333 "add_pusher",
313 self._invalidate_cache_and_stream,
334 self._invalidate_cache_and_stream, # type: ignore
314335 self.get_if_user_has_pusher,
315336 (user_id,),
316337 )
317338
318339 async def delete_pusher_by_app_id_pushkey_user_id(
319 self, app_id, pushkey, user_id
340 self, app_id: str, pushkey: str, user_id: str
320341 ) -> None:
321342 def delete_pusher_txn(txn, stream_id):
322 self._invalidate_cache_and_stream(
343 self._invalidate_cache_and_stream( # type: ignore
323344 txn, self.get_if_user_has_pusher, (user_id,)
324345 )
325346
462462 desc="get_user_by_external_id",
463463 )
464464
465 async def get_external_ids_by_user(self, mxid: str) -> List[Tuple[str, str]]:
466 """Look up external ids for the given user
467
468 Args:
469 mxid: the MXID to be looked up
470
471 Returns:
472 Tuples of (auth_provider, external_id)
473 """
474 res = await self.db_pool.simple_select_list(
475 table="user_external_ids",
476 keyvalues={"user_id": mxid},
477 retcols=("auth_provider", "external_id"),
478 desc="get_external_ids_by_user",
479 )
480 return [(r["auth_provider"], r["external_id"]) for r in res]
481
465482 async def count_all_users(self):
466483 """Counts all users registered on the homeserver."""
467484
925942 desc="del_user_pending_deactivation",
926943 )
927944
945 async def get_access_token_last_validated(self, token_id: int) -> int:
946 """Retrieves the time (in milliseconds) of the last validation of an access token.
947
948 Args:
949 token_id: The ID of the access token to update.
950 Raises:
951 StoreError if the access token was not found.
952
953 Returns:
954 The last validation time.
955 """
956 result = await self.db_pool.simple_select_one_onecol(
957 "access_tokens", {"id": token_id}, "last_validated"
958 )
959
960 # If this token has not been validated (since starting to track this),
961 # return 0 instead of None.
962 return result or 0
963
964 async def update_access_token_last_validated(self, token_id: int) -> None:
965 """Updates the last time an access token was validated.
966
967 Args:
968 token_id: The ID of the access token to update.
969 Raises:
970 StoreError if there was a problem updating this.
971 """
972 now = self._clock.time_msec()
973
974 await self.db_pool.simple_update_one(
975 "access_tokens",
976 {"id": token_id},
977 {"last_validated": now},
978 desc="update_access_token_last_validated",
979 )
980
928981
929982 class RegistrationBackgroundUpdateStore(RegistrationWorkerStore):
930983 def __init__(self, database: DatabasePool, db_conn: Connection, hs: "HomeServer"):
9601013
9611014 self.db_pool.updates.register_background_update_handler(
9621015 "users_set_deactivated_flag", self._background_update_set_deactivated_flag
1016 )
1017
1018 self.db_pool.updates.register_background_index_update(
1019 "user_external_ids_user_id_idx",
1020 index_name="user_external_ids_user_id_idx",
1021 table="user_external_ids",
1022 columns=["user_id"],
1023 unique=False,
9631024 )
9641025
9651026 async def _background_update_set_deactivated_flag(self, progress, batch_size):
11241185 The token ID
11251186 """
11261187 next_id = self._access_tokens_id_gen.get_next()
1188 now = self._clock.time_msec()
11271189
11281190 await self.db_pool.simple_insert(
11291191 "access_tokens",
11341196 "device_id": device_id,
11351197 "valid_until_ms": valid_until_ms,
11361198 "puppets_user_id": puppets_user_id,
1199 "last_validated": now,
11371200 },
11381201 desc="add_access_token_to_user",
11391202 )
378378 # Filter room names by a string
379379 where_statement = ""
380380 if search_term:
381 where_statement = "WHERE state.name LIKE ?"
381 where_statement = "WHERE LOWER(state.name) LIKE ?"
382382
383383 # Our postgres db driver converts ? -> %s in SQL strings as that's the
384384 # placeholder for postgres.
385385 # HOWEVER, if you put a % into your SQL then everything goes wibbly.
386386 # To get around this, we're going to surround search_term with %'s
387387 # before giving it to the database in python instead
388 search_term = "%" + search_term + "%"
388 search_term = "%" + search_term.lower() + "%"
389389
390390 # Set ordering
391391 if RoomSortOrder(order_by) == RoomSortOrder.SIZE:
0 /* Copyright 2020 The Matrix.org Foundation C.I.C
1 *
2 * Licensed under the Apache License, Version 2.0 (the "License");
3 * you may not use this file except in compliance with the License.
4 * You may obtain a copy of the License at
5 *
6 * http://www.apache.org/licenses/LICENSE-2.0
7 *
8 * Unless required by applicable law or agreed to in writing, software
9 * distributed under the License is distributed on an "AS IS" BASIS,
10 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
11 * See the License for the specific language governing permissions and
12 * limitations under the License.
13 */
14
15 INSERT INTO background_updates (ordering, update_name, progress_json) VALUES
16 (5825, 'user_external_ids_user_id_idx', '{}');
0 /* Copyright 2020 The Matrix.org Foundation C.I.C
1 *
2 * Licensed under the Apache License, Version 2.0 (the "License");
3 * you may not use this file except in compliance with the License.
4 * You may obtain a copy of the License at
5 *
6 * http://www.apache.org/licenses/LICENSE-2.0
7 *
8 * Unless required by applicable law or agreed to in writing, software
9 * distributed under the License is distributed on an "AS IS" BASIS,
10 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
11 * See the License for the specific language governing permissions and
12 * limitations under the License.
13 */
14
15 -- The last time this access token was "validated" (i.e. logged in or succeeded
16 -- at user-interactive authentication).
17 ALTER TABLE access_tokens ADD COLUMN last_validated BIGINT;
0 /*
1 * Copyright 2020 The Matrix.org Foundation C.I.C.
2 *
3 * Licensed under the Apache License, Version 2.0 (the "License");
4 * you may not use this file except in compliance with the License.
5 * You may obtain a copy of the License at
6 *
7 * http://www.apache.org/licenses/LICENSE-2.0
8 *
9 * Unless required by applicable law or agreed to in writing, software
10 * distributed under the License is distributed on an "AS IS" BASIS,
11 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 * See the License for the specific language governing permissions and
13 * limitations under the License.
14 */
15
16 -- This is unused since Synapse v1.17.0.
17 DROP TABLE local_invites;
1616 import re
1717 from typing import Any, Dict, Iterable, Optional, Set, Tuple
1818
19 from synapse.api.constants import EventTypes, JoinRules
19 from synapse.api.constants import EventTypes, HistoryVisibility, JoinRules
2020 from synapse.storage.database import DatabasePool
2121 from synapse.storage.databases.main.state import StateFilter
2222 from synapse.storage.databases.main.state_deltas import StateDeltasStore
359359 if hist_vis_id:
360360 hist_vis_ev = await self.get_event(hist_vis_id, allow_none=True)
361361 if hist_vis_ev:
362 if hist_vis_ev.content.get("history_visibility") == "world_readable":
362 if (
363 hist_vis_ev.content.get("history_visibility")
364 == HistoryVisibility.WORLD_READABLE
365 ):
363366 return True
364367
365368 return False
392395 sql = """
393396 INSERT INTO user_directory_search(user_id, vector)
394397 VALUES (?,
395 setweight(to_tsvector('english', ?), 'A')
396 || setweight(to_tsvector('english', ?), 'D')
397 || setweight(to_tsvector('english', COALESCE(?, '')), 'B')
398 setweight(to_tsvector('simple', ?), 'A')
399 || setweight(to_tsvector('simple', ?), 'D')
400 || setweight(to_tsvector('simple', COALESCE(?, '')), 'B')
398401 ) ON CONFLICT (user_id) DO UPDATE SET vector=EXCLUDED.vector
399402 """
400403 txn.execute(
414417 sql = """
415418 INSERT INTO user_directory_search(user_id, vector)
416419 VALUES (?,
417 setweight(to_tsvector('english', ?), 'A')
418 || setweight(to_tsvector('english', ?), 'D')
419 || setweight(to_tsvector('english', COALESCE(?, '')), 'B')
420 setweight(to_tsvector('simple', ?), 'A')
421 || setweight(to_tsvector('simple', ?), 'D')
422 || setweight(to_tsvector('simple', COALESCE(?, '')), 'B')
420423 )
421424 """
422425 txn.execute(
431434 elif new_entry is False:
432435 sql = """
433436 UPDATE user_directory_search
434 SET vector = setweight(to_tsvector('english', ?), 'A')
435 || setweight(to_tsvector('english', ?), 'D')
436 || setweight(to_tsvector('english', COALESCE(?, '')), 'B')
437 SET vector = setweight(to_tsvector('simple', ?), 'A')
438 || setweight(to_tsvector('simple', ?), 'D')
439 || setweight(to_tsvector('simple', COALESCE(?, '')), 'B')
437440 WHERE user_id = ?
438441 """
439442 txn.execute(
760763 INNER JOIN user_directory AS d USING (user_id)
761764 WHERE
762765 %s
763 AND vector @@ to_tsquery('english', ?)
766 AND vector @@ to_tsquery('simple', ?)
764767 ORDER BY
765768 (CASE WHEN d.user_id IS NOT NULL THEN 4.0 ELSE 1.0 END)
766769 * (CASE WHEN display_name IS NOT NULL THEN 1.2 ELSE 1.0 END)
769772 3 * ts_rank_cd(
770773 '{0.1, 0.1, 0.9, 1.0}',
771774 vector,
772 to_tsquery('english', ?),
775 to_tsquery('simple', ?),
773776 8
774777 )
775778 + ts_rank_cd(
776779 '{0.1, 0.1, 0.9, 1.0}',
777780 vector,
778 to_tsquery('english', ?),
781 to_tsquery('simple', ?),
779782 8
780783 )
781784 )
1616 import logging
1717
1818 import attr
19 from signedjson.types import VerifyKey
1920
2021 logger = logging.getLogger(__name__)
2122
2223
2324 @attr.s(slots=True, frozen=True)
2425 class FetchKeyResult:
25 verify_key = attr.ib() # VerifyKey: the key itself
26 valid_until_ts = attr.ib() # int: how long we can use this key for
26 verify_key = attr.ib(type=VerifyKey) # the key itself
27 valid_until_ts = attr.ib(type=int) # how long we can use this key for
3030 from synapse.metrics.background_process_metrics import run_as_background_process
3131 from synapse.storage.databases import Databases
3232 from synapse.storage.databases.main.events import DeltaState
33 from synapse.types import Collection, PersistedEventPosition, RoomStreamToken, StateMap
33 from synapse.storage.databases.main.events_worker import EventRedactBehaviour
34 from synapse.types import (
35 Collection,
36 PersistedEventPosition,
37 RoomStreamToken,
38 StateMap,
39 get_domain_from_id,
40 )
3441 from synapse.util.async_helpers import ObservableDeferred
3542 from synapse.util.metrics import Measure
3643
6572 "synapse_storage_events_stale_forward_extremities_persisted",
6673 "Number of unchanged forward extremities for each new event",
6774 buckets=(0, 1, 2, 3, 5, 7, 10, 15, 20, 50, 100, 200, 500, "+Inf"),
75 )
76
77 state_resolutions_during_persistence = Counter(
78 "synapse_storage_events_state_resolutions_during_persistence",
79 "Number of times we had to do state res to calculate new current state",
80 )
81
82 potential_times_prune_extremities = Counter(
83 "synapse_storage_events_potential_times_prune_extremities",
84 "Number of times we might be able to prune extremities",
85 )
86
87 times_pruned_extremities = Counter(
88 "synapse_storage_events_times_pruned_extremities",
89 "Number of times we were actually be able to prune extremities",
6890 )
6991
7092
453475 latest_event_ids,
454476 new_latest_event_ids,
455477 )
456 current_state, delta_ids = res
478 current_state, delta_ids, new_latest_event_ids = res
479
480 # there should always be at least one forward extremity.
481 # (except during the initial persistence of the send_join
482 # results, in which case there will be no existing
483 # extremities, so we'll `continue` above and skip this bit.)
484 assert new_latest_event_ids, "No forward extremities left!"
485
486 new_forward_extremeties[room_id] = new_latest_event_ids
457487
458488 # If either are not None then there has been a change,
459489 # and we need to work out the delta (or use that
572602 self,
573603 room_id: str,
574604 events_context: List[Tuple[EventBase, EventContext]],
575 old_latest_event_ids: Iterable[str],
576 new_latest_event_ids: Iterable[str],
577 ) -> Tuple[Optional[StateMap[str]], Optional[StateMap[str]]]:
605 old_latest_event_ids: Set[str],
606 new_latest_event_ids: Set[str],
607 ) -> Tuple[Optional[StateMap[str]], Optional[StateMap[str]], Set[str]]:
578608 """Calculate the current state dict after adding some new events to
579609 a room
580610
581611 Args:
582 room_id (str):
612 room_id:
583613 room to which the events are being added. Used for logging etc
584614
585 events_context (list[(EventBase, EventContext)]):
615 events_context:
586616 events and contexts which are being added to the room
587617
588 old_latest_event_ids (iterable[str]):
618 old_latest_event_ids:
589619 the old forward extremities for the room.
590620
591 new_latest_event_ids (iterable[str]):
621 new_latest_event_ids :
592622 the new forward extremities for the room.
593623
594624 Returns:
595 Returns a tuple of two state maps, the first being the full new current
596 state and the second being the delta to the existing current state.
597 If both are None then there has been no change.
625 Returns a tuple of two state maps and a set of new forward
626 extremities.
627
628 The first state map is the full new current state and the second
629 is the delta to the existing current state. If both are None then
630 there has been no change.
631
632 The function may prune some old entries from the set of new
633 forward extremities if it's safe to do so.
598634
599635 If there has been a change then we only return the delta if its
600636 already been calculated. Conversely if we do know the delta then
671707 # If they old and new groups are the same then we don't need to do
672708 # anything.
673709 if old_state_groups == new_state_groups:
674 return None, None
710 return None, None, new_latest_event_ids
675711
676712 if len(new_state_groups) == 1 and len(old_state_groups) == 1:
677713 # If we're going from one state group to another, lets check if
688724 # the current state in memory then lets also return that,
689725 # but it doesn't matter if we don't.
690726 new_state = state_groups_map.get(new_state_group)
691 return new_state, delta_ids
727 return new_state, delta_ids, new_latest_event_ids
692728
693729 # Now that we have calculated new_state_groups we need to get
694730 # their state IDs so we can resolve to a single state set.
700736 if len(new_state_groups) == 1:
701737 # If there is only one state group, then we know what the current
702738 # state is.
703 return state_groups_map[new_state_groups.pop()], None
739 return state_groups_map[new_state_groups.pop()], None, new_latest_event_ids
704740
705741 # Ok, we need to defer to the state handler to resolve our state sets.
706742
733769 state_res_store=StateResolutionStore(self.main_store),
734770 )
735771
736 return res.state, None
772 state_resolutions_during_persistence.inc()
773
774 # If the returned state matches the state group of one of the new
775 # forward extremities then we check if we are able to prune some state
776 # extremities.
777 if res.state_group and res.state_group in new_state_groups:
778 new_latest_event_ids = await self._prune_extremities(
779 room_id,
780 new_latest_event_ids,
781 res.state_group,
782 event_id_to_state_group,
783 events_context,
784 )
785
786 return res.state, None, new_latest_event_ids
787
788 async def _prune_extremities(
789 self,
790 room_id: str,
791 new_latest_event_ids: Set[str],
792 resolved_state_group: int,
793 event_id_to_state_group: Dict[str, int],
794 events_context: List[Tuple[EventBase, EventContext]],
795 ) -> Set[str]:
796 """See if we can prune any of the extremities after calculating the
797 resolved state.
798 """
799 potential_times_prune_extremities.inc()
800
801 # We keep all the extremities that have the same state group, and
802 # see if we can drop the others.
803 new_new_extrems = {
804 e
805 for e in new_latest_event_ids
806 if event_id_to_state_group[e] == resolved_state_group
807 }
808
809 dropped_extrems = set(new_latest_event_ids) - new_new_extrems
810
811 logger.debug("Might drop extremities: %s", dropped_extrems)
812
813 # We only drop events from the extremities list if:
814 # 1. we're not currently persisting them;
815 # 2. they're not our own events (or are dummy events); and
816 # 3. they're either:
817 # 1. over N hours old and more than N events ago (we use depth to
818 # calculate); or
819 # 2. we are persisting an event from the same domain and more than
820 # M events ago.
821 #
822 # The idea is that we don't want to drop events that are "legitimate"
823 # extremities (that we would want to include as prev events), only
824 # "stuck" extremities that are e.g. due to a gap in the graph.
825 #
826 # Note that we either drop all of them or none of them. If we only drop
827 # some of the events we don't know if state res would come to the same
828 # conclusion.
829
830 for ev, _ in events_context:
831 if ev.event_id in dropped_extrems:
832 logger.debug(
833 "Not dropping extremities: %s is being persisted", ev.event_id
834 )
835 return new_latest_event_ids
836
837 dropped_events = await self.main_store.get_events(
838 dropped_extrems,
839 allow_rejected=True,
840 redact_behaviour=EventRedactBehaviour.AS_IS,
841 )
842
843 new_senders = {get_domain_from_id(e.sender) for e, _ in events_context}
844
845 one_day_ago = self._clock.time_msec() - 24 * 60 * 60 * 1000
846 current_depth = max(e.depth for e, _ in events_context)
847 for event in dropped_events.values():
848 # If the event is a local dummy event then we should check it
849 # doesn't reference any local events, as we want to reference those
850 # if we send any new events.
851 #
852 # Note we do this recursively to handle the case where a dummy event
853 # references a dummy event that only references remote events.
854 #
855 # Ideally we'd figure out a way of still being able to drop old
856 # dummy events that reference local events, but this is good enough
857 # as a first cut.
858 events_to_check = [event]
859 while events_to_check:
860 new_events = set()
861 for event_to_check in events_to_check:
862 if self.is_mine_id(event_to_check.sender):
863 if event_to_check.type != EventTypes.Dummy:
864 logger.debug("Not dropping own event")
865 return new_latest_event_ids
866 new_events.update(event_to_check.prev_event_ids())
867
868 prev_events = await self.main_store.get_events(
869 new_events,
870 allow_rejected=True,
871 redact_behaviour=EventRedactBehaviour.AS_IS,
872 )
873 events_to_check = prev_events.values()
874
875 if (
876 event.origin_server_ts < one_day_ago
877 and event.depth < current_depth - 100
878 ):
879 continue
880
881 # We can be less conservative about dropping extremities from the
882 # same domain, though we do want to wait a little bit (otherwise
883 # we'll immediately remove all extremities from a given server).
884 if (
885 get_domain_from_id(event.sender) in new_senders
886 and event.depth < current_depth - 20
887 ):
888 continue
889
890 logger.debug(
891 "Not dropping as too new and not in new_senders: %s", new_senders,
892 )
893
894 return new_latest_event_ids
895
896 times_pruned_extremities.inc()
897
898 logger.info(
899 "Pruning forward extremities in room %s: from %s -> %s",
900 room_id,
901 new_latest_event_ids,
902 new_new_extrems,
903 )
904 return new_new_extrems
737905
738906 async def _calculate_state_delta(
739907 self, room_id: str, current_state: StateMap[str]
1717 import os
1818 import re
1919 from collections import Counter
20 from typing import Optional, TextIO
20 from typing import Generator, Iterable, List, Optional, TextIO, Tuple
2121
2222 import attr
23 from typing_extensions import Counter as CounterType
2324
2425 from synapse.config.homeserver import HomeServerConfig
2526 from synapse.storage.database import LoggingDatabaseConnection
6970 db_conn: LoggingDatabaseConnection,
7071 database_engine: BaseDatabaseEngine,
7172 config: Optional[HomeServerConfig],
72 databases: Collection[str] = ["main", "state"],
73 databases: Collection[str] = ("main", "state"),
7374 ):
7475 """Prepares a physical database for usage. Will either create all necessary tables
7576 or upgrade from an older schema version.
154155 raise
155156
156157
157 def _setup_new_database(cur, database_engine, databases):
158 def _setup_new_database(
159 cur: Cursor, database_engine: BaseDatabaseEngine, databases: Collection[str]
160 ) -> None:
158161 """Sets up the physical database by finding a base set of "full schemas" and
159162 then applying any necessary deltas, including schemas from the given data
160163 stores.
187190 folder as well those in the data stores specified.
188191
189192 Args:
190 cur (Cursor): a database cursor
191 database_engine (DatabaseEngine)
192 databases (list[str]): The names of the databases to instantiate
193 on the given physical database.
193 cur: a database cursor
194 database_engine
195 databases: The names of the databases to instantiate on the given physical database.
194196 """
195197
196198 # We're about to set up a brand new database so we check that its
198200 database_engine.check_new_database(cur)
199201
200202 current_dir = os.path.join(dir_path, "schema", "full_schemas")
201 directory_entries = os.listdir(current_dir)
202203
203204 # First we find the highest full schema version we have
204205 valid_versions = []
205206
206 for filename in directory_entries:
207 for filename in os.listdir(current_dir):
207208 try:
208209 ver = int(filename)
209210 except ValueError:
236237 for database in databases
237238 )
238239
239 directory_entries = []
240 directory_entries = [] # type: List[_DirectoryListing]
240241 for directory in directories:
241242 directory_entries.extend(
242243 _DirectoryListing(file_name, os.path.join(directory, file_name))
274275
275276
276277 def _upgrade_existing_database(
277 cur,
278 current_version,
279 applied_delta_files,
280 upgraded,
281 database_engine,
282 config,
283 databases,
284 is_empty=False,
285 ):
278 cur: Cursor,
279 current_version: int,
280 applied_delta_files: List[str],
281 upgraded: bool,
282 database_engine: BaseDatabaseEngine,
283 config: Optional[HomeServerConfig],
284 databases: Collection[str],
285 is_empty: bool = False,
286 ) -> None:
286287 """Upgrades an existing physical database.
287288
288289 Delta files can either be SQL stored in *.sql files, or python modules
322323 for a version before applying those in the next version.
323324
324325 Args:
325 cur (Cursor)
326 current_version (int): The current version of the schema.
327 applied_delta_files (list): A list of deltas that have already been
328 applied.
329 upgraded (bool): Whether the current version was generated by having
326 cur
327 current_version: The current version of the schema.
328 applied_delta_files: A list of deltas that have already been applied.
329 upgraded: Whether the current version was generated by having
330330 applied deltas or from full schema file. If `True` the function
331331 will never apply delta files for the given `current_version`, since
332332 the current_version wasn't generated by applying those delta files.
333 database_engine (DatabaseEngine)
334 config (synapse.config.homeserver.HomeServerConfig|None):
333 database_engine
334 config:
335335 None if we are initialising a blank database, otherwise the application
336336 config
337 databases (list[str]): The names of the databases to instantiate
337 databases: The names of the databases to instantiate
338338 on the given physical database.
339 is_empty (bool): Is this a blank database? I.e. do we need to run the
339 is_empty: Is this a blank database? I.e. do we need to run the
340340 upgrade portions of the delta scripts.
341341 """
342342 if is_empty:
357357 if not is_empty and "main" in databases:
358358 from synapse.storage.databases.main import check_database_before_upgrade
359359
360 assert config is not None
360361 check_database_before_upgrade(cur, database_engine, config)
361362
362363 start_ver = current_version
387388 )
388389
389390 # Used to check if we have any duplicate file names
390 file_name_counter = Counter()
391 file_name_counter = Counter() # type: CounterType[str]
391392
392393 # Now find which directories have anything of interest.
393 directory_entries = []
394 directory_entries = [] # type: List[_DirectoryListing]
394395 for directory in directories:
395396 logger.debug("Looking for schema deltas in %s", directory)
396397 try:
444445
445446 module_name = "synapse.storage.v%d_%s" % (v, root_name)
446447 with open(absolute_path) as python_file:
447 module = imp.load_source(module_name, absolute_path, python_file)
448 module = imp.load_source(module_name, absolute_path, python_file) # type: ignore
448449 logger.info("Running script %s", relative_path)
449 module.run_create(cur, database_engine)
450 module.run_create(cur, database_engine) # type: ignore
450451 if not is_empty:
451 module.run_upgrade(cur, database_engine, config=config)
452 module.run_upgrade(cur, database_engine, config=config) # type: ignore
452453 elif ext == ".pyc" or file_name == "__pycache__":
453454 # Sometimes .pyc files turn up anyway even though we've
454455 # disabled their generation; e.g. from distribution package
496497 logger.info("Schema now up to date")
497498
498499
499 def _apply_module_schemas(txn, database_engine, config):
500 def _apply_module_schemas(
501 txn: Cursor, database_engine: BaseDatabaseEngine, config: HomeServerConfig
502 ) -> None:
500503 """Apply the module schemas for the dynamic modules, if any
501504
502505 Args:
503506 cur: database cursor
504 database_engine: synapse database engine class
505 config (synapse.config.homeserver.HomeServerConfig):
506 application config
507 database_engine:
508 config: application config
507509 """
508510 for (mod, _config) in config.password_providers:
509511 if not hasattr(mod, "get_db_schema_files"):
514516 )
515517
516518
517 def _apply_module_schema_files(cur, database_engine, modname, names_and_streams):
519 def _apply_module_schema_files(
520 cur: Cursor,
521 database_engine: BaseDatabaseEngine,
522 modname: str,
523 names_and_streams: Iterable[Tuple[str, TextIO]],
524 ) -> None:
518525 """Apply the module schemas for a single module
519526
520527 Args:
521528 cur: database cursor
522529 database_engine: synapse database engine class
523 modname (str): fully qualified name of the module
524 names_and_streams (Iterable[(str, file)]): the names and streams of
525 schemas to be applied
530 modname: fully qualified name of the module
531 names_and_streams: the names and streams of schemas to be applied
526532 """
527533 cur.execute(
528534 "SELECT file FROM applied_module_schemas WHERE module_name = ?", (modname,),
548554 )
549555
550556
551 def get_statements(f):
557 def get_statements(f: Iterable[str]) -> Generator[str, None, None]:
552558 statement_buffer = ""
553559 in_comment = False # If we're in a /* ... */ style comment
554560
593599 statement_buffer = statements[-1].strip()
594600
595601
596 def executescript(txn, schema_path):
602 def executescript(txn: Cursor, schema_path: str) -> None:
597603 with open(schema_path, "r") as f:
598604 execute_statements_from_stream(txn, f)
599605
600606
601 def execute_statements_from_stream(cur: Cursor, f: TextIO):
607 def execute_statements_from_stream(cur: Cursor, f: TextIO) -> None:
602608 for statement in get_statements(f):
603609 cur.execute(statement)
604610
605611
606 def _get_or_create_schema_state(txn, database_engine):
612 def _get_or_create_schema_state(
613 txn: Cursor, database_engine: BaseDatabaseEngine
614 ) -> Optional[Tuple[int, List[str], bool]]:
607615 # Bluntly try creating the schema_version tables.
608616 schema_path = os.path.join(dir_path, "schema", "schema_version.sql")
609617 executescript(txn, schema_path)
611619 txn.execute("SELECT version, upgraded FROM schema_version")
612620 row = txn.fetchone()
613621 current_version = int(row[0]) if row else None
614 upgraded = bool(row[1]) if row else None
615622
616623 if current_version:
617624 txn.execute(
619626 (current_version,),
620627 )
621628 applied_deltas = [d for d, in txn]
629 upgraded = bool(row[1])
622630 return current_version, applied_deltas, upgraded
623631
624632 return None
633641 `file_name` attr is kept first.
634642 """
635643
636 file_name = attr.ib()
637 absolute_path = attr.ib()
644 file_name = attr.ib(type=str)
645 absolute_path = attr.ib(type=str)
1414
1515 import itertools
1616 import logging
17 from typing import Set
17 from typing import TYPE_CHECKING, Set
18
19 from synapse.storage.databases import Databases
20
21 if TYPE_CHECKING:
22 from synapse.app.homeserver import HomeServer
1823
1924 logger = logging.getLogger(__name__)
2025
2328 """High level interface for purging rooms and event history.
2429 """
2530
26 def __init__(self, hs, stores):
31 def __init__(self, hs: "HomeServer", stores: Databases):
2732 self.stores = stores
2833
29 async def purge_room(self, room_id: str):
34 async def purge_room(self, room_id: str) -> None:
3035 """Deletes all record of a room
3136 """
3237
1313 # limitations under the License.
1414
1515 import logging
16 from typing import Any, Dict, List, Optional, Tuple
1617
1718 import attr
1819
1920 from synapse.api.errors import SynapseError
21 from synapse.types import JsonDict
2022
2123 logger = logging.getLogger(__name__)
2224
2628 """Returned by relation pagination APIs.
2729
2830 Attributes:
29 chunk (list): The rows returned by pagination
30 next_batch (Any|None): Token to fetch next set of results with, if
31 chunk: The rows returned by pagination
32 next_batch: Token to fetch next set of results with, if
3133 None then there are no more results.
32 prev_batch (Any|None): Token to fetch previous set of results with, if
34 prev_batch: Token to fetch previous set of results with, if
3335 None then there are no previous results.
3436 """
3537
36 chunk = attr.ib()
37 next_batch = attr.ib(default=None)
38 prev_batch = attr.ib(default=None)
38 chunk = attr.ib(type=List[JsonDict])
39 next_batch = attr.ib(type=Optional[Any], default=None)
40 prev_batch = attr.ib(type=Optional[Any], default=None)
3941
40 def to_dict(self):
42 def to_dict(self) -> Dict[str, Any]:
4143 d = {"chunk": self.chunk}
4244
4345 if self.next_batch:
5860 boundaries of the chunk as pagination tokens.
5961
6062 Attributes:
61 topological (int): The topological ordering of the boundary event
62 stream (int): The stream ordering of the boundary event.
63 topological: The topological ordering of the boundary event
64 stream: The stream ordering of the boundary event.
6365 """
6466
65 topological = attr.ib()
66 stream = attr.ib()
67 topological = attr.ib(type=int)
68 stream = attr.ib(type=int)
6769
6870 @staticmethod
69 def from_string(string):
71 def from_string(string: str) -> "RelationPaginationToken":
7072 try:
7173 t, s = string.split("-")
7274 return RelationPaginationToken(int(t), int(s))
7375 except ValueError:
7476 raise SynapseError(400, "Invalid token")
7577
76 def to_string(self):
78 def to_string(self) -> str:
7779 return "%d-%d" % (self.topological, self.stream)
7880
79 def as_tuple(self):
81 def as_tuple(self) -> Tuple[Any, ...]:
8082 return attr.astuple(self)
8183
8284
8890 aggregation groups, we can just use them as our pagination token.
8991
9092 Attributes:
91 count (int): The count of relations in the boundar group.
92 stream (int): The MAX stream ordering in the boundary group.
93 count: The count of relations in the boundary group.
94 stream: The MAX stream ordering in the boundary group.
9395 """
9496
95 count = attr.ib()
96 stream = attr.ib()
97 count = attr.ib(type=int)
98 stream = attr.ib(type=int)
9799
98100 @staticmethod
99 def from_string(string):
101 def from_string(string: str) -> "AggregationPaginationToken":
100102 try:
101103 c, s = string.split("-")
102104 return AggregationPaginationToken(int(c), int(s))
103105 except ValueError:
104106 raise SynapseError(400, "Invalid token")
105107
106 def to_string(self):
108 def to_string(self) -> str:
107109 return "%d-%d" % (self.count, self.stream)
108110
109 def as_tuple(self):
111 def as_tuple(self) -> Tuple[Any, ...]:
110112 return attr.astuple(self)
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14
1514 import logging
16 from typing import Awaitable, Dict, Iterable, List, Optional, Set, Tuple, TypeVar
15 from typing import (
16 TYPE_CHECKING,
17 Awaitable,
18 Dict,
19 Iterable,
20 List,
21 Optional,
22 Set,
23 Tuple,
24 TypeVar,
25 )
1726
1827 import attr
1928
2029 from synapse.api.constants import EventTypes
2130 from synapse.events import EventBase
2231 from synapse.types import MutableStateMap, StateMap
32
33 if TYPE_CHECKING:
34 from synapse.app.homeserver import HomeServer
35 from synapse.storage.databases import Databases
2336
2437 logger = logging.getLogger(__name__)
2538
329342 """High level interface to fetching state for event.
330343 """
331344
332 def __init__(self, hs, stores):
345 def __init__(self, hs: "HomeServer", stores: "Databases"):
333346 self.stores = stores
334347
335 async def get_state_group_delta(self, state_group: int):
348 async def get_state_group_delta(
349 self, state_group: int
350 ) -> Tuple[Optional[int], Optional[StateMap[str]]]:
336351 """Given a state group try to return a previous group and a delta between
337352 the old and the new.
338353
340355 state_group: The state group used to retrieve state deltas.
341356
342357 Returns:
343 Tuple[Optional[int], Optional[StateMap[str]]]:
344 (prev_group, delta_ids)
358 A tuple of the previous group and a state map of the event IDs which
359 make up the delta between the old and new state groups.
345360 """
346361
347362 return await self.stores.state.get_state_group_delta(state_group)
435450
436451 async def get_state_for_events(
437452 self, event_ids: List[str], state_filter: StateFilter = StateFilter.all()
438 ):
453 ) -> Dict[str, StateMap[EventBase]]:
439454 """Given a list of event_ids and type tuples, return a list of state
440455 dicts for each event.
441456
471486
472487 async def get_state_ids_for_events(
473488 self, event_ids: List[str], state_filter: StateFilter = StateFilter.all()
474 ):
489 ) -> Dict[str, StateMap[str]]:
475490 """
476491 Get the state dicts corresponding to a list of events, containing the event_ids
477492 of the state events (as opposed to the events themselves)
499514
500515 async def get_state_for_event(
501516 self, event_id: str, state_filter: StateFilter = StateFilter.all()
502 ):
517 ) -> StateMap[EventBase]:
503518 """
504519 Get the state dict corresponding to a particular event
505520
515530
516531 async def get_state_ids_for_event(
517532 self, event_id: str, state_filter: StateFilter = StateFilter.all()
518 ):
533 ) -> StateMap[str]:
519534 """
520535 Get the state dict corresponding to a particular event
521536
152152
153153 return _AsyncCtxManagerWrapper(manager())
154154
155 def get_current_token(self):
155 def get_current_token(self) -> int:
156156 """Returns the maximum stream id such that all stream ids less than or
157157 equal to it have been successfully persisted.
158158
159159 Returns:
160 int
160 The maximum stream id.
161161 """
162162 with self._lock:
163163 if self._unfinished_ids:
348348 )
349349
350350
351 def map_username_to_mxid_localpart(username, case_sensitive=False):
351 def map_username_to_mxid_localpart(
352 username: Union[str, bytes], case_sensitive: bool = False
353 ) -> str:
352354 """Map a username onto a string suitable for a MXID
353355
354356 This follows the algorithm laid out at
355357 https://matrix.org/docs/spec/appendices.html#mapping-from-other-character-sets.
356358
357359 Args:
358 username (unicode|bytes): username to be mapped
359 case_sensitive (bool): true if TEST and test should be mapped
360 username: username to be mapped
361 case_sensitive: true if TEST and test should be mapped
360362 onto different mxids
361363
362364 Returns:
1414 # limitations under the License.
1515
1616 import collections
17 import inspect
1718 import logging
1819 from contextlib import contextmanager
1920 from typing import (
2021 Any,
22 Awaitable,
2123 Callable,
2224 Dict,
2325 Hashable,
541543 raise StopIteration(self.value)
542544
543545
544 def maybe_awaitable(value):
546 def maybe_awaitable(value: Union[Awaitable[R], R]) -> Awaitable[R]:
545547 """Convert a value to an awaitable if not already an awaitable.
546548 """
547
548 if hasattr(value, "__await__"):
549 if inspect.isawaitable(value):
550 assert isinstance(value, Awaitable)
549551 return value
550552
551553 return DoneAwaitable(value)
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14 import inspect
1514 import logging
1615
1716 from twisted.internet import defer
1817
1918 from synapse.logging.context import make_deferred_yieldable, run_in_background
2019 from synapse.metrics.background_process_metrics import run_as_background_process
20 from synapse.util.async_helpers import maybe_awaitable
2121
2222 logger = logging.getLogger(__name__)
2323
104104
105105 async def do(observer):
106106 try:
107 result = observer(*args, **kwargs)
108 if inspect.isawaitable(result):
109 result = await result
110 return result
107 return await maybe_awaitable(observer(*args, **kwargs))
111108 except Exception as e:
112109 logger.warning(
113110 "%s signal observer %s failed: %r", self.name, observer, e,
1414
1515 import importlib
1616 import importlib.util
17 import itertools
18 from typing import Any, Iterable, Tuple, Type
19
20 import jsonschema
1721
1822 from synapse.config._base import ConfigError
23 from synapse.config._util import json_error_to_config_error
1924
2025
21 def load_module(provider):
26 def load_module(provider: dict, config_path: Iterable[str]) -> Tuple[Type, Any]:
2227 """ Loads a synapse module with its config
23 Take a dict with keys 'module' (the module name) and 'config'
24 (the config dict).
28
29 Args:
30 provider: a dict with keys 'module' (the module name) and 'config'
31 (the config dict).
32 config_path: the path within the config file. This will be used as a basis
33 for any error message.
2534
2635 Returns
2736 Tuple of (provider class, parsed config object)
2837 """
38
39 modulename = provider.get("module")
40 if not isinstance(modulename, str):
41 raise ConfigError(
42 "expected a string", path=itertools.chain(config_path, ("module",))
43 )
44
2945 # We need to import the module, and then pick the class out of
3046 # that, so we split based on the last dot.
31 module, clz = provider["module"].rsplit(".", 1)
47 module, clz = modulename.rsplit(".", 1)
3248 module = importlib.import_module(module)
3349 provider_class = getattr(module, clz)
3450
51 module_config = provider.get("config")
3552 try:
36 provider_config = provider_class.parse_config(provider.get("config"))
53 provider_config = provider_class.parse_config(module_config)
54 except jsonschema.ValidationError as e:
55 raise json_error_to_config_error(e, itertools.chain(config_path, ("config",)))
56 except ConfigError as e:
57 raise _wrap_config_error(
58 "Failed to parse config for module %r" % (modulename,),
59 prefix=itertools.chain(config_path, ("config",)),
60 e=e,
61 )
3762 except Exception as e:
38 raise ConfigError("Failed to parse config for %r: %s" % (provider["module"], e))
63 raise ConfigError(
64 "Failed to parse config for module %r" % (modulename,),
65 path=itertools.chain(config_path, ("config",)),
66 ) from e
3967
4068 return provider_class, provider_config
4169
5583 mod = importlib.util.module_from_spec(spec)
5684 spec.loader.exec_module(mod) # type: ignore
5785 return mod
86
87
88 def _wrap_config_error(
89 msg: str, prefix: Iterable[str], e: ConfigError
90 ) -> "ConfigError":
91 """Wrap a relative ConfigError with a new path
92
93 This is useful when we have a ConfigError with a relative path due to a problem
94 parsing part of the config, and we now need to set it in context.
95 """
96 path = prefix
97 if e.path:
98 path = itertools.chain(prefix, e.path)
99
100 e1 = ConfigError(msg, path)
101
102 # ideally we would set the 'cause' of the new exception to the original exception;
103 # however now that we have merged the path into our own, the stringification of
104 # e will be incorrect, so instead we create a new exception with just the "msg"
105 # part.
106
107 e1.__cause__ = Exception(e.msg)
108 e1.__cause__.__cause__ = e.__cause__
109 return e1
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14
1514 import logging
1615 import operator
1716
18 from synapse.api.constants import AccountDataTypes, EventTypes, Membership
17 from synapse.api.constants import (
18 AccountDataTypes,
19 EventTypes,
20 HistoryVisibility,
21 Membership,
22 )
1923 from synapse.events.utils import prune_event
2024 from synapse.storage import Storage
2125 from synapse.storage.state import StateFilter
2428 logger = logging.getLogger(__name__)
2529
2630
27 VISIBILITY_PRIORITY = ("world_readable", "shared", "invited", "joined")
31 VISIBILITY_PRIORITY = (
32 HistoryVisibility.WORLD_READABLE,
33 HistoryVisibility.SHARED,
34 HistoryVisibility.INVITED,
35 HistoryVisibility.JOINED,
36 )
2837
2938
3039 MEMBERSHIP_PRIORITY = (
115124 # see events in the room at that point in the DAG, and that shouldn't be decided
116125 # on those checks.
117126 if filter_send_to_client:
118 if event.type == "org.matrix.dummy_event":
127 if event.type == EventTypes.Dummy:
119128 return None
120129
121130 if not event.is_state() and event.sender in ignore_list:
149158 # get the room_visibility at the time of the event.
150159 visibility_event = state.get((EventTypes.RoomHistoryVisibility, ""), None)
151160 if visibility_event:
152 visibility = visibility_event.content.get("history_visibility", "shared")
161 visibility = visibility_event.content.get(
162 "history_visibility", HistoryVisibility.SHARED
163 )
153164 else:
154 visibility = "shared"
165 visibility = HistoryVisibility.SHARED
155166
156167 if visibility not in VISIBILITY_PRIORITY:
157 visibility = "shared"
168 visibility = HistoryVisibility.SHARED
158169
159170 # Always allow history visibility events on boundaries. This is done
160171 # by setting the effective visibility to the least restrictive
164175 prev_visibility = prev_content.get("history_visibility", None)
165176
166177 if prev_visibility not in VISIBILITY_PRIORITY:
167 prev_visibility = "shared"
178 prev_visibility = HistoryVisibility.SHARED
168179
169180 new_priority = VISIBILITY_PRIORITY.index(visibility)
170181 old_priority = VISIBILITY_PRIORITY.index(prev_visibility)
209220
210221 # otherwise, it depends on the room visibility.
211222
212 if visibility == "joined":
223 if visibility == HistoryVisibility.JOINED:
213224 # we weren't a member at the time of the event, so we can't
214225 # see this event.
215226 return None
216227
217 elif visibility == "invited":
228 elif visibility == HistoryVisibility.INVITED:
218229 # user can also see the event if they were *invited* at the time
219230 # of the event.
220231 return event if membership == Membership.INVITE else None
221232
222 elif visibility == "shared" and is_peeking:
233 elif visibility == HistoryVisibility.SHARED and is_peeking:
223234 # if the visibility is shared, users cannot see the event unless
224235 # they have *subequently* joined the room (or were members at the
225236 # time, of course)
283294 def check_event_is_visible(event, state):
284295 history = state.get((EventTypes.RoomHistoryVisibility, ""), None)
285296 if history:
286 visibility = history.content.get("history_visibility", "shared")
287 if visibility in ["invited", "joined"]:
297 visibility = history.content.get(
298 "history_visibility", HistoryVisibility.SHARED
299 )
300 if visibility in [HistoryVisibility.INVITED, HistoryVisibility.JOINED]:
288301 # We now loop through all state events looking for
289302 # membership states for the requesting server to determine
290303 # if the server is either in the room or has been invited
304317 if memtype == Membership.JOIN:
305318 return True
306319 elif memtype == Membership.INVITE:
307 if visibility == "invited":
320 if visibility == HistoryVisibility.INVITED:
308321 return True
309322 else:
310323 # server has no users in the room: redact
335348 else:
336349 event_map = await storage.main.get_events(visibility_ids)
337350 all_open = all(
338 e.content.get("history_visibility") in (None, "shared", "world_readable")
351 e.content.get("history_visibility")
352 in (None, HistoryVisibility.SHARED, HistoryVisibility.WORLD_READABLE)
339353 for e in event_map.values()
340354 )
341355
1515 # See the License for the specific language governing permissions and
1616 # limitations under the License.
1717
18 from mock import Mock
19
2018 import jsonschema
2119
2220 from twisted.internet import defer
2725 from synapse.events import make_event_from_dict
2826
2927 from tests import unittest
30 from tests.utils import DeferredMockCallable, MockHttpResource, setup_test_homeserver
28 from tests.utils import setup_test_homeserver
3129
3230 user_localpart = "test_user"
3331
4139
4240
4341 class FilteringTestCase(unittest.TestCase):
44 @defer.inlineCallbacks
4542 def setUp(self):
46 self.mock_federation_resource = MockHttpResource()
47
48 self.mock_http_client = Mock(spec=[])
49 self.mock_http_client.put_json = DeferredMockCallable()
50
51 hs = yield setup_test_homeserver(
52 self.addCleanup, http_client=self.mock_http_client, keyring=Mock(),
53 )
54
43 hs = setup_test_homeserver(self.addCleanup)
5544 self.filtering = hs.get_filtering()
56
5745 self.datastore = hs.get_datastore()
5846
5947 def test_errors_on_invalid_filters(self):
2222 def make_homeserver(self, reactor, clock):
2323
2424 hs = self.setup_test_homeserver(
25 http_client=None, homeserver_to_use=GenericWorkerServer
25 federation_http_client=None, homeserver_to_use=GenericWorkerServer
2626 )
2727
2828 return hs
5656 self.assertEqual(len(self.reactor.tcpServers), 1)
5757 site = self.reactor.tcpServers[0][1]
5858
59 _, channel = make_request(self.reactor, site, "PUT", "presence/a/status")
59 channel = make_request(self.reactor, site, "PUT", "presence/a/status")
6060
6161 # 400 + unrecognised, because nothing is registered
6262 self.assertEqual(channel.code, 400)
7676 self.assertEqual(len(self.reactor.tcpServers), 1)
7777 site = self.reactor.tcpServers[0][1]
7878
79 _, channel = make_request(self.reactor, site, "PUT", "presence/a/status")
79 channel = make_request(self.reactor, site, "PUT", "presence/a/status")
8080
8181 # 401, because the stub servlet still checks authentication
8282 self.assertEqual(channel.code, 401)
2626 class FederationReaderOpenIDListenerTests(HomeserverTestCase):
2727 def make_homeserver(self, reactor, clock):
2828 hs = self.setup_test_homeserver(
29 http_client=None, homeserver_to_use=GenericWorkerServer
29 federation_http_client=None, homeserver_to_use=GenericWorkerServer
3030 )
3131 return hs
3232
7272 return
7373 raise
7474
75 _, channel = make_request(
75 channel = make_request(
7676 self.reactor, site, "GET", "/_matrix/federation/v1/openid/userinfo"
7777 )
7878
8383 class SynapseHomeserverOpenIDListenerTests(HomeserverTestCase):
8484 def make_homeserver(self, reactor, clock):
8585 hs = self.setup_test_homeserver(
86 http_client=None, homeserver_to_use=SynapseHomeServer
86 federation_http_client=None, homeserver_to_use=SynapseHomeServer
8787 )
8888 return hs
8989
120120 return
121121 raise
122122
123 _, channel = make_request(
123 channel = make_request(
124124 self.reactor, site, "GET", "/_matrix/federation/v1/openid/userinfo"
125125 )
126126
7474 return val
7575
7676 def test_verify_json_objects_for_server_awaits_previous_requests(self):
77 mock_fetcher = keyring.KeyFetcher()
77 mock_fetcher = Mock()
7878 mock_fetcher.get_keys = Mock()
7979 kr = keyring.Keyring(self.hs, key_fetchers=(mock_fetcher,))
8080
194194 """Tests that we correctly handle key requests for keys we've stored
195195 with a null `ts_valid_until_ms`
196196 """
197 mock_fetcher = keyring.KeyFetcher()
197 mock_fetcher = Mock()
198198 mock_fetcher.get_keys = Mock(return_value=make_awaitable({}))
199199
200200 kr = keyring.Keyring(
248248 }
249249 }
250250
251 mock_fetcher = keyring.KeyFetcher()
251 mock_fetcher = Mock()
252252 mock_fetcher.get_keys = Mock(side_effect=get_keys)
253253 kr = keyring.Keyring(self.hs, key_fetchers=(mock_fetcher,))
254254
287287 }
288288 }
289289
290 mock_fetcher1 = keyring.KeyFetcher()
290 mock_fetcher1 = Mock()
291291 mock_fetcher1.get_keys = Mock(side_effect=get_keys1)
292 mock_fetcher2 = keyring.KeyFetcher()
292 mock_fetcher2 = Mock()
293293 mock_fetcher2.get_keys = Mock(side_effect=get_keys2)
294294 kr = keyring.Keyring(self.hs, key_fetchers=(mock_fetcher1, mock_fetcher2))
295295
314314 class ServerKeyFetcherTestCase(unittest.HomeserverTestCase):
315315 def make_homeserver(self, reactor, clock):
316316 self.http_client = Mock()
317 hs = self.setup_test_homeserver(http_client=self.http_client)
317 hs = self.setup_test_homeserver(federation_http_client=self.http_client)
318318 return hs
319319
320320 def test_get_keys_from_server(self):
394394 }
395395 ]
396396
397 return self.setup_test_homeserver(http_client=self.http_client, config=config)
397 return self.setup_test_homeserver(
398 federation_http_client=self.http_client, config=config
399 )
398400
399401 def build_perspectives_response(
400402 self, server_name: str, signing_key: SigningKey, valid_until_ts: int,
4747 )
4848
4949 # Get the room complexity
50 request, channel = self.make_request(
50 channel = self.make_request(
5151 "GET", "/_matrix/federation/unstable/rooms/%s/complexity" % (room_1,)
5252 )
5353 self.assertEquals(200, channel.code)
5959 store.get_current_state_event_counts = lambda x: make_awaitable(500 * 1.23)
6060
6161 # Get the room complexity again -- make sure it's our artificial value
62 request, channel = self.make_request(
62 channel = self.make_request(
6363 "GET", "/_matrix/federation/unstable/rooms/%s/complexity" % (room_1,)
6464 )
6565 self.assertEquals(200, channel.code)
4545
4646 "/get_missing_events/(?P<room_id>[^/]*)/?"
4747
48 request, channel = self.make_request(
48 channel = self.make_request(
4949 "POST",
5050 "/_matrix/federation/v1/get_missing_events/%s" % (room_1,),
5151 query_content,
9494 room_1 = self.helper.create_room_as(u1, tok=u1_token)
9595 self.inject_room_member(room_1, "@user:other.example.com", "join")
9696
97 request, channel = self.make_request(
97 channel = self.make_request(
9898 "GET", "/_matrix/federation/v1/state/%s" % (room_1,)
9999 )
100100 self.assertEquals(200, channel.code, channel.result)
126126
127127 room_1 = self.helper.create_room_as(u1, tok=u1_token)
128128
129 request, channel = self.make_request(
129 channel = self.make_request(
130130 "GET", "/_matrix/federation/v1/state/%s" % (room_1,)
131131 )
132132 self.assertEquals(403, channel.code, channel.result)
00 # -*- coding: utf-8 -*-
1 # Copyright 2019 The Matrix.org Foundation C.I.C.
1 # Copyright 2020 The Matrix.org Foundation C.I.C.
22 #
33 # Licensed under the Apache License, Version 2.0 (the "License");
44 # you may not use this file except in compliance with the License.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414
15
16 from twisted.internet import defer
17
18 from synapse.config.ratelimiting import FederationRateLimitConfig
19 from synapse.federation.transport import server
20 from synapse.util.ratelimitutils import FederationRateLimiter
21
2215 from tests import unittest
2316 from tests.unittest import override_config
2417
2518
26 class RoomDirectoryFederationTests(unittest.HomeserverTestCase):
27 def prepare(self, reactor, clock, homeserver):
28 class Authenticator:
29 def authenticate_request(self, request, content):
30 return defer.succeed("otherserver.nottld")
31
32 ratelimiter = FederationRateLimiter(clock, FederationRateLimitConfig())
33 server.register_servlets(
34 homeserver, self.resource, Authenticator(), ratelimiter
35 )
36
19 class RoomDirectoryFederationTests(unittest.FederatingHomeserverTestCase):
3720 @override_config({"allow_public_rooms_over_federation": False})
3821 def test_blocked_public_room_list_over_federation(self):
39 request, channel = self.make_request(
40 "GET", "/_matrix/federation/v1/publicRooms"
22 """Test that unauthenticated requests to the public rooms directory 403 when
23 allow_public_rooms_over_federation is False.
24 """
25 channel = self.make_request(
26 "GET",
27 "/_matrix/federation/v1/publicRooms",
28 federation_auth_origin=b"example.com",
4129 )
4230 self.assertEquals(403, channel.code)
4331
4432 @override_config({"allow_public_rooms_over_federation": True})
4533 def test_open_public_room_list_over_federation(self):
46 request, channel = self.make_request(
47 "GET", "/_matrix/federation/v1/publicRooms"
34 """Test that unauthenticated requests to the public rooms directory 200 when
35 allow_public_rooms_over_federation is True.
36 """
37 channel = self.make_request(
38 "GET",
39 "/_matrix/federation/v1/publicRooms",
40 federation_auth_origin=b"example.com",
4841 )
4942 self.assertEquals(200, channel.code)
0 # Copyright 2020 The Matrix.org Foundation C.I.C.
1 #
2 # Licensed under the Apache License, Version 2.0 (the "License");
3 # you may not use this file except in compliance with the License.
4 # You may obtain a copy of the License at
5 #
6 # http://www.apache.org/licenses/LICENSE-2.0
7 #
8 # Unless required by applicable law or agreed to in writing, software
9 # distributed under the License is distributed on an "AS IS" BASIS,
10 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
11 # See the License for the specific language governing permissions and
12 # limitations under the License.
13 from mock import Mock
14
15 from synapse.handlers.cas_handler import CasResponse
16
17 from tests.test_utils import simple_async_mock
18 from tests.unittest import HomeserverTestCase
19
20 # These are a few constants that are used as config parameters in the tests.
21 BASE_URL = "https://synapse/"
22 SERVER_URL = "https://issuer/"
23
24
25 class CasHandlerTestCase(HomeserverTestCase):
26 def default_config(self):
27 config = super().default_config()
28 config["public_baseurl"] = BASE_URL
29 cas_config = {
30 "enabled": True,
31 "server_url": SERVER_URL,
32 "service_url": BASE_URL,
33 }
34 config["cas_config"] = cas_config
35
36 return config
37
38 def make_homeserver(self, reactor, clock):
39 hs = self.setup_test_homeserver()
40
41 self.handler = hs.get_cas_handler()
42
43 # Reduce the number of attempts when generating MXIDs.
44 sso_handler = hs.get_sso_handler()
45 sso_handler._MAP_USERNAME_RETRIES = 3
46
47 return hs
48
49 def test_map_cas_user_to_user(self):
50 """Ensure that mapping the CAS user returned from a provider to an MXID works properly."""
51
52 # stub out the auth handler
53 auth_handler = self.hs.get_auth_handler()
54 auth_handler.complete_sso_login = simple_async_mock()
55
56 cas_response = CasResponse("test_user", {})
57 request = _mock_request()
58 self.get_success(
59 self.handler._handle_cas_response(request, cas_response, "redirect_uri", "")
60 )
61
62 # check that the auth handler got called as expected
63 auth_handler.complete_sso_login.assert_called_once_with(
64 "@test_user:test", request, "redirect_uri", None
65 )
66
67 def test_map_cas_user_to_existing_user(self):
68 """Existing users can log in with CAS account."""
69 store = self.hs.get_datastore()
70 self.get_success(
71 store.register_user(user_id="@test_user:test", password_hash=None)
72 )
73
74 # stub out the auth handler
75 auth_handler = self.hs.get_auth_handler()
76 auth_handler.complete_sso_login = simple_async_mock()
77
78 # Map a user via SSO.
79 cas_response = CasResponse("test_user", {})
80 request = _mock_request()
81 self.get_success(
82 self.handler._handle_cas_response(request, cas_response, "redirect_uri", "")
83 )
84
85 # check that the auth handler got called as expected
86 auth_handler.complete_sso_login.assert_called_once_with(
87 "@test_user:test", request, "redirect_uri", None
88 )
89
90 # Subsequent calls should map to the same mxid.
91 auth_handler.complete_sso_login.reset_mock()
92 self.get_success(
93 self.handler._handle_cas_response(request, cas_response, "redirect_uri", "")
94 )
95 auth_handler.complete_sso_login.assert_called_once_with(
96 "@test_user:test", request, "redirect_uri", None
97 )
98
99 def test_map_cas_user_to_invalid_localpart(self):
100 """CAS automaps invalid characters to base-64 encoding."""
101
102 # stub out the auth handler
103 auth_handler = self.hs.get_auth_handler()
104 auth_handler.complete_sso_login = simple_async_mock()
105
106 cas_response = CasResponse("föö", {})
107 request = _mock_request()
108 self.get_success(
109 self.handler._handle_cas_response(request, cas_response, "redirect_uri", "")
110 )
111
112 # check that the auth handler got called as expected
113 auth_handler.complete_sso_login.assert_called_once_with(
114 "@f=c3=b6=c3=b6:test", request, "redirect_uri", None
115 )
116
117
118 def _mock_request():
119 """Returns a mock which will stand in as a SynapseRequest"""
120 return Mock(spec=["getClientIP", "get_user_agent"])
2626
2727 class DeviceTestCase(unittest.HomeserverTestCase):
2828 def make_homeserver(self, reactor, clock):
29 hs = self.setup_test_homeserver("server", http_client=None)
29 hs = self.setup_test_homeserver("server", federation_http_client=None)
3030 self.handler = hs.get_device_handler()
3131 self.store = hs.get_datastore()
3232 return hs
228228
229229 class DehydrationTestCase(unittest.HomeserverTestCase):
230230 def make_homeserver(self, reactor, clock):
231 hs = self.setup_test_homeserver("server", http_client=None)
231 hs = self.setup_test_homeserver("server", federation_http_client=None)
232232 self.handler = hs.get_device_handler()
233233 self.registration = hs.get_registration_handler()
234234 self.auth = hs.get_auth()
4141 self.mock_registry.register_query_handler = register_query_handler
4242
4343 hs = self.setup_test_homeserver(
44 http_client=None,
45 resource_for_federation=Mock(),
4644 federation_client=self.mock_federation,
4745 federation_registry=self.mock_registry,
4846 )
406404 def test_denied(self):
407405 room_id = self.helper.create_room_as(self.user_id)
408406
409 request, channel = self.make_request(
407 channel = self.make_request(
410408 "PUT",
411409 b"directory/room/%23test%3Atest",
412410 ('{"room_id":"%s"}' % (room_id,)).encode("ascii"),
416414 def test_allowed(self):
417415 room_id = self.helper.create_room_as(self.user_id)
418416
419 request, channel = self.make_request(
417 channel = self.make_request(
420418 "PUT",
421419 b"directory/room/%23unofficial_test%3Atest",
422420 ('{"room_id":"%s"}' % (room_id,)).encode("ascii"),
432430 def prepare(self, reactor, clock, hs):
433431 room_id = self.helper.create_room_as(self.user_id)
434432
435 request, channel = self.make_request(
433 channel = self.make_request(
436434 "PUT", b"directory/list/room/%s" % (room_id.encode("ascii"),), b"{}"
437435 )
438436 self.assertEquals(200, channel.code, channel.result)
447445 self.directory_handler.enable_room_list_search = True
448446
449447 # Room list is enabled so we should get some results
450 request, channel = self.make_request("GET", b"publicRooms")
448 channel = self.make_request("GET", b"publicRooms")
451449 self.assertEquals(200, channel.code, channel.result)
452450 self.assertTrue(len(channel.json_body["chunk"]) > 0)
453451
455453 self.directory_handler.enable_room_list_search = False
456454
457455 # Room list disabled so we should get no results
458 request, channel = self.make_request("GET", b"publicRooms")
456 channel = self.make_request("GET", b"publicRooms")
459457 self.assertEquals(200, channel.code, channel.result)
460458 self.assertTrue(len(channel.json_body["chunk"]) == 0)
461459
462460 # Room list disabled so we shouldn't be allowed to publish rooms
463461 room_id = self.helper.create_room_as(self.user_id)
464 request, channel = self.make_request(
462 channel = self.make_request(
465463 "PUT", b"directory/list/room/%s" % (room_id.encode("ascii"),), b"{}"
466464 )
467465 self.assertEquals(403, channel.code, channel.result)
3636 ]
3737
3838 def make_homeserver(self, reactor, clock):
39 hs = self.setup_test_homeserver(http_client=None)
39 hs = self.setup_test_homeserver(federation_http_client=None)
4040 self.handler = hs.get_federation_handler()
4141 self.store = hs.get_datastore()
4242 return hs
125125 room_version,
126126 )
127127
128 with LoggingContext(request="send_rejected"):
128 with LoggingContext("send_rejected"):
129129 d = run_in_background(self.handler.on_receive_pdu, OTHER_SERVER, ev)
130130 self.get_success(d)
131131
177177 room_version,
178178 )
179179
180 with LoggingContext(request="send_rejected"):
180 with LoggingContext("send_rejected"):
181181 d = run_in_background(self.handler.on_receive_pdu, OTHER_SERVER, ev)
182182 self.get_success(d)
183183
197197 # the auth code requires that a signature exists, but doesn't check that
198198 # signature... go figure.
199199 join_event.signatures[other_server] = {"x": "y"}
200 with LoggingContext(request="send_join"):
200 with LoggingContext("send_join"):
201201 d = run_in_background(
202202 self.handler.on_send_join_request, other_server, join_event
203203 )
205205
206206 # Redaction of event should fail.
207207 path = "/_matrix/client/r0/rooms/%s/redact/%s" % (self.room_id, event_id)
208 request, channel = self.make_request(
208 channel = self.make_request(
209209 "POST", path, content={}, access_token=self.access_token
210210 )
211211 self.assertEqual(int(channel.result["code"]), 403)
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414 import json
15 from urllib.parse import parse_qs, urlparse
16
17 from mock import Mock, patch
18
19 import attr
15 import re
16 from typing import Dict
17 from urllib.parse import parse_qs, urlencode, urlparse
18
19 from mock import ANY, Mock, patch
20
2021 import pymacaroons
2122
22 from twisted.python.failure import Failure
23 from twisted.web._newclient import ResponseDone
24
25 from synapse.handlers.oidc_handler import OidcError, OidcMappingProvider
23 from twisted.web.resource import Resource
24
25 from synapse.api.errors import RedirectException
26 from synapse.handlers.oidc_handler import OidcError
2627 from synapse.handlers.sso import MappingException
28 from synapse.rest.client.v1 import login
29 from synapse.rest.synapse.client.pick_username import pick_username_resource
30 from synapse.server import HomeServer
2731 from synapse.types import UserID
2832
33 from tests.test_utils import FakeResponse, simple_async_mock
2934 from tests.unittest import HomeserverTestCase, override_config
30
31
32 @attr.s
33 class FakeResponse:
34 code = attr.ib()
35 body = attr.ib()
36 phrase = attr.ib()
37
38 def deliverBody(self, protocol):
39 protocol.dataReceived(self.body)
40 protocol.connectionLost(Failure(ResponseDone()))
41
4235
4336 # These are a few constants that are used as config parameters in the tests.
4437 ISSUER = "https://issuer/"
6962 COOKIE_PATH = "/_synapse/oidc"
7063
7164
72 class TestMappingProvider(OidcMappingProvider):
65 class TestMappingProvider:
7366 @staticmethod
7467 def parse_config(config):
7568 return
69
70 def __init__(self, config):
71 pass
7672
7773 def get_remote_user_id(self, userinfo):
7874 return userinfo["sub"]
9490 "localpart": userinfo["username"] + (str(failures) if failures else ""),
9591 "display_name": None,
9692 }
97
98
99 def simple_async_mock(return_value=None, raises=None):
100 # AsyncMock is not available in python3.5, this mimics part of its behaviour
101 async def cb(*args, **kwargs):
102 if raises:
103 raise raises
104 return return_value
105
106 return Mock(side_effect=cb)
10793
10894
10995 async def get_json(url):
174160 self.assertEqual(args[2], error_description)
175161 # Reset the render_error mock
176162 self.render_error.reset_mock()
163 return args
177164
178165 def test_config(self):
179166 """Basic config correctly sets up the callback URL and client auth correctly."""
383370 - when the userinfo fetching fails
384371 - when the code exchange fails
385372 """
373
374 # ensure that we are correctly testing the fallback when "get_extra_attributes"
375 # is not implemented.
376 mapping_provider = self.handler._user_mapping_provider
377 with self.assertRaises(AttributeError):
378 _ = mapping_provider.get_extra_attributes
379
386380 token = {
387381 "type": "bearer",
388382 "id_token": "id_token",
389383 "access_token": "access_token",
390384 }
385 username = "bar"
391386 userinfo = {
392387 "sub": "foo",
393 "preferred_username": "bar",
394 }
395 user_id = "@foo:domain.org"
388 "username": username,
389 }
390 expected_user_id = "@%s:%s" % (username, self.hs.hostname)
396391 self.handler._exchange_code = simple_async_mock(return_value=token)
397392 self.handler._parse_id_token = simple_async_mock(return_value=userinfo)
398393 self.handler._fetch_userinfo = simple_async_mock(return_value=userinfo)
399 self.handler._map_userinfo_to_user = simple_async_mock(return_value=user_id)
400 self.handler._auth_handler.complete_sso_login = simple_async_mock()
401 request = Mock(
402 spec=[
403 "args",
404 "getCookie",
405 "addCookie",
406 "requestHeaders",
407 "getClientIP",
408 "get_user_agent",
409 ]
410 )
394 auth_handler = self.hs.get_auth_handler()
395 auth_handler.complete_sso_login = simple_async_mock()
411396
412397 code = "code"
413398 state = "state"
415400 client_redirect_url = "http://client/redirect"
416401 user_agent = "Browser"
417402 ip_address = "10.0.0.1"
418 request.getCookie.return_value = self.handler._generate_oidc_session_token(
403 session = self.handler._generate_oidc_session_token(
419404 state=state,
420405 nonce=nonce,
421406 client_redirect_url=client_redirect_url,
422407 ui_auth_session_id=None,
423408 )
424
425 request.args = {}
426 request.args[b"code"] = [code.encode("utf-8")]
427 request.args[b"state"] = [state.encode("utf-8")]
428
429 request.getClientIP.return_value = ip_address
430 request.get_user_agent.return_value = user_agent
431
432 self.get_success(self.handler.handle_oidc_callback(request))
433
434 self.handler._auth_handler.complete_sso_login.assert_called_once_with(
435 user_id, request, client_redirect_url, {},
409 request = _build_callback_request(
410 code, state, session, user_agent=user_agent, ip_address=ip_address
411 )
412
413 self.get_success(self.handler.handle_oidc_callback(request))
414
415 auth_handler.complete_sso_login.assert_called_once_with(
416 expected_user_id, request, client_redirect_url, None,
436417 )
437418 self.handler._exchange_code.assert_called_once_with(code)
438419 self.handler._parse_id_token.assert_called_once_with(token, nonce=nonce)
439 self.handler._map_userinfo_to_user.assert_called_once_with(
440 userinfo, token, user_agent, ip_address
441 )
442420 self.handler._fetch_userinfo.assert_not_called()
443421 self.render_error.assert_not_called()
444422
445423 # Handle mapping errors
446 self.handler._map_userinfo_to_user = simple_async_mock(
447 raises=MappingException()
448 )
449 self.get_success(self.handler.handle_oidc_callback(request))
450 self.assertRenderedError("mapping_error")
451 self.handler._map_userinfo_to_user = simple_async_mock(return_value=user_id)
424 with patch.object(
425 self.handler,
426 "_remote_id_from_userinfo",
427 new=Mock(side_effect=MappingException()),
428 ):
429 self.get_success(self.handler.handle_oidc_callback(request))
430 self.assertRenderedError("mapping_error")
452431
453432 # Handle ID token errors
454433 self.handler._parse_id_token = simple_async_mock(raises=Exception())
455434 self.get_success(self.handler.handle_oidc_callback(request))
456435 self.assertRenderedError("invalid_token")
457436
458 self.handler._auth_handler.complete_sso_login.reset_mock()
437 auth_handler.complete_sso_login.reset_mock()
459438 self.handler._exchange_code.reset_mock()
460439 self.handler._parse_id_token.reset_mock()
461 self.handler._map_userinfo_to_user.reset_mock()
462440 self.handler._fetch_userinfo.reset_mock()
463441
464442 # With userinfo fetching
465443 self.handler._scopes = [] # do not ask the "openid" scope
466444 self.get_success(self.handler.handle_oidc_callback(request))
467445
468 self.handler._auth_handler.complete_sso_login.assert_called_once_with(
469 user_id, request, client_redirect_url, {},
446 auth_handler.complete_sso_login.assert_called_once_with(
447 expected_user_id, request, client_redirect_url, None,
470448 )
471449 self.handler._exchange_code.assert_called_once_with(code)
472450 self.handler._parse_id_token.assert_not_called()
473 self.handler._map_userinfo_to_user.assert_called_once_with(
474 userinfo, token, user_agent, ip_address
475 )
476451 self.handler._fetch_userinfo.assert_called_once_with(token)
477452 self.render_error.assert_not_called()
478453
623598 }
624599 userinfo = {
625600 "sub": "foo",
601 "username": "foo",
626602 "phone": "1234567",
627603 }
628 user_id = "@foo:domain.org"
629604 self.handler._exchange_code = simple_async_mock(return_value=token)
630605 self.handler._parse_id_token = simple_async_mock(return_value=userinfo)
631 self.handler._map_userinfo_to_user = simple_async_mock(return_value=user_id)
632 self.handler._auth_handler.complete_sso_login = simple_async_mock()
633 request = Mock(
634 spec=[
635 "args",
636 "getCookie",
637 "addCookie",
638 "requestHeaders",
639 "getClientIP",
640 "get_user_agent",
641 ]
642 )
606 auth_handler = self.hs.get_auth_handler()
607 auth_handler.complete_sso_login = simple_async_mock()
643608
644609 state = "state"
645610 client_redirect_url = "http://client/redirect"
646 request.getCookie.return_value = self.handler._generate_oidc_session_token(
611 session = self.handler._generate_oidc_session_token(
647612 state=state,
648613 nonce="nonce",
649614 client_redirect_url=client_redirect_url,
650615 ui_auth_session_id=None,
651616 )
652
653 request.args = {}
654 request.args[b"code"] = [b"code"]
655 request.args[b"state"] = [state.encode("utf-8")]
656
657 request.getClientIP.return_value = "10.0.0.1"
658 request.get_user_agent.return_value = "Browser"
659
660 self.get_success(self.handler.handle_oidc_callback(request))
661
662 self.handler._auth_handler.complete_sso_login.assert_called_once_with(
663 user_id, request, client_redirect_url, {"phone": "1234567"},
617 request = _build_callback_request("code", state, session)
618
619 self.get_success(self.handler.handle_oidc_callback(request))
620
621 auth_handler.complete_sso_login.assert_called_once_with(
622 "@foo:test", request, client_redirect_url, {"phone": "1234567"},
664623 )
665624
666625 def test_map_userinfo_to_user(self):
667626 """Ensure that mapping the userinfo returned from a provider to an MXID works properly."""
627 auth_handler = self.hs.get_auth_handler()
628 auth_handler.complete_sso_login = simple_async_mock()
629
668630 userinfo = {
669631 "sub": "test_user",
670632 "username": "test_user",
671633 }
672 # The token doesn't matter with the default user mapping provider.
673 token = {}
674 mxid = self.get_success(
675 self.handler._map_userinfo_to_user(
676 userinfo, token, "user-agent", "10.10.10.10"
677 )
678 )
679 self.assertEqual(mxid, "@test_user:test")
634 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
635 auth_handler.complete_sso_login.assert_called_once_with(
636 "@test_user:test", ANY, ANY, None,
637 )
638 auth_handler.complete_sso_login.reset_mock()
680639
681640 # Some providers return an integer ID.
682641 userinfo = {
683642 "sub": 1234,
684643 "username": "test_user_2",
685644 }
686 mxid = self.get_success(
687 self.handler._map_userinfo_to_user(
688 userinfo, token, "user-agent", "10.10.10.10"
689 )
690 )
691 self.assertEqual(mxid, "@test_user_2:test")
645 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
646 auth_handler.complete_sso_login.assert_called_once_with(
647 "@test_user_2:test", ANY, ANY, None,
648 )
649 auth_handler.complete_sso_login.reset_mock()
692650
693651 # Test if the mxid is already taken
694652 store = self.hs.get_datastore()
697655 store.register_user(user_id=user3.to_string(), password_hash=None)
698656 )
699657 userinfo = {"sub": "test3", "username": "test_user_3"}
700 e = self.get_failure(
701 self.handler._map_userinfo_to_user(
702 userinfo, token, "user-agent", "10.10.10.10"
703 ),
704 MappingException,
705 )
706 self.assertEqual(
707 str(e.value), "Mapping provider does not support de-duplicating Matrix IDs",
658 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
659 auth_handler.complete_sso_login.assert_not_called()
660 self.assertRenderedError(
661 "mapping_error",
662 "Mapping provider does not support de-duplicating Matrix IDs",
708663 )
709664
710665 @override_config({"oidc_config": {"allow_existing_users": True}})
716671 store.register_user(user_id=user.to_string(), password_hash=None)
717672 )
718673
674 auth_handler = self.hs.get_auth_handler()
675 auth_handler.complete_sso_login = simple_async_mock()
676
719677 # Map a user via SSO.
720678 userinfo = {
721679 "sub": "test",
722680 "username": "test_user",
723681 }
724 token = {}
725 mxid = self.get_success(
726 self.handler._map_userinfo_to_user(
727 userinfo, token, "user-agent", "10.10.10.10"
728 )
729 )
730 self.assertEqual(mxid, "@test_user:test")
682 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
683 auth_handler.complete_sso_login.assert_called_once_with(
684 user.to_string(), ANY, ANY, None,
685 )
686 auth_handler.complete_sso_login.reset_mock()
731687
732688 # Subsequent calls should map to the same mxid.
733 mxid = self.get_success(
734 self.handler._map_userinfo_to_user(
735 userinfo, token, "user-agent", "10.10.10.10"
736 )
737 )
738 self.assertEqual(mxid, "@test_user:test")
689 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
690 auth_handler.complete_sso_login.assert_called_once_with(
691 user.to_string(), ANY, ANY, None,
692 )
693 auth_handler.complete_sso_login.reset_mock()
739694
740695 # Note that a second SSO user can be mapped to the same Matrix ID. (This
741696 # requires a unique sub, but something that maps to the same matrix ID,
746701 "sub": "test1",
747702 "username": "test_user",
748703 }
749 token = {}
750 mxid = self.get_success(
751 self.handler._map_userinfo_to_user(
752 userinfo, token, "user-agent", "10.10.10.10"
753 )
754 )
755 self.assertEqual(mxid, "@test_user:test")
704 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
705 auth_handler.complete_sso_login.assert_called_once_with(
706 user.to_string(), ANY, ANY, None,
707 )
708 auth_handler.complete_sso_login.reset_mock()
756709
757710 # Register some non-exact matching cases.
758711 user2 = UserID.from_string("@TEST_user_2:test")
769722 "sub": "test2",
770723 "username": "TEST_USER_2",
771724 }
772 e = self.get_failure(
773 self.handler._map_userinfo_to_user(
774 userinfo, token, "user-agent", "10.10.10.10"
775 ),
776 MappingException,
777 )
725 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
726 auth_handler.complete_sso_login.assert_not_called()
727 args = self.assertRenderedError("mapping_error")
778728 self.assertTrue(
779 str(e.value).startswith(
729 args[2].startswith(
780730 "Attempted to login as '@TEST_USER_2:test' but it matches more than one user inexactly:"
781731 )
782732 )
787737 store.register_user(user_id=user2.to_string(), password_hash=None)
788738 )
789739
790 mxid = self.get_success(
791 self.handler._map_userinfo_to_user(
792 userinfo, token, "user-agent", "10.10.10.10"
793 )
794 )
795 self.assertEqual(mxid, "@TEST_USER_2:test")
740 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
741 auth_handler.complete_sso_login.assert_called_once_with(
742 "@TEST_USER_2:test", ANY, ANY, None,
743 )
796744
797745 def test_map_userinfo_to_invalid_localpart(self):
798746 """If the mapping provider generates an invalid localpart it should be rejected."""
799 userinfo = {
800 "sub": "test2",
801 "username": "föö",
802 }
803 token = {}
804
805 e = self.get_failure(
806 self.handler._map_userinfo_to_user(
807 userinfo, token, "user-agent", "10.10.10.10"
808 ),
809 MappingException,
810 )
811 self.assertEqual(str(e.value), "localpart is invalid: föö")
747 self.get_success(
748 _make_callback_with_userinfo(self.hs, {"sub": "test2", "username": "föö"})
749 )
750 self.assertRenderedError("mapping_error", "localpart is invalid: föö")
812751
813752 @override_config(
814753 {
821760 )
822761 def test_map_userinfo_to_user_retries(self):
823762 """The mapping provider can retry generating an MXID if the MXID is already in use."""
763 auth_handler = self.hs.get_auth_handler()
764 auth_handler.complete_sso_login = simple_async_mock()
765
824766 store = self.hs.get_datastore()
825767 self.get_success(
826768 store.register_user(user_id="@test_user:test", password_hash=None)
829771 "sub": "test",
830772 "username": "test_user",
831773 }
832 token = {}
833 mxid = self.get_success(
834 self.handler._map_userinfo_to_user(
835 userinfo, token, "user-agent", "10.10.10.10"
836 )
837 )
774 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
775
838776 # test_user is already taken, so test_user1 gets registered instead.
839 self.assertEqual(mxid, "@test_user1:test")
777 auth_handler.complete_sso_login.assert_called_once_with(
778 "@test_user1:test", ANY, ANY, None,
779 )
780 auth_handler.complete_sso_login.reset_mock()
840781
841782 # Register all of the potential mxids for a particular OIDC username.
842783 self.get_success(
852793 "sub": "tester",
853794 "username": "tester",
854795 }
855 e = self.get_failure(
856 self.handler._map_userinfo_to_user(
857 userinfo, token, "user-agent", "10.10.10.10"
796 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
797 auth_handler.complete_sso_login.assert_not_called()
798 self.assertRenderedError(
799 "mapping_error", "Unable to generate a Matrix ID from the SSO response"
800 )
801
802 def test_empty_localpart(self):
803 """Attempts to map onto an empty localpart should be rejected."""
804 userinfo = {
805 "sub": "tester",
806 "username": "",
807 }
808 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
809 self.assertRenderedError("mapping_error", "localpart is invalid: ")
810
811 @override_config(
812 {
813 "oidc_config": {
814 "user_mapping_provider": {
815 "config": {"localpart_template": "{{ user.username }}"}
816 }
817 }
818 }
819 )
820 def test_null_localpart(self):
821 """Mapping onto a null localpart via an empty OIDC attribute should be rejected"""
822 userinfo = {
823 "sub": "tester",
824 "username": None,
825 }
826 self.get_success(_make_callback_with_userinfo(self.hs, userinfo))
827 self.assertRenderedError("mapping_error", "localpart is invalid: ")
828
829
830 class UsernamePickerTestCase(HomeserverTestCase):
831 servlets = [login.register_servlets]
832
833 def default_config(self):
834 config = super().default_config()
835 config["public_baseurl"] = BASE_URL
836 oidc_config = {
837 "enabled": True,
838 "client_id": CLIENT_ID,
839 "client_secret": CLIENT_SECRET,
840 "issuer": ISSUER,
841 "scopes": SCOPES,
842 "user_mapping_provider": {
843 "config": {"display_name_template": "{{ user.displayname }}"}
844 },
845 }
846
847 # Update this config with what's in the default config so that
848 # override_config works as expected.
849 oidc_config.update(config.get("oidc_config", {}))
850 config["oidc_config"] = oidc_config
851
852 # whitelist this client URI so we redirect straight to it rather than
853 # serving a confirmation page
854 config["sso"] = {"client_whitelist": ["https://whitelisted.client"]}
855 return config
856
857 def create_resource_dict(self) -> Dict[str, Resource]:
858 d = super().create_resource_dict()
859 d["/_synapse/client/pick_username"] = pick_username_resource(self.hs)
860 return d
861
862 def test_username_picker(self):
863 """Test the happy path of a username picker flow."""
864 client_redirect_url = "https://whitelisted.client"
865
866 # first of all, mock up an OIDC callback to the OidcHandler, which should
867 # raise a RedirectException
868 userinfo = {"sub": "tester", "displayname": "Jonny"}
869 f = self.get_failure(
870 _make_callback_with_userinfo(
871 self.hs, userinfo, client_redirect_url=client_redirect_url
858872 ),
859 MappingException,
860 )
873 RedirectException,
874 )
875
876 # check the Location and cookies returned by the RedirectException
877 self.assertEqual(f.value.location, b"/_synapse/client/pick_username")
878 cookieheader = f.value.cookies[0]
879 regex = re.compile(b"^username_mapping_session=([a-zA-Z]+);")
880 m = regex.search(cookieheader)
881 if not m:
882 self.fail("cookie header %s does not match %s" % (cookieheader, regex))
883
884 # introspect the sso handler a bit to check that the username mapping session
885 # looks ok.
886 session_id = m.group(1).decode("ascii")
887 username_mapping_sessions = self.hs.get_sso_handler()._username_mapping_sessions
888 self.assertIn(
889 session_id, username_mapping_sessions, "session id not found in map"
890 )
891 session = username_mapping_sessions[session_id]
892 self.assertEqual(session.remote_user_id, "tester")
893 self.assertEqual(session.display_name, "Jonny")
894 self.assertEqual(session.client_redirect_url, client_redirect_url)
895
896 # the expiry time should be about 15 minutes away
897 expected_expiry = self.clock.time_msec() + (15 * 60 * 1000)
898 self.assertApproximates(session.expiry_time_ms, expected_expiry, tolerance=1000)
899
900 # Now, submit a username to the username picker, which should serve a redirect
901 # back to the client
902 submit_path = f.value.location + b"/submit"
903 content = urlencode({b"username": b"bobby"}).encode("utf8")
904 chan = self.make_request(
905 "POST",
906 path=submit_path,
907 content=content,
908 content_is_form=True,
909 custom_headers=[
910 ("Cookie", cookieheader),
911 # old versions of twisted don't do form-parsing without a valid
912 # content-length header.
913 ("Content-Length", str(len(content))),
914 ],
915 )
916 self.assertEqual(chan.code, 302, chan.result)
917 location_headers = chan.headers.getRawHeaders("Location")
918 # ensure that the returned location starts with the requested redirect URL
861919 self.assertEqual(
862 str(e.value), "Unable to generate a Matrix ID from the SSO response"
863 )
920 location_headers[0][: len(client_redirect_url)], client_redirect_url
921 )
922
923 # fish the login token out of the returned redirect uri
924 parts = urlparse(location_headers[0])
925 query = parse_qs(parts.query)
926 login_token = query["loginToken"][0]
927
928 # finally, submit the matrix login token to the login API, which gives us our
929 # matrix access token, mxid, and device id.
930 chan = self.make_request(
931 "POST", "/login", content={"type": "m.login.token", "token": login_token},
932 )
933 self.assertEqual(chan.code, 200, chan.result)
934 self.assertEqual(chan.json_body["user_id"], "@bobby:test")
935
936
937 async def _make_callback_with_userinfo(
938 hs: HomeServer, userinfo: dict, client_redirect_url: str = "http://client/redirect"
939 ) -> None:
940 """Mock up an OIDC callback with the given userinfo dict
941
942 We'll pull out the OIDC handler from the homeserver, stub out a couple of methods,
943 and poke in the userinfo dict as if it were the response to an OIDC userinfo call.
944
945 Args:
946 hs: the HomeServer impl to send the callback to.
947 userinfo: the OIDC userinfo dict
948 client_redirect_url: the URL to redirect to on success.
949 """
950 handler = hs.get_oidc_handler()
951 handler._exchange_code = simple_async_mock(return_value={})
952 handler._parse_id_token = simple_async_mock(return_value=userinfo)
953 handler._fetch_userinfo = simple_async_mock(return_value=userinfo)
954
955 state = "state"
956 session = handler._generate_oidc_session_token(
957 state=state,
958 nonce="nonce",
959 client_redirect_url=client_redirect_url,
960 ui_auth_session_id=None,
961 )
962 request = _build_callback_request("code", state, session)
963
964 await handler.handle_oidc_callback(request)
965
966
967 def _build_callback_request(
968 code: str,
969 state: str,
970 session: str,
971 user_agent: str = "Browser",
972 ip_address: str = "10.0.0.1",
973 ):
974 """Builds a fake SynapseRequest to mock the browser callback
975
976 Returns a Mock object which looks like the SynapseRequest we get from a browser
977 after SSO (before we return to the client)
978
979 Args:
980 code: the authorization code which would have been returned by the OIDC
981 provider
982 state: the "state" param which would have been passed around in the
983 query param. Should be the same as was embedded in the session in
984 _build_oidc_session.
985 session: the "session" which would have been passed around in the cookie.
986 user_agent: the user-agent to present
987 ip_address: the IP address to pretend the request came from
988 """
989 request = Mock(
990 spec=[
991 "args",
992 "getCookie",
993 "addCookie",
994 "requestHeaders",
995 "getClientIP",
996 "get_user_agent",
997 ]
998 )
999
1000 request.getCookie.return_value = session
1001 request.args = {}
1002 request.args[b"code"] = [code.encode("utf-8")]
1003 request.args[b"state"] = [state.encode("utf-8")]
1004 request.getClientIP.return_value = ip_address
1005 request.get_user_agent.return_value = user_agent
1006 return request
431431
432432 @override_config(
433433 {
434 **providers_config(CustomAuthProvider),
435 "password_config": {"enabled": False, "localdb_enabled": False},
436 }
437 )
438 def test_custom_auth_password_disabled_localdb_enabled(self):
439 """Check the localdb_enabled == enabled == False
440
441 Regression test for https://github.com/matrix-org/synapse/issues/8914: check
442 that setting *both* `localdb_enabled` *and* `password: enabled` to False doesn't
443 cause an exception.
444 """
445 self.register_user("localuser", "localpass")
446
447 flows = self._get_login_flows()
448 self.assertEqual(flows, [{"type": "test.login_type"}] + ADDITIONAL_LOGIN_FLOWS)
449
450 # login shouldn't work and should be rejected with a 400 ("unknown login type")
451 channel = self._send_password_login("localuser", "localpass")
452 self.assertEqual(channel.code, 400, channel.result)
453 mock_password_provider.check_auth.assert_not_called()
454
455 @override_config(
456 {
434457 **providers_config(PasswordCustomAuthProvider),
435458 "password_config": {"enabled": False},
436459 }
527550 self.assertEqual(channel.code, 400, channel.result)
528551
529552 def _get_login_flows(self) -> JsonDict:
530 _, channel = self.make_request("GET", "/_matrix/client/r0/login")
553 channel = self.make_request("GET", "/_matrix/client/r0/login")
531554 self.assertEqual(channel.code, 200, channel.result)
532555 return channel.json_body["flows"]
533556
536559
537560 def _send_login(self, type, user, **params) -> FakeChannel:
538561 params.update({"identifier": {"type": "m.id.user", "user": user}, "type": type})
539 _, channel = self.make_request("POST", "/_matrix/client/r0/login", params)
562 channel = self.make_request("POST", "/_matrix/client/r0/login", params)
540563 return channel
541564
542565 def _start_delete_device_session(self, access_token, device_id) -> str:
573596 self, access_token: str, device: str, body: Union[JsonDict, bytes] = b"",
574597 ) -> FakeChannel:
575598 """Delete an individual device."""
576 _, channel = self.make_request(
599 channel = self.make_request(
577600 "DELETE", "devices/" + device, body, access_token=access_token
578601 )
579602 return channel
462462
463463 def make_homeserver(self, reactor, clock):
464464 hs = self.setup_test_homeserver(
465 "server", http_client=None, federation_sender=Mock()
465 "server", federation_http_client=None, federation_sender=Mock()
466466 )
467467 return hs
468468
4343
4444 hs = yield setup_test_homeserver(
4545 self.addCleanup,
46 http_client=None,
47 resource_for_federation=Mock(),
4846 federation_client=self.mock_federation,
4947 federation_server=Mock(),
5048 federation_registry=self.mock_registry,
1111 # See the License for the specific language governing permissions and
1212 # limitations under the License.
1313
14 from typing import Optional
15
16 from mock import Mock
17
1418 import attr
1519
1620 from synapse.api.errors import RedirectException
17 from synapse.handlers.sso import MappingException
18
21
22 from tests.test_utils import simple_async_mock
1923 from tests.unittest import HomeserverTestCase, override_config
24
25 # Check if we have the dependencies to run the tests.
26 try:
27 import saml2.config
28 from saml2.sigver import SigverError
29
30 has_saml2 = True
31
32 # pysaml2 can be installed and imported, but might not be able to find xmlsec1.
33 config = saml2.config.SPConfig()
34 try:
35 config.load({"metadata": {}})
36 has_xmlsec1 = True
37 except SigverError:
38 has_xmlsec1 = False
39 except ImportError:
40 has_saml2 = False
41 has_xmlsec1 = False
2042
2143 # These are a few constants that are used as config parameters in the tests.
2244 BASE_URL = "https://synapse/"
2547 @attr.s
2648 class FakeAuthnResponse:
2749 ava = attr.ib(type=dict)
50 assertions = attr.ib(type=list, factory=list)
51 in_response_to = attr.ib(type=Optional[str], default=None)
2852
2953
3054 class TestMappingProvider:
85109
86110 return hs
87111
112 if not has_saml2:
113 skip = "Requires pysaml2"
114 elif not has_xmlsec1:
115 skip = "Requires xmlsec1"
116
88117 def test_map_saml_response_to_user(self):
89118 """Ensure that mapping the SAML response returned from a provider to an MXID works properly."""
119
120 # stub out the auth handler
121 auth_handler = self.hs.get_auth_handler()
122 auth_handler.complete_sso_login = simple_async_mock()
123
124 # send a mocked-up SAML response to the callback
90125 saml_response = FakeAuthnResponse({"uid": "test_user", "username": "test_user"})
91 # The redirect_url doesn't matter with the default user mapping provider.
92 redirect_url = ""
93 mxid = self.get_success(
94 self.handler._map_saml_response_to_user(
95 saml_response, redirect_url, "user-agent", "10.10.10.10"
96 )
97 )
98 self.assertEqual(mxid, "@test_user:test")
126 request = _mock_request()
127 self.get_success(
128 self.handler._handle_authn_response(request, saml_response, "redirect_uri")
129 )
130
131 # check that the auth handler got called as expected
132 auth_handler.complete_sso_login.assert_called_once_with(
133 "@test_user:test", request, "redirect_uri", None
134 )
99135
100136 @override_config({"saml2_config": {"grandfathered_mxid_source_attribute": "mxid"}})
101137 def test_map_saml_response_to_existing_user(self):
105141 store.register_user(user_id="@test_user:test", password_hash=None)
106142 )
107143
144 # stub out the auth handler
145 auth_handler = self.hs.get_auth_handler()
146 auth_handler.complete_sso_login = simple_async_mock()
147
108148 # Map a user via SSO.
109149 saml_response = FakeAuthnResponse(
110150 {"uid": "tester", "mxid": ["test_user"], "username": "test_user"}
111151 )
112 redirect_url = ""
113 mxid = self.get_success(
114 self.handler._map_saml_response_to_user(
115 saml_response, redirect_url, "user-agent", "10.10.10.10"
116 )
117 )
118 self.assertEqual(mxid, "@test_user:test")
152 request = _mock_request()
153 self.get_success(
154 self.handler._handle_authn_response(request, saml_response, "")
155 )
156
157 # check that the auth handler got called as expected
158 auth_handler.complete_sso_login.assert_called_once_with(
159 "@test_user:test", request, "", None
160 )
119161
120162 # Subsequent calls should map to the same mxid.
121 mxid = self.get_success(
122 self.handler._map_saml_response_to_user(
123 saml_response, redirect_url, "user-agent", "10.10.10.10"
124 )
125 )
126 self.assertEqual(mxid, "@test_user:test")
163 auth_handler.complete_sso_login.reset_mock()
164 self.get_success(
165 self.handler._handle_authn_response(request, saml_response, "")
166 )
167 auth_handler.complete_sso_login.assert_called_once_with(
168 "@test_user:test", request, "", None
169 )
127170
128171 def test_map_saml_response_to_invalid_localpart(self):
129172 """If the mapping provider generates an invalid localpart it should be rejected."""
173
174 # stub out the auth handler
175 auth_handler = self.hs.get_auth_handler()
176 auth_handler.complete_sso_login = simple_async_mock()
177
178 # mock out the error renderer too
179 sso_handler = self.hs.get_sso_handler()
180 sso_handler.render_error = Mock(return_value=None)
181
130182 saml_response = FakeAuthnResponse({"uid": "test", "username": "föö"})
131 redirect_url = ""
132 e = self.get_failure(
133 self.handler._map_saml_response_to_user(
134 saml_response, redirect_url, "user-agent", "10.10.10.10"
135 ),
136 MappingException,
137 )
138 self.assertEqual(str(e.value), "localpart is invalid: föö")
183 request = _mock_request()
184 self.get_success(
185 self.handler._handle_authn_response(request, saml_response, ""),
186 )
187 sso_handler.render_error.assert_called_once_with(
188 request, "mapping_error", "localpart is invalid: föö"
189 )
190 auth_handler.complete_sso_login.assert_not_called()
139191
140192 def test_map_saml_response_to_user_retries(self):
141193 """The mapping provider can retry generating an MXID if the MXID is already in use."""
194
195 # stub out the auth handler and error renderer
196 auth_handler = self.hs.get_auth_handler()
197 auth_handler.complete_sso_login = simple_async_mock()
198 sso_handler = self.hs.get_sso_handler()
199 sso_handler.render_error = Mock(return_value=None)
200
201 # register a user to occupy the first-choice MXID
142202 store = self.hs.get_datastore()
143203 self.get_success(
144204 store.register_user(user_id="@test_user:test", password_hash=None)
145205 )
206
207 # send the fake SAML response
146208 saml_response = FakeAuthnResponse({"uid": "test", "username": "test_user"})
147 redirect_url = ""
148 mxid = self.get_success(
149 self.handler._map_saml_response_to_user(
150 saml_response, redirect_url, "user-agent", "10.10.10.10"
151 )
152 )
209 request = _mock_request()
210 self.get_success(
211 self.handler._handle_authn_response(request, saml_response, ""),
212 )
213
153214 # test_user is already taken, so test_user1 gets registered instead.
154 self.assertEqual(mxid, "@test_user1:test")
215 auth_handler.complete_sso_login.assert_called_once_with(
216 "@test_user1:test", request, "", None
217 )
218 auth_handler.complete_sso_login.reset_mock()
155219
156220 # Register all of the potential mxids for a particular SAML username.
157221 self.get_success(
164228
165229 # Now attempt to map to a username, this will fail since all potential usernames are taken.
166230 saml_response = FakeAuthnResponse({"uid": "tester", "username": "tester"})
167 e = self.get_failure(
168 self.handler._map_saml_response_to_user(
169 saml_response, redirect_url, "user-agent", "10.10.10.10"
170 ),
171 MappingException,
172 )
173 self.assertEqual(
174 str(e.value), "Unable to generate a Matrix ID from the SSO response"
175 )
231 self.get_success(
232 self.handler._handle_authn_response(request, saml_response, ""),
233 )
234 sso_handler.render_error.assert_called_once_with(
235 request,
236 "mapping_error",
237 "Unable to generate a Matrix ID from the SSO response",
238 )
239 auth_handler.complete_sso_login.assert_not_called()
176240
177241 @override_config(
178242 {
184248 }
185249 )
186250 def test_map_saml_response_redirect(self):
251 """Test a mapping provider that raises a RedirectException"""
252
187253 saml_response = FakeAuthnResponse({"uid": "test", "username": "test_user"})
188 redirect_url = ""
254 request = _mock_request()
189255 e = self.get_failure(
190 self.handler._map_saml_response_to_user(
191 saml_response, redirect_url, "user-agent", "10.10.10.10"
192 ),
256 self.handler._handle_authn_response(request, saml_response, ""),
193257 RedirectException,
194258 )
195259 self.assertEqual(e.value.location, b"https://custom-saml-redirect/")
260
261
262 def _mock_request():
263 """Returns a mock which will stand in as a SynapseRequest"""
264 return Mock(spec=["getClientIP", "get_user_agent"])
1414
1515
1616 import json
17 from typing import Dict
1718
1819 from mock import ANY, Mock, call
1920
2021 from twisted.internet import defer
22 from twisted.web.resource import Resource
2123
2224 from synapse.api.errors import AuthError
25 from synapse.federation.transport.server import TransportLayerServer
2326 from synapse.types import UserID, create_requester
2427
2528 from tests import unittest
2629 from tests.test_utils import make_awaitable
2730 from tests.unittest import override_config
28 from tests.utils import register_federation_servlets
2931
3032 # Some local users to test with
3133 U_APPLE = UserID.from_string("@apple:test")
5254
5355
5456 class TypingNotificationsTestCase(unittest.HomeserverTestCase):
55 servlets = [register_federation_servlets]
56
5757 def make_homeserver(self, reactor, clock):
5858 # we mock out the keyring so as to skip the authentication check on the
5959 # federation API call.
6969
7070 hs = self.setup_test_homeserver(
7171 notifier=Mock(),
72 http_client=mock_federation_client,
72 federation_http_client=mock_federation_client,
7373 keyring=mock_keyring,
7474 replication_streams={},
7575 )
7676
7777 return hs
78
79 def create_resource_dict(self) -> Dict[str, Resource]:
80 d = super().create_resource_dict()
81 d["/_matrix/federation"] = TransportLayerServer(self.hs)
82 return d
7883
7984 def prepare(self, reactor, clock, hs):
8085 mock_notifier = hs.get_notifier()
191196 )
192197 )
193198
194 put_json = self.hs.get_http_client().put_json
199 put_json = self.hs.get_federation_http_client().put_json
195200 put_json.assert_called_once_with(
196201 "farm",
197202 path="/_matrix/federation/v1/send/1000000",
214219
215220 self.assertEquals(self.event_source.get_current_key(), 0)
216221
217 (request, channel) = self.make_request(
222 channel = self.make_request(
218223 "PUT",
219224 "/_matrix/federation/v1/send/1000000",
220225 _make_edu_transaction_json(
269274
270275 self.on_new_event.assert_has_calls([call("typing_key", 1, rooms=[ROOM_ID])])
271276
272 put_json = self.hs.get_http_client().put_json
277 put_json = self.hs.get_federation_http_client().put_json
273278 put_json.assert_called_once_with(
274279 "farm",
275280 path="/_matrix/federation/v1/send/1000000",
5353 user_id=support_user_id, password_hash=None, user_type=UserTypes.SUPPORT
5454 )
5555 )
56 regular_user_id = "@regular:test"
57 self.get_success(
58 self.store.register_user(user_id=regular_user_id, password_hash=None)
59 )
5660
5761 self.get_success(
5862 self.handler.handle_local_profile_change(support_user_id, None)
6266 display_name = "display_name"
6367
6468 profile_info = ProfileInfo(avatar_url="avatar_url", display_name=display_name)
65 regular_user_id = "@regular:test"
6669 self.get_success(
6770 self.handler.handle_local_profile_change(regular_user_id, profile_info)
6871 )
6972 profile = self.get_success(self.store.get_user_in_directory(regular_user_id))
7073 self.assertTrue(profile["display_name"] == display_name)
74
75 def test_handle_local_profile_change_with_deactivated_user(self):
76 # create user
77 r_user_id = "@regular:test"
78 self.get_success(
79 self.store.register_user(user_id=r_user_id, password_hash=None)
80 )
81
82 # update profile
83 display_name = "Regular User"
84 profile_info = ProfileInfo(avatar_url="avatar_url", display_name=display_name)
85 self.get_success(
86 self.handler.handle_local_profile_change(r_user_id, profile_info)
87 )
88
89 # profile is in directory
90 profile = self.get_success(self.store.get_user_in_directory(r_user_id))
91 self.assertTrue(profile["display_name"] == display_name)
92
93 # deactivate user
94 self.get_success(self.store.set_user_deactivated_status(r_user_id, True))
95 self.get_success(self.handler.handle_user_deactivated(r_user_id))
96
97 # profile is not in directory
98 profile = self.get_success(self.store.get_user_in_directory(r_user_id))
99 self.assertTrue(profile is None)
100
101 # update profile after deactivation
102 self.get_success(
103 self.handler.handle_local_profile_change(r_user_id, profile_info)
104 )
105
106 # profile is furthermore not in directory
107 profile = self.get_success(self.store.get_user_in_directory(r_user_id))
108 self.assertTrue(profile is None)
71109
72110 def test_handle_user_deactivated_support_user(self):
73111 s_user_id = "@support:test"
269307 spam_checker = self.hs.get_spam_checker()
270308
271309 class AllowAll:
272 def check_username_for_spam(self, user_profile):
310 async def check_username_for_spam(self, user_profile):
273311 # Allow all users.
274312 return False
275313
282320
283321 # Configure a spam checker that filters all users.
284322 class BlockAll:
285 def check_username_for_spam(self, user_profile):
323 async def check_username_for_spam(self, user_profile):
286324 # All users are spammy.
287325 return True
288326
533571 self.helper.join(room, user=u2)
534572
535573 # Assert user directory is not empty
536 request, channel = self.make_request(
574 channel = self.make_request(
537575 "POST", b"user_directory/search", b'{"search_term":"user2"}'
538576 )
539577 self.assertEquals(200, channel.code, channel.result)
541579
542580 # Disable user directory and check search returns nothing
543581 self.config.user_directory_search_enabled = False
544 request, channel = self.make_request(
582 channel = self.make_request(
545583 "POST", b"user_directory/search", b'{"search_term":"user2"}'
546584 )
547585 self.assertEquals(200, channel.code, channel.result)
1616 from mock import Mock
1717
1818 import treq
19 from netaddr import IPSet
1920 from service_identity import VerificationError
2021 from zope.interface import implementer
2122
3435 from synapse.http.federation.matrix_federation_agent import MatrixFederationAgent
3536 from synapse.http.federation.srv_resolver import Server
3637 from synapse.http.federation.well_known_resolver import (
38 WELL_KNOWN_MAX_SIZE,
3739 WellKnownResolver,
3840 _cache_period_from_headers,
3941 )
102104 reactor=self.reactor,
103105 tls_client_options_factory=self.tls_factory,
104106 user_agent="test-agent", # Note that this is unused since _well_known_resolver is provided.
107 ip_blacklist=IPSet(),
105108 _srv_resolver=self.mock_resolver,
106109 _well_known_resolver=self.well_known_resolver,
107110 )
735738 reactor=self.reactor,
736739 tls_client_options_factory=tls_factory,
737740 user_agent=b"test-agent", # This is unused since _well_known_resolver is passed below.
741 ip_blacklist=IPSet(),
738742 _srv_resolver=self.mock_resolver,
739743 _well_known_resolver=WellKnownResolver(
740744 self.reactor,
11021106
11031107 r = self.successResultOf(fetch_d)
11041108 self.assertEqual(r.delegated_server, None)
1109
1110 def test_well_known_too_large(self):
1111 """A well-known query that returns a result which is too large should be rejected."""
1112 self.reactor.lookups["testserv"] = "1.2.3.4"
1113
1114 fetch_d = defer.ensureDeferred(
1115 self.well_known_resolver.get_well_known(b"testserv")
1116 )
1117
1118 # there should be an attempt to connect on port 443 for the .well-known
1119 clients = self.reactor.tcpClients
1120 self.assertEqual(len(clients), 1)
1121 (host, port, client_factory, _timeout, _bindAddress) = clients.pop(0)
1122 self.assertEqual(host, "1.2.3.4")
1123 self.assertEqual(port, 443)
1124
1125 self._handle_well_known_connection(
1126 client_factory,
1127 expected_sni=b"testserv",
1128 response_headers={b"Cache-Control": b"max-age=1000"},
1129 content=b'{ "m.server": "' + (b"a" * WELL_KNOWN_MAX_SIZE) + b'" }',
1130 )
1131
1132 # The result is sucessful, but disabled delegation.
1133 r = self.successResultOf(fetch_d)
1134 self.assertIsNone(r.delegated_server)
11051135
11061136 def test_srv_fallbacks(self):
11071137 """Test that other SRV results are tried if the first one fails.
4545 handler = _AsyncTestCustomEndpoint({}, None).handle_request
4646 resource = AdditionalResource(self.hs, handler)
4747
48 request, channel = make_request(self.reactor, FakeSite(resource), "GET", "/")
48 channel = make_request(self.reactor, FakeSite(resource), "GET", "/")
4949
50 self.assertEqual(request.code, 200)
50 self.assertEqual(channel.code, 200)
5151 self.assertEqual(channel.json_body, {"some_key": "some_value_async"})
5252
5353 def test_sync(self):
5454 handler = _SyncTestCustomEndpoint({}, None).handle_request
5555 resource = AdditionalResource(self.hs, handler)
5656
57 request, channel = make_request(self.reactor, FakeSite(resource), "GET", "/")
57 channel = make_request(self.reactor, FakeSite(resource), "GET", "/")
5858
59 self.assertEqual(request.code, 200)
59 self.assertEqual(channel.code, 200)
6060 self.assertEqual(channel.json_body, {"some_key": "some_value_sync"})
1414 import logging
1515
1616 import treq
17 from netaddr import IPSet
1718
1819 from twisted.internet import interfaces # noqa: F401
1920 from twisted.internet.protocol import Factory
2021 from twisted.protocols.tls import TLSMemoryBIOFactory
2122 from twisted.web.http import HTTPChannel
2223
24 from synapse.http.client import BlacklistingReactorWrapper
2325 from synapse.http.proxyagent import ProxyAgent
2426
2527 from tests.http import TestServerTLSConnectionFactory, get_test_https_policy
291293 body = self.successResultOf(treq.content(resp))
292294 self.assertEqual(body, b"result")
293295
296 def test_http_request_via_proxy_with_blacklist(self):
297 # The blacklist includes the configured proxy IP.
298 agent = ProxyAgent(
299 BlacklistingReactorWrapper(
300 self.reactor, ip_whitelist=None, ip_blacklist=IPSet(["1.0.0.0/8"])
301 ),
302 self.reactor,
303 http_proxy=b"proxy.com:8888",
304 )
305
306 self.reactor.lookups["proxy.com"] = "1.2.3.5"
307 d = agent.request(b"GET", b"http://test.com")
308
309 # there should be a pending TCP connection
310 clients = self.reactor.tcpClients
311 self.assertEqual(len(clients), 1)
312 (host, port, client_factory, _timeout, _bindAddress) = clients[0]
313 self.assertEqual(host, "1.2.3.5")
314 self.assertEqual(port, 8888)
315
316 # make a test server, and wire up the client
317 http_server = self._make_connection(
318 client_factory, _get_test_protocol_factory()
319 )
320
321 # the FakeTransport is async, so we need to pump the reactor
322 self.reactor.advance(0)
323
324 # now there should be a pending request
325 self.assertEqual(len(http_server.requests), 1)
326
327 request = http_server.requests[0]
328 self.assertEqual(request.method, b"GET")
329 self.assertEqual(request.path, b"http://test.com")
330 self.assertEqual(request.requestHeaders.getRawHeaders(b"host"), [b"test.com"])
331 request.write(b"result")
332 request.finish()
333
334 self.reactor.advance(0)
335
336 resp = self.successResultOf(d)
337 body = self.successResultOf(treq.content(resp))
338 self.assertEqual(body, b"result")
339
340 def test_https_request_via_proxy_with_blacklist(self):
341 # The blacklist includes the configured proxy IP.
342 agent = ProxyAgent(
343 BlacklistingReactorWrapper(
344 self.reactor, ip_whitelist=None, ip_blacklist=IPSet(["1.0.0.0/8"])
345 ),
346 self.reactor,
347 contextFactory=get_test_https_policy(),
348 https_proxy=b"proxy.com",
349 )
350
351 self.reactor.lookups["proxy.com"] = "1.2.3.5"
352 d = agent.request(b"GET", b"https://test.com/abc")
353
354 # there should be a pending TCP connection
355 clients = self.reactor.tcpClients
356 self.assertEqual(len(clients), 1)
357 (host, port, client_factory, _timeout, _bindAddress) = clients[0]
358 self.assertEqual(host, "1.2.3.5")
359 self.assertEqual(port, 1080)
360
361 # make a test HTTP server, and wire up the client
362 proxy_server = self._make_connection(
363 client_factory, _get_test_protocol_factory()
364 )
365
366 # fish the transports back out so that we can do the old switcheroo
367 s2c_transport = proxy_server.transport
368 client_protocol = s2c_transport.other
369 c2s_transport = client_protocol.transport
370
371 # the FakeTransport is async, so we need to pump the reactor
372 self.reactor.advance(0)
373
374 # now there should be a pending CONNECT request
375 self.assertEqual(len(proxy_server.requests), 1)
376
377 request = proxy_server.requests[0]
378 self.assertEqual(request.method, b"CONNECT")
379 self.assertEqual(request.path, b"test.com:443")
380
381 # tell the proxy server not to close the connection
382 proxy_server.persistent = True
383
384 # this just stops the http Request trying to do a chunked response
385 # request.setHeader(b"Content-Length", b"0")
386 request.finish()
387
388 # now we can replace the proxy channel with a new, SSL-wrapped HTTP channel
389 ssl_factory = _wrap_server_factory_for_tls(_get_test_protocol_factory())
390 ssl_protocol = ssl_factory.buildProtocol(None)
391 http_server = ssl_protocol.wrappedProtocol
392
393 ssl_protocol.makeConnection(
394 FakeTransport(client_protocol, self.reactor, ssl_protocol)
395 )
396 c2s_transport.other = ssl_protocol
397
398 self.reactor.advance(0)
399
400 server_name = ssl_protocol._tlsConnection.get_servername()
401 expected_sni = b"test.com"
402 self.assertEqual(
403 server_name,
404 expected_sni,
405 "Expected SNI %s but got %s" % (expected_sni, server_name),
406 )
407
408 # now there should be a pending request
409 self.assertEqual(len(http_server.requests), 1)
410
411 request = http_server.requests[0]
412 self.assertEqual(request.method, b"GET")
413 self.assertEqual(request.path, b"/abc")
414 self.assertEqual(request.requestHeaders.getRawHeaders(b"host"), [b"test.com"])
415 request.write(b"result")
416 request.finish()
417
418 self.reactor.advance(0)
419
420 resp = self.successResultOf(d)
421 body = self.successResultOf(treq.content(resp))
422 self.assertEqual(body, b"result")
423
294424
295425 def _wrap_server_factory_for_tls(factory, sanlist=None):
296426 """Wrap an existing Protocol Factory with a test TLSMemoryBIOFactory
1717 from io import StringIO
1818
1919 from synapse.logging._terse_json import JsonFormatter, TerseJsonFormatter
20 from synapse.logging.context import LoggingContext, LoggingContextFilter
2021
2122 from tests.logging import LoggerCleanupMixin
2223 from tests.unittest import TestCase
2324
2425
2526 class TerseJsonTestCase(LoggerCleanupMixin, TestCase):
27 def setUp(self):
28 self.output = StringIO()
29
30 def get_log_line(self):
31 # One log message, with a single trailing newline.
32 data = self.output.getvalue()
33 logs = data.splitlines()
34 self.assertEqual(len(logs), 1)
35 self.assertEqual(data.count("\n"), 1)
36 return json.loads(logs[0])
37
2638 def test_terse_json_output(self):
2739 """
2840 The Terse JSON formatter converts log messages to JSON.
2941 """
30 output = StringIO()
31
32 handler = logging.StreamHandler(output)
42 handler = logging.StreamHandler(self.output)
3343 handler.setFormatter(TerseJsonFormatter())
3444 logger = self.get_logger(handler)
3545
3646 logger.info("Hello there, %s!", "wally")
3747
38 # One log message, with a single trailing newline.
39 data = output.getvalue()
40 logs = data.splitlines()
41 self.assertEqual(len(logs), 1)
42 self.assertEqual(data.count("\n"), 1)
43 log = json.loads(logs[0])
48 log = self.get_log_line()
4449
4550 # The terse logger should give us these keys.
4651 expected_log_keys = [
5661 """
5762 Additional information can be included in the structured logging.
5863 """
59 output = StringIO()
60
61 handler = logging.StreamHandler(output)
64 handler = logging.StreamHandler(self.output)
6265 handler.setFormatter(TerseJsonFormatter())
6366 logger = self.get_logger(handler)
6467
6669 "Hello there, %s!", "wally", extra={"foo": "bar", "int": 3, "bool": True}
6770 )
6871
69 # One log message, with a single trailing newline.
70 data = output.getvalue()
71 logs = data.splitlines()
72 self.assertEqual(len(logs), 1)
73 self.assertEqual(data.count("\n"), 1)
74 log = json.loads(logs[0])
72 log = self.get_log_line()
7573
7674 # The terse logger should give us these keys.
7775 expected_log_keys = [
9593 """
9694 The Terse JSON formatter converts log messages to JSON.
9795 """
98 output = StringIO()
99
100 handler = logging.StreamHandler(output)
96 handler = logging.StreamHandler(self.output)
10197 handler.setFormatter(JsonFormatter())
10298 logger = self.get_logger(handler)
10399
104100 logger.info("Hello there, %s!", "wally")
105101
106 # One log message, with a single trailing newline.
107 data = output.getvalue()
108 logs = data.splitlines()
109 self.assertEqual(len(logs), 1)
110 self.assertEqual(data.count("\n"), 1)
111 log = json.loads(logs[0])
102 log = self.get_log_line()
112103
113104 # The terse logger should give us these keys.
114105 expected_log_keys = [
118109 ]
119110 self.assertCountEqual(log.keys(), expected_log_keys)
120111 self.assertEqual(log["log"], "Hello there, wally!")
112
113 def test_with_context(self):
114 """
115 The logging context should be added to the JSON response.
116 """
117 handler = logging.StreamHandler(self.output)
118 handler.setFormatter(JsonFormatter())
119 handler.addFilter(LoggingContextFilter())
120 logger = self.get_logger(handler)
121
122 with LoggingContext(request="test"):
123 logger.info("Hello there, %s!", "wally")
124
125 log = self.get_log_line()
126
127 # The terse logger should give us these keys.
128 expected_log_keys = [
129 "log",
130 "level",
131 "namespace",
132 "request",
133 ]
134 self.assertCountEqual(log.keys(), expected_log_keys)
135 self.assertEqual(log["log"], "Hello there, wally!")
136 self.assertEqual(log["request"], "test")
208208 )
209209 pushers = list(pushers)
210210 self.assertEqual(len(pushers), 1)
211 last_stream_ordering = pushers[0]["last_stream_ordering"]
211 last_stream_ordering = pushers[0].last_stream_ordering
212212
213213 # Advance time a bit, so the pusher will register something has happened
214214 self.pump(10)
219219 )
220220 pushers = list(pushers)
221221 self.assertEqual(len(pushers), 1)
222 self.assertEqual(last_stream_ordering, pushers[0]["last_stream_ordering"])
222 self.assertEqual(last_stream_ordering, pushers[0].last_stream_ordering)
223223
224224 # One email was attempted to be sent
225225 self.assertEqual(len(self.email_attempts), 1)
237237 )
238238 pushers = list(pushers)
239239 self.assertEqual(len(pushers), 1)
240 self.assertTrue(pushers[0]["last_stream_ordering"] > last_stream_ordering)
240 self.assertTrue(pushers[0].last_stream_ordering > last_stream_ordering)
1717
1818 import synapse.rest.admin
1919 from synapse.logging.context import make_deferred_yieldable
20 from synapse.push import PusherConfigException
2021 from synapse.rest.client.v1 import login, room
2122 from synapse.rest.client.v2_alpha import receipts
2223
3334 user_id = True
3435 hijack_auth = False
3536
37 def default_config(self):
38 config = super().default_config()
39 config["start_pushers"] = True
40 return config
41
3642 def make_homeserver(self, reactor, clock):
3743 self.push_attempts = []
3844
4551
4652 m.post_json_get_json = post_json_get_json
4753
48 config = self.default_config()
49 config["start_pushers"] = True
50
51 hs = self.setup_test_homeserver(config=config, proxied_http_client=m)
54 hs = self.setup_test_homeserver(proxied_blacklisted_http_client=m)
5255
5356 return hs
57
58 def test_invalid_configuration(self):
59 """Invalid push configurations should be rejected."""
60 # Register the user who gets notified
61 user_id = self.register_user("user", "pass")
62 access_token = self.login("user", "pass")
63
64 # Register the pusher
65 user_tuple = self.get_success(
66 self.hs.get_datastore().get_user_by_access_token(access_token)
67 )
68 token_id = user_tuple.token_id
69
70 def test_data(data):
71 self.get_failure(
72 self.hs.get_pusherpool().add_pusher(
73 user_id=user_id,
74 access_token=token_id,
75 kind="http",
76 app_id="m.http",
77 app_display_name="HTTP Push Notifications",
78 device_display_name="pushy push",
79 pushkey="a@example.com",
80 lang=None,
81 data=data,
82 ),
83 PusherConfigException,
84 )
85
86 # Data must be provided with a URL.
87 test_data(None)
88 test_data({})
89 test_data({"url": 1})
90 # A bare domain name isn't accepted.
91 test_data({"url": "example.com"})
92 # A URL without a path isn't accepted.
93 test_data({"url": "http://example.com"})
94 # A url with an incorrect path isn't accepted.
95 test_data({"url": "http://example.com/foo"})
5496
5597 def test_sends_http(self):
5698 """
81123 device_display_name="pushy push",
82124 pushkey="a@example.com",
83125 lang=None,
84 data={"url": "example.com"},
126 data={"url": "http://example.com/_matrix/push/v1/notify"},
85127 )
86128 )
87129
101143 )
102144 pushers = list(pushers)
103145 self.assertEqual(len(pushers), 1)
104 last_stream_ordering = pushers[0]["last_stream_ordering"]
146 last_stream_ordering = pushers[0].last_stream_ordering
105147
106148 # Advance time a bit, so the pusher will register something has happened
107149 self.pump()
112154 )
113155 pushers = list(pushers)
114156 self.assertEqual(len(pushers), 1)
115 self.assertEqual(last_stream_ordering, pushers[0]["last_stream_ordering"])
157 self.assertEqual(last_stream_ordering, pushers[0].last_stream_ordering)
116158
117159 # One push was attempted to be sent -- it'll be the first message
118160 self.assertEqual(len(self.push_attempts), 1)
119 self.assertEqual(self.push_attempts[0][1], "example.com")
161 self.assertEqual(
162 self.push_attempts[0][1], "http://example.com/_matrix/push/v1/notify"
163 )
120164 self.assertEqual(
121165 self.push_attempts[0][2]["notification"]["content"]["body"], "Hi!"
122166 )
131175 )
132176 pushers = list(pushers)
133177 self.assertEqual(len(pushers), 1)
134 self.assertTrue(pushers[0]["last_stream_ordering"] > last_stream_ordering)
135 last_stream_ordering = pushers[0]["last_stream_ordering"]
178 self.assertTrue(pushers[0].last_stream_ordering > last_stream_ordering)
179 last_stream_ordering = pushers[0].last_stream_ordering
136180
137181 # Now it'll try and send the second push message, which will be the second one
138182 self.assertEqual(len(self.push_attempts), 2)
139 self.assertEqual(self.push_attempts[1][1], "example.com")
183 self.assertEqual(
184 self.push_attempts[1][1], "http://example.com/_matrix/push/v1/notify"
185 )
140186 self.assertEqual(
141187 self.push_attempts[1][2]["notification"]["content"]["body"], "There!"
142188 )
151197 )
152198 pushers = list(pushers)
153199 self.assertEqual(len(pushers), 1)
154 self.assertTrue(pushers[0]["last_stream_ordering"] > last_stream_ordering)
200 self.assertTrue(pushers[0].last_stream_ordering > last_stream_ordering)
155201
156202 def test_sends_high_priority_for_encrypted(self):
157203 """
193239 device_display_name="pushy push",
194240 pushkey="a@example.com",
195241 lang=None,
196 data={"url": "example.com"},
242 data={"url": "http://example.com/_matrix/push/v1/notify"},
197243 )
198244 )
199245
229275
230276 # Check our push made it with high priority
231277 self.assertEqual(len(self.push_attempts), 1)
232 self.assertEqual(self.push_attempts[0][1], "example.com")
278 self.assertEqual(
279 self.push_attempts[0][1], "http://example.com/_matrix/push/v1/notify"
280 )
233281 self.assertEqual(self.push_attempts[0][2]["notification"]["prio"], "high")
234282
235283 # Add yet another person — we want to make this room not a 1:1
267315 # Advance time a bit, so the pusher will register something has happened
268316 self.pump()
269317 self.assertEqual(len(self.push_attempts), 2)
270 self.assertEqual(self.push_attempts[1][1], "example.com")
318 self.assertEqual(
319 self.push_attempts[1][1], "http://example.com/_matrix/push/v1/notify"
320 )
271321 self.assertEqual(self.push_attempts[1][2]["notification"]["prio"], "high")
272322
273323 def test_sends_high_priority_for_one_to_one_only(self):
309359 device_display_name="pushy push",
310360 pushkey="a@example.com",
311361 lang=None,
312 data={"url": "example.com"},
362 data={"url": "http://example.com/_matrix/push/v1/notify"},
313363 )
314364 )
315365
325375
326376 # Check our push made it with high priority — this is a one-to-one room
327377 self.assertEqual(len(self.push_attempts), 1)
328 self.assertEqual(self.push_attempts[0][1], "example.com")
378 self.assertEqual(
379 self.push_attempts[0][1], "http://example.com/_matrix/push/v1/notify"
380 )
329381 self.assertEqual(self.push_attempts[0][2]["notification"]["prio"], "high")
330382
331383 # Yet another user joins
344396 # Advance time a bit, so the pusher will register something has happened
345397 self.pump()
346398 self.assertEqual(len(self.push_attempts), 2)
347 self.assertEqual(self.push_attempts[1][1], "example.com")
399 self.assertEqual(
400 self.push_attempts[1][1], "http://example.com/_matrix/push/v1/notify"
401 )
348402
349403 # check that this is low-priority
350404 self.assertEqual(self.push_attempts[1][2]["notification"]["prio"], "low")
391445 device_display_name="pushy push",
392446 pushkey="a@example.com",
393447 lang=None,
394 data={"url": "example.com"},
448 data={"url": "http://example.com/_matrix/push/v1/notify"},
395449 )
396450 )
397451
407461
408462 # Check our push made it with high priority
409463 self.assertEqual(len(self.push_attempts), 1)
410 self.assertEqual(self.push_attempts[0][1], "example.com")
464 self.assertEqual(
465 self.push_attempts[0][1], "http://example.com/_matrix/push/v1/notify"
466 )
411467 self.assertEqual(self.push_attempts[0][2]["notification"]["prio"], "high")
412468
413469 # Send another event, this time with no mention
416472 # Advance time a bit, so the pusher will register something has happened
417473 self.pump()
418474 self.assertEqual(len(self.push_attempts), 2)
419 self.assertEqual(self.push_attempts[1][1], "example.com")
475 self.assertEqual(
476 self.push_attempts[1][1], "http://example.com/_matrix/push/v1/notify"
477 )
420478
421479 # check that this is low-priority
422480 self.assertEqual(self.push_attempts[1][2]["notification"]["prio"], "low")
464522 device_display_name="pushy push",
465523 pushkey="a@example.com",
466524 lang=None,
467 data={"url": "example.com"},
525 data={"url": "http://example.com/_matrix/push/v1/notify"},
468526 )
469527 )
470528
484542
485543 # Check our push made it with high priority
486544 self.assertEqual(len(self.push_attempts), 1)
487 self.assertEqual(self.push_attempts[0][1], "example.com")
545 self.assertEqual(
546 self.push_attempts[0][1], "http://example.com/_matrix/push/v1/notify"
547 )
488548 self.assertEqual(self.push_attempts[0][2]["notification"]["prio"], "high")
489549
490550 # Send another event, this time as someone without the power of @room
495555 # Advance time a bit, so the pusher will register something has happened
496556 self.pump()
497557 self.assertEqual(len(self.push_attempts), 2)
498 self.assertEqual(self.push_attempts[1][1], "example.com")
558 self.assertEqual(
559 self.push_attempts[1][1], "http://example.com/_matrix/push/v1/notify"
560 )
499561
500562 # check that this is low-priority
501563 self.assertEqual(self.push_attempts[1][2]["notification"]["prio"], "low")
569631 device_display_name="pushy push",
570632 pushkey="a@example.com",
571633 lang=None,
572 data={"url": "example.com"},
634 data={"url": "http://example.com/_matrix/push/v1/notify"},
573635 )
574636 )
575637
588650
589651 # Check our push made it
590652 self.assertEqual(len(self.push_attempts), 1)
591 self.assertEqual(self.push_attempts[0][1], "example.com")
653 self.assertEqual(
654 self.push_attempts[0][1], "http://example.com/_matrix/push/v1/notify"
655 )
592656
593657 # Check that the unread count for the room is 0
594658 #
602666 # This will actually trigger a new notification to be sent out so that
603667 # even if the user does not receive another message, their unread
604668 # count goes down
605 request, channel = self.make_request(
669 channel = self.make_request(
606670 "POST",
607671 "/rooms/%s/receipt/m.read/%s" % (room_id, first_message_event_id),
608672 {},
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414 import logging
15 from typing import Any, Callable, List, Optional, Tuple
15 from typing import Any, Callable, Dict, List, Optional, Tuple
1616
1717 import attr
1818
2020 from twisted.internet.protocol import Protocol
2121 from twisted.internet.task import LoopingCall
2222 from twisted.web.http import HTTPChannel
23 from twisted.web.resource import Resource
2324
2425 from synapse.app.generic_worker import (
2526 GenericWorkerReplicationHandler,
2728 )
2829 from synapse.http.server import JsonResource
2930 from synapse.http.site import SynapseRequest, SynapseSite
30 from synapse.replication.http import ReplicationRestResource, streams
31 from synapse.replication.http import ReplicationRestResource
3132 from synapse.replication.tcp.handler import ReplicationCommandHandler
3233 from synapse.replication.tcp.protocol import ClientReplicationStreamProtocol
3334 from synapse.replication.tcp.resource import ReplicationStreamProtocolFactory
5354 if not hiredis:
5455 skip = "Requires hiredis"
5556
56 servlets = [
57 streams.register_servlets,
58 ]
59
6057 def prepare(self, reactor, clock, hs):
6158 # build a replication server
6259 server_factory = ReplicationStreamProtocolFactory(hs)
6663 # Make a new HomeServer object for the worker
6764 self.reactor.lookups["testserv"] = "1.2.3.4"
6865 self.worker_hs = self.setup_test_homeserver(
69 http_client=None,
66 federation_http_client=None,
7067 homeserver_to_use=GenericWorkerServer,
7168 config=self._get_worker_hs_config(),
7269 reactor=self.reactor,
8683
8784 self._client_transport = None
8885 self._server_transport = None
86
87 def create_resource_dict(self) -> Dict[str, Resource]:
88 d = super().create_resource_dict()
89 d["/_synapse/replication"] = ReplicationRestResource(self.hs)
90 return d
8991
9092 def _get_worker_hs_config(self) -> dict:
9193 config = self.default_config()
263265 worker_app: Type of worker, e.g. `synapse.app.federation_sender`.
264266 extra_config: Any extra config to use for this instances.
265267 **kwargs: Options that get passed to `self.setup_test_homeserver`,
266 useful to e.g. pass some mocks for things like `http_client`
268 useful to e.g. pass some mocks for things like `federation_http_client`
267269
268270 Returns:
269271 The new worker HomeServer instance.
0 # -*- coding: utf-8 -*-
1 # Copyright 2020 The Matrix.org Foundation C.I.C.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 import logging
15
16 from synapse.rest.client.v2_alpha import register
17
18 from tests.replication._base import BaseMultiWorkerStreamTestCase
19 from tests.server import FakeChannel, make_request
20 from tests.unittest import override_config
21
22 logger = logging.getLogger(__name__)
23
24
25 class WorkerAuthenticationTestCase(BaseMultiWorkerStreamTestCase):
26 """Test the authentication of HTTP calls between workers."""
27
28 servlets = [register.register_servlets]
29
30 def make_homeserver(self, reactor, clock):
31 config = self.default_config()
32 # This isn't a real configuration option but is used to provide the main
33 # homeserver and worker homeserver different options.
34 main_replication_secret = config.pop("main_replication_secret", None)
35 if main_replication_secret:
36 config["worker_replication_secret"] = main_replication_secret
37 return self.setup_test_homeserver(config=config)
38
39 def _get_worker_hs_config(self) -> dict:
40 config = self.default_config()
41 config["worker_app"] = "synapse.app.client_reader"
42 config["worker_replication_host"] = "testserv"
43 config["worker_replication_http_port"] = "8765"
44
45 return config
46
47 def _test_register(self) -> FakeChannel:
48 """Run the actual test:
49
50 1. Create a worker homeserver.
51 2. Start registration by providing a user/password.
52 3. Complete registration by providing dummy auth (this hits the main synapse).
53 4. Return the final request.
54
55 """
56 worker_hs = self.make_worker_hs("synapse.app.client_reader")
57 site = self._hs_to_site[worker_hs]
58
59 channel_1 = make_request(
60 self.reactor,
61 site,
62 "POST",
63 "register",
64 {"username": "user", "type": "m.login.password", "password": "bar"},
65 )
66 self.assertEqual(channel_1.code, 401)
67
68 # Grab the session
69 session = channel_1.json_body["session"]
70
71 # also complete the dummy auth
72 return make_request(
73 self.reactor,
74 site,
75 "POST",
76 "register",
77 {"auth": {"session": session, "type": "m.login.dummy"}},
78 )
79
80 def test_no_auth(self):
81 """With no authentication the request should finish.
82 """
83 channel = self._test_register()
84 self.assertEqual(channel.code, 200)
85
86 # We're given a registered user.
87 self.assertEqual(channel.json_body["user_id"], "@user:test")
88
89 @override_config({"main_replication_secret": "my-secret"})
90 def test_missing_auth(self):
91 """If the main process expects a secret that is not provided, an error results.
92 """
93 channel = self._test_register()
94 self.assertEqual(channel.code, 500)
95
96 @override_config(
97 {
98 "main_replication_secret": "my-secret",
99 "worker_replication_secret": "wrong-secret",
100 }
101 )
102 def test_unauthorized(self):
103 """If the main process receives the wrong secret, an error results.
104 """
105 channel = self._test_register()
106 self.assertEqual(channel.code, 500)
107
108 @override_config({"worker_replication_secret": "my-secret"})
109 def test_authorized(self):
110 """The request should finish when the worker provides the authentication header.
111 """
112 channel = self._test_register()
113 self.assertEqual(channel.code, 200)
114
115 # We're given a registered user.
116 self.assertEqual(channel.json_body["user_id"], "@user:test")
1313 # limitations under the License.
1414 import logging
1515
16 from synapse.api.constants import LoginType
17 from synapse.http.site import SynapseRequest
1816 from synapse.rest.client.v2_alpha import register
1917
2018 from tests.replication._base import BaseMultiWorkerStreamTestCase
21 from tests.rest.client.v2_alpha.test_auth import DummyRecaptchaChecker
22 from tests.server import FakeChannel, make_request
19 from tests.server import make_request
2320
2421 logger = logging.getLogger(__name__)
2522
2623
2724 class ClientReaderTestCase(BaseMultiWorkerStreamTestCase):
28 """Base class for tests of the replication streams"""
25 """Test using one or more client readers for registration."""
2926
3027 servlets = [register.register_servlets]
31
32 def prepare(self, reactor, clock, hs):
33 self.recaptcha_checker = DummyRecaptchaChecker(hs)
34 auth_handler = hs.get_auth_handler()
35 auth_handler.checkers[LoginType.RECAPTCHA] = self.recaptcha_checker
3628
3729 def _get_worker_hs_config(self) -> dict:
3830 config = self.default_config()
4739 worker_hs = self.make_worker_hs("synapse.app.client_reader")
4840 site = self._hs_to_site[worker_hs]
4941
50 request_1, channel_1 = make_request(
42 channel_1 = make_request(
5143 self.reactor,
5244 site,
5345 "POST",
5446 "register",
5547 {"username": "user", "type": "m.login.password", "password": "bar"},
56 ) # type: SynapseRequest, FakeChannel
57 self.assertEqual(request_1.code, 401)
48 )
49 self.assertEqual(channel_1.code, 401)
5850
5951 # Grab the session
6052 session = channel_1.json_body["session"]
6153
6254 # also complete the dummy auth
63 request_2, channel_2 = make_request(
55 channel_2 = make_request(
6456 self.reactor,
6557 site,
6658 "POST",
6759 "register",
6860 {"auth": {"session": session, "type": "m.login.dummy"}},
69 ) # type: SynapseRequest, FakeChannel
70 self.assertEqual(request_2.code, 200)
61 )
62 self.assertEqual(channel_2.code, 200)
7163
7264 # We're given a registered user.
7365 self.assertEqual(channel_2.json_body["user_id"], "@user:test")
7971 worker_hs_2 = self.make_worker_hs("synapse.app.client_reader")
8072
8173 site_1 = self._hs_to_site[worker_hs_1]
82 request_1, channel_1 = make_request(
74 channel_1 = make_request(
8375 self.reactor,
8476 site_1,
8577 "POST",
8678 "register",
8779 {"username": "user", "type": "m.login.password", "password": "bar"},
88 ) # type: SynapseRequest, FakeChannel
89 self.assertEqual(request_1.code, 401)
80 )
81 self.assertEqual(channel_1.code, 401)
9082
9183 # Grab the session
9284 session = channel_1.json_body["session"]
9385
9486 # also complete the dummy auth
9587 site_2 = self._hs_to_site[worker_hs_2]
96 request_2, channel_2 = make_request(
88 channel_2 = make_request(
9789 self.reactor,
9890 site_2,
9991 "POST",
10092 "register",
10193 {"auth": {"session": session, "type": "m.login.dummy"}},
102 ) # type: SynapseRequest, FakeChannel
103 self.assertEqual(request_2.code, 200)
94 )
95 self.assertEqual(channel_2.code, 200)
10496
10597 # We're given a registered user.
10698 self.assertEqual(channel_2.json_body["user_id"], "@user:test")
4949 self.make_worker_hs(
5050 "synapse.app.federation_sender",
5151 {"send_federation": True},
52 http_client=mock_client,
52 federation_http_client=mock_client,
5353 )
5454
5555 user = self.register_user("user", "pass")
8080 "worker_name": "sender1",
8181 "federation_sender_instances": ["sender1", "sender2"],
8282 },
83 http_client=mock_client1,
83 federation_http_client=mock_client1,
8484 )
8585
8686 mock_client2 = Mock(spec=["put_json"])
9292 "worker_name": "sender2",
9393 "federation_sender_instances": ["sender1", "sender2"],
9494 },
95 http_client=mock_client2,
95 federation_http_client=mock_client2,
9696 )
9797
9898 user = self.register_user("user2", "pass")
143143 "worker_name": "sender1",
144144 "federation_sender_instances": ["sender1", "sender2"],
145145 },
146 http_client=mock_client1,
146 federation_http_client=mock_client1,
147147 )
148148
149149 mock_client2 = Mock(spec=["put_json"])
155155 "worker_name": "sender2",
156156 "federation_sender_instances": ["sender1", "sender2"],
157157 },
158 http_client=mock_client2,
158 federation_http_client=mock_client2,
159159 )
160160
161161 user = self.register_user("user3", "pass")
4747 self.user_id = self.register_user("user", "pass")
4848 self.access_token = self.login("user", "pass")
4949
50 self.reactor.lookups["example.com"] = "127.0.0.2"
50 self.reactor.lookups["example.com"] = "1.2.3.4"
5151
5252 def default_config(self):
5353 conf = super().default_config()
6767 the media which the caller should respond to.
6868 """
6969 resource = hs.get_media_repository_resource().children[b"download"]
70 _, channel = make_request(
70 channel = make_request(
7171 self.reactor,
7272 FakeSite(resource),
7373 "GET",
6666 device_display_name="pushy push",
6767 pushkey="a@example.com",
6868 lang=None,
69 data={"url": "https://push.example.com/push"},
69 data={"url": "https://push.example.com/_matrix/push/v1/notify"},
7070 )
7171 )
7272
9797 self.make_worker_hs(
9898 "synapse.app.pusher",
9999 {"start_pushers": True},
100 proxied_http_client=http_client_mock,
100 proxied_blacklisted_http_client=http_client_mock,
101101 )
102102
103103 event_id = self._create_pusher_and_send_msg("user")
108108 http_client_mock.post_json_get_json.assert_called_once()
109109 self.assertEqual(
110110 http_client_mock.post_json_get_json.call_args[0][0],
111 "https://push.example.com/push",
111 "https://push.example.com/_matrix/push/v1/notify",
112112 )
113113 self.assertEqual(
114114 event_id,
132132 "worker_name": "pusher1",
133133 "pusher_instances": ["pusher1", "pusher2"],
134134 },
135 proxied_http_client=http_client_mock1,
135 proxied_blacklisted_http_client=http_client_mock1,
136136 )
137137
138138 http_client_mock2 = Mock(spec_set=["post_json_get_json"])
147147 "worker_name": "pusher2",
148148 "pusher_instances": ["pusher1", "pusher2"],
149149 },
150 proxied_http_client=http_client_mock2,
150 proxied_blacklisted_http_client=http_client_mock2,
151151 )
152152
153153 # We choose a user name that we know should go to pusher1.
160160 http_client_mock2.post_json_get_json.assert_not_called()
161161 self.assertEqual(
162162 http_client_mock1.post_json_get_json.call_args[0][0],
163 "https://push.example.com/push",
163 "https://push.example.com/_matrix/push/v1/notify",
164164 )
165165 self.assertEqual(
166166 event_id,
182182 http_client_mock2.post_json_get_json.assert_called_once()
183183 self.assertEqual(
184184 http_client_mock2.post_json_get_json.call_args[0][0],
185 "https://push.example.com/push",
185 "https://push.example.com/_matrix/push/v1/notify",
186186 )
187187 self.assertEqual(
188188 event_id,
179179 )
180180
181181 # Do an initial sync so that we're up to date.
182 request, channel = make_request(
182 channel = make_request(
183183 self.reactor, sync_hs_site, "GET", "/sync", access_token=access_token
184184 )
185185 next_batch = channel.json_body["next_batch"]
205205
206206 # Check that syncing still gets the new event, despite the gap in the
207207 # stream IDs.
208 request, channel = make_request(
208 channel = make_request(
209209 self.reactor,
210210 sync_hs_site,
211211 "GET",
235235 response = self.helper.send(room_id2, body="Hi!", tok=self.other_access_token)
236236 first_event_in_room2 = response["event_id"]
237237
238 request, channel = make_request(
238 channel = make_request(
239239 self.reactor,
240240 sync_hs_site,
241241 "GET",
260260 self.helper.send(room_id1, body="Hi again!", tok=self.other_access_token)
261261 self.helper.send(room_id2, body="Hi again!", tok=self.other_access_token)
262262
263 request, channel = make_request(
263 channel = make_request(
264264 self.reactor,
265265 sync_hs_site,
266266 "GET",
278278 # Paginating back in the first room should not produce any results, as
279279 # no events have happened in it. This tests that we are correctly
280280 # filtering results based on the vector clock portion.
281 request, channel = make_request(
281 channel = make_request(
282282 self.reactor,
283283 sync_hs_site,
284284 "GET",
291291
292292 # Paginating back on the second room should produce the first event
293293 # again. This tests that pagination isn't completely broken.
294 request, channel = make_request(
294 channel = make_request(
295295 self.reactor,
296296 sync_hs_site,
297297 "GET",
306306 )
307307
308308 # Paginating forwards should give the same results
309 request, channel = make_request(
309 channel = make_request(
310310 self.reactor,
311311 sync_hs_site,
312312 "GET",
317317 )
318318 self.assertListEqual([], channel.json_body["chunk"])
319319
320 request, channel = make_request(
320 channel = make_request(
321321 self.reactor,
322322 sync_hs_site,
323323 "GET",
4141 return resource
4242
4343 def test_version_string(self):
44 request, channel = self.make_request("GET", self.url, shorthand=False)
44 channel = self.make_request("GET", self.url, shorthand=False)
4545
4646 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
4747 self.assertEqual(
6767
6868 def test_delete_group(self):
6969 # Create a new group
70 request, channel = self.make_request(
70 channel = self.make_request(
7171 "POST",
7272 "/create_group".encode("ascii"),
7373 access_token=self.admin_user_tok,
8383 # Invite/join another user
8484
8585 url = "/groups/%s/admin/users/invite/%s" % (group_id, self.other_user)
86 request, channel = self.make_request(
86 channel = self.make_request(
8787 "PUT", url.encode("ascii"), access_token=self.admin_user_tok, content={}
8888 )
8989 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
9090
9191 url = "/groups/%s/self/accept_invite" % (group_id,)
92 request, channel = self.make_request(
92 channel = self.make_request(
9393 "PUT", url.encode("ascii"), access_token=self.other_user_token, content={}
9494 )
9595 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
100100
101101 # Now delete the group
102102 url = "/_synapse/admin/v1/delete_group/" + group_id
103 request, channel = self.make_request(
103 channel = self.make_request(
104104 "POST",
105105 url.encode("ascii"),
106106 access_token=self.admin_user_tok,
122122 """
123123
124124 url = "/groups/%s/profile" % (group_id,)
125 request, channel = self.make_request(
125 channel = self.make_request(
126126 "GET", url.encode("ascii"), access_token=self.admin_user_tok
127127 )
128128
133133 def _get_groups_user_is_in(self, access_token):
134134 """Returns the list of groups the user is in (given their access token)
135135 """
136 request, channel = self.make_request(
136 channel = self.make_request(
137137 "GET", "/joined_groups".encode("ascii"), access_token=access_token
138138 )
139139
209209 }
210210 config["media_storage_providers"] = [provider_config]
211211
212 hs = self.setup_test_homeserver(config=config, http_client=client)
212 hs = self.setup_test_homeserver(config=config, federation_http_client=client)
213213
214214 return hs
215215
216216 def _ensure_quarantined(self, admin_user_tok, server_and_media_id):
217217 """Ensure a piece of media is quarantined when trying to access it."""
218 request, channel = make_request(
218 channel = make_request(
219219 self.reactor,
220220 FakeSite(self.download_resource),
221221 "GET",
240240
241241 # Attempt quarantine media APIs as non-admin
242242 url = "/_synapse/admin/v1/media/quarantine/example.org/abcde12345"
243 request, channel = self.make_request(
243 channel = self.make_request(
244244 "POST", url.encode("ascii"), access_token=non_admin_user_tok,
245245 )
246246
253253
254254 # And the roomID/userID endpoint
255255 url = "/_synapse/admin/v1/room/!room%3Aexample.com/media/quarantine"
256 request, channel = self.make_request(
256 channel = self.make_request(
257257 "POST", url.encode("ascii"), access_token=non_admin_user_tok,
258258 )
259259
281281 server_name, media_id = server_name_and_media_id.split("/")
282282
283283 # Attempt to access the media
284 request, channel = make_request(
284 channel = make_request(
285285 self.reactor,
286286 FakeSite(self.download_resource),
287287 "GET",
298298 urllib.parse.quote(server_name),
299299 urllib.parse.quote(media_id),
300300 )
301 request, channel = self.make_request("POST", url, access_token=admin_user_tok,)
301 channel = self.make_request("POST", url, access_token=admin_user_tok,)
302302 self.pump(1.0)
303303 self.assertEqual(200, int(channel.code), msg=channel.result["body"])
304304
350350 url = "/_synapse/admin/v1/room/%s/media/quarantine" % urllib.parse.quote(
351351 room_id
352352 )
353 request, channel = self.make_request("POST", url, access_token=admin_user_tok,)
353 channel = self.make_request("POST", url, access_token=admin_user_tok,)
354354 self.pump(1.0)
355355 self.assertEqual(200, int(channel.code), msg=channel.result["body"])
356356 self.assertEqual(
394394 url = "/_synapse/admin/v1/user/%s/media/quarantine" % urllib.parse.quote(
395395 non_admin_user
396396 )
397 request, channel = self.make_request(
397 channel = self.make_request(
398398 "POST", url.encode("ascii"), access_token=admin_user_tok,
399399 )
400400 self.pump(1.0)
436436 url = "/_synapse/admin/v1/user/%s/media/quarantine" % urllib.parse.quote(
437437 non_admin_user
438438 )
439 request, channel = self.make_request(
439 channel = self.make_request(
440440 "POST", url.encode("ascii"), access_token=admin_user_tok,
441441 )
442442 self.pump(1.0)
452452 self._ensure_quarantined(admin_user_tok, server_and_media_id_1)
453453
454454 # Attempt to access each piece of media
455 request, channel = make_request(
455 channel = make_request(
456456 self.reactor,
457457 FakeSite(self.download_resource),
458458 "GET",
4949 """
5050 Try to get a device of an user without authentication.
5151 """
52 request, channel = self.make_request("GET", self.url, b"{}")
52 channel = self.make_request("GET", self.url, b"{}")
5353
5454 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
5555 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
5656
57 request, channel = self.make_request("PUT", self.url, b"{}")
57 channel = self.make_request("PUT", self.url, b"{}")
5858
5959 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
6060 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
6161
62 request, channel = self.make_request("DELETE", self.url, b"{}")
62 channel = self.make_request("DELETE", self.url, b"{}")
6363
6464 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
6565 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
6868 """
6969 If the user is not a server admin, an error is returned.
7070 """
71 request, channel = self.make_request(
71 channel = self.make_request(
7272 "GET", self.url, access_token=self.other_user_token,
7373 )
7474
7575 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
7676 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
7777
78 request, channel = self.make_request(
78 channel = self.make_request(
7979 "PUT", self.url, access_token=self.other_user_token,
8080 )
8181
8282 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
8383 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
8484
85 request, channel = self.make_request(
85 channel = self.make_request(
8686 "DELETE", self.url, access_token=self.other_user_token,
8787 )
8888
9898 % self.other_user_device_id
9999 )
100100
101 request, channel = self.make_request(
102 "GET", url, access_token=self.admin_user_tok,
103 )
104
105 self.assertEqual(404, channel.code, msg=channel.json_body)
106 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
107
108 request, channel = self.make_request(
109 "PUT", url, access_token=self.admin_user_tok,
110 )
111
112 self.assertEqual(404, channel.code, msg=channel.json_body)
113 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
114
115 request, channel = self.make_request(
116 "DELETE", url, access_token=self.admin_user_tok,
117 )
101 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
102
103 self.assertEqual(404, channel.code, msg=channel.json_body)
104 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
105
106 channel = self.make_request("PUT", url, access_token=self.admin_user_tok,)
107
108 self.assertEqual(404, channel.code, msg=channel.json_body)
109 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
110
111 channel = self.make_request("DELETE", url, access_token=self.admin_user_tok,)
118112
119113 self.assertEqual(404, channel.code, msg=channel.json_body)
120114 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
128122 % self.other_user_device_id
129123 )
130124
131 request, channel = self.make_request(
132 "GET", url, access_token=self.admin_user_tok,
133 )
125 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
134126
135127 self.assertEqual(400, channel.code, msg=channel.json_body)
136128 self.assertEqual("Can only lookup local users", channel.json_body["error"])
137129
138 request, channel = self.make_request(
139 "PUT", url, access_token=self.admin_user_tok,
140 )
130 channel = self.make_request("PUT", url, access_token=self.admin_user_tok,)
141131
142132 self.assertEqual(400, channel.code, msg=channel.json_body)
143133 self.assertEqual("Can only lookup local users", channel.json_body["error"])
144134
145 request, channel = self.make_request(
146 "DELETE", url, access_token=self.admin_user_tok,
147 )
135 channel = self.make_request("DELETE", url, access_token=self.admin_user_tok,)
148136
149137 self.assertEqual(400, channel.code, msg=channel.json_body)
150138 self.assertEqual("Can only lookup local users", channel.json_body["error"])
157145 self.other_user
158146 )
159147
160 request, channel = self.make_request(
161 "GET", url, access_token=self.admin_user_tok,
162 )
163
164 self.assertEqual(404, channel.code, msg=channel.json_body)
165 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
166
167 request, channel = self.make_request(
168 "PUT", url, access_token=self.admin_user_tok,
169 )
170
171 self.assertEqual(200, channel.code, msg=channel.json_body)
172
173 request, channel = self.make_request(
174 "DELETE", url, access_token=self.admin_user_tok,
175 )
148 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
149
150 self.assertEqual(404, channel.code, msg=channel.json_body)
151 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
152
153 channel = self.make_request("PUT", url, access_token=self.admin_user_tok,)
154
155 self.assertEqual(200, channel.code, msg=channel.json_body)
156
157 channel = self.make_request("DELETE", url, access_token=self.admin_user_tok,)
176158
177159 # Delete unknown device returns status 200
178160 self.assertEqual(200, channel.code, msg=channel.json_body)
196178 }
197179
198180 body = json.dumps(update)
199 request, channel = self.make_request(
181 channel = self.make_request(
200182 "PUT",
201183 self.url,
202184 access_token=self.admin_user_tok,
207189 self.assertEqual(Codes.TOO_LARGE, channel.json_body["errcode"])
208190
209191 # Ensure the display name was not updated.
210 request, channel = self.make_request(
211 "GET", self.url, access_token=self.admin_user_tok,
212 )
192 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
213193
214194 self.assertEqual(200, channel.code, msg=channel.json_body)
215195 self.assertEqual("new display", channel.json_body["display_name"])
226206 )
227207 )
228208
229 request, channel = self.make_request(
230 "PUT", self.url, access_token=self.admin_user_tok,
231 )
209 channel = self.make_request("PUT", self.url, access_token=self.admin_user_tok,)
232210
233211 self.assertEqual(200, channel.code, msg=channel.json_body)
234212
235213 # Ensure the display name was not updated.
236 request, channel = self.make_request(
237 "GET", self.url, access_token=self.admin_user_tok,
238 )
214 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
239215
240216 self.assertEqual(200, channel.code, msg=channel.json_body)
241217 self.assertEqual("new display", channel.json_body["display_name"])
246222 """
247223 # Set new display_name
248224 body = json.dumps({"display_name": "new displayname"})
249 request, channel = self.make_request(
225 channel = self.make_request(
250226 "PUT",
251227 self.url,
252228 access_token=self.admin_user_tok,
256232 self.assertEqual(200, channel.code, msg=channel.json_body)
257233
258234 # Check new display_name
259 request, channel = self.make_request(
260 "GET", self.url, access_token=self.admin_user_tok,
261 )
235 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
262236
263237 self.assertEqual(200, channel.code, msg=channel.json_body)
264238 self.assertEqual("new displayname", channel.json_body["display_name"])
267241 """
268242 Tests that a normal lookup for a device is successfully
269243 """
270 request, channel = self.make_request(
271 "GET", self.url, access_token=self.admin_user_tok,
272 )
244 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
273245
274246 self.assertEqual(200, channel.code, msg=channel.json_body)
275247 self.assertEqual(self.other_user, channel.json_body["user_id"])
290262 self.assertEqual(1, number_devices)
291263
292264 # Delete device
293 request, channel = self.make_request(
265 channel = self.make_request(
294266 "DELETE", self.url, access_token=self.admin_user_tok,
295267 )
296268
322294 """
323295 Try to list devices of an user without authentication.
324296 """
325 request, channel = self.make_request("GET", self.url, b"{}")
297 channel = self.make_request("GET", self.url, b"{}")
326298
327299 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
328300 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
333305 """
334306 other_user_token = self.login("user", "pass")
335307
336 request, channel = self.make_request(
337 "GET", self.url, access_token=other_user_token,
338 )
308 channel = self.make_request("GET", self.url, access_token=other_user_token,)
339309
340310 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
341311 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
345315 Tests that a lookup for a user that does not exist returns a 404
346316 """
347317 url = "/_synapse/admin/v2/users/@unknown_person:test/devices"
348 request, channel = self.make_request(
349 "GET", url, access_token=self.admin_user_tok,
350 )
318 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
351319
352320 self.assertEqual(404, channel.code, msg=channel.json_body)
353321 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
358326 """
359327 url = "/_synapse/admin/v2/users/@unknown_person:unknown_domain/devices"
360328
361 request, channel = self.make_request(
362 "GET", url, access_token=self.admin_user_tok,
363 )
329 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
364330
365331 self.assertEqual(400, channel.code, msg=channel.json_body)
366332 self.assertEqual("Can only lookup local users", channel.json_body["error"])
372338 """
373339
374340 # Get devices
375 request, channel = self.make_request(
376 "GET", self.url, access_token=self.admin_user_tok,
377 )
341 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
378342
379343 self.assertEqual(200, channel.code, msg=channel.json_body)
380344 self.assertEqual(0, channel.json_body["total"])
390354 self.login("user", "pass")
391355
392356 # Get devices
393 request, channel = self.make_request(
394 "GET", self.url, access_token=self.admin_user_tok,
395 )
357 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
396358
397359 self.assertEqual(200, channel.code, msg=channel.json_body)
398360 self.assertEqual(number_devices, channel.json_body["total"])
430392 """
431393 Try to delete devices of an user without authentication.
432394 """
433 request, channel = self.make_request("POST", self.url, b"{}")
395 channel = self.make_request("POST", self.url, b"{}")
434396
435397 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
436398 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
441403 """
442404 other_user_token = self.login("user", "pass")
443405
444 request, channel = self.make_request(
445 "POST", self.url, access_token=other_user_token,
446 )
406 channel = self.make_request("POST", self.url, access_token=other_user_token,)
447407
448408 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
449409 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
453413 Tests that a lookup for a user that does not exist returns a 404
454414 """
455415 url = "/_synapse/admin/v2/users/@unknown_person:test/delete_devices"
456 request, channel = self.make_request(
457 "POST", url, access_token=self.admin_user_tok,
458 )
416 channel = self.make_request("POST", url, access_token=self.admin_user_tok,)
459417
460418 self.assertEqual(404, channel.code, msg=channel.json_body)
461419 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
466424 """
467425 url = "/_synapse/admin/v2/users/@unknown_person:unknown_domain/delete_devices"
468426
469 request, channel = self.make_request(
470 "POST", url, access_token=self.admin_user_tok,
471 )
427 channel = self.make_request("POST", url, access_token=self.admin_user_tok,)
472428
473429 self.assertEqual(400, channel.code, msg=channel.json_body)
474430 self.assertEqual("Can only lookup local users", channel.json_body["error"])
478434 Tests that a remove of a device that does not exist returns 200.
479435 """
480436 body = json.dumps({"devices": ["unknown_device1", "unknown_device2"]})
481 request, channel = self.make_request(
437 channel = self.make_request(
482438 "POST",
483439 self.url,
484440 access_token=self.admin_user_tok,
509465
510466 # Delete devices
511467 body = json.dumps({"devices": device_ids})
512 request, channel = self.make_request(
468 channel = self.make_request(
513469 "POST",
514470 self.url,
515471 access_token=self.admin_user_tok,
7373 """
7474 Try to get an event report without authentication.
7575 """
76 request, channel = self.make_request("GET", self.url, b"{}")
76 channel = self.make_request("GET", self.url, b"{}")
7777
7878 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
7979 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
8383 If the user is not a server admin, an error 403 is returned.
8484 """
8585
86 request, channel = self.make_request(
87 "GET", self.url, access_token=self.other_user_tok,
88 )
86 channel = self.make_request("GET", self.url, access_token=self.other_user_tok,)
8987
9088 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
9189 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
9593 Testing list of reported events
9694 """
9795
98 request, channel = self.make_request(
99 "GET", self.url, access_token=self.admin_user_tok,
100 )
96 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
10197
10298 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
10399 self.assertEqual(channel.json_body["total"], 20)
110106 Testing list of reported events with limit
111107 """
112108
113 request, channel = self.make_request(
109 channel = self.make_request(
114110 "GET", self.url + "?limit=5", access_token=self.admin_user_tok,
115111 )
116112
125121 Testing list of reported events with a defined starting point (from)
126122 """
127123
128 request, channel = self.make_request(
124 channel = self.make_request(
129125 "GET", self.url + "?from=5", access_token=self.admin_user_tok,
130126 )
131127
140136 Testing list of reported events with a defined starting point and limit
141137 """
142138
143 request, channel = self.make_request(
139 channel = self.make_request(
144140 "GET", self.url + "?from=5&limit=10", access_token=self.admin_user_tok,
145141 )
146142
155151 Testing list of reported events with a filter of room
156152 """
157153
158 request, channel = self.make_request(
154 channel = self.make_request(
159155 "GET",
160156 self.url + "?room_id=%s" % self.room_id1,
161157 access_token=self.admin_user_tok,
175171 Testing list of reported events with a filter of user
176172 """
177173
178 request, channel = self.make_request(
174 channel = self.make_request(
179175 "GET",
180176 self.url + "?user_id=%s" % self.other_user,
181177 access_token=self.admin_user_tok,
195191 Testing list of reported events with a filter of user and room
196192 """
197193
198 request, channel = self.make_request(
194 channel = self.make_request(
199195 "GET",
200196 self.url + "?user_id=%s&room_id=%s" % (self.other_user, self.room_id1),
201197 access_token=self.admin_user_tok,
217213 """
218214
219215 # fetch the most recent first, largest timestamp
220 request, channel = self.make_request(
216 channel = self.make_request(
221217 "GET", self.url + "?dir=b", access_token=self.admin_user_tok,
222218 )
223219
233229 report += 1
234230
235231 # fetch the oldest first, smallest timestamp
236 request, channel = self.make_request(
232 channel = self.make_request(
237233 "GET", self.url + "?dir=f", access_token=self.admin_user_tok,
238234 )
239235
253249 Testing that a invalid search order returns a 400
254250 """
255251
256 request, channel = self.make_request(
252 channel = self.make_request(
257253 "GET", self.url + "?dir=bar", access_token=self.admin_user_tok,
258254 )
259255
266262 Testing that a negative limit parameter returns a 400
267263 """
268264
269 request, channel = self.make_request(
265 channel = self.make_request(
270266 "GET", self.url + "?limit=-5", access_token=self.admin_user_tok,
271267 )
272268
278274 Testing that a negative from parameter returns a 400
279275 """
280276
281 request, channel = self.make_request(
277 channel = self.make_request(
282278 "GET", self.url + "?from=-5", access_token=self.admin_user_tok,
283279 )
284280
292288
293289 # `next_token` does not appear
294290 # Number of results is the number of entries
295 request, channel = self.make_request(
291 channel = self.make_request(
296292 "GET", self.url + "?limit=20", access_token=self.admin_user_tok,
297293 )
298294
303299
304300 # `next_token` does not appear
305301 # Number of max results is larger than the number of entries
306 request, channel = self.make_request(
302 channel = self.make_request(
307303 "GET", self.url + "?limit=21", access_token=self.admin_user_tok,
308304 )
309305
314310
315311 # `next_token` does appear
316312 # Number of max results is smaller than the number of entries
317 request, channel = self.make_request(
313 channel = self.make_request(
318314 "GET", self.url + "?limit=19", access_token=self.admin_user_tok,
319315 )
320316
326322 # Check
327323 # Set `from` to value of `next_token` for request remaining entries
328324 # `next_token` does not appear
329 request, channel = self.make_request(
325 channel = self.make_request(
330326 "GET", self.url + "?from=19", access_token=self.admin_user_tok,
331327 )
332328
341337 resp = self.helper.send(room_id, tok=user_tok)
342338 event_id = resp["event_id"]
343339
344 request, channel = self.make_request(
340 channel = self.make_request(
345341 "POST",
346342 "rooms/%s/report/%s" % (room_id, event_id),
347343 json.dumps({"score": -100, "reason": "this makes me sad"}),
398394 """
399395 Try to get event report without authentication.
400396 """
401 request, channel = self.make_request("GET", self.url, b"{}")
397 channel = self.make_request("GET", self.url, b"{}")
402398
403399 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
404400 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
408404 If the user is not a server admin, an error 403 is returned.
409405 """
410406
411 request, channel = self.make_request(
412 "GET", self.url, access_token=self.other_user_tok,
413 )
407 channel = self.make_request("GET", self.url, access_token=self.other_user_tok,)
414408
415409 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
416410 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
420414 Testing get a reported event
421415 """
422416
423 request, channel = self.make_request(
424 "GET", self.url, access_token=self.admin_user_tok,
425 )
417 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
426418
427419 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
428420 self._check_fields(channel.json_body)
433425 """
434426
435427 # `report_id` is negative
436 request, channel = self.make_request(
428 channel = self.make_request(
437429 "GET",
438430 "/_synapse/admin/v1/event_reports/-123",
439431 access_token=self.admin_user_tok,
447439 )
448440
449441 # `report_id` is a non-numerical string
450 request, channel = self.make_request(
442 channel = self.make_request(
451443 "GET",
452444 "/_synapse/admin/v1/event_reports/abcdef",
453445 access_token=self.admin_user_tok,
461453 )
462454
463455 # `report_id` is undefined
464 request, channel = self.make_request(
456 channel = self.make_request(
465457 "GET",
466458 "/_synapse/admin/v1/event_reports/",
467459 access_token=self.admin_user_tok,
479471 Testing that a not existing `report_id` returns a 404.
480472 """
481473
482 request, channel = self.make_request(
474 channel = self.make_request(
483475 "GET",
484476 "/_synapse/admin/v1/event_reports/123",
485477 access_token=self.admin_user_tok,
495487 resp = self.helper.send(room_id, tok=user_tok)
496488 event_id = resp["event_id"]
497489
498 request, channel = self.make_request(
490 channel = self.make_request(
499491 "POST",
500492 "rooms/%s/report/%s" % (room_id, event_id),
501493 json.dumps({"score": -100, "reason": "this makes me sad"}),
4949 """
5050 url = "/_synapse/admin/v1/media/%s/%s" % (self.server_name, "12345")
5151
52 request, channel = self.make_request("DELETE", url, b"{}")
52 channel = self.make_request("DELETE", url, b"{}")
5353
5454 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
5555 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
6363
6464 url = "/_synapse/admin/v1/media/%s/%s" % (self.server_name, "12345")
6565
66 request, channel = self.make_request(
67 "DELETE", url, access_token=self.other_user_token,
68 )
66 channel = self.make_request("DELETE", url, access_token=self.other_user_token,)
6967
7068 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
7169 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
7674 """
7775 url = "/_synapse/admin/v1/media/%s/%s" % (self.server_name, "12345")
7876
79 request, channel = self.make_request(
80 "DELETE", url, access_token=self.admin_user_tok,
81 )
77 channel = self.make_request("DELETE", url, access_token=self.admin_user_tok,)
8278
8379 self.assertEqual(404, channel.code, msg=channel.json_body)
8480 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
8985 """
9086 url = "/_synapse/admin/v1/media/%s/%s" % ("unknown_domain", "12345")
9187
92 request, channel = self.make_request(
93 "DELETE", url, access_token=self.admin_user_tok,
94 )
88 channel = self.make_request("DELETE", url, access_token=self.admin_user_tok,)
9589
9690 self.assertEqual(400, channel.code, msg=channel.json_body)
9791 self.assertEqual("Can only delete local media", channel.json_body["error"])
120114 self.assertEqual(server_name, self.server_name)
121115
122116 # Attempt to access media
123 request, channel = make_request(
117 channel = make_request(
124118 self.reactor,
125119 FakeSite(download_resource),
126120 "GET",
145139 url = "/_synapse/admin/v1/media/%s/%s" % (self.server_name, media_id)
146140
147141 # Delete media
148 request, channel = self.make_request(
149 "DELETE", url, access_token=self.admin_user_tok,
150 )
142 channel = self.make_request("DELETE", url, access_token=self.admin_user_tok,)
151143
152144 self.assertEqual(200, channel.code, msg=channel.json_body)
153145 self.assertEqual(1, channel.json_body["total"])
156148 )
157149
158150 # Attempt to access media
159 request, channel = make_request(
151 channel = make_request(
160152 self.reactor,
161153 FakeSite(download_resource),
162154 "GET",
203195 Try to delete media without authentication.
204196 """
205197
206 request, channel = self.make_request("POST", self.url, b"{}")
198 channel = self.make_request("POST", self.url, b"{}")
207199
208200 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
209201 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
215207 self.other_user = self.register_user("user", "pass")
216208 self.other_user_token = self.login("user", "pass")
217209
218 request, channel = self.make_request(
210 channel = self.make_request(
219211 "POST", self.url, access_token=self.other_user_token,
220212 )
221213
228220 """
229221 url = "/_synapse/admin/v1/media/%s/delete" % "unknown_domain"
230222
231 request, channel = self.make_request(
223 channel = self.make_request(
232224 "POST", url + "?before_ts=1234", access_token=self.admin_user_tok,
233225 )
234226
239231 """
240232 If the parameter `before_ts` is missing, an error is returned.
241233 """
242 request, channel = self.make_request(
243 "POST", self.url, access_token=self.admin_user_tok,
244 )
234 channel = self.make_request("POST", self.url, access_token=self.admin_user_tok,)
245235
246236 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
247237 self.assertEqual(Codes.MISSING_PARAM, channel.json_body["errcode"])
253243 """
254244 If parameters are invalid, an error is returned.
255245 """
256 request, channel = self.make_request(
246 channel = self.make_request(
257247 "POST", self.url + "?before_ts=-1234", access_token=self.admin_user_tok,
258248 )
259249
264254 channel.json_body["error"],
265255 )
266256
267 request, channel = self.make_request(
257 channel = self.make_request(
268258 "POST",
269259 self.url + "?before_ts=1234&size_gt=-1234",
270260 access_token=self.admin_user_tok,
277267 channel.json_body["error"],
278268 )
279269
280 request, channel = self.make_request(
270 channel = self.make_request(
281271 "POST",
282272 self.url + "?before_ts=1234&keep_profiles=not_bool",
283273 access_token=self.admin_user_tok,
307297
308298 # timestamp after upload/create
309299 now_ms = self.clock.time_msec()
310 request, channel = self.make_request(
300 channel = self.make_request(
311301 "POST",
312302 self.url + "?before_ts=" + str(now_ms),
313303 access_token=self.admin_user_tok,
331321
332322 self._access_media(server_and_media_id)
333323
334 request, channel = self.make_request(
324 channel = self.make_request(
335325 "POST",
336326 self.url + "?before_ts=" + str(now_ms),
337327 access_token=self.admin_user_tok,
343333
344334 # timestamp after upload
345335 now_ms = self.clock.time_msec()
346 request, channel = self.make_request(
336 channel = self.make_request(
347337 "POST",
348338 self.url + "?before_ts=" + str(now_ms),
349339 access_token=self.admin_user_tok,
366356 self._access_media(server_and_media_id)
367357
368358 now_ms = self.clock.time_msec()
369 request, channel = self.make_request(
359 channel = self.make_request(
370360 "POST",
371361 self.url + "?before_ts=" + str(now_ms) + "&size_gt=67",
372362 access_token=self.admin_user_tok,
377367 self._access_media(server_and_media_id)
378368
379369 now_ms = self.clock.time_msec()
380 request, channel = self.make_request(
370 channel = self.make_request(
381371 "POST",
382372 self.url + "?before_ts=" + str(now_ms) + "&size_gt=66",
383373 access_token=self.admin_user_tok,
400390 self._access_media(server_and_media_id)
401391
402392 # set media as avatar
403 request, channel = self.make_request(
393 channel = self.make_request(
404394 "PUT",
405395 "/profile/%s/avatar_url" % (self.admin_user,),
406396 content=json.dumps({"avatar_url": "mxc://%s" % (server_and_media_id,)}),
409399 self.assertEqual(200, channel.code, msg=channel.json_body)
410400
411401 now_ms = self.clock.time_msec()
412 request, channel = self.make_request(
402 channel = self.make_request(
413403 "POST",
414404 self.url + "?before_ts=" + str(now_ms) + "&keep_profiles=true",
415405 access_token=self.admin_user_tok,
420410 self._access_media(server_and_media_id)
421411
422412 now_ms = self.clock.time_msec()
423 request, channel = self.make_request(
413 channel = self.make_request(
424414 "POST",
425415 self.url + "?before_ts=" + str(now_ms) + "&keep_profiles=false",
426416 access_token=self.admin_user_tok,
444434
445435 # set media as room avatar
446436 room_id = self.helper.create_room_as(self.admin_user, tok=self.admin_user_tok)
447 request, channel = self.make_request(
437 channel = self.make_request(
448438 "PUT",
449439 "/rooms/%s/state/m.room.avatar" % (room_id,),
450440 content=json.dumps({"url": "mxc://%s" % (server_and_media_id,)}),
453443 self.assertEqual(200, channel.code, msg=channel.json_body)
454444
455445 now_ms = self.clock.time_msec()
456 request, channel = self.make_request(
446 channel = self.make_request(
457447 "POST",
458448 self.url + "?before_ts=" + str(now_ms) + "&keep_profiles=true",
459449 access_token=self.admin_user_tok,
464454 self._access_media(server_and_media_id)
465455
466456 now_ms = self.clock.time_msec()
467 request, channel = self.make_request(
457 channel = self.make_request(
468458 "POST",
469459 self.url + "?before_ts=" + str(now_ms) + "&keep_profiles=false",
470460 access_token=self.admin_user_tok,
511501 media_id = server_and_media_id.split("/")[1]
512502 local_path = self.filepaths.local_media_filepath(media_id)
513503
514 request, channel = make_request(
504 channel = make_request(
515505 self.reactor,
516506 FakeSite(download_resource),
517507 "GET",
1919 from mock import Mock
2020
2121 import synapse.rest.admin
22 from synapse.api.constants import EventTypes, Membership
2223 from synapse.api.errors import Codes
2324 from synapse.rest.client.v1 import directory, events, login, room
2425
7879
7980 # Test that the admin can still send shutdown
8081 url = "/_synapse/admin/v1/shutdown_room/" + room_id
81 request, channel = self.make_request(
82 channel = self.make_request(
8283 "POST",
8384 url.encode("ascii"),
8485 json.dumps({"new_room_user_id": self.admin_user}),
102103
103104 # Enable world readable
104105 url = "rooms/%s/state/m.room.history_visibility" % (room_id,)
105 request, channel = self.make_request(
106 channel = self.make_request(
106107 "PUT",
107108 url.encode("ascii"),
108109 json.dumps({"history_visibility": "world_readable"}),
112113
113114 # Test that the admin can still send shutdown
114115 url = "/_synapse/admin/v1/shutdown_room/" + room_id
115 request, channel = self.make_request(
116 channel = self.make_request(
116117 "POST",
117118 url.encode("ascii"),
118119 json.dumps({"new_room_user_id": self.admin_user}),
129130 """
130131
131132 url = "rooms/%s/initialSync" % (room_id,)
132 request, channel = self.make_request(
133 channel = self.make_request(
133134 "GET", url.encode("ascii"), access_token=self.admin_user_tok
134135 )
135136 self.assertEqual(
137138 )
138139
139140 url = "events?timeout=0&room_id=" + room_id
140 request, channel = self.make_request(
141 channel = self.make_request(
141142 "GET", url.encode("ascii"), access_token=self.admin_user_tok
142143 )
143144 self.assertEqual(
183184 If the user is not a server admin, an error 403 is returned.
184185 """
185186
186 request, channel = self.make_request(
187 channel = self.make_request(
187188 "POST", self.url, json.dumps({}), access_token=self.other_user_tok,
188189 )
189190
196197 """
197198 url = "/_synapse/admin/v1/rooms/!unknown:test/delete"
198199
199 request, channel = self.make_request(
200 channel = self.make_request(
200201 "POST", url, json.dumps({}), access_token=self.admin_user_tok,
201202 )
202203
209210 """
210211 url = "/_synapse/admin/v1/rooms/invalidroom/delete"
211212
212 request, channel = self.make_request(
213 channel = self.make_request(
213214 "POST", url, json.dumps({}), access_token=self.admin_user_tok,
214215 )
215216
224225 """
225226 body = json.dumps({"new_room_user_id": "@unknown:test"})
226227
227 request, channel = self.make_request(
228 channel = self.make_request(
228229 "POST",
229230 self.url,
230231 content=body.encode(encoding="utf_8"),
243244 """
244245 body = json.dumps({"new_room_user_id": "@not:exist.bla"})
245246
246 request, channel = self.make_request(
247 channel = self.make_request(
247248 "POST",
248249 self.url,
249250 content=body.encode(encoding="utf_8"),
261262 """
262263 body = json.dumps({"block": "NotBool"})
263264
264 request, channel = self.make_request(
265 channel = self.make_request(
265266 "POST",
266267 self.url,
267268 content=body.encode(encoding="utf_8"),
277278 """
278279 body = json.dumps({"purge": "NotBool"})
279280
280 request, channel = self.make_request(
281 channel = self.make_request(
281282 "POST",
282283 self.url,
283284 content=body.encode(encoding="utf_8"),
303304
304305 body = json.dumps({"block": True, "purge": True})
305306
306 request, channel = self.make_request(
307 channel = self.make_request(
307308 "POST",
308309 self.url.encode("ascii"),
309310 content=body.encode(encoding="utf_8"),
336337
337338 body = json.dumps({"block": False, "purge": True})
338339
339 request, channel = self.make_request(
340 channel = self.make_request(
340341 "POST",
341342 self.url.encode("ascii"),
342343 content=body.encode(encoding="utf_8"),
370371
371372 body = json.dumps({"block": False, "purge": False})
372373
373 request, channel = self.make_request(
374 channel = self.make_request(
374375 "POST",
375376 self.url.encode("ascii"),
376377 content=body.encode(encoding="utf_8"),
417418
418419 # Test that the admin can still send shutdown
419420 url = "/_synapse/admin/v1/rooms/%s/delete" % self.room_id
420 request, channel = self.make_request(
421 channel = self.make_request(
421422 "POST",
422423 url.encode("ascii"),
423424 json.dumps({"new_room_user_id": self.admin_user}),
447448
448449 # Enable world readable
449450 url = "rooms/%s/state/m.room.history_visibility" % (self.room_id,)
450 request, channel = self.make_request(
451 channel = self.make_request(
451452 "PUT",
452453 url.encode("ascii"),
453454 json.dumps({"history_visibility": "world_readable"}),
464465
465466 # Test that the admin can still send shutdown
466467 url = "/_synapse/admin/v1/rooms/%s/delete" % self.room_id
467 request, channel = self.make_request(
468 channel = self.make_request(
468469 "POST",
469470 url.encode("ascii"),
470471 json.dumps({"new_room_user_id": self.admin_user}),
529530 """
530531
531532 url = "rooms/%s/initialSync" % (room_id,)
532 request, channel = self.make_request(
533 channel = self.make_request(
533534 "GET", url.encode("ascii"), access_token=self.admin_user_tok
534535 )
535536 self.assertEqual(
537538 )
538539
539540 url = "events?timeout=0&room_id=" + room_id
540 request, channel = self.make_request(
541 channel = self.make_request(
541542 "GET", url.encode("ascii"), access_token=self.admin_user_tok
542543 )
543544 self.assertEqual(
568569 self.helper.leave(room_id, user=self.admin_user, tok=self.admin_user_tok)
569570
570571 url = "/_synapse/admin/v1/purge_room"
571 request, channel = self.make_request(
572 channel = self.make_request(
572573 "POST",
573574 url.encode("ascii"),
574575 {"room_id": room_id},
622623
623624 # Request the list of rooms
624625 url = "/_synapse/admin/v1/rooms"
625 request, channel = self.make_request(
626 channel = self.make_request(
626627 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
627628 )
628629
703704 limit,
704705 "name",
705706 )
706 request, channel = self.make_request(
707 channel = self.make_request(
707708 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
708709 )
709710 self.assertEqual(
743744 self.assertEqual(room_ids, returned_room_ids)
744745
745746 url = "/_synapse/admin/v1/rooms?from=%d&limit=%d" % (start, limit)
746 request, channel = self.make_request(
747 channel = self.make_request(
747748 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
748749 )
749750 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
763764
764765 # Create a new alias to this room
765766 url = "/_matrix/client/r0/directory/room/%s" % (urllib.parse.quote(test_alias),)
766 request, channel = self.make_request(
767 channel = self.make_request(
767768 "PUT",
768769 url.encode("ascii"),
769770 {"room_id": room_id},
793794
794795 # Request the list of rooms
795796 url = "/_synapse/admin/v1/rooms"
796 request, channel = self.make_request(
797 channel = self.make_request(
797798 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
798799 )
799800 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
834835 url = "/_matrix/client/r0/directory/room/%s" % (
835836 urllib.parse.quote(test_alias),
836837 )
837 request, channel = self.make_request(
838 channel = self.make_request(
838839 "PUT",
839840 url.encode("ascii"),
840841 {"room_id": room_id},
874875 url = "/_synapse/admin/v1/rooms?order_by=%s" % (order_type,)
875876 if reverse:
876877 url += "&dir=b"
877 request, channel = self.make_request(
878 channel = self.make_request(
878879 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
879880 )
880881 self.assertEqual(200, channel.code, msg=channel.json_body)
10101011 expected_http_code: The expected http code for the request
10111012 """
10121013 url = "/_synapse/admin/v1/rooms?search_term=%s" % (search_term,)
1013 request, channel = self.make_request(
1014 channel = self.make_request(
10141015 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
10151016 )
10161017 self.assertEqual(expected_http_code, channel.code, msg=channel.json_body)
10481049
10491050 _search_test(room_id_2, "else")
10501051 _search_test(room_id_2, "se")
1052
1053 # Test case insensitive
1054 _search_test(room_id_1, "SOMETHING")
1055 _search_test(room_id_1, "THING")
1056
1057 _search_test(room_id_2, "ELSE")
1058 _search_test(room_id_2, "SE")
10511059
10521060 _search_test(None, "foo")
10531061 _search_test(None, "bar")
10711079 )
10721080
10731081 url = "/_synapse/admin/v1/rooms/%s" % (room_id_1,)
1074 request, channel = self.make_request(
1082 channel = self.make_request(
10751083 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
10761084 )
10771085 self.assertEqual(200, channel.code, msg=channel.json_body)
10831091 self.assertIn("canonical_alias", channel.json_body)
10841092 self.assertIn("joined_members", channel.json_body)
10851093 self.assertIn("joined_local_members", channel.json_body)
1094 self.assertIn("joined_local_devices", channel.json_body)
10861095 self.assertIn("version", channel.json_body)
10871096 self.assertIn("creator", channel.json_body)
10881097 self.assertIn("encryption", channel.json_body)
10951104
10961105 self.assertEqual(room_id_1, channel.json_body["room_id"])
10971106
1107 def test_single_room_devices(self):
1108 """Test that `joined_local_devices` can be requested correctly"""
1109 room_id_1 = self.helper.create_room_as(self.admin_user, tok=self.admin_user_tok)
1110
1111 url = "/_synapse/admin/v1/rooms/%s" % (room_id_1,)
1112 channel = self.make_request(
1113 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
1114 )
1115 self.assertEqual(200, channel.code, msg=channel.json_body)
1116 self.assertEqual(1, channel.json_body["joined_local_devices"])
1117
1118 # Have another user join the room
1119 user_1 = self.register_user("foo", "pass")
1120 user_tok_1 = self.login("foo", "pass")
1121 self.helper.join(room_id_1, user_1, tok=user_tok_1)
1122
1123 url = "/_synapse/admin/v1/rooms/%s" % (room_id_1,)
1124 channel = self.make_request(
1125 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
1126 )
1127 self.assertEqual(200, channel.code, msg=channel.json_body)
1128 self.assertEqual(2, channel.json_body["joined_local_devices"])
1129
1130 # leave room
1131 self.helper.leave(room_id_1, self.admin_user, tok=self.admin_user_tok)
1132 self.helper.leave(room_id_1, user_1, tok=user_tok_1)
1133 url = "/_synapse/admin/v1/rooms/%s" % (room_id_1,)
1134 channel = self.make_request(
1135 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
1136 )
1137 self.assertEqual(200, channel.code, msg=channel.json_body)
1138 self.assertEqual(0, channel.json_body["joined_local_devices"])
1139
10981140 def test_room_members(self):
10991141 """Test that room members can be requested correctly"""
11001142 # Create two test rooms
11181160 self.helper.join(room_id_2, user_3, tok=user_tok_3)
11191161
11201162 url = "/_synapse/admin/v1/rooms/%s/members" % (room_id_1,)
1121 request, channel = self.make_request(
1163 channel = self.make_request(
11221164 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
11231165 )
11241166 self.assertEqual(200, channel.code, msg=channel.json_body)
11291171 self.assertEqual(channel.json_body["total"], 3)
11301172
11311173 url = "/_synapse/admin/v1/rooms/%s/members" % (room_id_2,)
1132 request, channel = self.make_request(
1174 channel = self.make_request(
11331175 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
11341176 )
11351177 self.assertEqual(200, channel.code, msg=channel.json_body)
11691211 """
11701212 body = json.dumps({"user_id": self.second_user_id})
11711213
1172 request, channel = self.make_request(
1214 channel = self.make_request(
11731215 "POST",
11741216 self.url,
11751217 content=body.encode(encoding="utf_8"),
11851227 """
11861228 body = json.dumps({"unknown_parameter": "@unknown:test"})
11871229
1188 request, channel = self.make_request(
1230 channel = self.make_request(
11891231 "POST",
11901232 self.url,
11911233 content=body.encode(encoding="utf_8"),
12011243 """
12021244 body = json.dumps({"user_id": "@unknown:test"})
12031245
1204 request, channel = self.make_request(
1246 channel = self.make_request(
12051247 "POST",
12061248 self.url,
12071249 content=body.encode(encoding="utf_8"),
12171259 """
12181260 body = json.dumps({"user_id": "@not:exist.bla"})
12191261
1220 request, channel = self.make_request(
1262 channel = self.make_request(
12211263 "POST",
12221264 self.url,
12231265 content=body.encode(encoding="utf_8"),
12371279 body = json.dumps({"user_id": self.second_user_id})
12381280 url = "/_synapse/admin/v1/join/!unknown:test"
12391281
1240 request, channel = self.make_request(
1282 channel = self.make_request(
12411283 "POST",
12421284 url,
12431285 content=body.encode(encoding="utf_8"),
12541296 body = json.dumps({"user_id": self.second_user_id})
12551297 url = "/_synapse/admin/v1/join/invalidroom"
12561298
1257 request, channel = self.make_request(
1299 channel = self.make_request(
12581300 "POST",
12591301 url,
12601302 content=body.encode(encoding="utf_8"),
12731315 """
12741316 body = json.dumps({"user_id": self.second_user_id})
12751317
1276 request, channel = self.make_request(
1318 channel = self.make_request(
12771319 "POST",
12781320 self.url,
12791321 content=body.encode(encoding="utf_8"),
12851327
12861328 # Validate if user is a member of the room
12871329
1288 request, channel = self.make_request(
1330 channel = self.make_request(
12891331 "GET", "/_matrix/client/r0/joined_rooms", access_token=self.second_tok,
12901332 )
12911333 self.assertEquals(200, int(channel.result["code"]), msg=channel.result["body"])
13021344 url = "/_synapse/admin/v1/join/{}".format(private_room_id)
13031345 body = json.dumps({"user_id": self.second_user_id})
13041346
1305 request, channel = self.make_request(
1347 channel = self.make_request(
13061348 "POST",
13071349 url,
13081350 content=body.encode(encoding="utf_8"),
13321374
13331375 # Validate if server admin is a member of the room
13341376
1335 request, channel = self.make_request(
1377 channel = self.make_request(
13361378 "GET", "/_matrix/client/r0/joined_rooms", access_token=self.admin_user_tok,
13371379 )
13381380 self.assertEquals(200, int(channel.result["code"]), msg=channel.result["body"])
13431385 url = "/_synapse/admin/v1/join/{}".format(private_room_id)
13441386 body = json.dumps({"user_id": self.second_user_id})
13451387
1346 request, channel = self.make_request(
1388 channel = self.make_request(
13471389 "POST",
13481390 url,
13491391 content=body.encode(encoding="utf_8"),
13541396
13551397 # Validate if user is a member of the room
13561398
1357 request, channel = self.make_request(
1399 channel = self.make_request(
13581400 "GET", "/_matrix/client/r0/joined_rooms", access_token=self.second_tok,
13591401 )
13601402 self.assertEquals(200, int(channel.result["code"]), msg=channel.result["body"])
13711413 url = "/_synapse/admin/v1/join/{}".format(private_room_id)
13721414 body = json.dumps({"user_id": self.second_user_id})
13731415
1374 request, channel = self.make_request(
1416 channel = self.make_request(
13751417 "POST",
13761418 url,
13771419 content=body.encode(encoding="utf_8"),
13831425
13841426 # Validate if user is a member of the room
13851427
1386 request, channel = self.make_request(
1428 channel = self.make_request(
13871429 "GET", "/_matrix/client/r0/joined_rooms", access_token=self.second_tok,
13881430 )
13891431 self.assertEquals(200, int(channel.result["code"]), msg=channel.result["body"])
13901432 self.assertEqual(private_room_id, channel.json_body["joined_rooms"][0])
1433
1434
1435 class MakeRoomAdminTestCase(unittest.HomeserverTestCase):
1436 servlets = [
1437 synapse.rest.admin.register_servlets,
1438 room.register_servlets,
1439 login.register_servlets,
1440 ]
1441
1442 def prepare(self, reactor, clock, homeserver):
1443 self.admin_user = self.register_user("admin", "pass", admin=True)
1444 self.admin_user_tok = self.login("admin", "pass")
1445
1446 self.creator = self.register_user("creator", "test")
1447 self.creator_tok = self.login("creator", "test")
1448
1449 self.second_user_id = self.register_user("second", "test")
1450 self.second_tok = self.login("second", "test")
1451
1452 self.public_room_id = self.helper.create_room_as(
1453 self.creator, tok=self.creator_tok, is_public=True
1454 )
1455 self.url = "/_synapse/admin/v1/rooms/{}/make_room_admin".format(
1456 self.public_room_id
1457 )
1458
1459 def test_public_room(self):
1460 """Test that getting admin in a public room works.
1461 """
1462 room_id = self.helper.create_room_as(
1463 self.creator, tok=self.creator_tok, is_public=True
1464 )
1465
1466 channel = self.make_request(
1467 "POST",
1468 "/_synapse/admin/v1/rooms/{}/make_room_admin".format(room_id),
1469 content={},
1470 access_token=self.admin_user_tok,
1471 )
1472
1473 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
1474
1475 # Now we test that we can join the room and ban a user.
1476 self.helper.join(room_id, self.admin_user, tok=self.admin_user_tok)
1477 self.helper.change_membership(
1478 room_id,
1479 self.admin_user,
1480 "@test:test",
1481 Membership.BAN,
1482 tok=self.admin_user_tok,
1483 )
1484
1485 def test_private_room(self):
1486 """Test that getting admin in a private room works and we get invited.
1487 """
1488 room_id = self.helper.create_room_as(
1489 self.creator, tok=self.creator_tok, is_public=False,
1490 )
1491
1492 channel = self.make_request(
1493 "POST",
1494 "/_synapse/admin/v1/rooms/{}/make_room_admin".format(room_id),
1495 content={},
1496 access_token=self.admin_user_tok,
1497 )
1498
1499 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
1500
1501 # Now we test that we can join the room (we should have received an
1502 # invite) and can ban a user.
1503 self.helper.join(room_id, self.admin_user, tok=self.admin_user_tok)
1504 self.helper.change_membership(
1505 room_id,
1506 self.admin_user,
1507 "@test:test",
1508 Membership.BAN,
1509 tok=self.admin_user_tok,
1510 )
1511
1512 def test_other_user(self):
1513 """Test that giving admin in a public room works to a non-admin user works.
1514 """
1515 room_id = self.helper.create_room_as(
1516 self.creator, tok=self.creator_tok, is_public=True
1517 )
1518
1519 channel = self.make_request(
1520 "POST",
1521 "/_synapse/admin/v1/rooms/{}/make_room_admin".format(room_id),
1522 content={"user_id": self.second_user_id},
1523 access_token=self.admin_user_tok,
1524 )
1525
1526 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
1527
1528 # Now we test that we can join the room and ban a user.
1529 self.helper.join(room_id, self.second_user_id, tok=self.second_tok)
1530 self.helper.change_membership(
1531 room_id,
1532 self.second_user_id,
1533 "@test:test",
1534 Membership.BAN,
1535 tok=self.second_tok,
1536 )
1537
1538 def test_not_enough_power(self):
1539 """Test that we get a sensible error if there are no local room admins.
1540 """
1541 room_id = self.helper.create_room_as(
1542 self.creator, tok=self.creator_tok, is_public=True
1543 )
1544
1545 # The creator drops admin rights in the room.
1546 pl = self.helper.get_state(
1547 room_id, EventTypes.PowerLevels, tok=self.creator_tok
1548 )
1549 pl["users"][self.creator] = 0
1550 self.helper.send_state(
1551 room_id, EventTypes.PowerLevels, body=pl, tok=self.creator_tok
1552 )
1553
1554 channel = self.make_request(
1555 "POST",
1556 "/_synapse/admin/v1/rooms/{}/make_room_admin".format(room_id),
1557 content={},
1558 access_token=self.admin_user_tok,
1559 )
1560
1561 # We expect this to fail with a 400 as there are no room admins.
1562 #
1563 # (Note we assert the error message to ensure that it's not denied for
1564 # some other reason)
1565 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
1566 self.assertEqual(
1567 channel.json_body["error"],
1568 "No local admin user in room with power to update power levels.",
1569 )
13911570
13921571
13931572 PURGE_TABLES = [
14181597 "event_push_summary",
14191598 "pusher_throttle",
14201599 "group_summary_rooms",
1421 "local_invites",
14221600 "room_account_data",
14231601 "room_tags",
14241602 # "state_groups", # Current impl leaves orphaned state groups around.
4545 """
4646 Try to list users without authentication.
4747 """
48 request, channel = self.make_request("GET", self.url, b"{}")
48 channel = self.make_request("GET", self.url, b"{}")
4949
5050 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
5151 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
5454 """
5555 If the user is not a server admin, an error 403 is returned.
5656 """
57 request, channel = self.make_request(
57 channel = self.make_request(
5858 "GET", self.url, json.dumps({}), access_token=self.other_user_tok,
5959 )
6060
6666 If parameters are invalid, an error is returned.
6767 """
6868 # unkown order_by
69 request, channel = self.make_request(
69 channel = self.make_request(
7070 "GET", self.url + "?order_by=bar", access_token=self.admin_user_tok,
7171 )
7272
7474 self.assertEqual(Codes.INVALID_PARAM, channel.json_body["errcode"])
7575
7676 # negative from
77 request, channel = self.make_request(
77 channel = self.make_request(
7878 "GET", self.url + "?from=-5", access_token=self.admin_user_tok,
7979 )
8080
8282 self.assertEqual(Codes.INVALID_PARAM, channel.json_body["errcode"])
8383
8484 # negative limit
85 request, channel = self.make_request(
85 channel = self.make_request(
8686 "GET", self.url + "?limit=-5", access_token=self.admin_user_tok,
8787 )
8888
9090 self.assertEqual(Codes.INVALID_PARAM, channel.json_body["errcode"])
9191
9292 # negative from_ts
93 request, channel = self.make_request(
93 channel = self.make_request(
9494 "GET", self.url + "?from_ts=-1234", access_token=self.admin_user_tok,
9595 )
9696
9898 self.assertEqual(Codes.INVALID_PARAM, channel.json_body["errcode"])
9999
100100 # negative until_ts
101 request, channel = self.make_request(
101 channel = self.make_request(
102102 "GET", self.url + "?until_ts=-1234", access_token=self.admin_user_tok,
103103 )
104104
106106 self.assertEqual(Codes.INVALID_PARAM, channel.json_body["errcode"])
107107
108108 # until_ts smaller from_ts
109 request, channel = self.make_request(
109 channel = self.make_request(
110110 "GET",
111111 self.url + "?from_ts=10&until_ts=5",
112112 access_token=self.admin_user_tok,
116116 self.assertEqual(Codes.INVALID_PARAM, channel.json_body["errcode"])
117117
118118 # empty search term
119 request, channel = self.make_request(
119 channel = self.make_request(
120120 "GET", self.url + "?search_term=", access_token=self.admin_user_tok,
121121 )
122122
124124 self.assertEqual(Codes.INVALID_PARAM, channel.json_body["errcode"])
125125
126126 # invalid search order
127 request, channel = self.make_request(
127 channel = self.make_request(
128128 "GET", self.url + "?dir=bar", access_token=self.admin_user_tok,
129129 )
130130
137137 """
138138 self._create_users_with_media(10, 2)
139139
140 request, channel = self.make_request(
140 channel = self.make_request(
141141 "GET", self.url + "?limit=5", access_token=self.admin_user_tok,
142142 )
143143
153153 """
154154 self._create_users_with_media(20, 2)
155155
156 request, channel = self.make_request(
156 channel = self.make_request(
157157 "GET", self.url + "?from=5", access_token=self.admin_user_tok,
158158 )
159159
169169 """
170170 self._create_users_with_media(20, 2)
171171
172 request, channel = self.make_request(
172 channel = self.make_request(
173173 "GET", self.url + "?from=5&limit=10", access_token=self.admin_user_tok,
174174 )
175175
189189
190190 # `next_token` does not appear
191191 # Number of results is the number of entries
192 request, channel = self.make_request(
192 channel = self.make_request(
193193 "GET", self.url + "?limit=20", access_token=self.admin_user_tok,
194194 )
195195
200200
201201 # `next_token` does not appear
202202 # Number of max results is larger than the number of entries
203 request, channel = self.make_request(
203 channel = self.make_request(
204204 "GET", self.url + "?limit=21", access_token=self.admin_user_tok,
205205 )
206206
211211
212212 # `next_token` does appear
213213 # Number of max results is smaller than the number of entries
214 request, channel = self.make_request(
214 channel = self.make_request(
215215 "GET", self.url + "?limit=19", access_token=self.admin_user_tok,
216216 )
217217
222222
223223 # Set `from` to value of `next_token` for request remaining entries
224224 # Check `next_token` does not appear
225 request, channel = self.make_request(
225 channel = self.make_request(
226226 "GET", self.url + "?from=19", access_token=self.admin_user_tok,
227227 )
228228
237237 if users have no media created
238238 """
239239
240 request, channel = self.make_request(
241 "GET", self.url, access_token=self.admin_user_tok,
242 )
240 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
243241
244242 self.assertEqual(200, channel.code, msg=channel.json_body)
245243 self.assertEqual(0, channel.json_body["total"])
315313 ts1 = self.clock.time_msec()
316314
317315 # list all media when filter is not set
318 request, channel = self.make_request(
319 "GET", self.url, access_token=self.admin_user_tok,
320 )
316 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
321317 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
322318 self.assertEqual(channel.json_body["users"][0]["media_count"], 3)
323319
324320 # filter media starting at `ts1` after creating first media
325321 # result is 0
326 request, channel = self.make_request(
322 channel = self.make_request(
327323 "GET", self.url + "?from_ts=%s" % (ts1,), access_token=self.admin_user_tok,
328324 )
329325 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
336332 self._create_media(self.other_user_tok, 3)
337333
338334 # filter media between `ts1` and `ts2`
339 request, channel = self.make_request(
335 channel = self.make_request(
340336 "GET",
341337 self.url + "?from_ts=%s&until_ts=%s" % (ts1, ts2),
342338 access_token=self.admin_user_tok,
345341 self.assertEqual(channel.json_body["users"][0]["media_count"], 3)
346342
347343 # filter media until `ts2` and earlier
348 request, channel = self.make_request(
344 channel = self.make_request(
349345 "GET", self.url + "?until_ts=%s" % (ts2,), access_token=self.admin_user_tok,
350346 )
351347 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
355351 self._create_users_with_media(20, 1)
356352
357353 # check without filter get all users
358 request, channel = self.make_request(
359 "GET", self.url, access_token=self.admin_user_tok,
360 )
354 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
361355 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
362356 self.assertEqual(channel.json_body["total"], 20)
363357
364358 # filter user 1 and 10-19 by `user_id`
365 request, channel = self.make_request(
359 channel = self.make_request(
366360 "GET",
367361 self.url + "?search_term=foo_user_1",
368362 access_token=self.admin_user_tok,
371365 self.assertEqual(channel.json_body["total"], 11)
372366
373367 # filter on this user in `displayname`
374 request, channel = self.make_request(
368 channel = self.make_request(
375369 "GET",
376370 self.url + "?search_term=bar_user_10",
377371 access_token=self.admin_user_tok,
381375 self.assertEqual(channel.json_body["total"], 1)
382376
383377 # filter and get empty result
384 request, channel = self.make_request(
378 channel = self.make_request(
385379 "GET", self.url + "?search_term=foobar", access_token=self.admin_user_tok,
386380 )
387381 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
446440 url = self.url + "?order_by=%s" % (order_type,)
447441 if dir is not None and dir in ("b", "f"):
448442 url += "&dir=%s" % (dir,)
449 request, channel = self.make_request(
443 channel = self.make_request(
450444 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
451445 )
452446 self.assertEqual(200, channel.code, msg=channel.json_body)
1717 import json
1818 import urllib.parse
1919 from binascii import unhexlify
20 from typing import Optional
2021
2122 from mock import Mock
2223
6970 """
7071 self.hs.config.registration_shared_secret = None
7172
72 request, channel = self.make_request("POST", self.url, b"{}")
73 channel = self.make_request("POST", self.url, b"{}")
7374
7475 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
7576 self.assertEqual(
8687
8788 self.hs.get_secrets = Mock(return_value=secrets)
8889
89 request, channel = self.make_request("GET", self.url)
90 channel = self.make_request("GET", self.url)
9091
9192 self.assertEqual(channel.json_body, {"nonce": "abcd"})
9293
9596 Calling GET on the endpoint will return a randomised nonce, which will
9697 only last for SALT_TIMEOUT (60s).
9798 """
98 request, channel = self.make_request("GET", self.url)
99 channel = self.make_request("GET", self.url)
99100 nonce = channel.json_body["nonce"]
100101
101102 # 59 seconds
102103 self.reactor.advance(59)
103104
104105 body = json.dumps({"nonce": nonce})
105 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
106 channel = self.make_request("POST", self.url, body.encode("utf8"))
106107
107108 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
108109 self.assertEqual("username must be specified", channel.json_body["error"])
110111 # 61 seconds
111112 self.reactor.advance(2)
112113
113 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
114 channel = self.make_request("POST", self.url, body.encode("utf8"))
114115
115116 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
116117 self.assertEqual("unrecognised nonce", channel.json_body["error"])
119120 """
120121 Only the provided nonce can be used, as it's checked in the MAC.
121122 """
122 request, channel = self.make_request("GET", self.url)
123 channel = self.make_request("GET", self.url)
123124 nonce = channel.json_body["nonce"]
124125
125126 want_mac = hmac.new(key=b"shared", digestmod=hashlib.sha1)
135136 "mac": want_mac,
136137 }
137138 )
138 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
139 channel = self.make_request("POST", self.url, body.encode("utf8"))
139140
140141 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
141142 self.assertEqual("HMAC incorrect", channel.json_body["error"])
145146 When the correct nonce is provided, and the right key is provided, the
146147 user is registered.
147148 """
148 request, channel = self.make_request("GET", self.url)
149 channel = self.make_request("GET", self.url)
149150 nonce = channel.json_body["nonce"]
150151
151152 want_mac = hmac.new(key=b"shared", digestmod=hashlib.sha1)
164165 "mac": want_mac,
165166 }
166167 )
167 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
168 channel = self.make_request("POST", self.url, body.encode("utf8"))
168169
169170 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
170171 self.assertEqual("@bob:test", channel.json_body["user_id"])
173174 """
174175 A valid unrecognised nonce.
175176 """
176 request, channel = self.make_request("GET", self.url)
177 channel = self.make_request("GET", self.url)
177178 nonce = channel.json_body["nonce"]
178179
179180 want_mac = hmac.new(key=b"shared", digestmod=hashlib.sha1)
189190 "mac": want_mac,
190191 }
191192 )
192 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
193 channel = self.make_request("POST", self.url, body.encode("utf8"))
193194
194195 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
195196 self.assertEqual("@bob:test", channel.json_body["user_id"])
196197
197198 # Now, try and reuse it
198 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
199 channel = self.make_request("POST", self.url, body.encode("utf8"))
199200
200201 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
201202 self.assertEqual("unrecognised nonce", channel.json_body["error"])
208209 """
209210
210211 def nonce():
211 request, channel = self.make_request("GET", self.url)
212 channel = self.make_request("GET", self.url)
212213 return channel.json_body["nonce"]
213214
214215 #
217218
218219 # Must be present
219220 body = json.dumps({})
220 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
221 channel = self.make_request("POST", self.url, body.encode("utf8"))
221222
222223 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
223224 self.assertEqual("nonce must be specified", channel.json_body["error"])
228229
229230 # Must be present
230231 body = json.dumps({"nonce": nonce()})
231 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
232 channel = self.make_request("POST", self.url, body.encode("utf8"))
232233
233234 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
234235 self.assertEqual("username must be specified", channel.json_body["error"])
235236
236237 # Must be a string
237238 body = json.dumps({"nonce": nonce(), "username": 1234})
238 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
239 channel = self.make_request("POST", self.url, body.encode("utf8"))
239240
240241 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
241242 self.assertEqual("Invalid username", channel.json_body["error"])
242243
243244 # Must not have null bytes
244245 body = json.dumps({"nonce": nonce(), "username": "abcd\u0000"})
245 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
246 channel = self.make_request("POST", self.url, body.encode("utf8"))
246247
247248 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
248249 self.assertEqual("Invalid username", channel.json_body["error"])
249250
250251 # Must not have null bytes
251252 body = json.dumps({"nonce": nonce(), "username": "a" * 1000})
252 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
253 channel = self.make_request("POST", self.url, body.encode("utf8"))
253254
254255 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
255256 self.assertEqual("Invalid username", channel.json_body["error"])
260261
261262 # Must be present
262263 body = json.dumps({"nonce": nonce(), "username": "a"})
263 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
264 channel = self.make_request("POST", self.url, body.encode("utf8"))
264265
265266 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
266267 self.assertEqual("password must be specified", channel.json_body["error"])
267268
268269 # Must be a string
269270 body = json.dumps({"nonce": nonce(), "username": "a", "password": 1234})
270 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
271 channel = self.make_request("POST", self.url, body.encode("utf8"))
271272
272273 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
273274 self.assertEqual("Invalid password", channel.json_body["error"])
274275
275276 # Must not have null bytes
276277 body = json.dumps({"nonce": nonce(), "username": "a", "password": "abcd\u0000"})
277 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
278 channel = self.make_request("POST", self.url, body.encode("utf8"))
278279
279280 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
280281 self.assertEqual("Invalid password", channel.json_body["error"])
281282
282283 # Super long
283284 body = json.dumps({"nonce": nonce(), "username": "a", "password": "A" * 1000})
284 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
285 channel = self.make_request("POST", self.url, body.encode("utf8"))
285286
286287 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
287288 self.assertEqual("Invalid password", channel.json_body["error"])
299300 "user_type": "invalid",
300301 }
301302 )
302 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
303 channel = self.make_request("POST", self.url, body.encode("utf8"))
303304
304305 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
305306 self.assertEqual("Invalid user type", channel.json_body["error"])
310311 """
311312
312313 # set no displayname
313 request, channel = self.make_request("GET", self.url)
314 channel = self.make_request("GET", self.url)
314315 nonce = channel.json_body["nonce"]
315316
316317 want_mac = hmac.new(key=b"shared", digestmod=hashlib.sha1)
320321 body = json.dumps(
321322 {"nonce": nonce, "username": "bob1", "password": "abc123", "mac": want_mac}
322323 )
323 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
324 channel = self.make_request("POST", self.url, body.encode("utf8"))
324325
325326 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
326327 self.assertEqual("@bob1:test", channel.json_body["user_id"])
327328
328 request, channel = self.make_request("GET", "/profile/@bob1:test/displayname")
329 channel = self.make_request("GET", "/profile/@bob1:test/displayname")
329330 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
330331 self.assertEqual("bob1", channel.json_body["displayname"])
331332
332333 # displayname is None
333 request, channel = self.make_request("GET", self.url)
334 channel = self.make_request("GET", self.url)
334335 nonce = channel.json_body["nonce"]
335336
336337 want_mac = hmac.new(key=b"shared", digestmod=hashlib.sha1)
346347 "mac": want_mac,
347348 }
348349 )
349 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
350 channel = self.make_request("POST", self.url, body.encode("utf8"))
350351
351352 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
352353 self.assertEqual("@bob2:test", channel.json_body["user_id"])
353354
354 request, channel = self.make_request("GET", "/profile/@bob2:test/displayname")
355 channel = self.make_request("GET", "/profile/@bob2:test/displayname")
355356 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
356357 self.assertEqual("bob2", channel.json_body["displayname"])
357358
358359 # displayname is empty
359 request, channel = self.make_request("GET", self.url)
360 channel = self.make_request("GET", self.url)
360361 nonce = channel.json_body["nonce"]
361362
362363 want_mac = hmac.new(key=b"shared", digestmod=hashlib.sha1)
372373 "mac": want_mac,
373374 }
374375 )
375 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
376 channel = self.make_request("POST", self.url, body.encode("utf8"))
376377
377378 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
378379 self.assertEqual("@bob3:test", channel.json_body["user_id"])
379380
380 request, channel = self.make_request("GET", "/profile/@bob3:test/displayname")
381 channel = self.make_request("GET", "/profile/@bob3:test/displayname")
381382 self.assertEqual(404, int(channel.result["code"]), msg=channel.result["body"])
382383
383384 # set displayname
384 request, channel = self.make_request("GET", self.url)
385 channel = self.make_request("GET", self.url)
385386 nonce = channel.json_body["nonce"]
386387
387388 want_mac = hmac.new(key=b"shared", digestmod=hashlib.sha1)
397398 "mac": want_mac,
398399 }
399400 )
400 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
401 channel = self.make_request("POST", self.url, body.encode("utf8"))
401402
402403 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
403404 self.assertEqual("@bob4:test", channel.json_body["user_id"])
404405
405 request, channel = self.make_request("GET", "/profile/@bob4:test/displayname")
406 channel = self.make_request("GET", "/profile/@bob4:test/displayname")
406407 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
407408 self.assertEqual("Bob's Name", channel.json_body["displayname"])
408409
428429 )
429430
430431 # Register new user with admin API
431 request, channel = self.make_request("GET", self.url)
432 channel = self.make_request("GET", self.url)
432433 nonce = channel.json_body["nonce"]
433434
434435 want_mac = hmac.new(key=b"shared", digestmod=hashlib.sha1)
447448 "mac": want_mac,
448449 }
449450 )
450 request, channel = self.make_request("POST", self.url, body.encode("utf8"))
451 channel = self.make_request("POST", self.url, body.encode("utf8"))
451452
452453 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
453454 self.assertEqual("@bob:test", channel.json_body["user_id"])
465466 self.admin_user = self.register_user("admin", "pass", admin=True)
466467 self.admin_user_tok = self.login("admin", "pass")
467468
468 self.register_user("user1", "pass1", admin=False)
469 self.register_user("user2", "pass2", admin=False)
469 self.user1 = self.register_user(
470 "user1", "pass1", admin=False, displayname="Name 1"
471 )
472 self.user2 = self.register_user(
473 "user2", "pass2", admin=False, displayname="Name 2"
474 )
470475
471476 def test_no_auth(self):
472477 """
473478 Try to list users without authentication.
474479 """
475 request, channel = self.make_request("GET", self.url, b"{}")
480 channel = self.make_request("GET", self.url, b"{}")
476481
477482 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
478 self.assertEqual("M_MISSING_TOKEN", channel.json_body["errcode"])
483 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
484
485 def test_requester_is_no_admin(self):
486 """
487 If the user is not a server admin, an error is returned.
488 """
489 other_user_token = self.login("user1", "pass1")
490
491 channel = self.make_request("GET", self.url, access_token=other_user_token)
492
493 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
494 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
479495
480496 def test_all_users(self):
481497 """
482498 List all users, including deactivated users.
483499 """
484 request, channel = self.make_request(
500 channel = self.make_request(
485501 "GET",
486502 self.url + "?deactivated=true",
487503 b"{}",
491507 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
492508 self.assertEqual(3, len(channel.json_body["users"]))
493509 self.assertEqual(3, channel.json_body["total"])
510
511 # Check that all fields are available
512 for u in channel.json_body["users"]:
513 self.assertIn("name", u)
514 self.assertIn("is_guest", u)
515 self.assertIn("admin", u)
516 self.assertIn("user_type", u)
517 self.assertIn("deactivated", u)
518 self.assertIn("displayname", u)
519 self.assertIn("avatar_url", u)
520
521 def test_search_term(self):
522 """Test that searching for a users works correctly"""
523
524 def _search_test(
525 expected_user_id: Optional[str],
526 search_term: str,
527 search_field: Optional[str] = "name",
528 expected_http_code: Optional[int] = 200,
529 ):
530 """Search for a user and check that the returned user's id is a match
531
532 Args:
533 expected_user_id: The user_id expected to be returned by the API. Set
534 to None to expect zero results for the search
535 search_term: The term to search for user names with
536 search_field: Field which is to request: `name` or `user_id`
537 expected_http_code: The expected http code for the request
538 """
539 url = self.url + "?%s=%s" % (search_field, search_term,)
540 channel = self.make_request(
541 "GET", url.encode("ascii"), access_token=self.admin_user_tok,
542 )
543 self.assertEqual(expected_http_code, channel.code, msg=channel.json_body)
544
545 if expected_http_code != 200:
546 return
547
548 # Check that users were returned
549 self.assertTrue("users" in channel.json_body)
550 users = channel.json_body["users"]
551
552 # Check that the expected number of users were returned
553 expected_user_count = 1 if expected_user_id else 0
554 self.assertEqual(len(users), expected_user_count)
555 self.assertEqual(channel.json_body["total"], expected_user_count)
556
557 if expected_user_id:
558 # Check that the first returned user id is correct
559 u = users[0]
560 self.assertEqual(expected_user_id, u["name"])
561
562 # Perform search tests
563 _search_test(self.user1, "er1")
564 _search_test(self.user1, "me 1")
565
566 _search_test(self.user2, "er2")
567 _search_test(self.user2, "me 2")
568
569 _search_test(self.user1, "er1", "user_id")
570 _search_test(self.user2, "er2", "user_id")
571
572 # Test case insensitive
573 _search_test(self.user1, "ER1")
574 _search_test(self.user1, "NAME 1")
575
576 _search_test(self.user2, "ER2")
577 _search_test(self.user2, "NAME 2")
578
579 _search_test(self.user1, "ER1", "user_id")
580 _search_test(self.user2, "ER2", "user_id")
581
582 _search_test(None, "foo")
583 _search_test(None, "bar")
584
585 _search_test(None, "foo", "user_id")
586 _search_test(None, "bar", "user_id")
494587
495588
496589 class UserRestTestCase(unittest.HomeserverTestCase):
507600 self.admin_user = self.register_user("admin", "pass", admin=True)
508601 self.admin_user_tok = self.login("admin", "pass")
509602
510 self.other_user = self.register_user("user", "pass")
603 self.other_user = self.register_user("user", "pass", displayname="User")
511604 self.other_user_token = self.login("user", "pass")
512605 self.url_other_user = "/_synapse/admin/v2/users/%s" % urllib.parse.quote(
513606 self.other_user
519612 """
520613 url = "/_synapse/admin/v2/users/@bob:test"
521614
522 request, channel = self.make_request(
523 "GET", url, access_token=self.other_user_token,
524 )
615 channel = self.make_request("GET", url, access_token=self.other_user_token,)
525616
526617 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
527618 self.assertEqual("You are not a server admin", channel.json_body["error"])
528619
529 request, channel = self.make_request(
620 channel = self.make_request(
530621 "PUT", url, access_token=self.other_user_token, content=b"{}",
531622 )
532623
538629 Tests that a lookup for a user that does not exist returns a 404
539630 """
540631
541 request, channel = self.make_request(
632 channel = self.make_request(
542633 "GET",
543634 "/_synapse/admin/v2/users/@unknown_person:test",
544635 access_token=self.admin_user_tok,
560651 "admin": True,
561652 "displayname": "Bob's name",
562653 "threepids": [{"medium": "email", "address": "bob@bob.bob"}],
563 "avatar_url": None,
654 "avatar_url": "mxc://fibble/wibble",
564655 }
565656 )
566657
567 request, channel = self.make_request(
658 channel = self.make_request(
568659 "PUT",
569660 url,
570661 access_token=self.admin_user_tok,
577668 self.assertEqual("email", channel.json_body["threepids"][0]["medium"])
578669 self.assertEqual("bob@bob.bob", channel.json_body["threepids"][0]["address"])
579670 self.assertEqual(True, channel.json_body["admin"])
671 self.assertEqual("mxc://fibble/wibble", channel.json_body["avatar_url"])
580672
581673 # Get user
582 request, channel = self.make_request(
583 "GET", url, access_token=self.admin_user_tok,
584 )
674 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
585675
586676 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
587677 self.assertEqual("@bob:test", channel.json_body["name"])
591681 self.assertEqual(True, channel.json_body["admin"])
592682 self.assertEqual(False, channel.json_body["is_guest"])
593683 self.assertEqual(False, channel.json_body["deactivated"])
684 self.assertEqual("mxc://fibble/wibble", channel.json_body["avatar_url"])
594685
595686 def test_create_user(self):
596687 """
605696 "admin": False,
606697 "displayname": "Bob's name",
607698 "threepids": [{"medium": "email", "address": "bob@bob.bob"}],
699 "avatar_url": "mxc://fibble/wibble",
608700 }
609701 )
610702
611 request, channel = self.make_request(
703 channel = self.make_request(
612704 "PUT",
613705 url,
614706 access_token=self.admin_user_tok,
621713 self.assertEqual("email", channel.json_body["threepids"][0]["medium"])
622714 self.assertEqual("bob@bob.bob", channel.json_body["threepids"][0]["address"])
623715 self.assertEqual(False, channel.json_body["admin"])
716 self.assertEqual("mxc://fibble/wibble", channel.json_body["avatar_url"])
624717
625718 # Get user
626 request, channel = self.make_request(
627 "GET", url, access_token=self.admin_user_tok,
628 )
719 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
629720
630721 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
631722 self.assertEqual("@bob:test", channel.json_body["name"])
635726 self.assertEqual(False, channel.json_body["admin"])
636727 self.assertEqual(False, channel.json_body["is_guest"])
637728 self.assertEqual(False, channel.json_body["deactivated"])
729 self.assertEqual("mxc://fibble/wibble", channel.json_body["avatar_url"])
638730
639731 @override_config(
640732 {"limit_usage_by_mau": True, "max_mau_value": 2, "mau_trial_days": 0}
650742
651743 # Sync to set admin user to active
652744 # before limit of monthly active users is reached
653 request, channel = self.make_request(
654 "GET", "/sync", access_token=self.admin_user_tok
655 )
745 channel = self.make_request("GET", "/sync", access_token=self.admin_user_tok)
656746
657747 if channel.code != 200:
658748 raise HttpResponseException(
675765 # Create user
676766 body = json.dumps({"password": "abc123", "admin": False})
677767
678 request, channel = self.make_request(
768 channel = self.make_request(
679769 "PUT",
680770 url,
681771 access_token=self.admin_user_tok,
714804 # Create user
715805 body = json.dumps({"password": "abc123", "admin": False})
716806
717 request, channel = self.make_request(
807 channel = self.make_request(
718808 "PUT",
719809 url,
720810 access_token=self.admin_user_tok,
751841 }
752842 )
753843
754 request, channel = self.make_request(
844 channel = self.make_request(
755845 "PUT",
756846 url,
757847 access_token=self.admin_user_tok,
768858 )
769859 pushers = list(pushers)
770860 self.assertEqual(len(pushers), 1)
771 self.assertEqual("@bob:test", pushers[0]["user_name"])
861 self.assertEqual("@bob:test", pushers[0].user_name)
772862
773863 @override_config(
774864 {
795885 }
796886 )
797887
798 request, channel = self.make_request(
888 channel = self.make_request(
799889 "PUT",
800890 url,
801891 access_token=self.admin_user_tok,
821911 # Change password
822912 body = json.dumps({"password": "hahaha"})
823913
824 request, channel = self.make_request(
914 channel = self.make_request(
825915 "PUT",
826916 self.url_other_user,
827917 access_token=self.admin_user_tok,
838928 # Modify user
839929 body = json.dumps({"displayname": "foobar"})
840930
841 request, channel = self.make_request(
931 channel = self.make_request(
842932 "PUT",
843933 self.url_other_user,
844934 access_token=self.admin_user_tok,
850940 self.assertEqual("foobar", channel.json_body["displayname"])
851941
852942 # Get user
853 request, channel = self.make_request(
943 channel = self.make_request(
854944 "GET", self.url_other_user, access_token=self.admin_user_tok,
855945 )
856946
868958 {"threepids": [{"medium": "email", "address": "bob3@bob.bob"}]}
869959 )
870960
871 request, channel = self.make_request(
961 channel = self.make_request(
872962 "PUT",
873963 self.url_other_user,
874964 access_token=self.admin_user_tok,
881971 self.assertEqual("bob3@bob.bob", channel.json_body["threepids"][0]["address"])
882972
883973 # Get user
884 request, channel = self.make_request(
974 channel = self.make_request(
885975 "GET", self.url_other_user, access_token=self.admin_user_tok,
886976 )
887977
898988 # Deactivate user
899989 body = json.dumps({"deactivated": True})
900990
901 request, channel = self.make_request(
991 channel = self.make_request(
902992 "PUT",
903993 self.url_other_user,
904994 access_token=self.admin_user_tok,
9111001 # the user is deactivated, the threepid will be deleted
9121002
9131003 # Get user
914 request, channel = self.make_request(
1004 channel = self.make_request(
9151005 "GET", self.url_other_user, access_token=self.admin_user_tok,
9161006 )
9171007
9191009 self.assertEqual("@user:test", channel.json_body["name"])
9201010 self.assertEqual(True, channel.json_body["deactivated"])
9211011
1012 @override_config({"user_directory": {"enabled": True, "search_all_users": True}})
1013 def test_change_name_deactivate_user_user_directory(self):
1014 """
1015 Test change profile information of a deactivated user and
1016 check that it does not appear in user directory
1017 """
1018
1019 # is in user directory
1020 profile = self.get_success(self.store.get_user_in_directory(self.other_user))
1021 self.assertTrue(profile["display_name"] == "User")
1022
1023 # Deactivate user
1024 body = json.dumps({"deactivated": True})
1025
1026 channel = self.make_request(
1027 "PUT",
1028 self.url_other_user,
1029 access_token=self.admin_user_tok,
1030 content=body.encode(encoding="utf_8"),
1031 )
1032
1033 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
1034 self.assertEqual("@user:test", channel.json_body["name"])
1035 self.assertEqual(True, channel.json_body["deactivated"])
1036
1037 # is not in user directory
1038 profile = self.get_success(self.store.get_user_in_directory(self.other_user))
1039 self.assertTrue(profile is None)
1040
1041 # Set new displayname user
1042 body = json.dumps({"displayname": "Foobar"})
1043
1044 channel = self.make_request(
1045 "PUT",
1046 self.url_other_user,
1047 access_token=self.admin_user_tok,
1048 content=body.encode(encoding="utf_8"),
1049 )
1050
1051 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
1052 self.assertEqual("@user:test", channel.json_body["name"])
1053 self.assertEqual(True, channel.json_body["deactivated"])
1054 self.assertEqual("Foobar", channel.json_body["displayname"])
1055
1056 # is not in user directory
1057 profile = self.get_success(self.store.get_user_in_directory(self.other_user))
1058 self.assertTrue(profile is None)
1059
9221060 def test_reactivate_user(self):
9231061 """
9241062 Test reactivating another user.
9251063 """
9261064
9271065 # Deactivate the user.
928 request, channel = self.make_request(
1066 channel = self.make_request(
9291067 "PUT",
9301068 self.url_other_user,
9311069 access_token=self.admin_user_tok,
9381076 self._is_erased("@user:test", True)
9391077
9401078 # Attempt to reactivate the user (without a password).
941 request, channel = self.make_request(
1079 channel = self.make_request(
9421080 "PUT",
9431081 self.url_other_user,
9441082 access_token=self.admin_user_tok,
9471085 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
9481086
9491087 # Reactivate the user.
950 request, channel = self.make_request(
1088 channel = self.make_request(
9511089 "PUT",
9521090 self.url_other_user,
9531091 access_token=self.admin_user_tok,
9581096 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
9591097
9601098 # Get user
961 request, channel = self.make_request(
1099 channel = self.make_request(
9621100 "GET", self.url_other_user, access_token=self.admin_user_tok,
9631101 )
9641102
9751113 # Set a user as an admin
9761114 body = json.dumps({"admin": True})
9771115
978 request, channel = self.make_request(
1116 channel = self.make_request(
9791117 "PUT",
9801118 self.url_other_user,
9811119 access_token=self.admin_user_tok,
9871125 self.assertEqual(True, channel.json_body["admin"])
9881126
9891127 # Get user
990 request, channel = self.make_request(
1128 channel = self.make_request(
9911129 "GET", self.url_other_user, access_token=self.admin_user_tok,
9921130 )
9931131
10051143 # Create user
10061144 body = json.dumps({"password": "abc123"})
10071145
1008 request, channel = self.make_request(
1146 channel = self.make_request(
10091147 "PUT",
10101148 url,
10111149 access_token=self.admin_user_tok,
10171155 self.assertEqual("bob", channel.json_body["displayname"])
10181156
10191157 # Get user
1020 request, channel = self.make_request(
1021 "GET", url, access_token=self.admin_user_tok,
1022 )
1158 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
10231159
10241160 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
10251161 self.assertEqual("@bob:test", channel.json_body["name"])
10291165 # Change password (and use a str for deactivate instead of a bool)
10301166 body = json.dumps({"password": "abc123", "deactivated": "false"}) # oops!
10311167
1032 request, channel = self.make_request(
1168 channel = self.make_request(
10331169 "PUT",
10341170 url,
10351171 access_token=self.admin_user_tok,
10391175 self.assertEqual(400, int(channel.result["code"]), msg=channel.result["body"])
10401176
10411177 # Check user is not deactivated
1042 request, channel = self.make_request(
1043 "GET", url, access_token=self.admin_user_tok,
1044 )
1178 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
10451179
10461180 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
10471181 self.assertEqual("@bob:test", channel.json_body["name"])
10831217 """
10841218 Try to list rooms of an user without authentication.
10851219 """
1086 request, channel = self.make_request("GET", self.url, b"{}")
1220 channel = self.make_request("GET", self.url, b"{}")
10871221
10881222 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
10891223 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
10941228 """
10951229 other_user_token = self.login("user", "pass")
10961230
1097 request, channel = self.make_request(
1098 "GET", self.url, access_token=other_user_token,
1099 )
1231 channel = self.make_request("GET", self.url, access_token=other_user_token,)
11001232
11011233 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
11021234 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
11061238 Tests that a lookup for a user that does not exist returns a 404
11071239 """
11081240 url = "/_synapse/admin/v1/users/@unknown_person:test/joined_rooms"
1109 request, channel = self.make_request(
1110 "GET", url, access_token=self.admin_user_tok,
1111 )
1241 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
11121242
11131243 self.assertEqual(404, channel.code, msg=channel.json_body)
11141244 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
11191249 """
11201250 url = "/_synapse/admin/v1/users/@unknown_person:unknown_domain/joined_rooms"
11211251
1122 request, channel = self.make_request(
1123 "GET", url, access_token=self.admin_user_tok,
1124 )
1252 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
11251253
11261254 self.assertEqual(400, channel.code, msg=channel.json_body)
11271255 self.assertEqual("Can only lookup local users", channel.json_body["error"])
11321260 if user has no memberships
11331261 """
11341262 # Get rooms
1135 request, channel = self.make_request(
1136 "GET", self.url, access_token=self.admin_user_tok,
1137 )
1263 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
11381264
11391265 self.assertEqual(200, channel.code, msg=channel.json_body)
11401266 self.assertEqual(0, channel.json_body["total"])
11511277 self.helper.create_room_as(self.other_user, tok=other_user_tok)
11521278
11531279 # Get rooms
1154 request, channel = self.make_request(
1155 "GET", self.url, access_token=self.admin_user_tok,
1156 )
1280 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
11571281
11581282 self.assertEqual(200, channel.code, msg=channel.json_body)
11591283 self.assertEqual(number_rooms, channel.json_body["total"])
11821306 """
11831307 Try to list pushers of an user without authentication.
11841308 """
1185 request, channel = self.make_request("GET", self.url, b"{}")
1309 channel = self.make_request("GET", self.url, b"{}")
11861310
11871311 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
11881312 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
11931317 """
11941318 other_user_token = self.login("user", "pass")
11951319
1196 request, channel = self.make_request(
1197 "GET", self.url, access_token=other_user_token,
1198 )
1320 channel = self.make_request("GET", self.url, access_token=other_user_token,)
11991321
12001322 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
12011323 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
12051327 Tests that a lookup for a user that does not exist returns a 404
12061328 """
12071329 url = "/_synapse/admin/v1/users/@unknown_person:test/pushers"
1208 request, channel = self.make_request(
1209 "GET", url, access_token=self.admin_user_tok,
1210 )
1330 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
12111331
12121332 self.assertEqual(404, channel.code, msg=channel.json_body)
12131333 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
12181338 """
12191339 url = "/_synapse/admin/v1/users/@unknown_person:unknown_domain/pushers"
12201340
1221 request, channel = self.make_request(
1222 "GET", url, access_token=self.admin_user_tok,
1223 )
1341 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
12241342
12251343 self.assertEqual(400, channel.code, msg=channel.json_body)
12261344 self.assertEqual("Can only lookup local users", channel.json_body["error"])
12311349 """
12321350
12331351 # Get pushers
1234 request, channel = self.make_request(
1235 "GET", self.url, access_token=self.admin_user_tok,
1236 )
1352 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
12371353
12381354 self.assertEqual(200, channel.code, msg=channel.json_body)
12391355 self.assertEqual(0, channel.json_body["total"])
12551371 device_display_name="pushy push",
12561372 pushkey="a@example.com",
12571373 lang=None,
1258 data={"url": "example.com"},
1374 data={"url": "https://example.com/_matrix/push/v1/notify"},
12591375 )
12601376 )
12611377
12621378 # Get pushers
1263 request, channel = self.make_request(
1264 "GET", self.url, access_token=self.admin_user_tok,
1265 )
1379 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
12661380
12671381 self.assertEqual(200, channel.code, msg=channel.json_body)
12681382 self.assertEqual(1, channel.json_body["total"])
13011415 """
13021416 Try to list media of an user without authentication.
13031417 """
1304 request, channel = self.make_request("GET", self.url, b"{}")
1418 channel = self.make_request("GET", self.url, b"{}")
13051419
13061420 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
13071421 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
13121426 """
13131427 other_user_token = self.login("user", "pass")
13141428
1315 request, channel = self.make_request(
1316 "GET", self.url, access_token=other_user_token,
1317 )
1429 channel = self.make_request("GET", self.url, access_token=other_user_token,)
13181430
13191431 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
13201432 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
13241436 Tests that a lookup for a user that does not exist returns a 404
13251437 """
13261438 url = "/_synapse/admin/v1/users/@unknown_person:test/media"
1327 request, channel = self.make_request(
1328 "GET", url, access_token=self.admin_user_tok,
1329 )
1439 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
13301440
13311441 self.assertEqual(404, channel.code, msg=channel.json_body)
13321442 self.assertEqual(Codes.NOT_FOUND, channel.json_body["errcode"])
13371447 """
13381448 url = "/_synapse/admin/v1/users/@unknown_person:unknown_domain/media"
13391449
1340 request, channel = self.make_request(
1341 "GET", url, access_token=self.admin_user_tok,
1342 )
1450 channel = self.make_request("GET", url, access_token=self.admin_user_tok,)
13431451
13441452 self.assertEqual(400, channel.code, msg=channel.json_body)
13451453 self.assertEqual("Can only lookup local users", channel.json_body["error"])
13531461 other_user_tok = self.login("user", "pass")
13541462 self._create_media(other_user_tok, number_media)
13551463
1356 request, channel = self.make_request(
1464 channel = self.make_request(
13571465 "GET", self.url + "?limit=5", access_token=self.admin_user_tok,
13581466 )
13591467
13721480 other_user_tok = self.login("user", "pass")
13731481 self._create_media(other_user_tok, number_media)
13741482
1375 request, channel = self.make_request(
1483 channel = self.make_request(
13761484 "GET", self.url + "?from=5", access_token=self.admin_user_tok,
13771485 )
13781486
13911499 other_user_tok = self.login("user", "pass")
13921500 self._create_media(other_user_tok, number_media)
13931501
1394 request, channel = self.make_request(
1502 channel = self.make_request(
13951503 "GET", self.url + "?from=5&limit=10", access_token=self.admin_user_tok,
13961504 )
13971505
14061514 Testing that a negative limit parameter returns a 400
14071515 """
14081516
1409 request, channel = self.make_request(
1517 channel = self.make_request(
14101518 "GET", self.url + "?limit=-5", access_token=self.admin_user_tok,
14111519 )
14121520
14181526 Testing that a negative from parameter returns a 400
14191527 """
14201528
1421 request, channel = self.make_request(
1529 channel = self.make_request(
14221530 "GET", self.url + "?from=-5", access_token=self.admin_user_tok,
14231531 )
14241532
14361544
14371545 # `next_token` does not appear
14381546 # Number of results is the number of entries
1439 request, channel = self.make_request(
1547 channel = self.make_request(
14401548 "GET", self.url + "?limit=20", access_token=self.admin_user_tok,
14411549 )
14421550
14471555
14481556 # `next_token` does not appear
14491557 # Number of max results is larger than the number of entries
1450 request, channel = self.make_request(
1558 channel = self.make_request(
14511559 "GET", self.url + "?limit=21", access_token=self.admin_user_tok,
14521560 )
14531561
14581566
14591567 # `next_token` does appear
14601568 # Number of max results is smaller than the number of entries
1461 request, channel = self.make_request(
1569 channel = self.make_request(
14621570 "GET", self.url + "?limit=19", access_token=self.admin_user_tok,
14631571 )
14641572
14701578 # Check
14711579 # Set `from` to value of `next_token` for request remaining entries
14721580 # `next_token` does not appear
1473 request, channel = self.make_request(
1581 channel = self.make_request(
14741582 "GET", self.url + "?from=19", access_token=self.admin_user_tok,
14751583 )
14761584
14851593 if user has no media created
14861594 """
14871595
1488 request, channel = self.make_request(
1489 "GET", self.url, access_token=self.admin_user_tok,
1490 )
1596 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
14911597
14921598 self.assertEqual(200, channel.code, msg=channel.json_body)
14931599 self.assertEqual(0, channel.json_body["total"])
15021608 other_user_tok = self.login("user", "pass")
15031609 self._create_media(other_user_tok, number_media)
15041610
1505 request, channel = self.make_request(
1506 "GET", self.url, access_token=self.admin_user_tok,
1507 )
1611 channel = self.make_request("GET", self.url, access_token=self.admin_user_tok,)
15081612
15091613 self.assertEqual(200, channel.code, msg=channel.json_body)
15101614 self.assertEqual(number_media, channel.json_body["total"])
15701674 )
15711675
15721676 def _get_token(self) -> str:
1573 request, channel = self.make_request(
1677 channel = self.make_request(
15741678 "POST", self.url, b"{}", access_token=self.admin_user_tok
15751679 )
15761680 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
15791683 def test_no_auth(self):
15801684 """Try to login as a user without authentication.
15811685 """
1582 request, channel = self.make_request("POST", self.url, b"{}")
1686 channel = self.make_request("POST", self.url, b"{}")
15831687
15841688 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
15851689 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
15871691 def test_not_admin(self):
15881692 """Try to login as a user as a non-admin user.
15891693 """
1590 request, channel = self.make_request(
1694 channel = self.make_request(
15911695 "POST", self.url, b"{}", access_token=self.other_user_tok
15921696 )
15931697
16151719 self._get_token()
16161720
16171721 # Check that we don't see a new device in our devices list
1618 request, channel = self.make_request(
1722 channel = self.make_request(
16191723 "GET", "devices", b"{}", access_token=self.other_user_tok
16201724 )
16211725 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
16301734 puppet_token = self._get_token()
16311735
16321736 # Test that we can successfully make a request
1633 request, channel = self.make_request(
1634 "GET", "devices", b"{}", access_token=puppet_token
1635 )
1737 channel = self.make_request("GET", "devices", b"{}", access_token=puppet_token)
16361738 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
16371739
16381740 # Logout with the puppet token
1639 request, channel = self.make_request(
1640 "POST", "logout", b"{}", access_token=puppet_token
1641 )
1741 channel = self.make_request("POST", "logout", b"{}", access_token=puppet_token)
16421742 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
16431743
16441744 # The puppet token should no longer work
1645 request, channel = self.make_request(
1646 "GET", "devices", b"{}", access_token=puppet_token
1647 )
1745 channel = self.make_request("GET", "devices", b"{}", access_token=puppet_token)
16481746 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
16491747
16501748 # .. but the real user's tokens should still work
1651 request, channel = self.make_request(
1749 channel = self.make_request(
16521750 "GET", "devices", b"{}", access_token=self.other_user_tok
16531751 )
16541752 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
16611759 puppet_token = self._get_token()
16621760
16631761 # Test that we can successfully make a request
1664 request, channel = self.make_request(
1665 "GET", "devices", b"{}", access_token=puppet_token
1666 )
1762 channel = self.make_request("GET", "devices", b"{}", access_token=puppet_token)
16671763 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
16681764
16691765 # Logout all with the real user token
1670 request, channel = self.make_request(
1766 channel = self.make_request(
16711767 "POST", "logout/all", b"{}", access_token=self.other_user_tok
16721768 )
16731769 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
16741770
16751771 # The puppet token should still work
1676 request, channel = self.make_request(
1677 "GET", "devices", b"{}", access_token=puppet_token
1678 )
1772 channel = self.make_request("GET", "devices", b"{}", access_token=puppet_token)
16791773 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
16801774
16811775 # .. but the real user's tokens shouldn't
1682 request, channel = self.make_request(
1776 channel = self.make_request(
16831777 "GET", "devices", b"{}", access_token=self.other_user_tok
16841778 )
16851779 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
16921786 puppet_token = self._get_token()
16931787
16941788 # Test that we can successfully make a request
1695 request, channel = self.make_request(
1696 "GET", "devices", b"{}", access_token=puppet_token
1697 )
1789 channel = self.make_request("GET", "devices", b"{}", access_token=puppet_token)
16981790 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
16991791
17001792 # Logout all with the admin user token
1701 request, channel = self.make_request(
1793 channel = self.make_request(
17021794 "POST", "logout/all", b"{}", access_token=self.admin_user_tok
17031795 )
17041796 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
17051797
17061798 # The puppet token should no longer work
1707 request, channel = self.make_request(
1708 "GET", "devices", b"{}", access_token=puppet_token
1709 )
1799 channel = self.make_request("GET", "devices", b"{}", access_token=puppet_token)
17101800 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
17111801
17121802 # .. but the real user's tokens should still work
1713 request, channel = self.make_request(
1803 channel = self.make_request(
17141804 "GET", "devices", b"{}", access_token=self.other_user_tok
17151805 )
17161806 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
17921882 """
17931883 Try to get information of an user without authentication.
17941884 """
1795 request, channel = self.make_request("GET", self.url1, b"{}")
1885 channel = self.make_request("GET", self.url1, b"{}")
17961886 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
17971887 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
17981888
1799 request, channel = self.make_request("GET", self.url2, b"{}")
1889 channel = self.make_request("GET", self.url2, b"{}")
18001890 self.assertEqual(401, int(channel.result["code"]), msg=channel.result["body"])
18011891 self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
18021892
18071897 self.register_user("user2", "pass")
18081898 other_user2_token = self.login("user2", "pass")
18091899
1810 request, channel = self.make_request(
1811 "GET", self.url1, access_token=other_user2_token,
1812 )
1900 channel = self.make_request("GET", self.url1, access_token=other_user2_token,)
18131901 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
18141902 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
18151903
1816 request, channel = self.make_request(
1817 "GET", self.url2, access_token=other_user2_token,
1818 )
1904 channel = self.make_request("GET", self.url2, access_token=other_user2_token,)
18191905 self.assertEqual(403, int(channel.result["code"]), msg=channel.result["body"])
18201906 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
18211907
18261912 url1 = "/_synapse/admin/v1/whois/@unknown_person:unknown_domain"
18271913 url2 = "/_matrix/client/r0/admin/whois/@unknown_person:unknown_domain"
18281914
1829 request, channel = self.make_request(
1830 "GET", url1, access_token=self.admin_user_tok,
1831 )
1915 channel = self.make_request("GET", url1, access_token=self.admin_user_tok,)
18321916 self.assertEqual(400, channel.code, msg=channel.json_body)
18331917 self.assertEqual("Can only whois a local user", channel.json_body["error"])
18341918
1835 request, channel = self.make_request(
1836 "GET", url2, access_token=self.admin_user_tok,
1837 )
1919 channel = self.make_request("GET", url2, access_token=self.admin_user_tok,)
18381920 self.assertEqual(400, channel.code, msg=channel.json_body)
18391921 self.assertEqual("Can only whois a local user", channel.json_body["error"])
18401922
18421924 """
18431925 The lookup should succeed for an admin.
18441926 """
1845 request, channel = self.make_request(
1846 "GET", self.url1, access_token=self.admin_user_tok,
1847 )
1927 channel = self.make_request("GET", self.url1, access_token=self.admin_user_tok,)
18481928 self.assertEqual(200, channel.code, msg=channel.json_body)
18491929 self.assertEqual(self.other_user, channel.json_body["user_id"])
18501930 self.assertIn("devices", channel.json_body)
18511931
1852 request, channel = self.make_request(
1853 "GET", self.url2, access_token=self.admin_user_tok,
1854 )
1932 channel = self.make_request("GET", self.url2, access_token=self.admin_user_tok,)
18551933 self.assertEqual(200, channel.code, msg=channel.json_body)
18561934 self.assertEqual(self.other_user, channel.json_body["user_id"])
18571935 self.assertIn("devices", channel.json_body)
18621940 """
18631941 other_user_token = self.login("user", "pass")
18641942
1865 request, channel = self.make_request(
1866 "GET", self.url1, access_token=other_user_token,
1867 )
1943 channel = self.make_request("GET", self.url1, access_token=other_user_token,)
18681944 self.assertEqual(200, channel.code, msg=channel.json_body)
18691945 self.assertEqual(self.other_user, channel.json_body["user_id"])
18701946 self.assertIn("devices", channel.json_body)
18711947
1872 request, channel = self.make_request(
1873 "GET", self.url2, access_token=other_user_token,
1874 )
1948 channel = self.make_request("GET", self.url2, access_token=other_user_token,)
18751949 self.assertEqual(200, channel.code, msg=channel.json_body)
18761950 self.assertEqual(self.other_user, channel.json_body["user_id"])
18771951 self.assertIn("devices", channel.json_body)
6060 def test_render_public_consent(self):
6161 """You can observe the terms form without specifying a user"""
6262 resource = consent_resource.ConsentResource(self.hs)
63 request, channel = make_request(
63 channel = make_request(
6464 self.reactor, FakeSite(resource), "GET", "/consent?v=1", shorthand=False
6565 )
6666 self.assertEqual(channel.code, 200)
8181 uri_builder.build_user_consent_uri(user_id).replace("_matrix/", "")
8282 + "&u=user"
8383 )
84 request, channel = make_request(
84 channel = make_request(
8585 self.reactor,
8686 FakeSite(resource),
8787 "GET",
9696 self.assertEqual(consented, "False")
9797
9898 # POST to the consent page, saying we've agreed
99 request, channel = make_request(
99 channel = make_request(
100100 self.reactor,
101101 FakeSite(resource),
102102 "POST",
108108
109109 # Fetch the consent page, to get the consent version -- it should have
110110 # changed
111 request, channel = make_request(
111 channel = make_request(
112112 self.reactor,
113113 FakeSite(resource),
114114 "GET",
9292 def get_event(self, room_id, event_id, expected_code=200):
9393 url = "/_matrix/client/r0/rooms/%s/event/%s" % (room_id, event_id)
9494
95 request, channel = self.make_request("GET", url)
95 channel = self.make_request("GET", url)
9696
9797 self.assertEqual(channel.code, expected_code, channel.result)
9898
4242 self.register_user("kermit", "monkey")
4343 tok = self.login("kermit", "monkey")
4444
45 request, channel = self.make_request(
46 b"POST", "/createRoom", b"{}", access_token=tok
47 )
45 channel = self.make_request(b"POST", "/createRoom", b"{}", access_token=tok)
4846 self.assertEquals(channel.result["code"], b"200", channel.result)
4947 room_id = channel.json_body["room_id"]
5048
5553 }
5654 request_data = json.dumps(params)
5755 request_url = ("/rooms/%s/invite" % (room_id)).encode("ascii")
58 request, channel = self.make_request(
56 channel = self.make_request(
5957 b"POST", request_url, request_data, access_token=tok
6058 )
6159 self.assertEquals(channel.result["code"], b"403", channel.result)
6868 """
6969 path = "/_matrix/client/r0/rooms/%s/redact/%s" % (room_id, event_id)
7070
71 request, channel = self.make_request(
72 "POST", path, content={}, access_token=access_token
73 )
71 channel = self.make_request("POST", path, content={}, access_token=access_token)
7472 self.assertEqual(int(channel.result["code"]), expect_code)
7573 return channel.json_body
7674
7775 def _sync_room_timeline(self, access_token, room_id):
78 request, channel = self.make_request(
79 "GET", "sync", access_token=self.mod_access_token
80 )
76 channel = self.make_request("GET", "sync", access_token=self.mod_access_token)
8177 self.assertEqual(channel.result["code"], b"200")
8278 room_sync = channel.json_body["rooms"]["join"][room_id]
8379 return room_sync["timeline"]["events"]
324324 def get_event(self, room_id, event_id, expected_code=200):
325325 url = "/_matrix/client/r0/rooms/%s/event/%s" % (room_id, event_id)
326326
327 request, channel = self.make_request("GET", url, access_token=self.token)
327 channel = self.make_request("GET", url, access_token=self.token)
328328
329329 self.assertEqual(channel.code, expected_code, channel.result)
330330
8888 )
8989
9090 # Inviting the user completes successfully.
91 request, channel = self.make_request(
91 channel = self.make_request(
9292 "POST",
9393 "/rooms/%s/invite" % (room_id,),
9494 {"id_server": "test", "medium": "email", "address": "test@test.test"},
102102 def test_create_room(self):
103103 """Invitations during a room creation should be discarded, but the room still gets created."""
104104 # The room creation is successful.
105 request, channel = self.make_request(
105 channel = self.make_request(
106106 "POST",
107107 "/_matrix/client/r0/createRoom",
108108 {"visibility": "public", "invite": [self.other_user_id]},
157157 self.banned_user_id, tok=self.banned_access_token
158158 )
159159
160 request, channel = self.make_request(
160 channel = self.make_request(
161161 "POST",
162162 "/_matrix/client/r0/rooms/%s/upgrade" % (room_id,),
163163 {"new_version": "6"},
182182 self.banned_user_id, tok=self.banned_access_token
183183 )
184184
185 request, channel = self.make_request(
185 channel = self.make_request(
186186 "PUT",
187187 "/rooms/%s/typing/%s" % (room_id, self.banned_user_id),
188188 {"typing": True, "timeout": 30000},
197197 # The other user can join and send typing events.
198198 self.helper.join(room_id, self.other_user_id, tok=self.other_access_token)
199199
200 request, channel = self.make_request(
200 channel = self.make_request(
201201 "PUT",
202202 "/rooms/%s/typing/%s" % (room_id, self.other_user_id),
203203 {"typing": True, "timeout": 30000},
243243 )
244244
245245 # The update should succeed.
246 request, channel = self.make_request(
246 channel = self.make_request(
247247 "PUT",
248248 "/_matrix/client/r0/profile/%s/displayname" % (self.banned_user_id,),
249249 {"displayname": new_display_name},
253253 self.assertEqual(channel.json_body, {})
254254
255255 # The user's display name should be updated.
256 request, channel = self.make_request(
256 channel = self.make_request(
257257 "GET", "/profile/%s/displayname" % (self.banned_user_id,)
258258 )
259259 self.assertEqual(channel.code, 200, channel.result)
281281 )
282282
283283 # The update should succeed.
284 request, channel = self.make_request(
284 channel = self.make_request(
285285 "PUT",
286286 "/_matrix/client/r0/rooms/%s/state/m.room.member/%s"
287287 % (room_id, self.banned_user_id),
8585 callback = Mock(spec=[], side_effect=check)
8686 current_rules_module().check_event_allowed = callback
8787
88 request, channel = self.make_request(
88 channel = self.make_request(
8989 "PUT",
9090 "/_matrix/client/r0/rooms/%s/send/foo.bar.allowed/1" % self.room_id,
9191 {},
103103 self.assertEqual(ev.type, k[0])
104104 self.assertEqual(ev.state_key, k[1])
105105
106 request, channel = self.make_request(
106 channel = self.make_request(
107107 "PUT",
108108 "/_matrix/client/r0/rooms/%s/send/foo.bar.forbidden/2" % self.room_id,
109109 {},
122122 current_rules_module().check_event_allowed = check
123123
124124 # now send the event
125 request, channel = self.make_request(
125 channel = self.make_request(
126126 "PUT",
127127 "/_matrix/client/r0/rooms/%s/send/modifyme/1" % self.room_id,
128128 {"x": "x"},
141141 current_rules_module().check_event_allowed = check
142142
143143 # now send the event
144 request, channel = self.make_request(
144 channel = self.make_request(
145145 "PUT",
146146 "/_matrix/client/r0/rooms/%s/send/modifyme/1" % self.room_id,
147147 {"x": "x"},
151151 event_id = channel.json_body["event_id"]
152152
153153 # ... and check that it got modified
154 request, channel = self.make_request(
154 channel = self.make_request(
155155 "GET",
156156 "/_matrix/client/r0/rooms/%s/event/%s" % (self.room_id, event_id),
157157 access_token=self.tok,
9090 # that we can make sure that the check is done on the whole alias.
9191 data = {"room_alias_name": random_string(256 - len(self.hs.hostname))}
9292 request_data = json.dumps(data)
93 request, channel = self.make_request(
93 channel = self.make_request(
9494 "POST", url, request_data, access_token=self.user_tok
9595 )
9696 self.assertEqual(channel.code, 400, channel.result)
103103 # as cautious as possible here.
104104 data = {"room_alias_name": random_string(5)}
105105 request_data = json.dumps(data)
106 request, channel = self.make_request(
106 channel = self.make_request(
107107 "POST", url, request_data, access_token=self.user_tok
108108 )
109109 self.assertEqual(channel.code, 200, channel.result)
117117 data = {"aliases": [self.random_alias(alias_length)]}
118118 request_data = json.dumps(data)
119119
120 request, channel = self.make_request(
120 channel = self.make_request(
121121 "PUT", url, request_data, access_token=self.user_tok
122122 )
123123 self.assertEqual(channel.code, expected_code, channel.result)
127127 data = {"room_id": self.room_id}
128128 request_data = json.dumps(data)
129129
130 request, channel = self.make_request(
130 channel = self.make_request(
131131 "PUT", url, request_data, access_token=self.user_tok
132132 )
133133 self.assertEqual(channel.code, expected_code, channel.result)
6262 # implementation is now part of the r0 implementation, the newer
6363 # behaviour is used instead to be consistent with the r0 spec.
6464 # see issue #2602
65 request, channel = self.make_request(
65 channel = self.make_request(
6666 "GET", "/events?access_token=%s" % ("invalid" + self.token,)
6767 )
6868 self.assertEquals(channel.code, 401, msg=channel.result)
6969
7070 # valid token, expect content
71 request, channel = self.make_request(
71 channel = self.make_request(
7272 "GET", "/events?access_token=%s&timeout=0" % (self.token,)
7373 )
7474 self.assertEquals(channel.code, 200, msg=channel.result)
8686 )
8787
8888 # valid token, expect content
89 request, channel = self.make_request(
89 channel = self.make_request(
9090 "GET", "/events?access_token=%s&timeout=0" % (self.token,)
9191 )
9292 self.assertEquals(channel.code, 200, msg=channel.result)
148148 resp = self.helper.send(self.room_id, tok=self.token)
149149 event_id = resp["event_id"]
150150
151 request, channel = self.make_request(
151 channel = self.make_request(
152152 "GET", "/events/" + event_id, access_token=self.token,
153153 )
154154 self.assertEquals(channel.code, 200, msg=channel.result)
6262 "identifier": {"type": "m.id.user", "user": "kermit" + str(i)},
6363 "password": "monkey",
6464 }
65 request, channel = self.make_request(b"POST", LOGIN_URL, params)
65 channel = self.make_request(b"POST", LOGIN_URL, params)
6666
6767 if i == 5:
6868 self.assertEquals(channel.result["code"], b"429", channel.result)
8181 "identifier": {"type": "m.id.user", "user": "kermit" + str(i)},
8282 "password": "monkey",
8383 }
84 request, channel = self.make_request(b"POST", LOGIN_URL, params)
84 channel = self.make_request(b"POST", LOGIN_URL, params)
8585
8686 self.assertEquals(channel.result["code"], b"200", channel.result)
8787
107107 "identifier": {"type": "m.id.user", "user": "kermit"},
108108 "password": "monkey",
109109 }
110 request, channel = self.make_request(b"POST", LOGIN_URL, params)
110 channel = self.make_request(b"POST", LOGIN_URL, params)
111111
112112 if i == 5:
113113 self.assertEquals(channel.result["code"], b"429", channel.result)
126126 "identifier": {"type": "m.id.user", "user": "kermit"},
127127 "password": "monkey",
128128 }
129 request, channel = self.make_request(b"POST", LOGIN_URL, params)
129 channel = self.make_request(b"POST", LOGIN_URL, params)
130130
131131 self.assertEquals(channel.result["code"], b"200", channel.result)
132132
152152 "identifier": {"type": "m.id.user", "user": "kermit"},
153153 "password": "notamonkey",
154154 }
155 request, channel = self.make_request(b"POST", LOGIN_URL, params)
155 channel = self.make_request(b"POST", LOGIN_URL, params)
156156
157157 if i == 5:
158158 self.assertEquals(channel.result["code"], b"429", channel.result)
171171 "identifier": {"type": "m.id.user", "user": "kermit"},
172172 "password": "notamonkey",
173173 }
174 request, channel = self.make_request(b"POST", LOGIN_URL, params)
174 channel = self.make_request(b"POST", LOGIN_URL, params)
175175
176176 self.assertEquals(channel.result["code"], b"403", channel.result)
177177
180180 self.register_user("kermit", "monkey")
181181
182182 # we shouldn't be able to make requests without an access token
183 request, channel = self.make_request(b"GET", TEST_URL)
183 channel = self.make_request(b"GET", TEST_URL)
184184 self.assertEquals(channel.result["code"], b"401", channel.result)
185185 self.assertEquals(channel.json_body["errcode"], "M_MISSING_TOKEN")
186186
190190 "identifier": {"type": "m.id.user", "user": "kermit"},
191191 "password": "monkey",
192192 }
193 request, channel = self.make_request(b"POST", LOGIN_URL, params)
193 channel = self.make_request(b"POST", LOGIN_URL, params)
194194
195195 self.assertEquals(channel.code, 200, channel.result)
196196 access_token = channel.json_body["access_token"]
197197 device_id = channel.json_body["device_id"]
198198
199199 # we should now be able to make requests with the access token
200 request, channel = self.make_request(
201 b"GET", TEST_URL, access_token=access_token
202 )
200 channel = self.make_request(b"GET", TEST_URL, access_token=access_token)
203201 self.assertEquals(channel.code, 200, channel.result)
204202
205203 # time passes
206204 self.reactor.advance(24 * 3600)
207205
208206 # ... and we should be soft-logouted
209 request, channel = self.make_request(
210 b"GET", TEST_URL, access_token=access_token
211 )
207 channel = self.make_request(b"GET", TEST_URL, access_token=access_token)
212208 self.assertEquals(channel.code, 401, channel.result)
213209 self.assertEquals(channel.json_body["errcode"], "M_UNKNOWN_TOKEN")
214210 self.assertEquals(channel.json_body["soft_logout"], True)
222218
223219 # more requests with the expired token should still return a soft-logout
224220 self.reactor.advance(3600)
225 request, channel = self.make_request(
226 b"GET", TEST_URL, access_token=access_token
227 )
221 channel = self.make_request(b"GET", TEST_URL, access_token=access_token)
228222 self.assertEquals(channel.code, 401, channel.result)
229223 self.assertEquals(channel.json_body["errcode"], "M_UNKNOWN_TOKEN")
230224 self.assertEquals(channel.json_body["soft_logout"], True)
232226 # ... but if we delete that device, it will be a proper logout
233227 self._delete_device(access_token_2, "kermit", "monkey", device_id)
234228
235 request, channel = self.make_request(
236 b"GET", TEST_URL, access_token=access_token
237 )
229 channel = self.make_request(b"GET", TEST_URL, access_token=access_token)
238230 self.assertEquals(channel.code, 401, channel.result)
239231 self.assertEquals(channel.json_body["errcode"], "M_UNKNOWN_TOKEN")
240232 self.assertEquals(channel.json_body["soft_logout"], False)
241233
242234 def _delete_device(self, access_token, user_id, password, device_id):
243235 """Perform the UI-Auth to delete a device"""
244 request, channel = self.make_request(
236 channel = self.make_request(
245237 b"DELETE", "devices/" + device_id, access_token=access_token
246238 )
247239 self.assertEquals(channel.code, 401, channel.result)
261253 "session": channel.json_body["session"],
262254 }
263255
264 request, channel = self.make_request(
256 channel = self.make_request(
265257 b"DELETE",
266258 "devices/" + device_id,
267259 access_token=access_token,
277269 access_token = self.login("kermit", "monkey")
278270
279271 # we should now be able to make requests with the access token
280 request, channel = self.make_request(
281 b"GET", TEST_URL, access_token=access_token
282 )
272 channel = self.make_request(b"GET", TEST_URL, access_token=access_token)
283273 self.assertEquals(channel.code, 200, channel.result)
284274
285275 # time passes
286276 self.reactor.advance(24 * 3600)
287277
288278 # ... and we should be soft-logouted
289 request, channel = self.make_request(
290 b"GET", TEST_URL, access_token=access_token
291 )
279 channel = self.make_request(b"GET", TEST_URL, access_token=access_token)
292280 self.assertEquals(channel.code, 401, channel.result)
293281 self.assertEquals(channel.json_body["errcode"], "M_UNKNOWN_TOKEN")
294282 self.assertEquals(channel.json_body["soft_logout"], True)
295283
296284 # Now try to hard logout this session
297 request, channel = self.make_request(
298 b"POST", "/logout", access_token=access_token
299 )
285 channel = self.make_request(b"POST", "/logout", access_token=access_token)
300286 self.assertEquals(channel.result["code"], b"200", channel.result)
301287
302288 @override_config({"session_lifetime": "24h"})
307293 access_token = self.login("kermit", "monkey")
308294
309295 # we should now be able to make requests with the access token
310 request, channel = self.make_request(
311 b"GET", TEST_URL, access_token=access_token
312 )
296 channel = self.make_request(b"GET", TEST_URL, access_token=access_token)
313297 self.assertEquals(channel.code, 200, channel.result)
314298
315299 # time passes
316300 self.reactor.advance(24 * 3600)
317301
318302 # ... and we should be soft-logouted
319 request, channel = self.make_request(
320 b"GET", TEST_URL, access_token=access_token
321 )
303 channel = self.make_request(b"GET", TEST_URL, access_token=access_token)
322304 self.assertEquals(channel.code, 401, channel.result)
323305 self.assertEquals(channel.json_body["errcode"], "M_UNKNOWN_TOKEN")
324306 self.assertEquals(channel.json_body["soft_logout"], True)
325307
326308 # Now try to hard log out all of the user's sessions
327 request, channel = self.make_request(
328 b"POST", "/logout/all", access_token=access_token
329 )
309 channel = self.make_request(b"POST", "/logout/all", access_token=access_token)
330310 self.assertEquals(channel.result["code"], b"200", channel.result)
331311
332312
401381 cas_ticket_url = urllib.parse.urlunparse(url_parts)
402382
403383 # Get Synapse to call the fake CAS and serve the template.
404 request, channel = self.make_request("GET", cas_ticket_url)
384 channel = self.make_request("GET", cas_ticket_url)
405385
406386 # Test that the response is HTML.
407387 self.assertEqual(channel.code, 200)
445425 )
446426
447427 # Get Synapse to call the fake CAS and serve the template.
448 request, channel = self.make_request("GET", cas_ticket_url)
428 channel = self.make_request("GET", cas_ticket_url)
449429
450430 self.assertEqual(channel.code, 302)
451431 location_headers = channel.headers.getRawHeaders("Location")
471451 )
472452
473453 # Get Synapse to call the fake CAS and serve the template.
474 request, channel = self.make_request("GET", cas_ticket_url)
454 channel = self.make_request("GET", cas_ticket_url)
475455
476456 # Because the user is deactivated they are served an error template.
477457 self.assertEqual(channel.code, 403)
494474 self.hs.config.jwt_algorithm = self.jwt_algorithm
495475 return self.hs
496476
497 def jwt_encode(self, token, secret=jwt_secret):
498 return jwt.encode(token, secret, self.jwt_algorithm).decode("ascii")
477 def jwt_encode(self, token: str, secret: str = jwt_secret) -> str:
478 # PyJWT 2.0.0 changed the return type of jwt.encode from bytes to str.
479 result = jwt.encode(token, secret, self.jwt_algorithm)
480 if isinstance(result, bytes):
481 return result.decode("ascii")
482 return result
499483
500484 def jwt_login(self, *args):
501485 params = json.dumps(
502486 {"type": "org.matrix.login.jwt", "token": self.jwt_encode(*args)}
503487 )
504 request, channel = self.make_request(b"POST", LOGIN_URL, params)
488 channel = self.make_request(b"POST", LOGIN_URL, params)
505489 return channel
506490
507491 def test_login_jwt_valid_registered(self):
633617
634618 def test_login_no_token(self):
635619 params = json.dumps({"type": "org.matrix.login.jwt"})
636 request, channel = self.make_request(b"POST", LOGIN_URL, params)
620 channel = self.make_request(b"POST", LOGIN_URL, params)
637621 self.assertEqual(channel.result["code"], b"403", channel.result)
638622 self.assertEqual(channel.json_body["errcode"], "M_FORBIDDEN")
639623 self.assertEqual(channel.json_body["error"], "Token field for JWT is missing")
699683 self.hs.config.jwt_algorithm = "RS256"
700684 return self.hs
701685
702 def jwt_encode(self, token, secret=jwt_privatekey):
703 return jwt.encode(token, secret, "RS256").decode("ascii")
686 def jwt_encode(self, token: str, secret: str = jwt_privatekey) -> str:
687 # PyJWT 2.0.0 changed the return type of jwt.encode from bytes to str.
688 result = jwt.encode(token, secret, "RS256")
689 if isinstance(result, bytes):
690 return result.decode("ascii")
691 return result
704692
705693 def jwt_login(self, *args):
706694 params = json.dumps(
707695 {"type": "org.matrix.login.jwt", "token": self.jwt_encode(*args)}
708696 )
709 request, channel = self.make_request(b"POST", LOGIN_URL, params)
697 channel = self.make_request(b"POST", LOGIN_URL, params)
710698 return channel
711699
712700 def test_login_jwt_valid(self):
734722 ]
735723
736724 def register_as_user(self, username):
737 request, channel = self.make_request(
725 self.make_request(
738726 b"POST",
739727 "/_matrix/client/r0/register?access_token=%s" % (self.service.token,),
740728 {"username": username},
783771 "type": login.LoginRestServlet.APPSERVICE_TYPE,
784772 "identifier": {"type": "m.id.user", "user": AS_USER},
785773 }
786 request, channel = self.make_request(
774 channel = self.make_request(
787775 b"POST", LOGIN_URL, params, access_token=self.service.token
788776 )
789777
798786 "type": login.LoginRestServlet.APPSERVICE_TYPE,
799787 "identifier": {"type": "m.id.user", "user": self.service.sender},
800788 }
801 request, channel = self.make_request(
789 channel = self.make_request(
802790 b"POST", LOGIN_URL, params, access_token=self.service.token
803791 )
804792
813801 "type": login.LoginRestServlet.APPSERVICE_TYPE,
814802 "identifier": {"type": "m.id.user", "user": "fibble_wibble"},
815803 }
816 request, channel = self.make_request(
804 channel = self.make_request(
817805 b"POST", LOGIN_URL, params, access_token=self.service.token
818806 )
819807
828816 "type": login.LoginRestServlet.APPSERVICE_TYPE,
829817 "identifier": {"type": "m.id.user", "user": AS_USER},
830818 }
831 request, channel = self.make_request(
819 channel = self.make_request(
832820 b"POST", LOGIN_URL, params, access_token=self.another_service.token
833821 )
834822
844832 "type": login.LoginRestServlet.APPSERVICE_TYPE,
845833 "identifier": {"type": "m.id.user", "user": AS_USER},
846834 }
847 request, channel = self.make_request(b"POST", LOGIN_URL, params)
835 channel = self.make_request(b"POST", LOGIN_URL, params)
848836
849837 self.assertEquals(channel.result["code"], b"401", channel.result)
3737
3838 hs = self.setup_test_homeserver(
3939 "red",
40 http_client=None,
40 federation_http_client=None,
4141 federation_client=Mock(),
4242 presence_handler=presence_handler,
4343 )
5252 self.hs.config.use_presence = True
5353
5454 body = {"presence": "here", "status_msg": "beep boop"}
55 request, channel = self.make_request(
55 channel = self.make_request(
5656 "PUT", "/presence/%s/status" % (self.user_id,), body
5757 )
5858
6767 self.hs.config.use_presence = False
6868
6969 body = {"presence": "here", "status_msg": "beep boop"}
70 request, channel = self.make_request(
70 channel = self.make_request(
7171 "PUT", "/presence/%s/status" % (self.user_id,), body
7272 )
7373
6262 hs = yield setup_test_homeserver(
6363 self.addCleanup,
6464 "test",
65 http_client=None,
65 federation_http_client=None,
6666 resource_for_client=self.mock_resource,
6767 federation=Mock(),
6868 federation_client=Mock(),
188188 self.owner_tok = self.login("owner", "pass")
189189
190190 def test_set_displayname(self):
191 request, channel = self.make_request(
191 channel = self.make_request(
192192 "PUT",
193193 "/profile/%s/displayname" % (self.owner,),
194194 content=json.dumps({"displayname": "test"}),
201201
202202 def test_set_displayname_too_long(self):
203203 """Attempts to set a stupid displayname should get a 400"""
204 request, channel = self.make_request(
204 channel = self.make_request(
205205 "PUT",
206206 "/profile/%s/displayname" % (self.owner,),
207207 content=json.dumps({"displayname": "test" * 100}),
213213 self.assertEqual(res, "owner")
214214
215215 def get_displayname(self):
216 request, channel = self.make_request(
217 "GET", "/profile/%s/displayname" % (self.owner,)
218 )
216 channel = self.make_request("GET", "/profile/%s/displayname" % (self.owner,))
219217 self.assertEqual(channel.code, 200, channel.result)
220218 return channel.json_body["displayname"]
221219
277275 )
278276
279277 def request_profile(self, expected_code, url_suffix="", access_token=None):
280 request, channel = self.make_request(
278 channel = self.make_request(
281279 "GET", self.profile_url + url_suffix, access_token=access_token
282280 )
283281 self.assertEqual(channel.code, expected_code, channel.result)
319317 """Tests that a user can lookup their own profile without having to be in a room
320318 if 'require_auth_for_profile_requests' is set to true in the server's config.
321319 """
322 request, channel = self.make_request(
320 channel = self.make_request(
323321 "GET", "/profile/" + self.requester, access_token=self.requester_tok
324322 )
325323 self.assertEqual(channel.code, 200, channel.result)
326324
327 request, channel = self.make_request(
325 channel = self.make_request(
328326 "GET",
329327 "/profile/" + self.requester + "/displayname",
330328 access_token=self.requester_tok,
331329 )
332330 self.assertEqual(channel.code, 200, channel.result)
333331
334 request, channel = self.make_request(
332 channel = self.make_request(
335333 "GET",
336334 "/profile/" + self.requester + "/avatar_url",
337335 access_token=self.requester_tok,
4444 }
4545
4646 # PUT a new rule
47 request, channel = self.make_request(
47 channel = self.make_request(
4848 "PUT", "/pushrules/global/override/best.friend", body, access_token=token
4949 )
5050 self.assertEqual(channel.code, 200)
5151
5252 # GET enabled for that new rule
53 request, channel = self.make_request(
53 channel = self.make_request(
5454 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
5555 )
5656 self.assertEqual(channel.code, 200)
7373 }
7474
7575 # PUT a new rule
76 request, channel = self.make_request(
76 channel = self.make_request(
7777 "PUT", "/pushrules/global/override/best.friend", body, access_token=token
7878 )
7979 self.assertEqual(channel.code, 200)
8080
8181 # disable the rule
82 request, channel = self.make_request(
82 channel = self.make_request(
8383 "PUT",
8484 "/pushrules/global/override/best.friend/enabled",
8585 {"enabled": False},
8888 self.assertEqual(channel.code, 200)
8989
9090 # check rule disabled
91 request, channel = self.make_request(
91 channel = self.make_request(
9292 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
9393 )
9494 self.assertEqual(channel.code, 200)
9595 self.assertEqual(channel.json_body["enabled"], False)
9696
9797 # DELETE the rule
98 request, channel = self.make_request(
98 channel = self.make_request(
9999 "DELETE", "/pushrules/global/override/best.friend", access_token=token
100100 )
101101 self.assertEqual(channel.code, 200)
102102
103103 # PUT a new rule
104 request, channel = self.make_request(
104 channel = self.make_request(
105105 "PUT", "/pushrules/global/override/best.friend", body, access_token=token
106106 )
107107 self.assertEqual(channel.code, 200)
108108
109109 # GET enabled for that new rule
110 request, channel = self.make_request(
110 channel = self.make_request(
111111 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
112112 )
113113 self.assertEqual(channel.code, 200)
129129 }
130130
131131 # PUT a new rule
132 request, channel = self.make_request(
132 channel = self.make_request(
133133 "PUT", "/pushrules/global/override/best.friend", body, access_token=token
134134 )
135135 self.assertEqual(channel.code, 200)
136136
137137 # disable the rule
138 request, channel = self.make_request(
138 channel = self.make_request(
139139 "PUT",
140140 "/pushrules/global/override/best.friend/enabled",
141141 {"enabled": False},
144144 self.assertEqual(channel.code, 200)
145145
146146 # check rule disabled
147 request, channel = self.make_request(
147 channel = self.make_request(
148148 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
149149 )
150150 self.assertEqual(channel.code, 200)
151151 self.assertEqual(channel.json_body["enabled"], False)
152152
153153 # re-enable the rule
154 request, channel = self.make_request(
154 channel = self.make_request(
155155 "PUT",
156156 "/pushrules/global/override/best.friend/enabled",
157157 {"enabled": True},
160160 self.assertEqual(channel.code, 200)
161161
162162 # check rule enabled
163 request, channel = self.make_request(
163 channel = self.make_request(
164164 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
165165 )
166166 self.assertEqual(channel.code, 200)
181181 }
182182
183183 # check 404 for never-heard-of rule
184 request, channel = self.make_request(
185 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
186 )
187 self.assertEqual(channel.code, 404)
188 self.assertEqual(channel.json_body["errcode"], Codes.NOT_FOUND)
189
190 # PUT a new rule
191 request, channel = self.make_request(
184 channel = self.make_request(
185 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
186 )
187 self.assertEqual(channel.code, 404)
188 self.assertEqual(channel.json_body["errcode"], Codes.NOT_FOUND)
189
190 # PUT a new rule
191 channel = self.make_request(
192192 "PUT", "/pushrules/global/override/best.friend", body, access_token=token
193193 )
194194 self.assertEqual(channel.code, 200)
195195
196196 # GET enabled for that new rule
197 request, channel = self.make_request(
197 channel = self.make_request(
198198 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
199199 )
200200 self.assertEqual(channel.code, 200)
201201
202202 # DELETE the rule
203 request, channel = self.make_request(
203 channel = self.make_request(
204204 "DELETE", "/pushrules/global/override/best.friend", access_token=token
205205 )
206206 self.assertEqual(channel.code, 200)
207207
208208 # check 404 for deleted rule
209 request, channel = self.make_request(
209 channel = self.make_request(
210210 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
211211 )
212212 self.assertEqual(channel.code, 404)
220220 token = self.login("user", "pass")
221221
222222 # check 404 for never-heard-of rule
223 request, channel = self.make_request(
223 channel = self.make_request(
224224 "GET", "/pushrules/global/override/.m.muahahaha/enabled", access_token=token
225225 )
226226 self.assertEqual(channel.code, 404)
234234 token = self.login("user", "pass")
235235
236236 # enable & check 404 for never-heard-of rule
237 request, channel = self.make_request(
237 channel = self.make_request(
238238 "PUT",
239239 "/pushrules/global/override/best.friend/enabled",
240240 {"enabled": True},
251251 token = self.login("user", "pass")
252252
253253 # enable & check 404 for never-heard-of rule
254 request, channel = self.make_request(
254 channel = self.make_request(
255255 "PUT",
256256 "/pushrules/global/override/.m.muahahah/enabled",
257257 {"enabled": True},
275275 }
276276
277277 # PUT a new rule
278 request, channel = self.make_request(
278 channel = self.make_request(
279279 "PUT", "/pushrules/global/override/best.friend", body, access_token=token
280280 )
281281 self.assertEqual(channel.code, 200)
282282
283283 # GET actions for that new rule
284 request, channel = self.make_request(
284 channel = self.make_request(
285285 "GET", "/pushrules/global/override/best.friend/actions", access_token=token
286286 )
287287 self.assertEqual(channel.code, 200)
304304 }
305305
306306 # PUT a new rule
307 request, channel = self.make_request(
307 channel = self.make_request(
308308 "PUT", "/pushrules/global/override/best.friend", body, access_token=token
309309 )
310310 self.assertEqual(channel.code, 200)
311311
312312 # change the rule actions
313 request, channel = self.make_request(
313 channel = self.make_request(
314314 "PUT",
315315 "/pushrules/global/override/best.friend/actions",
316316 {"actions": ["dont_notify"]},
319319 self.assertEqual(channel.code, 200)
320320
321321 # GET actions for that new rule
322 request, channel = self.make_request(
322 channel = self.make_request(
323323 "GET", "/pushrules/global/override/best.friend/actions", access_token=token
324324 )
325325 self.assertEqual(channel.code, 200)
340340 }
341341
342342 # check 404 for never-heard-of rule
343 request, channel = self.make_request(
344 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
345 )
346 self.assertEqual(channel.code, 404)
347 self.assertEqual(channel.json_body["errcode"], Codes.NOT_FOUND)
348
349 # PUT a new rule
350 request, channel = self.make_request(
343 channel = self.make_request(
344 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
345 )
346 self.assertEqual(channel.code, 404)
347 self.assertEqual(channel.json_body["errcode"], Codes.NOT_FOUND)
348
349 # PUT a new rule
350 channel = self.make_request(
351351 "PUT", "/pushrules/global/override/best.friend", body, access_token=token
352352 )
353353 self.assertEqual(channel.code, 200)
354354
355355 # DELETE the rule
356 request, channel = self.make_request(
356 channel = self.make_request(
357357 "DELETE", "/pushrules/global/override/best.friend", access_token=token
358358 )
359359 self.assertEqual(channel.code, 200)
360360
361361 # check 404 for deleted rule
362 request, channel = self.make_request(
362 channel = self.make_request(
363363 "GET", "/pushrules/global/override/best.friend/enabled", access_token=token
364364 )
365365 self.assertEqual(channel.code, 404)
373373 token = self.login("user", "pass")
374374
375375 # check 404 for never-heard-of rule
376 request, channel = self.make_request(
376 channel = self.make_request(
377377 "GET", "/pushrules/global/override/.m.muahahaha/actions", access_token=token
378378 )
379379 self.assertEqual(channel.code, 404)
387387 token = self.login("user", "pass")
388388
389389 # enable & check 404 for never-heard-of rule
390 request, channel = self.make_request(
390 channel = self.make_request(
391391 "PUT",
392392 "/pushrules/global/override/best.friend/actions",
393393 {"actions": ["dont_notify"]},
404404 token = self.login("user", "pass")
405405
406406 # enable & check 404 for never-heard-of rule
407 request, channel = self.make_request(
407 channel = self.make_request(
408408 "PUT",
409409 "/pushrules/global/override/.m.muahahah/actions",
410410 {"actions": ["dont_notify"]},
2525 import synapse.rest.admin
2626 from synapse.api.constants import EventContentFields, EventTypes, Membership
2727 from synapse.handlers.pagination import PurgeStatus
28 from synapse.rest import admin
2829 from synapse.rest.client.v1 import directory, login, profile, room
2930 from synapse.rest.client.v2_alpha import account
3031 from synapse.types import JsonDict, RoomAlias, UserID
4445 def make_homeserver(self, reactor, clock):
4546
4647 self.hs = self.setup_test_homeserver(
47 "red", http_client=None, federation_client=Mock(),
48 "red", federation_http_client=None, federation_client=Mock(),
4849 )
4950
5051 self.hs.get_federation_handler = Mock()
8283 self.created_rmid_msg_path = (
8384 "rooms/%s/send/m.room.message/a1" % (self.created_rmid)
8485 ).encode("ascii")
85 request, channel = self.make_request(
86 channel = self.make_request(
8687 "PUT", self.created_rmid_msg_path, b'{"msgtype":"m.text","body":"test msg"}'
8788 )
8889 self.assertEquals(200, channel.code, channel.result)
8990
9091 # set topic for public room
91 request, channel = self.make_request(
92 channel = self.make_request(
9293 "PUT",
9394 ("rooms/%s/state/m.room.topic" % self.created_public_rmid).encode("ascii"),
9495 b'{"topic":"Public Room Topic"}',
110111 )
111112
112113 # send message in uncreated room, expect 403
113 request, channel = self.make_request(
114 channel = self.make_request(
114115 "PUT",
115116 "/rooms/%s/send/m.room.message/mid2" % (self.uncreated_rmid,),
116117 msg_content,
118119 self.assertEquals(403, channel.code, msg=channel.result["body"])
119120
120121 # send message in created room not joined (no state), expect 403
121 request, channel = self.make_request("PUT", send_msg_path(), msg_content)
122 channel = self.make_request("PUT", send_msg_path(), msg_content)
122123 self.assertEquals(403, channel.code, msg=channel.result["body"])
123124
124125 # send message in created room and invited, expect 403
125126 self.helper.invite(
126127 room=self.created_rmid, src=self.rmcreator_id, targ=self.user_id
127128 )
128 request, channel = self.make_request("PUT", send_msg_path(), msg_content)
129 channel = self.make_request("PUT", send_msg_path(), msg_content)
129130 self.assertEquals(403, channel.code, msg=channel.result["body"])
130131
131132 # send message in created room and joined, expect 200
132133 self.helper.join(room=self.created_rmid, user=self.user_id)
133 request, channel = self.make_request("PUT", send_msg_path(), msg_content)
134 channel = self.make_request("PUT", send_msg_path(), msg_content)
134135 self.assertEquals(200, channel.code, msg=channel.result["body"])
135136
136137 # send message in created room and left, expect 403
137138 self.helper.leave(room=self.created_rmid, user=self.user_id)
138 request, channel = self.make_request("PUT", send_msg_path(), msg_content)
139 channel = self.make_request("PUT", send_msg_path(), msg_content)
139140 self.assertEquals(403, channel.code, msg=channel.result["body"])
140141
141142 def test_topic_perms(self):
143144 topic_path = "/rooms/%s/state/m.room.topic" % self.created_rmid
144145
145146 # set/get topic in uncreated room, expect 403
146 request, channel = self.make_request(
147 channel = self.make_request(
147148 "PUT", "/rooms/%s/state/m.room.topic" % self.uncreated_rmid, topic_content
148149 )
149150 self.assertEquals(403, channel.code, msg=channel.result["body"])
150 request, channel = self.make_request(
151 channel = self.make_request(
151152 "GET", "/rooms/%s/state/m.room.topic" % self.uncreated_rmid
152153 )
153154 self.assertEquals(403, channel.code, msg=channel.result["body"])
154155
155156 # set/get topic in created PRIVATE room not joined, expect 403
156 request, channel = self.make_request("PUT", topic_path, topic_content)
157 channel = self.make_request("PUT", topic_path, topic_content)
157158 self.assertEquals(403, channel.code, msg=channel.result["body"])
158 request, channel = self.make_request("GET", topic_path)
159 channel = self.make_request("GET", topic_path)
159160 self.assertEquals(403, channel.code, msg=channel.result["body"])
160161
161162 # set topic in created PRIVATE room and invited, expect 403
162163 self.helper.invite(
163164 room=self.created_rmid, src=self.rmcreator_id, targ=self.user_id
164165 )
165 request, channel = self.make_request("PUT", topic_path, topic_content)
166 channel = self.make_request("PUT", topic_path, topic_content)
166167 self.assertEquals(403, channel.code, msg=channel.result["body"])
167168
168169 # get topic in created PRIVATE room and invited, expect 403
169 request, channel = self.make_request("GET", topic_path)
170 channel = self.make_request("GET", topic_path)
170171 self.assertEquals(403, channel.code, msg=channel.result["body"])
171172
172173 # set/get topic in created PRIVATE room and joined, expect 200
174175
175176 # Only room ops can set topic by default
176177 self.helper.auth_user_id = self.rmcreator_id
177 request, channel = self.make_request("PUT", topic_path, topic_content)
178 channel = self.make_request("PUT", topic_path, topic_content)
178179 self.assertEquals(200, channel.code, msg=channel.result["body"])
179180 self.helper.auth_user_id = self.user_id
180181
181 request, channel = self.make_request("GET", topic_path)
182 channel = self.make_request("GET", topic_path)
182183 self.assertEquals(200, channel.code, msg=channel.result["body"])
183184 self.assert_dict(json.loads(topic_content.decode("utf8")), channel.json_body)
184185
185186 # set/get topic in created PRIVATE room and left, expect 403
186187 self.helper.leave(room=self.created_rmid, user=self.user_id)
187 request, channel = self.make_request("PUT", topic_path, topic_content)
188 channel = self.make_request("PUT", topic_path, topic_content)
188189 self.assertEquals(403, channel.code, msg=channel.result["body"])
189 request, channel = self.make_request("GET", topic_path)
190 channel = self.make_request("GET", topic_path)
190191 self.assertEquals(200, channel.code, msg=channel.result["body"])
191192
192193 # get topic in PUBLIC room, not joined, expect 403
193 request, channel = self.make_request(
194 channel = self.make_request(
194195 "GET", "/rooms/%s/state/m.room.topic" % self.created_public_rmid
195196 )
196197 self.assertEquals(403, channel.code, msg=channel.result["body"])
197198
198199 # set topic in PUBLIC room, not joined, expect 403
199 request, channel = self.make_request(
200 channel = self.make_request(
200201 "PUT",
201202 "/rooms/%s/state/m.room.topic" % self.created_public_rmid,
202203 topic_content,
206207 def _test_get_membership(self, room=None, members=[], expect_code=None):
207208 for member in members:
208209 path = "/rooms/%s/state/m.room.member/%s" % (room, member)
209 request, channel = self.make_request("GET", path)
210 channel = self.make_request("GET", path)
210211 self.assertEquals(expect_code, channel.code)
211212
212213 def test_membership_basic_room_perms(self):
378379
379380 def test_get_member_list(self):
380381 room_id = self.helper.create_room_as(self.user_id)
381 request, channel = self.make_request("GET", "/rooms/%s/members" % room_id)
382 channel = self.make_request("GET", "/rooms/%s/members" % room_id)
382383 self.assertEquals(200, channel.code, msg=channel.result["body"])
383384
384385 def test_get_member_list_no_room(self):
385 request, channel = self.make_request("GET", "/rooms/roomdoesnotexist/members")
386 channel = self.make_request("GET", "/rooms/roomdoesnotexist/members")
386387 self.assertEquals(403, channel.code, msg=channel.result["body"])
387388
388389 def test_get_member_list_no_permission(self):
389390 room_id = self.helper.create_room_as("@some_other_guy:red")
390 request, channel = self.make_request("GET", "/rooms/%s/members" % room_id)
391 channel = self.make_request("GET", "/rooms/%s/members" % room_id)
391392 self.assertEquals(403, channel.code, msg=channel.result["body"])
392393
393394 def test_get_member_list_mixed_memberships(self):
396397 room_path = "/rooms/%s/members" % room_id
397398 self.helper.invite(room=room_id, src=room_creator, targ=self.user_id)
398399 # can't see list if you're just invited.
399 request, channel = self.make_request("GET", room_path)
400 channel = self.make_request("GET", room_path)
400401 self.assertEquals(403, channel.code, msg=channel.result["body"])
401402
402403 self.helper.join(room=room_id, user=self.user_id)
403404 # can see list now joined
404 request, channel = self.make_request("GET", room_path)
405 channel = self.make_request("GET", room_path)
405406 self.assertEquals(200, channel.code, msg=channel.result["body"])
406407
407408 self.helper.leave(room=room_id, user=self.user_id)
408409 # can see old list once left
409 request, channel = self.make_request("GET", room_path)
410 channel = self.make_request("GET", room_path)
410411 self.assertEquals(200, channel.code, msg=channel.result["body"])
411412
412413
417418
418419 def test_post_room_no_keys(self):
419420 # POST with no config keys, expect new room id
420 request, channel = self.make_request("POST", "/createRoom", "{}")
421 channel = self.make_request("POST", "/createRoom", "{}")
421422
422423 self.assertEquals(200, channel.code, channel.result)
423424 self.assertTrue("room_id" in channel.json_body)
424425
425426 def test_post_room_visibility_key(self):
426427 # POST with visibility config key, expect new room id
427 request, channel = self.make_request(
428 "POST", "/createRoom", b'{"visibility":"private"}'
429 )
428 channel = self.make_request("POST", "/createRoom", b'{"visibility":"private"}')
430429 self.assertEquals(200, channel.code)
431430 self.assertTrue("room_id" in channel.json_body)
432431
433432 def test_post_room_custom_key(self):
434433 # POST with custom config keys, expect new room id
435 request, channel = self.make_request(
436 "POST", "/createRoom", b'{"custom":"stuff"}'
437 )
434 channel = self.make_request("POST", "/createRoom", b'{"custom":"stuff"}')
438435 self.assertEquals(200, channel.code)
439436 self.assertTrue("room_id" in channel.json_body)
440437
441438 def test_post_room_known_and_unknown_keys(self):
442439 # POST with custom + known config keys, expect new room id
443 request, channel = self.make_request(
440 channel = self.make_request(
444441 "POST", "/createRoom", b'{"visibility":"private","custom":"things"}'
445442 )
446443 self.assertEquals(200, channel.code)
448445
449446 def test_post_room_invalid_content(self):
450447 # POST with invalid content / paths, expect 400
451 request, channel = self.make_request("POST", "/createRoom", b'{"visibili')
448 channel = self.make_request("POST", "/createRoom", b'{"visibili')
452449 self.assertEquals(400, channel.code)
453450
454 request, channel = self.make_request("POST", "/createRoom", b'["hello"]')
451 channel = self.make_request("POST", "/createRoom", b'["hello"]')
455452 self.assertEquals(400, channel.code)
456453
457454 def test_post_room_invitees_invalid_mxid(self):
458455 # POST with invalid invitee, see https://github.com/matrix-org/synapse/issues/4088
459456 # Note the trailing space in the MXID here!
460 request, channel = self.make_request(
457 channel = self.make_request(
461458 "POST", "/createRoom", b'{"invite":["@alice:example.com "]}'
462459 )
463460 self.assertEquals(400, channel.code)
475472
476473 def test_invalid_puts(self):
477474 # missing keys or invalid json
478 request, channel = self.make_request("PUT", self.path, "{}")
479 self.assertEquals(400, channel.code, msg=channel.result["body"])
480
481 request, channel = self.make_request("PUT", self.path, '{"_name":"bo"}')
482 self.assertEquals(400, channel.code, msg=channel.result["body"])
483
484 request, channel = self.make_request("PUT", self.path, '{"nao')
485 self.assertEquals(400, channel.code, msg=channel.result["body"])
486
487 request, channel = self.make_request(
475 channel = self.make_request("PUT", self.path, "{}")
476 self.assertEquals(400, channel.code, msg=channel.result["body"])
477
478 channel = self.make_request("PUT", self.path, '{"_name":"bo"}')
479 self.assertEquals(400, channel.code, msg=channel.result["body"])
480
481 channel = self.make_request("PUT", self.path, '{"nao')
482 self.assertEquals(400, channel.code, msg=channel.result["body"])
483
484 channel = self.make_request(
488485 "PUT", self.path, '[{"_name":"bo"},{"_name":"jill"}]'
489486 )
490487 self.assertEquals(400, channel.code, msg=channel.result["body"])
491488
492 request, channel = self.make_request("PUT", self.path, "text only")
493 self.assertEquals(400, channel.code, msg=channel.result["body"])
494
495 request, channel = self.make_request("PUT", self.path, "")
489 channel = self.make_request("PUT", self.path, "text only")
490 self.assertEquals(400, channel.code, msg=channel.result["body"])
491
492 channel = self.make_request("PUT", self.path, "")
496493 self.assertEquals(400, channel.code, msg=channel.result["body"])
497494
498495 # valid key, wrong type
499496 content = '{"topic":["Topic name"]}'
500 request, channel = self.make_request("PUT", self.path, content)
497 channel = self.make_request("PUT", self.path, content)
501498 self.assertEquals(400, channel.code, msg=channel.result["body"])
502499
503500 def test_rooms_topic(self):
504501 # nothing should be there
505 request, channel = self.make_request("GET", self.path)
502 channel = self.make_request("GET", self.path)
506503 self.assertEquals(404, channel.code, msg=channel.result["body"])
507504
508505 # valid put
509506 content = '{"topic":"Topic name"}'
510 request, channel = self.make_request("PUT", self.path, content)
507 channel = self.make_request("PUT", self.path, content)
511508 self.assertEquals(200, channel.code, msg=channel.result["body"])
512509
513510 # valid get
514 request, channel = self.make_request("GET", self.path)
511 channel = self.make_request("GET", self.path)
515512 self.assertEquals(200, channel.code, msg=channel.result["body"])
516513 self.assert_dict(json.loads(content), channel.json_body)
517514
518515 def test_rooms_topic_with_extra_keys(self):
519516 # valid put with extra keys
520517 content = '{"topic":"Seasons","subtopic":"Summer"}'
521 request, channel = self.make_request("PUT", self.path, content)
518 channel = self.make_request("PUT", self.path, content)
522519 self.assertEquals(200, channel.code, msg=channel.result["body"])
523520
524521 # valid get
525 request, channel = self.make_request("GET", self.path)
522 channel = self.make_request("GET", self.path)
526523 self.assertEquals(200, channel.code, msg=channel.result["body"])
527524 self.assert_dict(json.loads(content), channel.json_body)
528525
538535 def test_invalid_puts(self):
539536 path = "/rooms/%s/state/m.room.member/%s" % (self.room_id, self.user_id)
540537 # missing keys or invalid json
541 request, channel = self.make_request("PUT", path, "{}")
542 self.assertEquals(400, channel.code, msg=channel.result["body"])
543
544 request, channel = self.make_request("PUT", path, '{"_name":"bo"}')
545 self.assertEquals(400, channel.code, msg=channel.result["body"])
546
547 request, channel = self.make_request("PUT", path, '{"nao')
548 self.assertEquals(400, channel.code, msg=channel.result["body"])
549
550 request, channel = self.make_request(
551 "PUT", path, b'[{"_name":"bo"},{"_name":"jill"}]'
552 )
553 self.assertEquals(400, channel.code, msg=channel.result["body"])
554
555 request, channel = self.make_request("PUT", path, "text only")
556 self.assertEquals(400, channel.code, msg=channel.result["body"])
557
558 request, channel = self.make_request("PUT", path, "")
538 channel = self.make_request("PUT", path, "{}")
539 self.assertEquals(400, channel.code, msg=channel.result["body"])
540
541 channel = self.make_request("PUT", path, '{"_name":"bo"}')
542 self.assertEquals(400, channel.code, msg=channel.result["body"])
543
544 channel = self.make_request("PUT", path, '{"nao')
545 self.assertEquals(400, channel.code, msg=channel.result["body"])
546
547 channel = self.make_request("PUT", path, b'[{"_name":"bo"},{"_name":"jill"}]')
548 self.assertEquals(400, channel.code, msg=channel.result["body"])
549
550 channel = self.make_request("PUT", path, "text only")
551 self.assertEquals(400, channel.code, msg=channel.result["body"])
552
553 channel = self.make_request("PUT", path, "")
559554 self.assertEquals(400, channel.code, msg=channel.result["body"])
560555
561556 # valid keys, wrong types
564559 Membership.JOIN,
565560 Membership.LEAVE,
566561 )
567 request, channel = self.make_request("PUT", path, content.encode("ascii"))
562 channel = self.make_request("PUT", path, content.encode("ascii"))
568563 self.assertEquals(400, channel.code, msg=channel.result["body"])
569564
570565 def test_rooms_members_self(self):
575570
576571 # valid join message (NOOP since we made the room)
577572 content = '{"membership":"%s"}' % Membership.JOIN
578 request, channel = self.make_request("PUT", path, content.encode("ascii"))
573 channel = self.make_request("PUT", path, content.encode("ascii"))
579574 self.assertEquals(200, channel.code, msg=channel.result["body"])
580575
581 request, channel = self.make_request("GET", path, None)
576 channel = self.make_request("GET", path, None)
582577 self.assertEquals(200, channel.code, msg=channel.result["body"])
583578
584579 expected_response = {"membership": Membership.JOIN}
593588
594589 # valid invite message
595590 content = '{"membership":"%s"}' % Membership.INVITE
596 request, channel = self.make_request("PUT", path, content)
591 channel = self.make_request("PUT", path, content)
597592 self.assertEquals(200, channel.code, msg=channel.result["body"])
598593
599 request, channel = self.make_request("GET", path, None)
594 channel = self.make_request("GET", path, None)
600595 self.assertEquals(200, channel.code, msg=channel.result["body"])
601596 self.assertEquals(json.loads(content), channel.json_body)
602597
612607 Membership.INVITE,
613608 "Join us!",
614609 )
615 request, channel = self.make_request("PUT", path, content)
610 channel = self.make_request("PUT", path, content)
616611 self.assertEquals(200, channel.code, msg=channel.result["body"])
617612
618 request, channel = self.make_request("GET", path, None)
613 channel = self.make_request("GET", path, None)
619614 self.assertEquals(200, channel.code, msg=channel.result["body"])
620615 self.assertEquals(json.loads(content), channel.json_body)
621616
624619 user_id = "@sid1:red"
625620
626621 servlets = [
622 admin.register_servlets,
627623 profile.register_servlets,
628624 room.register_servlets,
629625 ]
665661
666662 # Update the display name for the user.
667663 path = "/_matrix/client/r0/profile/%s/displayname" % self.user_id
668 request, channel = self.make_request("PUT", path, {"displayname": "John Doe"})
664 channel = self.make_request("PUT", path, {"displayname": "John Doe"})
669665 self.assertEquals(channel.code, 200, channel.json_body)
670666
671667 # Check that all the rooms have been sent a profile update into.
675671 self.user_id,
676672 )
677673
678 request, channel = self.make_request("GET", path)
674 channel = self.make_request("GET", path)
679675 self.assertEquals(channel.code, 200)
680676
681677 self.assertIn("displayname", channel.json_body)
699695 # Make sure we send more requests than the rate-limiting config would allow
700696 # if all of these requests ended up joining the user to a room.
701697 for i in range(4):
702 request, channel = self.make_request("POST", path % room_id, {})
698 channel = self.make_request("POST", path % room_id, {})
703699 self.assertEquals(channel.code, 200)
700
701 @unittest.override_config(
702 {
703 "rc_joins": {"local": {"per_second": 0.5, "burst_count": 3}},
704 "auto_join_rooms": ["#room:red", "#room2:red", "#room3:red", "#room4:red"],
705 "autocreate_auto_join_rooms": True,
706 },
707 )
708 def test_autojoin_rooms(self):
709 user_id = self.register_user("testuser", "password")
710
711 # Check that the new user successfully joined the four rooms
712 rooms = self.get_success(self.hs.get_datastore().get_rooms_for_user(user_id))
713 self.assertEqual(len(rooms), 4)
704714
705715
706716 class RoomMessagesTestCase(RoomBase):
714724 def test_invalid_puts(self):
715725 path = "/rooms/%s/send/m.room.message/mid1" % (urlparse.quote(self.room_id))
716726 # missing keys or invalid json
717 request, channel = self.make_request("PUT", path, b"{}")
718 self.assertEquals(400, channel.code, msg=channel.result["body"])
719
720 request, channel = self.make_request("PUT", path, b'{"_name":"bo"}')
721 self.assertEquals(400, channel.code, msg=channel.result["body"])
722
723 request, channel = self.make_request("PUT", path, b'{"nao')
724 self.assertEquals(400, channel.code, msg=channel.result["body"])
725
726 request, channel = self.make_request(
727 "PUT", path, b'[{"_name":"bo"},{"_name":"jill"}]'
728 )
729 self.assertEquals(400, channel.code, msg=channel.result["body"])
730
731 request, channel = self.make_request("PUT", path, b"text only")
732 self.assertEquals(400, channel.code, msg=channel.result["body"])
733
734 request, channel = self.make_request("PUT", path, b"")
727 channel = self.make_request("PUT", path, b"{}")
728 self.assertEquals(400, channel.code, msg=channel.result["body"])
729
730 channel = self.make_request("PUT", path, b'{"_name":"bo"}')
731 self.assertEquals(400, channel.code, msg=channel.result["body"])
732
733 channel = self.make_request("PUT", path, b'{"nao')
734 self.assertEquals(400, channel.code, msg=channel.result["body"])
735
736 channel = self.make_request("PUT", path, b'[{"_name":"bo"},{"_name":"jill"}]')
737 self.assertEquals(400, channel.code, msg=channel.result["body"])
738
739 channel = self.make_request("PUT", path, b"text only")
740 self.assertEquals(400, channel.code, msg=channel.result["body"])
741
742 channel = self.make_request("PUT", path, b"")
735743 self.assertEquals(400, channel.code, msg=channel.result["body"])
736744
737745 def test_rooms_messages_sent(self):
738746 path = "/rooms/%s/send/m.room.message/mid1" % (urlparse.quote(self.room_id))
739747
740748 content = b'{"body":"test","msgtype":{"type":"a"}}'
741 request, channel = self.make_request("PUT", path, content)
749 channel = self.make_request("PUT", path, content)
742750 self.assertEquals(400, channel.code, msg=channel.result["body"])
743751
744752 # custom message types
745753 content = b'{"body":"test","msgtype":"test.custom.text"}'
746 request, channel = self.make_request("PUT", path, content)
754 channel = self.make_request("PUT", path, content)
747755 self.assertEquals(200, channel.code, msg=channel.result["body"])
748756
749757 # m.text message type
750758 path = "/rooms/%s/send/m.room.message/mid2" % (urlparse.quote(self.room_id))
751759 content = b'{"body":"test2","msgtype":"m.text"}'
752 request, channel = self.make_request("PUT", path, content)
760 channel = self.make_request("PUT", path, content)
753761 self.assertEquals(200, channel.code, msg=channel.result["body"])
754762
755763
763771 self.room_id = self.helper.create_room_as(self.user_id)
764772
765773 def test_initial_sync(self):
766 request, channel = self.make_request(
767 "GET", "/rooms/%s/initialSync" % self.room_id
768 )
774 channel = self.make_request("GET", "/rooms/%s/initialSync" % self.room_id)
769775 self.assertEquals(200, channel.code)
770776
771777 self.assertEquals(self.room_id, channel.json_body["room_id"])
806812
807813 def test_topo_token_is_accepted(self):
808814 token = "t1-0_0_0_0_0_0_0_0_0"
809 request, channel = self.make_request(
815 channel = self.make_request(
810816 "GET", "/rooms/%s/messages?access_token=x&from=%s" % (self.room_id, token)
811817 )
812818 self.assertEquals(200, channel.code)
817823
818824 def test_stream_token_is_accepted_for_fwd_pagianation(self):
819825 token = "s0_0_0_0_0_0_0_0_0"
820 request, channel = self.make_request(
826 channel = self.make_request(
821827 "GET", "/rooms/%s/messages?access_token=x&from=%s" % (self.room_id, token)
822828 )
823829 self.assertEquals(200, channel.code)
850856 self.helper.send(self.room_id, "message 3")
851857
852858 # Check that we get the first and second message when querying /messages.
853 request, channel = self.make_request(
859 channel = self.make_request(
854860 "GET",
855861 "/rooms/%s/messages?access_token=x&from=%s&dir=b&filter=%s"
856862 % (
878884
879885 # Check that we only get the second message through /message now that the first
880886 # has been purged.
881 request, channel = self.make_request(
887 channel = self.make_request(
882888 "GET",
883889 "/rooms/%s/messages?access_token=x&from=%s&dir=b&filter=%s"
884890 % (
895901 # Check that we get no event, but also no error, when querying /messages with
896902 # the token that was pointing at the first event, because we don't have it
897903 # anymore.
898 request, channel = self.make_request(
904 channel = self.make_request(
899905 "GET",
900906 "/rooms/%s/messages?access_token=x&from=%s&dir=b&filter=%s"
901907 % (
954960 self.helper.send(self.room, body="Hi!", tok=self.other_access_token)
955961 self.helper.send(self.room, body="There!", tok=self.other_access_token)
956962
957 request, channel = self.make_request(
963 channel = self.make_request(
958964 "POST",
959965 "/search?access_token=%s" % (self.access_token,),
960966 {
983989 self.helper.send(self.room, body="Hi!", tok=self.other_access_token)
984990 self.helper.send(self.room, body="There!", tok=self.other_access_token)
985991
986 request, channel = self.make_request(
992 channel = self.make_request(
987993 "POST",
988994 "/search?access_token=%s" % (self.access_token,),
989995 {
10311037 return self.hs
10321038
10331039 def test_restricted_no_auth(self):
1034 request, channel = self.make_request("GET", self.url)
1040 channel = self.make_request("GET", self.url)
10351041 self.assertEqual(channel.code, 401, channel.result)
10361042
10371043 def test_restricted_auth(self):
10381044 self.register_user("user", "pass")
10391045 tok = self.login("user", "pass")
10401046
1041 request, channel = self.make_request("GET", self.url, access_token=tok)
1047 channel = self.make_request("GET", self.url, access_token=tok)
10421048 self.assertEqual(channel.code, 200, channel.result)
10431049
10441050
10661072 self.displayname = "test user"
10671073 data = {"displayname": self.displayname}
10681074 request_data = json.dumps(data)
1069 request, channel = self.make_request(
1075 channel = self.make_request(
10701076 "PUT",
10711077 "/_matrix/client/r0/profile/%s/displayname" % (self.user_id,),
10721078 request_data,
10791085 def test_per_room_profile_forbidden(self):
10801086 data = {"membership": "join", "displayname": "other test user"}
10811087 request_data = json.dumps(data)
1082 request, channel = self.make_request(
1088 channel = self.make_request(
10831089 "PUT",
10841090 "/_matrix/client/r0/rooms/%s/state/m.room.member/%s"
10851091 % (self.room_id, self.user_id),
10891095 self.assertEqual(channel.code, 200, channel.result)
10901096 event_id = channel.json_body["event_id"]
10911097
1092 request, channel = self.make_request(
1098 channel = self.make_request(
10931099 "GET",
10941100 "/_matrix/client/r0/rooms/%s/event/%s" % (self.room_id, event_id),
10951101 access_token=self.tok,
11221128
11231129 def test_join_reason(self):
11241130 reason = "hello"
1125 request, channel = self.make_request(
1131 channel = self.make_request(
11261132 "POST",
11271133 "/_matrix/client/r0/rooms/{}/join".format(self.room_id),
11281134 content={"reason": reason},
11361142 self.helper.join(self.room_id, user=self.second_user_id, tok=self.second_tok)
11371143
11381144 reason = "hello"
1139 request, channel = self.make_request(
1145 channel = self.make_request(
11401146 "POST",
11411147 "/_matrix/client/r0/rooms/{}/leave".format(self.room_id),
11421148 content={"reason": reason},
11501156 self.helper.join(self.room_id, user=self.second_user_id, tok=self.second_tok)
11511157
11521158 reason = "hello"
1153 request, channel = self.make_request(
1159 channel = self.make_request(
11541160 "POST",
11551161 "/_matrix/client/r0/rooms/{}/kick".format(self.room_id),
11561162 content={"reason": reason, "user_id": self.second_user_id},
11641170 self.helper.join(self.room_id, user=self.second_user_id, tok=self.second_tok)
11651171
11661172 reason = "hello"
1167 request, channel = self.make_request(
1173 channel = self.make_request(
11681174 "POST",
11691175 "/_matrix/client/r0/rooms/{}/ban".format(self.room_id),
11701176 content={"reason": reason, "user_id": self.second_user_id},
11761182
11771183 def test_unban_reason(self):
11781184 reason = "hello"
1179 request, channel = self.make_request(
1185 channel = self.make_request(
11801186 "POST",
11811187 "/_matrix/client/r0/rooms/{}/unban".format(self.room_id),
11821188 content={"reason": reason, "user_id": self.second_user_id},
11881194
11891195 def test_invite_reason(self):
11901196 reason = "hello"
1191 request, channel = self.make_request(
1197 channel = self.make_request(
11921198 "POST",
11931199 "/_matrix/client/r0/rooms/{}/invite".format(self.room_id),
11941200 content={"reason": reason, "user_id": self.second_user_id},
12071213 )
12081214
12091215 reason = "hello"
1210 request, channel = self.make_request(
1216 channel = self.make_request(
12111217 "POST",
12121218 "/_matrix/client/r0/rooms/{}/leave".format(self.room_id),
12131219 content={"reason": reason},
12181224 self._check_for_reason(reason)
12191225
12201226 def _check_for_reason(self, reason):
1221 request, channel = self.make_request(
1227 channel = self.make_request(
12221228 "GET",
12231229 "/_matrix/client/r0/rooms/{}/state/m.room.member/{}".format(
12241230 self.room_id, self.second_user_id
12671273 """Test that we can filter by a label on a /context request."""
12681274 event_id = self._send_labelled_messages_in_room()
12691275
1270 request, channel = self.make_request(
1276 channel = self.make_request(
12711277 "GET",
12721278 "/rooms/%s/context/%s?filter=%s"
12731279 % (self.room_id, event_id, json.dumps(self.FILTER_LABELS)),
12971303 """Test that we can filter by the absence of a label on a /context request."""
12981304 event_id = self._send_labelled_messages_in_room()
12991305
1300 request, channel = self.make_request(
1306 channel = self.make_request(
13011307 "GET",
13021308 "/rooms/%s/context/%s?filter=%s"
13031309 % (self.room_id, event_id, json.dumps(self.FILTER_NOT_LABELS)),
13321338 """
13331339 event_id = self._send_labelled_messages_in_room()
13341340
1335 request, channel = self.make_request(
1341 channel = self.make_request(
13361342 "GET",
13371343 "/rooms/%s/context/%s?filter=%s"
13381344 % (self.room_id, event_id, json.dumps(self.FILTER_LABELS_NOT_LABELS)),
13601366 self._send_labelled_messages_in_room()
13611367
13621368 token = "s0_0_0_0_0_0_0_0_0"
1363 request, channel = self.make_request(
1369 channel = self.make_request(
13641370 "GET",
13651371 "/rooms/%s/messages?access_token=%s&from=%s&filter=%s"
13661372 % (self.room_id, self.tok, token, json.dumps(self.FILTER_LABELS)),
13771383 self._send_labelled_messages_in_room()
13781384
13791385 token = "s0_0_0_0_0_0_0_0_0"
1380 request, channel = self.make_request(
1386 channel = self.make_request(
13811387 "GET",
13821388 "/rooms/%s/messages?access_token=%s&from=%s&filter=%s"
13831389 % (self.room_id, self.tok, token, json.dumps(self.FILTER_NOT_LABELS)),
14001406 self._send_labelled_messages_in_room()
14011407
14021408 token = "s0_0_0_0_0_0_0_0_0"
1403 request, channel = self.make_request(
1409 channel = self.make_request(
14041410 "GET",
14051411 "/rooms/%s/messages?access_token=%s&from=%s&filter=%s"
14061412 % (
14311437
14321438 self._send_labelled_messages_in_room()
14331439
1434 request, channel = self.make_request(
1440 channel = self.make_request(
14351441 "POST", "/search?access_token=%s" % self.tok, request_data
14361442 )
14371443
14661472
14671473 self._send_labelled_messages_in_room()
14681474
1469 request, channel = self.make_request(
1475 channel = self.make_request(
14701476 "POST", "/search?access_token=%s" % self.tok, request_data
14711477 )
14721478
15131519
15141520 self._send_labelled_messages_in_room()
15151521
1516 request, channel = self.make_request(
1522 channel = self.make_request(
15171523 "POST", "/search?access_token=%s" % self.tok, request_data
15181524 )
15191525
16341640
16351641 # Check that we can still see the messages before the erasure request.
16361642
1637 request, channel = self.make_request(
1643 channel = self.make_request(
16381644 "GET",
16391645 '/rooms/%s/context/%s?filter={"types":["m.room.message"]}'
16401646 % (self.room_id, event_id),
16981704 # Check that a user that joined the room after the erasure request can't see
16991705 # the messages anymore.
17001706
1701 request, channel = self.make_request(
1707 channel = self.make_request(
17021708 "GET",
17031709 '/rooms/%s/context/%s?filter={"types":["m.room.message"]}'
17041710 % (self.room_id, event_id),
17881794
17891795 def _get_aliases(self, access_token: str, expected_code: int = 200) -> JsonDict:
17901796 """Calls the endpoint under test. returns the json response object."""
1791 request, channel = self.make_request(
1797 channel = self.make_request(
17921798 "GET",
17931799 "/_matrix/client/unstable/org.matrix.msc2432/rooms/%s/aliases"
17941800 % (self.room_id,),
18091815 data = {"room_id": self.room_id}
18101816 request_data = json.dumps(data)
18111817
1812 request, channel = self.make_request(
1818 channel = self.make_request(
18131819 "PUT", url, request_data, access_token=self.room_owner_tok
18141820 )
18151821 self.assertEqual(channel.code, expected_code, channel.result)
18391845 data = {"room_id": self.room_id}
18401846 request_data = json.dumps(data)
18411847
1842 request, channel = self.make_request(
1848 channel = self.make_request(
18431849 "PUT", url, request_data, access_token=self.room_owner_tok
18441850 )
18451851 self.assertEqual(channel.code, expected_code, channel.result)
18461852
18471853 def _get_canonical_alias(self, expected_code: int = 200) -> JsonDict:
18481854 """Calls the endpoint under test. returns the json response object."""
1849 request, channel = self.make_request(
1855 channel = self.make_request(
18501856 "GET",
18511857 "rooms/%s/state/m.room.canonical_alias" % (self.room_id,),
18521858 access_token=self.room_owner_tok,
18581864
18591865 def _set_canonical_alias(self, content: str, expected_code: int = 200) -> JsonDict:
18601866 """Calls the endpoint under test. returns the json response object."""
1861 request, channel = self.make_request(
1867 channel = self.make_request(
18621868 "PUT",
18631869 "rooms/%s/state/m.room.canonical_alias" % (self.room_id,),
18641870 json.dumps(content),
3838 def make_homeserver(self, reactor, clock):
3939
4040 hs = self.setup_test_homeserver(
41 "red", http_client=None, federation_client=Mock(),
41 "red", federation_http_client=None, federation_client=Mock(),
4242 )
4343
4444 self.event_source = hs.get_event_sources().sources["typing"]
9393 self.helper.join(self.room_id, user="@jim:red")
9494
9595 def test_set_typing(self):
96 request, channel = self.make_request(
96 channel = self.make_request(
9797 "PUT",
9898 "/rooms/%s/typing/%s" % (self.room_id, self.user_id),
9999 b'{"typing": true, "timeout": 30000}',
116116 )
117117
118118 def test_set_not_typing(self):
119 request, channel = self.make_request(
119 channel = self.make_request(
120120 "PUT",
121121 "/rooms/%s/typing/%s" % (self.room_id, self.user_id),
122122 b'{"typing": false}',
124124 self.assertEquals(200, channel.code)
125125
126126 def test_typing_timeout(self):
127 request, channel = self.make_request(
127 channel = self.make_request(
128128 "PUT",
129129 "/rooms/%s/typing/%s" % (self.room_id, self.user_id),
130130 b'{"typing": true, "timeout": 30000}',
137137
138138 self.assertEquals(self.event_source.get_current_key(), 2)
139139
140 request, channel = self.make_request(
140 channel = self.make_request(
141141 "PUT",
142142 "/rooms/%s/typing/%s" % (self.room_id, self.user_id),
143143 b'{"typing": true, "timeout": 30000}',
11 # Copyright 2014-2016 OpenMarket Ltd
22 # Copyright 2017 Vector Creations Ltd
33 # Copyright 2018-2019 New Vector Ltd
4 # Copyright 2019 The Matrix.org Foundation C.I.C.
4 # Copyright 2019-2020 The Matrix.org Foundation C.I.C.
55 #
66 # Licensed under the Apache License, Version 2.0 (the "License");
77 # you may not use this file except in compliance with the License.
1616 # limitations under the License.
1717
1818 import json
19 import re
1920 import time
21 import urllib.parse
2022 from typing import Any, Dict, Optional
23
24 from mock import patch
2125
2226 import attr
2327
2529 from twisted.web.server import Site
2630
2731 from synapse.api.constants import Membership
32 from synapse.types import JsonDict
2833
2934 from tests.server import FakeSite, make_request
35 from tests.test_utils import FakeResponse
3036
3137
3238 @attr.s
7480 if tok:
7581 path = path + "?access_token=%s" % tok
7682
77 _, channel = make_request(
83 channel = make_request(
7884 self.hs.get_reactor(),
7985 self.site,
8086 "POST",
150156 data = {"membership": membership}
151157 data.update(extra_data)
152158
153 _, channel = make_request(
159 channel = make_request(
154160 self.hs.get_reactor(),
155161 self.site,
156162 "PUT",
185191 if tok:
186192 path = path + "?access_token=%s" % tok
187193
188 _, channel = make_request(
194 channel = make_request(
189195 self.hs.get_reactor(),
190196 self.site,
191197 "PUT",
241247 if body is not None:
242248 content = json.dumps(body).encode("utf8")
243249
244 _, channel = make_request(
245 self.hs.get_reactor(), self.site, method, path, content
246 )
250 channel = make_request(self.hs.get_reactor(), self.site, method, path, content)
247251
248252 assert int(channel.result["code"]) == expect_code, (
249253 "Expected: %d, got: %d, resp: %r"
326330 """
327331 image_length = len(image_data)
328332 path = "/_matrix/media/r0/upload?filename=%s" % (filename,)
329 _, channel = make_request(
333 channel = make_request(
330334 self.hs.get_reactor(),
331335 FakeSite(resource),
332336 "POST",
343347 )
344348
345349 return channel.json_body
350
351 def login_via_oidc(self, remote_user_id: str) -> JsonDict:
352 """Log in (as a new user) via OIDC
353
354 Returns the result of the final token login.
355
356 Requires that "oidc_config" in the homeserver config be set appropriately
357 (TEST_OIDC_CONFIG is a suitable example) - and by implication, needs a
358 "public_base_url".
359
360 Also requires the login servlet and the OIDC callback resource to be mounted at
361 the normal places.
362 """
363 client_redirect_url = "https://x"
364
365 # first hit the redirect url (which will issue a cookie and state)
366 channel = make_request(
367 self.hs.get_reactor(),
368 self.site,
369 "GET",
370 "/login/sso/redirect?redirectUrl=" + client_redirect_url,
371 )
372 # that will redirect to the OIDC IdP, but we skip that and go straight
373 # back to synapse's OIDC callback resource. However, we do need the "state"
374 # param that synapse passes to the IdP via query params, and the cookie that
375 # synapse passes to the client.
376 assert channel.code == 302
377 oauth_uri = channel.headers.getRawHeaders("Location")[0]
378 params = urllib.parse.parse_qs(urllib.parse.urlparse(oauth_uri).query)
379 redirect_uri = "%s?%s" % (
380 urllib.parse.urlparse(params["redirect_uri"][0]).path,
381 urllib.parse.urlencode({"state": params["state"][0], "code": "TEST_CODE"}),
382 )
383 cookies = {}
384 for h in channel.headers.getRawHeaders("Set-Cookie"):
385 parts = h.split(";")
386 k, v = parts[0].split("=", maxsplit=1)
387 cookies[k] = v
388
389 # before we hit the callback uri, stub out some methods in the http client so
390 # that we don't have to handle full HTTPS requests.
391
392 # (expected url, json response) pairs, in the order we expect them.
393 expected_requests = [
394 # first we get a hit to the token endpoint, which we tell to return
395 # a dummy OIDC access token
396 ("https://issuer.test/token", {"access_token": "TEST"}),
397 # and then one to the user_info endpoint, which returns our remote user id.
398 ("https://issuer.test/userinfo", {"sub": remote_user_id}),
399 ]
400
401 async def mock_req(method: str, uri: str, data=None, headers=None):
402 (expected_uri, resp_obj) = expected_requests.pop(0)
403 assert uri == expected_uri
404 resp = FakeResponse(
405 code=200, phrase=b"OK", body=json.dumps(resp_obj).encode("utf-8"),
406 )
407 return resp
408
409 with patch.object(self.hs.get_proxied_http_client(), "request", mock_req):
410 # now hit the callback URI with the right params and a made-up code
411 channel = make_request(
412 self.hs.get_reactor(),
413 self.site,
414 "GET",
415 redirect_uri,
416 custom_headers=[
417 ("Cookie", "%s=%s" % (k, v)) for (k, v) in cookies.items()
418 ],
419 )
420
421 # expect a confirmation page
422 assert channel.code == 200
423
424 # fish the matrix login token out of the body of the confirmation page
425 m = re.search(
426 'a href="%s.*loginToken=([^"]*)"' % (client_redirect_url,),
427 channel.result["body"].decode("utf-8"),
428 )
429 assert m
430 login_token = m.group(1)
431
432 # finally, submit the matrix login token to the login API, which gives us our
433 # matrix access token and device id.
434 channel = make_request(
435 self.hs.get_reactor(),
436 self.site,
437 "POST",
438 "/login",
439 content={"type": "m.login.token", "token": login_token},
440 )
441 assert channel.code == 200
442 return channel.json_body
443
444
445 # an 'oidc_config' suitable for login_via_oidc.
446 TEST_OIDC_CONFIG = {
447 "enabled": True,
448 "discover": False,
449 "issuer": "https://issuer.test",
450 "client_id": "test-client-id",
451 "client_secret": "test-client-secret",
452 "scopes": ["profile"],
453 "authorization_endpoint": "https://z",
454 "token_endpoint": "https://issuer.test/token",
455 "userinfo_endpoint": "https://issuer.test/userinfo",
456 "user_mapping_provider": {"config": {"localpart_template": "{{ user.sub }}"}},
457 }
1818 import re
1919 from email.parser import Parser
2020 from typing import Optional
21 from urllib.parse import urlencode
2221
2322 import pkg_resources
2423
240239 self.assertIsNotNone(session_id)
241240
242241 def _request_token(self, email, client_secret):
243 request, channel = self.make_request(
242 channel = self.make_request(
244243 "POST",
245244 b"account/password/email/requestToken",
246245 {"client_secret": client_secret, "email": email, "send_attempt": 1},
254253 path = link.replace("https://example.com", "")
255254
256255 # Load the password reset confirmation page
257 request, channel = make_request(
256 channel = make_request(
258257 self.reactor,
259258 FakeSite(self.submit_token_resource),
260259 "GET",
267266 # Now POST to the same endpoint, mimicking the same behaviour as clicking the
268267 # password reset confirm button
269268
270 # Send arguments as url-encoded form data, matching the template's behaviour
271 form_args = []
272 for key, value_list in request.args.items():
273 for value in value_list:
274 arg = (key, value)
275 form_args.append(arg)
276
277269 # Confirm the password reset
278 request, channel = make_request(
270 channel = make_request(
279271 self.reactor,
280272 FakeSite(self.submit_token_resource),
281273 "POST",
282274 path,
283 content=urlencode(form_args).encode("utf8"),
275 content=b"",
284276 shorthand=False,
285277 content_is_form=True,
286278 )
309301 def _reset_password(
310302 self, new_password, session_id, client_secret, expected_code=200
311303 ):
312 request, channel = self.make_request(
304 channel = self.make_request(
313305 "POST",
314306 b"account/password",
315307 {
351343 self.assertTrue(self.get_success(store.get_user_deactivated_status(user_id)))
352344
353345 # Check that this access token has been invalidated.
354 request, channel = self.make_request("GET", "account/whoami")
355 self.assertEqual(request.code, 401)
346 channel = self.make_request("GET", "account/whoami")
347 self.assertEqual(channel.code, 401)
356348
357349 def test_pending_invites(self):
358350 """Tests that deactivating a user rejects every pending invite for them."""
406398 "erase": False,
407399 }
408400 )
409 request, channel = self.make_request(
401 channel = self.make_request(
410402 "POST", "account/deactivate", request_data, access_token=tok
411403 )
412 self.assertEqual(request.code, 200)
404 self.assertEqual(channel.code, 200)
413405
414406
415407 class ThreepidEmailRestTestCase(unittest.HomeserverTestCase):
529521
530522 self._validate_token(link)
531523
532 request, channel = self.make_request(
524 channel = self.make_request(
533525 "POST",
534526 b"/_matrix/client/unstable/account/3pid/add",
535527 {
547539 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
548540
549541 # Get user
550 request, channel = self.make_request(
542 channel = self.make_request(
551543 "GET", self.url_3pid, access_token=self.user_id_tok,
552544 )
553545
568560 )
569561 )
570562
571 request, channel = self.make_request(
563 channel = self.make_request(
572564 "POST",
573565 b"account/3pid/delete",
574566 {"medium": "email", "address": self.email},
577569 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
578570
579571 # Get user
580 request, channel = self.make_request(
572 channel = self.make_request(
581573 "GET", self.url_3pid, access_token=self.user_id_tok,
582574 )
583575
600592 )
601593 )
602594
603 request, channel = self.make_request(
595 channel = self.make_request(
604596 "POST",
605597 b"account/3pid/delete",
606598 {"medium": "email", "address": self.email},
611603 self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
612604
613605 # Get user
614 request, channel = self.make_request(
606 channel = self.make_request(
615607 "GET", self.url_3pid, access_token=self.user_id_tok,
616608 )
617609
628620 self.assertEquals(len(self.email_attempts), 1)
629621
630622 # Attempt to add email without clicking the link
631 request, channel = self.make_request(
623 channel = self.make_request(
632624 "POST",
633625 b"/_matrix/client/unstable/account/3pid/add",
634626 {
646638 self.assertEqual(Codes.THREEPID_AUTH_FAILED, channel.json_body["errcode"])
647639
648640 # Get user
649 request, channel = self.make_request(
641 channel = self.make_request(
650642 "GET", self.url_3pid, access_token=self.user_id_tok,
651643 )
652644
661653 session_id = "weasle"
662654
663655 # Attempt to add email without even requesting an email
664 request, channel = self.make_request(
656 channel = self.make_request(
665657 "POST",
666658 b"/_matrix/client/unstable/account/3pid/add",
667659 {
679671 self.assertEqual(Codes.THREEPID_AUTH_FAILED, channel.json_body["errcode"])
680672
681673 # Get user
682 request, channel = self.make_request(
674 channel = self.make_request(
683675 "GET", self.url_3pid, access_token=self.user_id_tok,
684676 )
685677
783775 if next_link:
784776 body["next_link"] = next_link
785777
786 request, channel = self.make_request(
787 "POST", b"account/3pid/email/requestToken", body,
788 )
778 channel = self.make_request("POST", b"account/3pid/email/requestToken", body,)
789779 self.assertEquals(expect_code, channel.code, channel.result)
790780
791781 return channel.json_body.get("sid")
793783 def _request_token_invalid_email(
794784 self, email, expected_errcode, expected_error, client_secret="foobar",
795785 ):
796 request, channel = self.make_request(
786 channel = self.make_request(
797787 "POST",
798788 b"account/3pid/email/requestToken",
799789 {"client_secret": client_secret, "email": email, "send_attempt": 1},
806796 # Remove the host
807797 path = link.replace("https://example.com", "")
808798
809 request, channel = self.make_request("GET", path, shorthand=False)
799 channel = self.make_request("GET", path, shorthand=False)
810800 self.assertEquals(200, channel.code, channel.result)
811801
812802 def _get_link_from_email(self):
840830
841831 self._validate_token(link)
842832
843 request, channel = self.make_request(
833 channel = self.make_request(
844834 "POST",
845835 b"/_matrix/client/unstable/account/3pid/add",
846836 {
858848 self.assertEqual(200, int(channel.result["code"]), msg=channel.result["body"])
859849
860850 # Get user
861 request, channel = self.make_request(
851 channel = self.make_request(
862852 "GET", self.url_3pid, access_token=self.user_id_tok,
863853 )
864854
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14 from typing import List, Union
14
15 from typing import Union
1516
1617 from twisted.internet.defer import succeed
1718
1819 import synapse.rest.admin
1920 from synapse.api.constants import LoginType
2021 from synapse.handlers.ui_auth.checkers import UserInteractiveAuthChecker
21 from synapse.http.site import SynapseRequest
2222 from synapse.rest.client.v1 import login
2323 from synapse.rest.client.v2_alpha import auth, devices, register
24 from synapse.types import JsonDict
24 from synapse.rest.oidc import OIDCResource
25 from synapse.types import JsonDict, UserID
2526
2627 from tests import unittest
28 from tests.rest.client.v1.utils import TEST_OIDC_CONFIG
2729 from tests.server import FakeChannel
2830
2931
6365
6466 def register(self, expected_response: int, body: JsonDict) -> FakeChannel:
6567 """Make a register request."""
66 request, channel = self.make_request(
67 "POST", "register", body
68 ) # type: SynapseRequest, FakeChannel
69
70 self.assertEqual(request.code, expected_response)
68 channel = self.make_request("POST", "register", body)
69
70 self.assertEqual(channel.code, expected_response)
7171 return channel
7272
7373 def recaptcha(
7777 if post_session is None:
7878 post_session = session
7979
80 request, channel = self.make_request(
80 channel = self.make_request(
8181 "GET", "auth/m.login.recaptcha/fallback/web?session=" + session
82 ) # type: SynapseRequest, FakeChannel
83 self.assertEqual(request.code, 200)
84
85 request, channel = self.make_request(
82 )
83 self.assertEqual(channel.code, 200)
84
85 channel = self.make_request(
8686 "POST",
8787 "auth/m.login.recaptcha/fallback/web?session="
8888 + post_session
8989 + "&g-recaptcha-response=a",
9090 )
91 self.assertEqual(request.code, expected_post_response)
91 self.assertEqual(channel.code, expected_post_response)
9292
9393 # The recaptcha handler is called with the response given
9494 attempts = self.recaptcha_checker.recaptcha_attempts
155155 register.register_servlets,
156156 ]
157157
158 def default_config(self):
159 config = super().default_config()
160
161 # we enable OIDC as a way of testing SSO flows
162 oidc_config = {}
163 oidc_config.update(TEST_OIDC_CONFIG)
164 oidc_config["allow_existing_users"] = True
165
166 config["oidc_config"] = oidc_config
167 config["public_baseurl"] = "https://synapse.test"
168 return config
169
170 def create_resource_dict(self):
171 resource_dict = super().create_resource_dict()
172 # mount the OIDC resource at /_synapse/oidc
173 resource_dict["/_synapse/oidc"] = OIDCResource(self.hs)
174 return resource_dict
175
158176 def prepare(self, reactor, clock, hs):
159177 self.user_pass = "pass"
160178 self.user = self.register_user("test", self.user_pass)
161 self.user_tok = self.login("test", self.user_pass)
162
163 def get_device_ids(self) -> List[str]:
164 # Get the list of devices so one can be deleted.
165 request, channel = self.make_request(
166 "GET", "devices", access_token=self.user_tok,
167 ) # type: SynapseRequest, FakeChannel
168
169 # Get the ID of the device.
170 self.assertEqual(request.code, 200)
171 return [d["device_id"] for d in channel.json_body["devices"]]
179 self.device_id = "dev1"
180 self.user_tok = self.login("test", self.user_pass, self.device_id)
172181
173182 def delete_device(
174 self, device: str, expected_response: int, body: Union[bytes, JsonDict] = b""
183 self,
184 access_token: str,
185 device: str,
186 expected_response: int,
187 body: Union[bytes, JsonDict] = b"",
175188 ) -> FakeChannel:
176189 """Delete an individual device."""
177 request, channel = self.make_request(
178 "DELETE", "devices/" + device, body, access_token=self.user_tok
179 ) # type: SynapseRequest, FakeChannel
190 channel = self.make_request(
191 "DELETE", "devices/" + device, body, access_token=access_token,
192 )
180193
181194 # Ensure the response is sane.
182 self.assertEqual(request.code, expected_response)
195 self.assertEqual(channel.code, expected_response)
183196
184197 return channel
185198
187200 """Delete 1 or more devices."""
188201 # Note that this uses the delete_devices endpoint so that we can modify
189202 # the payload half-way through some tests.
190 request, channel = self.make_request(
203 channel = self.make_request(
191204 "POST", "delete_devices", body, access_token=self.user_tok,
192 ) # type: SynapseRequest, FakeChannel
205 )
193206
194207 # Ensure the response is sane.
195 self.assertEqual(request.code, expected_response)
208 self.assertEqual(channel.code, expected_response)
196209
197210 return channel
198211
200213 """
201214 Test user interactive authentication outside of registration.
202215 """
203 device_id = self.get_device_ids()[0]
204
205216 # Attempt to delete this device.
206217 # Returns a 401 as per the spec
207 channel = self.delete_device(device_id, 401)
218 channel = self.delete_device(self.user_tok, self.device_id, 401)
208219
209220 # Grab the session
210221 session = channel.json_body["session"]
213224
214225 # Make another request providing the UI auth flow.
215226 self.delete_device(
216 device_id,
227 self.user_tok,
228 self.device_id,
217229 200,
218230 {
219231 "auth": {
232244 UIA - check that still works.
233245 """
234246
235 device_id = self.get_device_ids()[0]
236 channel = self.delete_device(device_id, 401)
247 channel = self.delete_device(self.user_tok, self.device_id, 401)
237248 session = channel.json_body["session"]
238249
239250 # Make another request providing the UI auth flow.
240251 self.delete_device(
241 device_id,
252 self.user_tok,
253 self.device_id,
242254 200,
243255 {
244256 "auth": {
261273 session ID should be rejected.
262274 """
263275 # Create a second login.
264 self.login("test", self.user_pass)
265
266 device_ids = self.get_device_ids()
267 self.assertEqual(len(device_ids), 2)
276 self.login("test", self.user_pass, "dev2")
268277
269278 # Attempt to delete the first device.
270279 # Returns a 401 as per the spec
271 channel = self.delete_devices(401, {"devices": [device_ids[0]]})
280 channel = self.delete_devices(401, {"devices": [self.device_id]})
272281
273282 # Grab the session
274283 session = channel.json_body["session"]
280289 self.delete_devices(
281290 200,
282291 {
283 "devices": [device_ids[1]],
292 "devices": ["dev2"],
284293 "auth": {
285294 "type": "m.login.password",
286295 "identifier": {"type": "m.id.user", "user": self.user},
295304 The initial requested URI cannot be modified during the user interactive authentication session.
296305 """
297306 # Create a second login.
298 self.login("test", self.user_pass)
299
300 device_ids = self.get_device_ids()
301 self.assertEqual(len(device_ids), 2)
307 self.login("test", self.user_pass, "dev2")
302308
303309 # Attempt to delete the first device.
304310 # Returns a 401 as per the spec
305 channel = self.delete_device(device_ids[0], 401)
311 channel = self.delete_device(self.user_tok, self.device_id, 401)
306312
307313 # Grab the session
308314 session = channel.json_body["session"]
311317
312318 # Make another request providing the UI auth flow, but try to delete the
313319 # second device. This results in an error.
320 #
321 # This makes use of the fact that the device ID is embedded into the URL.
314322 self.delete_device(
315 device_ids[1],
323 self.user_tok,
324 "dev2",
316325 403,
317326 {
318327 "auth": {
323332 },
324333 },
325334 )
335
336 @unittest.override_config({"ui_auth": {"session_timeout": 5 * 1000}})
337 def test_can_reuse_session(self):
338 """
339 The session can be reused if configured.
340
341 Compare to test_cannot_change_uri.
342 """
343 # Create a second and third login.
344 self.login("test", self.user_pass, "dev2")
345 self.login("test", self.user_pass, "dev3")
346
347 # Attempt to delete a device. This works since the user just logged in.
348 self.delete_device(self.user_tok, "dev2", 200)
349
350 # Move the clock forward past the validation timeout.
351 self.reactor.advance(6)
352
353 # Deleting another devices throws the user into UI auth.
354 channel = self.delete_device(self.user_tok, "dev3", 401)
355
356 # Grab the session
357 session = channel.json_body["session"]
358 # Ensure that flows are what is expected.
359 self.assertIn({"stages": ["m.login.password"]}, channel.json_body["flows"])
360
361 # Make another request providing the UI auth flow.
362 self.delete_device(
363 self.user_tok,
364 "dev3",
365 200,
366 {
367 "auth": {
368 "type": "m.login.password",
369 "identifier": {"type": "m.id.user", "user": self.user},
370 "password": self.user_pass,
371 "session": session,
372 },
373 },
374 )
375
376 # Make another request, but try to delete the first device. This works
377 # due to re-using the previous session.
378 #
379 # Note that *no auth* information is provided, not even a session iD!
380 self.delete_device(self.user_tok, self.device_id, 200)
381
382 def test_does_not_offer_password_for_sso_user(self):
383 login_resp = self.helper.login_via_oidc("username")
384 user_tok = login_resp["access_token"]
385 device_id = login_resp["device_id"]
386
387 # now call the device deletion API: we should get the option to auth with SSO
388 # and not password.
389 channel = self.delete_device(user_tok, device_id, 401)
390
391 flows = channel.json_body["flows"]
392 self.assertEqual(flows, [{"stages": ["m.login.sso"]}])
393
394 def test_does_not_offer_sso_for_password_user(self):
395 # now call the device deletion API: we should get the option to auth with SSO
396 # and not password.
397 channel = self.delete_device(self.user_tok, self.device_id, 401)
398
399 flows = channel.json_body["flows"]
400 self.assertEqual(flows, [{"stages": ["m.login.password"]}])
401
402 def test_offers_both_flows_for_upgraded_user(self):
403 """A user that had a password and then logged in with SSO should get both flows
404 """
405 login_resp = self.helper.login_via_oidc(UserID.from_string(self.user).localpart)
406 self.assertEqual(login_resp["user_id"], self.user)
407
408 channel = self.delete_device(self.user_tok, self.device_id, 401)
409
410 flows = channel.json_body["flows"]
411 # we have no particular expectations of ordering here
412 self.assertIn({"stages": ["m.login.password"]}, flows)
413 self.assertIn({"stages": ["m.login.sso"]}, flows)
414 self.assertEqual(len(flows), 2)
3535 return hs
3636
3737 def test_check_auth_required(self):
38 request, channel = self.make_request("GET", self.url)
38 channel = self.make_request("GET", self.url)
3939
4040 self.assertEqual(channel.code, 401)
4141
4343 self.register_user("user", "pass")
4444 access_token = self.login("user", "pass")
4545
46 request, channel = self.make_request("GET", self.url, access_token=access_token)
46 channel = self.make_request("GET", self.url, access_token=access_token)
4747 capabilities = channel.json_body["capabilities"]
4848
4949 self.assertEqual(channel.code, 200)
6161 user = self.register_user(localpart, password)
6262 access_token = self.login(user, password)
6363
64 request, channel = self.make_request("GET", self.url, access_token=access_token)
64 channel = self.make_request("GET", self.url, access_token=access_token)
6565 capabilities = channel.json_body["capabilities"]
6666
6767 self.assertEqual(channel.code, 200)
6969 # Test case where password is handled outside of Synapse
7070 self.assertTrue(capabilities["m.change_password"]["enabled"])
7171 self.get_success(self.store.user_set_password_hash(user, None))
72 request, channel = self.make_request("GET", self.url, access_token=access_token)
72 channel = self.make_request("GET", self.url, access_token=access_token)
7373 capabilities = channel.json_body["capabilities"]
7474
7575 self.assertEqual(channel.code, 200)
3535 self.store = hs.get_datastore()
3636
3737 def test_add_filter(self):
38 request, channel = self.make_request(
38 channel = self.make_request(
3939 "POST",
4040 "/_matrix/client/r0/user/%s/filter" % (self.user_id),
4141 self.EXAMPLE_FILTER_JSON,
4848 self.assertEquals(filter.result, self.EXAMPLE_FILTER)
4949
5050 def test_add_filter_for_other_user(self):
51 request, channel = self.make_request(
51 channel = self.make_request(
5252 "POST",
5353 "/_matrix/client/r0/user/%s/filter" % ("@watermelon:test"),
5454 self.EXAMPLE_FILTER_JSON,
6060 def test_add_filter_non_local_user(self):
6161 _is_mine = self.hs.is_mine
6262 self.hs.is_mine = lambda target_user: False
63 request, channel = self.make_request(
63 channel = self.make_request(
6464 "POST",
6565 "/_matrix/client/r0/user/%s/filter" % (self.user_id),
6666 self.EXAMPLE_FILTER_JSON,
7878 )
7979 self.reactor.advance(1)
8080 filter_id = filter_id.result
81 request, channel = self.make_request(
81 channel = self.make_request(
8282 "GET", "/_matrix/client/r0/user/%s/filter/%s" % (self.user_id, filter_id)
8383 )
8484
8686 self.assertEquals(channel.json_body, self.EXAMPLE_FILTER)
8787
8888 def test_get_filter_non_existant(self):
89 request, channel = self.make_request(
89 channel = self.make_request(
9090 "GET", "/_matrix/client/r0/user/%s/filter/12382148321" % (self.user_id)
9191 )
9292
9696 # Currently invalid params do not have an appropriate errcode
9797 # in errors.py
9898 def test_get_filter_invalid_id(self):
99 request, channel = self.make_request(
99 channel = self.make_request(
100100 "GET", "/_matrix/client/r0/user/%s/filter/foobar" % (self.user_id)
101101 )
102102
104104
105105 # No ID also returns an invalid_id error
106106 def test_get_filter_no_id(self):
107 request, channel = self.make_request(
107 channel = self.make_request(
108108 "GET", "/_matrix/client/r0/user/%s/filter/" % (self.user_id)
109109 )
110110
6969 def test_get_policy(self):
7070 """Tests if the /password_policy endpoint returns the configured policy."""
7171
72 request, channel = self.make_request(
73 "GET", "/_matrix/client/r0/password_policy"
74 )
72 channel = self.make_request("GET", "/_matrix/client/r0/password_policy")
7573
7674 self.assertEqual(channel.code, 200, channel.result)
7775 self.assertEqual(
8886
8987 def test_password_too_short(self):
9088 request_data = json.dumps({"username": "kermit", "password": "shorty"})
91 request, channel = self.make_request("POST", self.register_url, request_data)
89 channel = self.make_request("POST", self.register_url, request_data)
9290
9391 self.assertEqual(channel.code, 400, channel.result)
9492 self.assertEqual(
9795
9896 def test_password_no_digit(self):
9997 request_data = json.dumps({"username": "kermit", "password": "longerpassword"})
100 request, channel = self.make_request("POST", self.register_url, request_data)
98 channel = self.make_request("POST", self.register_url, request_data)
10199
102100 self.assertEqual(channel.code, 400, channel.result)
103101 self.assertEqual(
106104
107105 def test_password_no_symbol(self):
108106 request_data = json.dumps({"username": "kermit", "password": "l0ngerpassword"})
109 request, channel = self.make_request("POST", self.register_url, request_data)
107 channel = self.make_request("POST", self.register_url, request_data)
110108
111109 self.assertEqual(channel.code, 400, channel.result)
112110 self.assertEqual(
115113
116114 def test_password_no_uppercase(self):
117115 request_data = json.dumps({"username": "kermit", "password": "l0ngerpassword!"})
118 request, channel = self.make_request("POST", self.register_url, request_data)
116 channel = self.make_request("POST", self.register_url, request_data)
119117
120118 self.assertEqual(channel.code, 400, channel.result)
121119 self.assertEqual(
124122
125123 def test_password_no_lowercase(self):
126124 request_data = json.dumps({"username": "kermit", "password": "L0NGERPASSWORD!"})
127 request, channel = self.make_request("POST", self.register_url, request_data)
125 channel = self.make_request("POST", self.register_url, request_data)
128126
129127 self.assertEqual(channel.code, 400, channel.result)
130128 self.assertEqual(
133131
134132 def test_password_compliant(self):
135133 request_data = json.dumps({"username": "kermit", "password": "L0ngerpassword!"})
136 request, channel = self.make_request("POST", self.register_url, request_data)
134 channel = self.make_request("POST", self.register_url, request_data)
137135
138136 # Getting a 401 here means the password has passed validation and the server has
139137 # responded with a list of registration flows.
159157 },
160158 }
161159 )
162 request, channel = self.make_request(
160 channel = self.make_request(
163161 "POST",
164162 "/_matrix/client/r0/account/password",
165163 request_data,
6060 self.hs.get_datastore().services_cache.append(appservice)
6161 request_data = json.dumps({"username": "as_user_kermit"})
6262
63 request, channel = self.make_request(
63 channel = self.make_request(
6464 b"POST", self.url + b"?access_token=i_am_an_app_service", request_data
6565 )
6666
7171 def test_POST_appservice_registration_invalid(self):
7272 self.appservice = None # no application service exists
7373 request_data = json.dumps({"username": "kermit"})
74 request, channel = self.make_request(
74 channel = self.make_request(
7575 b"POST", self.url + b"?access_token=i_am_an_app_service", request_data
7676 )
7777
7979
8080 def test_POST_bad_password(self):
8181 request_data = json.dumps({"username": "kermit", "password": 666})
82 request, channel = self.make_request(b"POST", self.url, request_data)
82 channel = self.make_request(b"POST", self.url, request_data)
8383
8484 self.assertEquals(channel.result["code"], b"400", channel.result)
8585 self.assertEquals(channel.json_body["error"], "Invalid password")
8686
8787 def test_POST_bad_username(self):
8888 request_data = json.dumps({"username": 777, "password": "monkey"})
89 request, channel = self.make_request(b"POST", self.url, request_data)
89 channel = self.make_request(b"POST", self.url, request_data)
9090
9191 self.assertEquals(channel.result["code"], b"400", channel.result)
9292 self.assertEquals(channel.json_body["error"], "Invalid username")
101101 "auth": {"type": LoginType.DUMMY},
102102 }
103103 request_data = json.dumps(params)
104 request, channel = self.make_request(b"POST", self.url, request_data)
104 channel = self.make_request(b"POST", self.url, request_data)
105105
106106 det_data = {
107107 "user_id": user_id,
116116 request_data = json.dumps({"username": "kermit", "password": "monkey"})
117117 self.auth_result = (None, {"username": "kermit", "password": "monkey"}, None)
118118
119 request, channel = self.make_request(b"POST", self.url, request_data)
119 channel = self.make_request(b"POST", self.url, request_data)
120120
121121 self.assertEquals(channel.result["code"], b"403", channel.result)
122122 self.assertEquals(channel.json_body["error"], "Registration has been disabled")
123 self.assertEquals(channel.json_body["errcode"], "M_FORBIDDEN")
123124
124125 def test_POST_guest_registration(self):
125126 self.hs.config.macaroon_secret_key = "test"
126127 self.hs.config.allow_guest_access = True
127128
128 request, channel = self.make_request(b"POST", self.url + b"?kind=guest", b"{}")
129 channel = self.make_request(b"POST", self.url + b"?kind=guest", b"{}")
129130
130131 det_data = {"home_server": self.hs.hostname, "device_id": "guest_device"}
131132 self.assertEquals(channel.result["code"], b"200", channel.result)
134135 def test_POST_disabled_guest_registration(self):
135136 self.hs.config.allow_guest_access = False
136137
137 request, channel = self.make_request(b"POST", self.url + b"?kind=guest", b"{}")
138 channel = self.make_request(b"POST", self.url + b"?kind=guest", b"{}")
138139
139140 self.assertEquals(channel.result["code"], b"403", channel.result)
140141 self.assertEquals(channel.json_body["error"], "Guest access is disabled")
143144 def test_POST_ratelimiting_guest(self):
144145 for i in range(0, 6):
145146 url = self.url + b"?kind=guest"
146 request, channel = self.make_request(b"POST", url, b"{}")
147 channel = self.make_request(b"POST", url, b"{}")
147148
148149 if i == 5:
149150 self.assertEquals(channel.result["code"], b"429", channel.result)
153154
154155 self.reactor.advance(retry_after_ms / 1000.0 + 1.0)
155156
156 request, channel = self.make_request(b"POST", self.url + b"?kind=guest", b"{}")
157 channel = self.make_request(b"POST", self.url + b"?kind=guest", b"{}")
157158
158159 self.assertEquals(channel.result["code"], b"200", channel.result)
159160
167168 "auth": {"type": LoginType.DUMMY},
168169 }
169170 request_data = json.dumps(params)
170 request, channel = self.make_request(b"POST", self.url, request_data)
171 channel = self.make_request(b"POST", self.url, request_data)
171172
172173 if i == 5:
173174 self.assertEquals(channel.result["code"], b"429", channel.result)
177178
178179 self.reactor.advance(retry_after_ms / 1000.0 + 1.0)
179180
180 request, channel = self.make_request(b"POST", self.url + b"?kind=guest", b"{}")
181 channel = self.make_request(b"POST", self.url + b"?kind=guest", b"{}")
181182
182183 self.assertEquals(channel.result["code"], b"200", channel.result)
183184
184185 def test_advertised_flows(self):
185 request, channel = self.make_request(b"POST", self.url, b"{}")
186 channel = self.make_request(b"POST", self.url, b"{}")
186187 self.assertEquals(channel.result["code"], b"401", channel.result)
187188 flows = channel.json_body["flows"]
188189
205206 }
206207 )
207208 def test_advertised_flows_captcha_and_terms_and_3pids(self):
208 request, channel = self.make_request(b"POST", self.url, b"{}")
209 channel = self.make_request(b"POST", self.url, b"{}")
209210 self.assertEquals(channel.result["code"], b"401", channel.result)
210211 flows = channel.json_body["flows"]
211212
237238 }
238239 )
239240 def test_advertised_flows_no_msisdn_email_required(self):
240 request, channel = self.make_request(b"POST", self.url, b"{}")
241 channel = self.make_request(b"POST", self.url, b"{}")
241242 self.assertEquals(channel.result["code"], b"401", channel.result)
242243 flows = channel.json_body["flows"]
243244
277278 )
278279 )
279280
280 request, channel = self.make_request(
281 channel = self.make_request(
281282 "POST",
282283 b"register/email/requestToken",
283284 {"client_secret": "foobar", "email": email, "send_attempt": 1},
316317
317318 # The specific endpoint doesn't matter, all we need is an authenticated
318319 # endpoint.
319 request, channel = self.make_request(b"GET", "/sync", access_token=tok)
320 channel = self.make_request(b"GET", "/sync", access_token=tok)
320321
321322 self.assertEquals(channel.result["code"], b"200", channel.result)
322323
323324 self.reactor.advance(datetime.timedelta(weeks=1).total_seconds())
324325
325 request, channel = self.make_request(b"GET", "/sync", access_token=tok)
326 channel = self.make_request(b"GET", "/sync", access_token=tok)
326327
327328 self.assertEquals(channel.result["code"], b"403", channel.result)
328329 self.assertEquals(
344345 url = "/_synapse/admin/v1/account_validity/validity"
345346 params = {"user_id": user_id}
346347 request_data = json.dumps(params)
347 request, channel = self.make_request(
348 b"POST", url, request_data, access_token=admin_tok
349 )
348 channel = self.make_request(b"POST", url, request_data, access_token=admin_tok)
350349 self.assertEquals(channel.result["code"], b"200", channel.result)
351350
352351 # The specific endpoint doesn't matter, all we need is an authenticated
353352 # endpoint.
354 request, channel = self.make_request(b"GET", "/sync", access_token=tok)
353 channel = self.make_request(b"GET", "/sync", access_token=tok)
355354 self.assertEquals(channel.result["code"], b"200", channel.result)
356355
357356 def test_manual_expire(self):
368367 "enable_renewal_emails": False,
369368 }
370369 request_data = json.dumps(params)
371 request, channel = self.make_request(
372 b"POST", url, request_data, access_token=admin_tok
373 )
370 channel = self.make_request(b"POST", url, request_data, access_token=admin_tok)
374371 self.assertEquals(channel.result["code"], b"200", channel.result)
375372
376373 # The specific endpoint doesn't matter, all we need is an authenticated
377374 # endpoint.
378 request, channel = self.make_request(b"GET", "/sync", access_token=tok)
375 channel = self.make_request(b"GET", "/sync", access_token=tok)
379376 self.assertEquals(channel.result["code"], b"403", channel.result)
380377 self.assertEquals(
381378 channel.json_body["errcode"], Codes.EXPIRED_ACCOUNT, channel.result
395392 "enable_renewal_emails": False,
396393 }
397394 request_data = json.dumps(params)
398 request, channel = self.make_request(
399 b"POST", url, request_data, access_token=admin_tok
400 )
395 channel = self.make_request(b"POST", url, request_data, access_token=admin_tok)
401396 self.assertEquals(channel.result["code"], b"200", channel.result)
402397
403398 # Try to log the user out
404 request, channel = self.make_request(b"POST", "/logout", access_token=tok)
399 channel = self.make_request(b"POST", "/logout", access_token=tok)
405400 self.assertEquals(channel.result["code"], b"200", channel.result)
406401
407402 # Log the user in again (allowed for expired accounts)
408403 tok = self.login("kermit", "monkey")
409404
410405 # Try to log out all of the user's sessions
411 request, channel = self.make_request(b"POST", "/logout/all", access_token=tok)
406 channel = self.make_request(b"POST", "/logout/all", access_token=tok)
412407 self.assertEquals(channel.result["code"], b"200", channel.result)
413408
414409
482477 # retrieve the token from the DB.
483478 renewal_token = self.get_success(self.store.get_renewal_token_for_user(user_id))
484479 url = "/_matrix/client/unstable/account_validity/renew?token=%s" % renewal_token
485 request, channel = self.make_request(b"GET", url)
480 channel = self.make_request(b"GET", url)
486481 self.assertEquals(channel.result["code"], b"200", channel.result)
487482
488483 # Check that we're getting HTML back.
502497 # our access token should be denied from now, otherwise they should
503498 # succeed.
504499 self.reactor.advance(datetime.timedelta(days=3).total_seconds())
505 request, channel = self.make_request(b"GET", "/sync", access_token=tok)
500 channel = self.make_request(b"GET", "/sync", access_token=tok)
506501 self.assertEquals(channel.result["code"], b"200", channel.result)
507502
508503 def test_renewal_invalid_token(self):
509504 # Hit the renewal endpoint with an invalid token and check that it behaves as
510505 # expected, i.e. that it responds with 404 Not Found and the correct HTML.
511506 url = "/_matrix/client/unstable/account_validity/renew?token=123"
512 request, channel = self.make_request(b"GET", url)
507 channel = self.make_request(b"GET", url)
513508 self.assertEquals(channel.result["code"], b"404", channel.result)
514509
515510 # Check that we're getting HTML back.
530525 self.email_attempts = []
531526
532527 (user_id, tok) = self.create_user()
533 request, channel = self.make_request(
528 channel = self.make_request(
534529 b"POST",
535530 "/_matrix/client/unstable/account_validity/send_mail",
536531 access_token=tok,
554549 "erase": False,
555550 }
556551 )
557 request, channel = self.make_request(
552 channel = self.make_request(
558553 "POST", "account/deactivate", request_data, access_token=tok
559554 )
560 self.assertEqual(request.code, 200)
555 self.assertEqual(channel.code, 200)
561556
562557 self.reactor.advance(datetime.timedelta(days=8).total_seconds())
563558
605600 self.email_attempts = []
606601
607602 # Test that we're still able to manually trigger a mail to be sent.
608 request, channel = self.make_request(
603 channel = self.make_request(
609604 b"POST",
610605 "/_matrix/client/unstable/account_validity/send_mail",
611606 access_token=tok,
5959
6060 event_id = channel.json_body["event_id"]
6161
62 request, channel = self.make_request(
62 channel = self.make_request(
6363 "GET",
6464 "/rooms/%s/event/%s" % (self.room, event_id),
6565 access_token=self.user_token,
106106 self.assertEquals(200, channel.code, channel.json_body)
107107 annotation_id = channel.json_body["event_id"]
108108
109 request, channel = self.make_request(
109 channel = self.make_request(
110110 "GET",
111111 "/_matrix/client/unstable/rooms/%s/relations/%s?limit=1"
112112 % (self.room, self.parent_id),
151151 if prev_token:
152152 from_token = "&from=" + prev_token
153153
154 request, channel = self.make_request(
154 channel = self.make_request(
155155 "GET",
156156 "/_matrix/client/unstable/rooms/%s/relations/%s?limit=1%s"
157157 % (self.room, self.parent_id, from_token),
209209 if prev_token:
210210 from_token = "&from=" + prev_token
211211
212 request, channel = self.make_request(
212 channel = self.make_request(
213213 "GET",
214214 "/_matrix/client/unstable/rooms/%s/aggregations/%s?limit=1%s"
215215 % (self.room, self.parent_id, from_token),
278278 if prev_token:
279279 from_token = "&from=" + prev_token
280280
281 request, channel = self.make_request(
281 channel = self.make_request(
282282 "GET",
283283 "/_matrix/client/unstable/rooms/%s"
284284 "/aggregations/%s/%s/m.reaction/%s?limit=1%s"
324324 channel = self._send_relation(RelationTypes.ANNOTATION, "m.reaction", "b")
325325 self.assertEquals(200, channel.code, channel.json_body)
326326
327 request, channel = self.make_request(
327 channel = self.make_request(
328328 "GET",
329329 "/_matrix/client/unstable/rooms/%s/aggregations/%s"
330330 % (self.room, self.parent_id),
356356 self.assertEquals(200, channel.code, channel.json_body)
357357
358358 # Now lets redact one of the 'a' reactions
359 request, channel = self.make_request(
359 channel = self.make_request(
360360 "POST",
361361 "/_matrix/client/r0/rooms/%s/redact/%s" % (self.room, to_redact_event_id),
362362 access_token=self.user_token,
364364 )
365365 self.assertEquals(200, channel.code, channel.json_body)
366366
367 request, channel = self.make_request(
367 channel = self.make_request(
368368 "GET",
369369 "/_matrix/client/unstable/rooms/%s/aggregations/%s"
370370 % (self.room, self.parent_id),
381381 """Test that aggregations must be annotations.
382382 """
383383
384 request, channel = self.make_request(
384 channel = self.make_request(
385385 "GET",
386386 "/_matrix/client/unstable/rooms/%s/aggregations/%s/%s?limit=1"
387387 % (self.room, self.parent_id, RelationTypes.REPLACE),
413413 self.assertEquals(200, channel.code, channel.json_body)
414414 reply_2 = channel.json_body["event_id"]
415415
416 request, channel = self.make_request(
416 channel = self.make_request(
417417 "GET",
418418 "/rooms/%s/event/%s" % (self.room, self.parent_id),
419419 access_token=self.user_token,
449449
450450 edit_event_id = channel.json_body["event_id"]
451451
452 request, channel = self.make_request(
452 channel = self.make_request(
453453 "GET",
454454 "/rooms/%s/event/%s" % (self.room, self.parent_id),
455455 access_token=self.user_token,
506506 )
507507 self.assertEquals(200, channel.code, channel.json_body)
508508
509 request, channel = self.make_request(
509 channel = self.make_request(
510510 "GET",
511511 "/rooms/%s/event/%s" % (self.room, self.parent_id),
512512 access_token=self.user_token,
548548 self.assertEquals(200, channel.code, channel.json_body)
549549
550550 # Check the relation is returned
551 request, channel = self.make_request(
551 channel = self.make_request(
552552 "GET",
553553 "/_matrix/client/unstable/rooms/%s/relations/%s/m.replace/m.room.message"
554554 % (self.room, original_event_id),
560560 self.assertEquals(len(channel.json_body["chunk"]), 1)
561561
562562 # Redact the original event
563 request, channel = self.make_request(
563 channel = self.make_request(
564564 "PUT",
565565 "/rooms/%s/redact/%s/%s"
566566 % (self.room, original_event_id, "test_relations_redaction_redacts_edits"),
570570 self.assertEquals(200, channel.code, channel.json_body)
571571
572572 # Try to check for remaining m.replace relations
573 request, channel = self.make_request(
573 channel = self.make_request(
574574 "GET",
575575 "/_matrix/client/unstable/rooms/%s/relations/%s/m.replace/m.room.message"
576576 % (self.room, original_event_id),
597597 self.assertEquals(200, channel.code, channel.json_body)
598598
599599 # Redact the original
600 request, channel = self.make_request(
600 channel = self.make_request(
601601 "PUT",
602602 "/rooms/%s/redact/%s/%s"
603603 % (
611611 self.assertEquals(200, channel.code, channel.json_body)
612612
613613 # Check that aggregations returns zero
614 request, channel = self.make_request(
614 channel = self.make_request(
615615 "GET",
616616 "/_matrix/client/unstable/rooms/%s/aggregations/%s/m.annotation/m.reaction"
617617 % (self.room, original_event_id),
655655
656656 original_id = parent_id if parent_id else self.parent_id
657657
658 request, channel = self.make_request(
658 channel = self.make_request(
659659 "POST",
660660 "/_matrix/client/unstable/rooms/%s/send_relation/%s/%s/%s%s"
661661 % (self.room, original_id, relation_type, event_type, query),
1616 from synapse.rest.client.v2_alpha import shared_rooms
1717
1818 from tests import unittest
19 from tests.server import FakeChannel
1920
2021
2122 class UserSharedRoomsTest(unittest.HomeserverTestCase):
3940 self.store = hs.get_datastore()
4041 self.handler = hs.get_user_directory_handler()
4142
42 def _get_shared_rooms(self, token, other_user):
43 request, channel = self.make_request(
43 def _get_shared_rooms(self, token, other_user) -> FakeChannel:
44 return self.make_request(
4445 "GET",
4546 "/_matrix/client/unstable/uk.half-shot.msc2666/user/shared_rooms/%s"
4647 % other_user,
4748 access_token=token,
4849 )
49 return request, channel
5050
5151 def test_shared_room_list_public(self):
5252 """
6262 self.helper.invite(room, src=u1, targ=u2, tok=u1_token)
6363 self.helper.join(room, user=u2, tok=u2_token)
6464
65 request, channel = self._get_shared_rooms(u1_token, u2)
65 channel = self._get_shared_rooms(u1_token, u2)
6666 self.assertEquals(200, channel.code, channel.result)
6767 self.assertEquals(len(channel.json_body["joined"]), 1)
6868 self.assertEquals(channel.json_body["joined"][0], room)
8181 self.helper.invite(room, src=u1, targ=u2, tok=u1_token)
8282 self.helper.join(room, user=u2, tok=u2_token)
8383
84 request, channel = self._get_shared_rooms(u1_token, u2)
84 channel = self._get_shared_rooms(u1_token, u2)
8585 self.assertEquals(200, channel.code, channel.result)
8686 self.assertEquals(len(channel.json_body["joined"]), 1)
8787 self.assertEquals(channel.json_body["joined"][0], room)
103103 self.helper.join(room_public, user=u2, tok=u2_token)
104104 self.helper.join(room_private, user=u1, tok=u1_token)
105105
106 request, channel = self._get_shared_rooms(u1_token, u2)
106 channel = self._get_shared_rooms(u1_token, u2)
107107 self.assertEquals(200, channel.code, channel.result)
108108 self.assertEquals(len(channel.json_body["joined"]), 2)
109109 self.assertTrue(room_public in channel.json_body["joined"])
124124 self.helper.join(room, user=u2, tok=u2_token)
125125
126126 # Assert user directory is not empty
127 request, channel = self._get_shared_rooms(u1_token, u2)
127 channel = self._get_shared_rooms(u1_token, u2)
128128 self.assertEquals(200, channel.code, channel.result)
129129 self.assertEquals(len(channel.json_body["joined"]), 1)
130130 self.assertEquals(channel.json_body["joined"][0], room)
131131
132132 self.helper.leave(room, user=u1, tok=u1_token)
133133
134 request, channel = self._get_shared_rooms(u2_token, u1)
134 channel = self._get_shared_rooms(u2_token, u1)
135135 self.assertEquals(200, channel.code, channel.result)
136136 self.assertEquals(len(channel.json_body["joined"]), 0)
3434 ]
3535
3636 def test_sync_argless(self):
37 request, channel = self.make_request("GET", "/sync")
37 channel = self.make_request("GET", "/sync")
3838
3939 self.assertEqual(channel.code, 200)
4040 self.assertTrue(
5454 """
5555 self.hs.config.use_presence = False
5656
57 request, channel = self.make_request("GET", "/sync")
57 channel = self.make_request("GET", "/sync")
5858
5959 self.assertEqual(channel.code, 200)
6060 self.assertTrue(
193193 tok=tok,
194194 )
195195
196 request, channel = self.make_request(
196 channel = self.make_request(
197197 "GET", "/sync?filter=%s" % sync_filter, access_token=tok
198198 )
199199 self.assertEqual(channel.code, 200, channel.result)
244244 self.helper.send(room, body="There!", tok=other_access_token)
245245
246246 # Start typing.
247 request, channel = self.make_request(
247 channel = self.make_request(
248248 "PUT",
249249 typing_url % (room, other_user_id, other_access_token),
250250 b'{"typing": true, "timeout": 30000}',
251251 )
252252 self.assertEquals(200, channel.code)
253253
254 request, channel = self.make_request(
255 "GET", "/sync?access_token=%s" % (access_token,)
256 )
254 channel = self.make_request("GET", "/sync?access_token=%s" % (access_token,))
257255 self.assertEquals(200, channel.code)
258256 next_batch = channel.json_body["next_batch"]
259257
260258 # Stop typing.
261 request, channel = self.make_request(
259 channel = self.make_request(
262260 "PUT",
263261 typing_url % (room, other_user_id, other_access_token),
264262 b'{"typing": false}',
266264 self.assertEquals(200, channel.code)
267265
268266 # Start typing.
269 request, channel = self.make_request(
267 channel = self.make_request(
270268 "PUT",
271269 typing_url % (room, other_user_id, other_access_token),
272270 b'{"typing": true, "timeout": 30000}',
274272 self.assertEquals(200, channel.code)
275273
276274 # Should return immediately
277 request, channel = self.make_request(
278 "GET", sync_url % (access_token, next_batch)
279 )
275 channel = self.make_request("GET", sync_url % (access_token, next_batch))
280276 self.assertEquals(200, channel.code)
281277 next_batch = channel.json_body["next_batch"]
282278
288284 # invalidate the stream token.
289285 self.helper.send(room, body="There!", tok=other_access_token)
290286
291 request, channel = self.make_request(
292 "GET", sync_url % (access_token, next_batch)
293 )
287 channel = self.make_request("GET", sync_url % (access_token, next_batch))
294288 self.assertEquals(200, channel.code)
295289 next_batch = channel.json_body["next_batch"]
296290
298292 # ahead, and therefore it's saying the typing (that we've actually
299293 # already seen) is new, since it's got a token above our new, now-reset
300294 # stream token.
301 request, channel = self.make_request(
302 "GET", sync_url % (access_token, next_batch)
303 )
295 channel = self.make_request("GET", sync_url % (access_token, next_batch))
304296 self.assertEquals(200, channel.code)
305297 next_batch = channel.json_body["next_batch"]
306298
382374
383375 # Send a read receipt to tell the server we've read the latest event.
384376 body = json.dumps({"m.read": res["event_id"]}).encode("utf8")
385 request, channel = self.make_request(
377 channel = self.make_request(
386378 "POST",
387379 "/rooms/%s/read_markers" % self.room_id,
388380 body,
449441 def _check_unread_count(self, expected_count: True):
450442 """Syncs and compares the unread count with the expected value."""
451443
452 request, channel = self.make_request(
444 channel = self.make_request(
453445 "GET", self.url % self.next_batch, access_token=self.tok,
454446 )
455447
3838 class BaseRemoteKeyResourceTestCase(unittest.HomeserverTestCase):
3939 def make_homeserver(self, reactor, clock):
4040 self.http_client = Mock()
41 return self.setup_test_homeserver(http_client=self.http_client)
41 return self.setup_test_homeserver(federation_http_client=self.http_client)
4242
4343 def create_test_resource(self):
4444 return create_resource_tree(
171171 }
172172 ]
173173 self.hs2 = self.setup_test_homeserver(
174 http_client=self.http_client2, config=config
174 federation_http_client=self.http_client2, config=config
175175 )
176176
177177 # wire up outbound POST /key/v2/query requests from hs2 so that they
213213 }
214214 config["media_storage_providers"] = [provider_config]
215215
216 hs = self.setup_test_homeserver(config=config, http_client=client)
216 hs = self.setup_test_homeserver(config=config, federation_http_client=client)
217217
218218 return hs
219219
227227
228228 def _req(self, content_disposition):
229229
230 request, channel = make_request(
230 channel = make_request(
231231 self.reactor,
232232 FakeSite(self.download_resource),
233233 "GET",
323323
324324 def _test_thumbnail(self, method, expected_body, expected_found):
325325 params = "?width=32&height=32&method=" + method
326 request, channel = make_request(
326 channel = make_request(
327327 self.reactor,
328328 FakeSite(self.thumbnail_resource),
329329 "GET",
361361 "error": "Not found [b'example.com', b'12345']",
362362 },
363363 )
364
365 def test_x_robots_tag_header(self):
366 """
367 Tests that the `X-Robots-Tag` header is present, which informs web crawlers
368 to not index, archive, or follow links in media.
369 """
370 channel = self._req(b"inline; filename=out" + self.test_image.extension)
371
372 headers = channel.headers
373 self.assertEqual(
374 headers.getRawHeaders(b"X-Robots-Tag"),
375 [b"noindex, nofollow, noarchive, noimageindex"],
376 )
1717
1818 from mock import patch
1919
20 import attr
21
2220 from twisted.internet._resolver import HostResolution
2321 from twisted.internet.address import IPv4Address, IPv6Address
2422 from twisted.internet.error import DNSLookupError
25 from twisted.python.failure import Failure
2623 from twisted.test.proto_helpers import AccumulatingProtocol
27 from twisted.web._newclient import ResponseDone
2824
2925 from tests import unittest
3026 from tests.server import FakeTransport
31
32
33 @attr.s
34 class FakeResponse:
35 version = attr.ib()
36 code = attr.ib()
37 phrase = attr.ib()
38 headers = attr.ib()
39 body = attr.ib()
40 absoluteURI = attr.ib()
41
42 @property
43 def request(self):
44 @attr.s
45 class FakeTransport:
46 absoluteURI = self.absoluteURI
47
48 return FakeTransport()
49
50 def deliverBody(self, protocol):
51 protocol.dataReceived(self.body)
52 protocol.connectionLost(Failure(ResponseDone()))
5327
5428
5529 class URLPreviewTests(unittest.HomeserverTestCase):
138112 def test_cache_returns_correct_type(self):
139113 self.lookups["matrix.org"] = [(IPv4Address, "10.1.2.3")]
140114
141 request, channel = self.make_request(
115 channel = self.make_request(
142116 "GET",
143117 "preview_url?url=http://matrix.org",
144118 shorthand=False,
163137 )
164138
165139 # Check the cache returns the correct response
166 request, channel = self.make_request(
140 channel = self.make_request(
167141 "GET", "preview_url?url=http://matrix.org", shorthand=False
168142 )
169143
179153 self.assertNotIn("http://matrix.org", self.preview_url._cache)
180154
181155 # Check the database cache returns the correct response
182 request, channel = self.make_request(
156 channel = self.make_request(
183157 "GET", "preview_url?url=http://matrix.org", shorthand=False
184158 )
185159
200174 b"</head></html>"
201175 )
202176
203 request, channel = self.make_request(
177 channel = self.make_request(
204178 "GET",
205179 "preview_url?url=http://matrix.org",
206180 shorthand=False,
235209 b"</head></html>"
236210 )
237211
238 request, channel = self.make_request(
212 channel = self.make_request(
239213 "GET",
240214 "preview_url?url=http://matrix.org",
241215 shorthand=False,
270244 b"</head></html>"
271245 )
272246
273 request, channel = self.make_request(
247 channel = self.make_request(
274248 "GET",
275249 "preview_url?url=http://matrix.org",
276250 shorthand=False,
303277 """
304278 self.lookups["example.com"] = [(IPv4Address, "10.1.2.3")]
305279
306 request, channel = self.make_request(
280 channel = self.make_request(
307281 "GET",
308282 "preview_url?url=http://example.com",
309283 shorthand=False,
333307 """
334308 self.lookups["example.com"] = [(IPv4Address, "192.168.1.1")]
335309
336 request, channel = self.make_request(
310 channel = self.make_request(
337311 "GET", "preview_url?url=http://example.com", shorthand=False
338312 )
339313
354328 """
355329 self.lookups["example.com"] = [(IPv4Address, "1.1.1.2")]
356330
357 request, channel = self.make_request(
331 channel = self.make_request(
358332 "GET", "preview_url?url=http://example.com", shorthand=False
359333 )
360334
371345 """
372346 Blacklisted IP addresses, accessed directly, are not spidered.
373347 """
374 request, channel = self.make_request(
348 channel = self.make_request(
375349 "GET", "preview_url?url=http://192.168.1.1", shorthand=False
376350 )
377351
390364 """
391365 Blacklisted IP ranges, accessed directly, are not spidered.
392366 """
393 request, channel = self.make_request(
367 channel = self.make_request(
394368 "GET", "preview_url?url=http://1.1.1.2", shorthand=False
395369 )
396370
410384 """
411385 self.lookups["example.com"] = [(IPv4Address, "1.1.1.1")]
412386
413 request, channel = self.make_request(
387 channel = self.make_request(
414388 "GET",
415389 "preview_url?url=http://example.com",
416390 shorthand=False,
447421 (IPv4Address, "10.1.2.3"),
448422 ]
449423
450 request, channel = self.make_request(
424 channel = self.make_request(
451425 "GET", "preview_url?url=http://example.com", shorthand=False
452426 )
453427 self.assertEqual(channel.code, 502)
467441 (IPv6Address, "3fff:ffff:ffff:ffff:ffff:ffff:ffff:ffff")
468442 ]
469443
470 request, channel = self.make_request(
444 channel = self.make_request(
471445 "GET", "preview_url?url=http://example.com", shorthand=False
472446 )
473447
488462 """
489463 self.lookups["example.com"] = [(IPv6Address, "2001:800::1")]
490464
491 request, channel = self.make_request(
465 channel = self.make_request(
492466 "GET", "preview_url?url=http://example.com", shorthand=False
493467 )
494468
505479 """
506480 OPTIONS returns the OPTIONS.
507481 """
508 request, channel = self.make_request(
482 channel = self.make_request(
509483 "OPTIONS", "preview_url?url=http://example.com", shorthand=False
510484 )
511485 self.assertEqual(channel.code, 200)
518492 self.lookups["example.com"] = [(IPv4Address, "10.1.2.3")]
519493
520494 # Build and make a request to the server
521 request, channel = self.make_request(
495 channel = self.make_request(
522496 "GET",
523497 "preview_url?url=http://example.com",
524498 shorthand=False,
592566 b"</head></html>"
593567 )
594568
595 request, channel = self.make_request(
569 channel = self.make_request(
596570 "GET",
597571 "preview_url?url=http://twitter.com/matrixdotorg/status/12345",
598572 shorthand=False,
657631 }
658632 end_content = json.dumps(result).encode("utf-8")
659633
660 request, channel = self.make_request(
634 channel = self.make_request(
661635 "GET",
662636 "preview_url?url=http://twitter.com/matrixdotorg/status/12345",
663637 shorthand=False,
2424 return HealthResource()
2525
2626 def test_health(self):
27 request, channel = self.make_request("GET", "/health", shorthand=False)
27 channel = self.make_request("GET", "/health", shorthand=False)
2828
29 self.assertEqual(request.code, 200)
29 self.assertEqual(channel.code, 200)
3030 self.assertEqual(channel.result["body"], b"OK")
2727 self.hs.config.public_baseurl = "https://tesths"
2828 self.hs.config.default_identity_server = "https://testis"
2929
30 request, channel = self.make_request(
30 channel = self.make_request(
3131 "GET", "/.well-known/matrix/client", shorthand=False
3232 )
3333
34 self.assertEqual(request.code, 200)
34 self.assertEqual(channel.code, 200)
3535 self.assertEqual(
3636 channel.json_body,
3737 {
4343 def test_well_known_no_public_baseurl(self):
4444 self.hs.config.public_baseurl = None
4545
46 request, channel = self.make_request(
46 channel = self.make_request(
4747 "GET", "/.well-known/matrix/client", shorthand=False
4848 )
4949
50 self.assertEqual(request.code, 404)
50 self.assertEqual(channel.code, 404)
173173 custom_headers: Optional[
174174 Iterable[Tuple[Union[bytes, str], Union[bytes, str]]]
175175 ] = None,
176 ):
176 ) -> FakeChannel:
177177 """
178178 Make a web request using the given method, path and content, and render it
179179
180 Returns the Request and the Channel underneath.
180 Returns the fake Channel object which records the response to the request.
181181
182182 Args:
183183 site: The twisted Site to use to render the request
201201 is finished.
202202
203203 Returns:
204 Tuple[synapse.http.site.SynapseRequest, channel]
204 channel
205205 """
206206 if not isinstance(method, bytes):
207207 method = method.encode("ascii")
215215 and not path.startswith(b"/_matrix")
216216 and not path.startswith(b"/_synapse")
217217 ):
218 if path.startswith(b"/"):
219 path = path[1:]
218220 path = b"/_matrix/client/r0/" + path
219 path = path.replace(b"//", b"/")
220221
221222 if not path.startswith(b"/"):
222223 path = b"/" + path
257258 for k, v in custom_headers:
258259 req.requestHeaders.addRawHeader(k, v)
259260
261 req.parseCookies()
260262 req.requestReceived(method, path, b"1.1")
261263
262264 if await_result:
263265 channel.await_result()
264266
265 return req, channel
267 return channel
266268
267269
268270 @implementer(IReactorPluggableNameResolver)
6969 the notice URL + an authentication code.
7070 """
7171 # Initial sync, to get the user consent room invite
72 request, channel = self.make_request(
72 channel = self.make_request(
7373 "GET", "/_matrix/client/r0/sync", access_token=self.access_token
7474 )
7575 self.assertEqual(channel.code, 200)
7878 room_id = list(channel.json_body["rooms"]["invite"].keys())[0]
7979
8080 # Join the room
81 request, channel = self.make_request(
81 channel = self.make_request(
8282 "POST",
8383 "/_matrix/client/r0/rooms/" + room_id + "/join",
8484 access_token=self.access_token,
8686 self.assertEqual(channel.code, 200)
8787
8888 # Sync again, to get the message in the room
89 request, channel = self.make_request(
89 channel = self.make_request(
9090 "GET", "/_matrix/client/r0/sync", access_token=self.access_token
9191 )
9292 self.assertEqual(channel.code, 200)
304304 self.register_user("user", "password")
305305 tok = self.login("user", "password")
306306
307 request, channel = self.make_request("GET", "/sync?timeout=0", access_token=tok)
307 channel = self.make_request("GET", "/sync?timeout=0", access_token=tok)
308308
309309 invites = channel.json_body["rooms"]["invite"]
310310 self.assertEqual(len(invites), 0, invites)
317317
318318 # Sync again to retrieve the events in the room, so we can check whether this
319319 # room has a notice in it.
320 request, channel = self.make_request("GET", "/sync?timeout=0", access_token=tok)
320 channel = self.make_request("GET", "/sync?timeout=0", access_token=tok)
321321
322322 # Scan the events in the room to search for a message from the server notices
323323 # user.
352352 tok = self.login(localpart, "password")
353353
354354 # Sync with the user's token to mark the user as active.
355 request, channel = self.make_request(
356 "GET", "/sync?timeout=0", access_token=tok,
357 )
355 channel = self.make_request("GET", "/sync?timeout=0", access_token=tok,)
358356
359357 # Also retrieves the list of invites for this user. We don't care about that
360358 # one except if we're processing the last user, which should have received an
2323 from synapse.api.room_versions import RoomVersions
2424 from synapse.event_auth import auth_types_for_event
2525 from synapse.events import make_event_from_dict
26 from synapse.state.v2 import lexicographical_topological_sort, resolve_events_with_store
26 from synapse.state.v2 import (
27 _get_auth_chain_difference,
28 lexicographical_topological_sort,
29 resolve_events_with_store,
30 )
2731 from synapse.types import EventID
2832
2933 from tests import unittest
8387 event_dict = {
8488 "auth_events": [(a, {}) for a in auth_events],
8589 "prev_events": [(p, {}) for p in prev_events],
86 "event_id": self.node_id,
90 "event_id": self.event_id,
8791 "sender": self.sender,
8892 "type": self.type,
8993 "content": self.content,
376380
377381 self.do_check(events, edges, expected_state_ids)
378382
383 def test_mainline_sort(self):
384 """Tests that the mainline ordering works correctly.
385 """
386
387 events = [
388 FakeEvent(
389 id="T1", sender=ALICE, type=EventTypes.Topic, state_key="", content={}
390 ),
391 FakeEvent(
392 id="PA1",
393 sender=ALICE,
394 type=EventTypes.PowerLevels,
395 state_key="",
396 content={"users": {ALICE: 100, BOB: 50}},
397 ),
398 FakeEvent(
399 id="T2", sender=ALICE, type=EventTypes.Topic, state_key="", content={}
400 ),
401 FakeEvent(
402 id="PA2",
403 sender=ALICE,
404 type=EventTypes.PowerLevels,
405 state_key="",
406 content={
407 "users": {ALICE: 100, BOB: 50},
408 "events": {EventTypes.PowerLevels: 100},
409 },
410 ),
411 FakeEvent(
412 id="PB",
413 sender=BOB,
414 type=EventTypes.PowerLevels,
415 state_key="",
416 content={"users": {ALICE: 100, BOB: 50}},
417 ),
418 FakeEvent(
419 id="T3", sender=BOB, type=EventTypes.Topic, state_key="", content={}
420 ),
421 FakeEvent(
422 id="T4", sender=ALICE, type=EventTypes.Topic, state_key="", content={}
423 ),
424 ]
425
426 edges = [
427 ["END", "T3", "PA2", "T2", "PA1", "T1", "START"],
428 ["END", "T4", "PB", "PA1"],
429 ]
430
431 # We expect T3 to be picked as the other topics are pointing at older
432 # power levels. Note that without mainline ordering we'd pick T4 due to
433 # it being sent *after* T3.
434 expected_state_ids = ["T3", "PA2"]
435
436 self.do_check(events, edges, expected_state_ids)
437
379438 def do_check(self, events, edges, expected_state_ids):
380439 """Take a list of events and edges and calculate the state of the
381440 graph at END, and asserts it matches `expected_state_ids`
586645 self.assert_dict(self.expected_combined_state, state)
587646
588647
648 class AuthChainDifferenceTestCase(unittest.TestCase):
649 """We test that `_get_auth_chain_difference` correctly handles unpersisted
650 events.
651 """
652
653 def test_simple(self):
654 # Test getting the auth difference for a simple chain with a single
655 # unpersisted event:
656 #
657 # Unpersisted | Persisted
658 # |
659 # C -|-> B -> A
660
661 a = FakeEvent(
662 id="A", sender=ALICE, type=EventTypes.Member, state_key="", content={},
663 ).to_event([], [])
664
665 b = FakeEvent(
666 id="B", sender=ALICE, type=EventTypes.Member, state_key="", content={},
667 ).to_event([a.event_id], [])
668
669 c = FakeEvent(
670 id="C", sender=ALICE, type=EventTypes.Member, state_key="", content={},
671 ).to_event([b.event_id], [])
672
673 persisted_events = {a.event_id: a, b.event_id: b}
674 unpersited_events = {c.event_id: c}
675
676 state_sets = [{"a": a.event_id, "b": b.event_id}, {"c": c.event_id}]
677
678 store = TestStateResolutionStore(persisted_events)
679
680 diff_d = _get_auth_chain_difference(
681 ROOM_ID, state_sets, unpersited_events, store
682 )
683 difference = self.successResultOf(defer.ensureDeferred(diff_d))
684
685 self.assertEqual(difference, {c.event_id})
686
687 def test_multiple_unpersisted_chain(self):
688 # Test getting the auth difference for a simple chain with multiple
689 # unpersisted events:
690 #
691 # Unpersisted | Persisted
692 # |
693 # D -> C -|-> B -> A
694
695 a = FakeEvent(
696 id="A", sender=ALICE, type=EventTypes.Member, state_key="", content={},
697 ).to_event([], [])
698
699 b = FakeEvent(
700 id="B", sender=ALICE, type=EventTypes.Member, state_key="", content={},
701 ).to_event([a.event_id], [])
702
703 c = FakeEvent(
704 id="C", sender=ALICE, type=EventTypes.Member, state_key="", content={},
705 ).to_event([b.event_id], [])
706
707 d = FakeEvent(
708 id="D", sender=ALICE, type=EventTypes.Member, state_key="", content={},
709 ).to_event([c.event_id], [])
710
711 persisted_events = {a.event_id: a, b.event_id: b}
712 unpersited_events = {c.event_id: c, d.event_id: d}
713
714 state_sets = [
715 {"a": a.event_id, "b": b.event_id},
716 {"c": c.event_id, "d": d.event_id},
717 ]
718
719 store = TestStateResolutionStore(persisted_events)
720
721 diff_d = _get_auth_chain_difference(
722 ROOM_ID, state_sets, unpersited_events, store
723 )
724 difference = self.successResultOf(defer.ensureDeferred(diff_d))
725
726 self.assertEqual(difference, {d.event_id, c.event_id})
727
728 def test_unpersisted_events_different_sets(self):
729 # Test getting the auth difference for with multiple unpersisted events
730 # in different branches:
731 #
732 # Unpersisted | Persisted
733 # |
734 # D --> C -|-> B -> A
735 # E ----^ -|---^
736 # |
737
738 a = FakeEvent(
739 id="A", sender=ALICE, type=EventTypes.Member, state_key="", content={},
740 ).to_event([], [])
741
742 b = FakeEvent(
743 id="B", sender=ALICE, type=EventTypes.Member, state_key="", content={},
744 ).to_event([a.event_id], [])
745
746 c = FakeEvent(
747 id="C", sender=ALICE, type=EventTypes.Member, state_key="", content={},
748 ).to_event([b.event_id], [])
749
750 d = FakeEvent(
751 id="D", sender=ALICE, type=EventTypes.Member, state_key="", content={},
752 ).to_event([c.event_id], [])
753
754 e = FakeEvent(
755 id="E", sender=ALICE, type=EventTypes.Member, state_key="", content={},
756 ).to_event([c.event_id, b.event_id], [])
757
758 persisted_events = {a.event_id: a, b.event_id: b}
759 unpersited_events = {c.event_id: c, d.event_id: d, e.event_id: e}
760
761 state_sets = [
762 {"a": a.event_id, "b": b.event_id, "e": e.event_id},
763 {"c": c.event_id, "d": d.event_id},
764 ]
765
766 store = TestStateResolutionStore(persisted_events)
767
768 diff_d = _get_auth_chain_difference(
769 ROOM_ID, state_sets, unpersited_events, store
770 )
771 difference = self.successResultOf(defer.ensureDeferred(diff_d))
772
773 self.assertEqual(difference, {d.event_id, e.event_id})
774
775
589776 def pairwise(iterable):
590777 "s -> (s0,s1), (s1,s2), (s2, s3), ..."
591778 a, b = itertools.tee(iterable)
646833
647834 return list(result)
648835
649 def get_auth_chain_difference(self, auth_sets):
836 def get_auth_chain_difference(self, room_id, auth_sets):
650837 chains = [frozenset(self._get_auth_chain(a)) for a in auth_sets]
651838
652839 common = set(chains[0]).intersection(*chains[1:])
7979 )
8080
8181 @defer.inlineCallbacks
82 def test_count_devices_by_users(self):
83 yield defer.ensureDeferred(
84 self.store.store_device("user_id", "device1", "display_name 1")
85 )
86 yield defer.ensureDeferred(
87 self.store.store_device("user_id", "device2", "display_name 2")
88 )
89 yield defer.ensureDeferred(
90 self.store.store_device("user_id2", "device3", "display_name 3")
91 )
92
93 res = yield defer.ensureDeferred(self.store.count_devices_by_users())
94 self.assertEqual(0, res)
95
96 res = yield defer.ensureDeferred(self.store.count_devices_by_users(["unknown"]))
97 self.assertEqual(0, res)
98
99 res = yield defer.ensureDeferred(self.store.count_devices_by_users(["user_id"]))
100 self.assertEqual(2, res)
101
102 res = yield defer.ensureDeferred(
103 self.store.count_devices_by_users(["user_id", "user_id2"])
104 )
105 self.assertEqual(3, res)
106
107 @defer.inlineCallbacks
82108 def test_get_device_updates_by_remote(self):
83109 device_ids = ["device_id1", "device_id2"]
84110
2525
2626 class E2eRoomKeysHandlerTestCase(unittest.HomeserverTestCase):
2727 def make_homeserver(self, reactor, clock):
28 hs = self.setup_test_homeserver("server", http_client=None)
28 hs = self.setup_test_homeserver("server", federation_http_client=None)
2929 self.store = hs.get_datastore()
3030 return hs
3131
201201 # Now actually test that various combinations give the right result:
202202
203203 difference = self.get_success(
204 self.store.get_auth_chain_difference([{"a"}, {"b"}])
204 self.store.get_auth_chain_difference(room_id, [{"a"}, {"b"}])
205205 )
206206 self.assertSetEqual(difference, {"a", "b"})
207207
208208 difference = self.get_success(
209 self.store.get_auth_chain_difference([{"a"}, {"b"}, {"c"}])
209 self.store.get_auth_chain_difference(room_id, [{"a"}, {"b"}, {"c"}])
210210 )
211211 self.assertSetEqual(difference, {"a", "b", "c", "e", "f"})
212212
213213 difference = self.get_success(
214 self.store.get_auth_chain_difference([{"a", "c"}, {"b"}])
214 self.store.get_auth_chain_difference(room_id, [{"a", "c"}, {"b"}])
215215 )
216216 self.assertSetEqual(difference, {"a", "b", "c"})
217217
218218 difference = self.get_success(
219 self.store.get_auth_chain_difference([{"a"}, {"b"}, {"d"}])
219 self.store.get_auth_chain_difference(room_id, [{"a", "c"}, {"b", "c"}])
220 )
221 self.assertSetEqual(difference, {"a", "b"})
222
223 difference = self.get_success(
224 self.store.get_auth_chain_difference(room_id, [{"a"}, {"b"}, {"d"}])
220225 )
221226 self.assertSetEqual(difference, {"a", "b", "d", "e"})
222227
223228 difference = self.get_success(
224 self.store.get_auth_chain_difference([{"a"}, {"b"}, {"c"}, {"d"}])
229 self.store.get_auth_chain_difference(room_id, [{"a"}, {"b"}, {"c"}, {"d"}])
225230 )
226231 self.assertSetEqual(difference, {"a", "b", "c", "d", "e", "f"})
227232
228233 difference = self.get_success(
229 self.store.get_auth_chain_difference([{"a"}, {"b"}, {"e"}])
234 self.store.get_auth_chain_difference(room_id, [{"a"}, {"b"}, {"e"}])
230235 )
231236 self.assertSetEqual(difference, {"a", "b"})
232237
233 difference = self.get_success(self.store.get_auth_chain_difference([{"a"}]))
238 difference = self.get_success(
239 self.store.get_auth_chain_difference(room_id, [{"a"}])
240 )
234241 self.assertSetEqual(difference, set())
0 # -*- coding: utf-8 -*-
1 # Copyright 2020 The Matrix.org Foundation C.I.C.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15
16 from synapse.api.constants import EventTypes, Membership
17 from synapse.api.room_versions import RoomVersions
18 from synapse.federation.federation_base import event_from_pdu_json
19 from synapse.rest import admin
20 from synapse.rest.client.v1 import login, room
21
22 from tests.unittest import HomeserverTestCase
23
24
25 class ExtremPruneTestCase(HomeserverTestCase):
26 servlets = [
27 admin.register_servlets,
28 room.register_servlets,
29 login.register_servlets,
30 ]
31
32 def prepare(self, reactor, clock, homeserver):
33 self.state = self.hs.get_state_handler()
34 self.persistence = self.hs.get_storage().persistence
35 self.store = self.hs.get_datastore()
36
37 self.register_user("user", "pass")
38 self.token = self.login("user", "pass")
39
40 self.room_id = self.helper.create_room_as(
41 "user", room_version=RoomVersions.V6.identifier, tok=self.token
42 )
43
44 body = self.helper.send(self.room_id, body="Test", tok=self.token)
45 local_message_event_id = body["event_id"]
46
47 # Fudge a remote event and persist it. This will be the extremity before
48 # the gap.
49 self.remote_event_1 = event_from_pdu_json(
50 {
51 "type": EventTypes.Message,
52 "state_key": "@user:other",
53 "content": {},
54 "room_id": self.room_id,
55 "sender": "@user:other",
56 "depth": 5,
57 "prev_events": [local_message_event_id],
58 "auth_events": [],
59 "origin_server_ts": self.clock.time_msec(),
60 },
61 RoomVersions.V6,
62 )
63
64 self.persist_event(self.remote_event_1)
65
66 # Check that the current extremities is the remote event.
67 self.assert_extremities([self.remote_event_1.event_id])
68
69 def persist_event(self, event, state=None):
70 """Persist the event, with optional state
71 """
72 context = self.get_success(
73 self.state.compute_event_context(event, old_state=state)
74 )
75 self.get_success(self.persistence.persist_event(event, context))
76
77 def assert_extremities(self, expected_extremities):
78 """Assert the current extremities for the room
79 """
80 extremities = self.get_success(
81 self.store.get_prev_events_for_room(self.room_id)
82 )
83 self.assertCountEqual(extremities, expected_extremities)
84
85 def test_prune_gap(self):
86 """Test that we drop extremities after a gap when we see an event from
87 the same domain.
88 """
89
90 # Fudge a second event which points to an event we don't have. This is a
91 # state event so that the state changes (otherwise we won't prune the
92 # extremity as they'll have the same state group).
93 remote_event_2 = event_from_pdu_json(
94 {
95 "type": EventTypes.Member,
96 "state_key": "@user:other",
97 "content": {"membership": Membership.JOIN},
98 "room_id": self.room_id,
99 "sender": "@user:other",
100 "depth": 50,
101 "prev_events": ["$some_unknown_message"],
102 "auth_events": [],
103 "origin_server_ts": self.clock.time_msec(),
104 },
105 RoomVersions.V6,
106 )
107
108 state_before_gap = self.get_success(self.state.get_current_state(self.room_id))
109
110 self.persist_event(remote_event_2, state=state_before_gap.values())
111
112 # Check the new extremity is just the new remote event.
113 self.assert_extremities([remote_event_2.event_id])
114
115 def test_do_not_prune_gap_if_state_different(self):
116 """Test that we don't prune extremities after a gap if the resolved
117 state is different.
118 """
119
120 # Fudge a second event which points to an event we don't have.
121 remote_event_2 = event_from_pdu_json(
122 {
123 "type": EventTypes.Message,
124 "state_key": "@user:other",
125 "content": {},
126 "room_id": self.room_id,
127 "sender": "@user:other",
128 "depth": 10,
129 "prev_events": ["$some_unknown_message"],
130 "auth_events": [],
131 "origin_server_ts": self.clock.time_msec(),
132 },
133 RoomVersions.V6,
134 )
135
136 # Now we persist it with state with a dropped history visibility
137 # setting. The state resolution across the old and new event will then
138 # include it, and so the resolved state won't match the new state.
139 state_before_gap = dict(
140 self.get_success(self.state.get_current_state(self.room_id))
141 )
142 state_before_gap.pop(("m.room.history_visibility", ""))
143
144 context = self.get_success(
145 self.state.compute_event_context(
146 remote_event_2, old_state=state_before_gap.values()
147 )
148 )
149
150 self.get_success(self.persistence.persist_event(remote_event_2, context))
151
152 # Check that we haven't dropped the old extremity.
153 self.assert_extremities([self.remote_event_1.event_id, remote_event_2.event_id])
154
155 def test_prune_gap_if_old(self):
156 """Test that we drop extremities after a gap when the previous extremity
157 is "old"
158 """
159
160 # Advance the clock for many days to make the old extremity "old". We
161 # also set the depth to "lots".
162 self.reactor.advance(7 * 24 * 60 * 60)
163
164 # Fudge a second event which points to an event we don't have. This is a
165 # state event so that the state changes (otherwise we won't prune the
166 # extremity as they'll have the same state group).
167 remote_event_2 = event_from_pdu_json(
168 {
169 "type": EventTypes.Member,
170 "state_key": "@user:other2",
171 "content": {"membership": Membership.JOIN},
172 "room_id": self.room_id,
173 "sender": "@user:other2",
174 "depth": 10000,
175 "prev_events": ["$some_unknown_message"],
176 "auth_events": [],
177 "origin_server_ts": self.clock.time_msec(),
178 },
179 RoomVersions.V6,
180 )
181
182 state_before_gap = self.get_success(self.state.get_current_state(self.room_id))
183
184 self.persist_event(remote_event_2, state=state_before_gap.values())
185
186 # Check the new extremity is just the new remote event.
187 self.assert_extremities([remote_event_2.event_id])
188
189 def test_do_not_prune_gap_if_other_server(self):
190 """Test that we do not drop extremities after a gap when we see an event
191 from a different domain.
192 """
193
194 # Fudge a second event which points to an event we don't have. This is a
195 # state event so that the state changes (otherwise we won't prune the
196 # extremity as they'll have the same state group).
197 remote_event_2 = event_from_pdu_json(
198 {
199 "type": EventTypes.Member,
200 "state_key": "@user:other2",
201 "content": {"membership": Membership.JOIN},
202 "room_id": self.room_id,
203 "sender": "@user:other2",
204 "depth": 10,
205 "prev_events": ["$some_unknown_message"],
206 "auth_events": [],
207 "origin_server_ts": self.clock.time_msec(),
208 },
209 RoomVersions.V6,
210 )
211
212 state_before_gap = self.get_success(self.state.get_current_state(self.room_id))
213
214 self.persist_event(remote_event_2, state=state_before_gap.values())
215
216 # Check the new extremity is just the new remote event.
217 self.assert_extremities([self.remote_event_1.event_id, remote_event_2.event_id])
218
219 def test_prune_gap_if_dummy_remote(self):
220 """Test that we drop extremities after a gap when the previous extremity
221 is a local dummy event and only points to remote events.
222 """
223
224 body = self.helper.send_event(
225 self.room_id, type=EventTypes.Dummy, content={}, tok=self.token
226 )
227 local_message_event_id = body["event_id"]
228 self.assert_extremities([local_message_event_id])
229
230 # Advance the clock for many days to make the old extremity "old". We
231 # also set the depth to "lots".
232 self.reactor.advance(7 * 24 * 60 * 60)
233
234 # Fudge a second event which points to an event we don't have. This is a
235 # state event so that the state changes (otherwise we won't prune the
236 # extremity as they'll have the same state group).
237 remote_event_2 = event_from_pdu_json(
238 {
239 "type": EventTypes.Member,
240 "state_key": "@user:other2",
241 "content": {"membership": Membership.JOIN},
242 "room_id": self.room_id,
243 "sender": "@user:other2",
244 "depth": 10000,
245 "prev_events": ["$some_unknown_message"],
246 "auth_events": [],
247 "origin_server_ts": self.clock.time_msec(),
248 },
249 RoomVersions.V6,
250 )
251
252 state_before_gap = self.get_success(self.state.get_current_state(self.room_id))
253
254 self.persist_event(remote_event_2, state=state_before_gap.values())
255
256 # Check the new extremity is just the new remote event.
257 self.assert_extremities([remote_event_2.event_id])
258
259 def test_prune_gap_if_dummy_local(self):
260 """Test that we don't drop extremities after a gap when the previous
261 extremity is a local dummy event and points to local events.
262 """
263
264 body = self.helper.send(self.room_id, body="Test", tok=self.token)
265
266 body = self.helper.send_event(
267 self.room_id, type=EventTypes.Dummy, content={}, tok=self.token
268 )
269 local_message_event_id = body["event_id"]
270 self.assert_extremities([local_message_event_id])
271
272 # Advance the clock for many days to make the old extremity "old". We
273 # also set the depth to "lots".
274 self.reactor.advance(7 * 24 * 60 * 60)
275
276 # Fudge a second event which points to an event we don't have. This is a
277 # state event so that the state changes (otherwise we won't prune the
278 # extremity as they'll have the same state group).
279 remote_event_2 = event_from_pdu_json(
280 {
281 "type": EventTypes.Member,
282 "state_key": "@user:other2",
283 "content": {"membership": Membership.JOIN},
284 "room_id": self.room_id,
285 "sender": "@user:other2",
286 "depth": 10000,
287 "prev_events": ["$some_unknown_message"],
288 "auth_events": [],
289 "origin_server_ts": self.clock.time_msec(),
290 },
291 RoomVersions.V6,
292 )
293
294 state_before_gap = self.get_success(self.state.get_current_state(self.room_id))
295
296 self.persist_event(remote_event_2, state=state_before_gap.values())
297
298 # Check the new extremity is just the new remote event.
299 self.assert_extremities([remote_event_2.event_id, local_message_event_id])
300
301 def test_do_not_prune_gap_if_not_dummy(self):
302 """Test that we do not drop extremities after a gap when the previous extremity
303 is not a dummy event.
304 """
305
306 body = self.helper.send(self.room_id, body="test", tok=self.token)
307 local_message_event_id = body["event_id"]
308 self.assert_extremities([local_message_event_id])
309
310 # Fudge a second event which points to an event we don't have. This is a
311 # state event so that the state changes (otherwise we won't prune the
312 # extremity as they'll have the same state group).
313 remote_event_2 = event_from_pdu_json(
314 {
315 "type": EventTypes.Member,
316 "state_key": "@user:other2",
317 "content": {"membership": Membership.JOIN},
318 "room_id": self.room_id,
319 "sender": "@user:other2",
320 "depth": 10000,
321 "prev_events": ["$some_unknown_message"],
322 "auth_events": [],
323 "origin_server_ts": self.clock.time_msec(),
324 },
325 RoomVersions.V6,
326 )
327
328 state_before_gap = self.get_success(self.state.get_current_state(self.room_id))
329
330 self.persist_event(remote_event_2, state=state_before_gap.values())
331
332 # Check the new extremity is just the new remote event.
333 self.assert_extremities([local_message_event_id, remote_event_2.event_id])
4747
4848 self.assertEquals(1, total)
4949 self.assertEquals(self.displayname, users.pop()["displayname"])
50
51 users, total = yield defer.ensureDeferred(
52 self.store.get_users_paginate(0, 10, name="BC", guests=False)
53 )
54
55 self.assertEquals(1, total)
56 self.assertEquals(self.displayname, users.pop()["displayname"])
2626 servlets = [room.register_servlets]
2727
2828 def make_homeserver(self, reactor, clock):
29 hs = self.setup_test_homeserver("server", http_client=None)
29 hs = self.setup_test_homeserver("server", federation_http_client=None)
3030 return hs
3131
3232 def prepare(self, reactor, clock, hs):
1313 # See the License for the specific language governing permissions and
1414 # limitations under the License.
1515
16
17 from mock import Mock
18
1916 from canonicaljson import json
2017
2118 from twisted.internet import defer
2926
3027
3128 class RedactionTestCase(unittest.HomeserverTestCase):
32 def make_homeserver(self, reactor, clock):
33 config = self.default_config()
29 def default_config(self):
30 config = super().default_config()
3431 config["redaction_retention_period"] = "30d"
35 return self.setup_test_homeserver(
36 resource_for_federation=Mock(), http_client=None, config=config
37 )
32 return config
3833
3934 def prepare(self, reactor, clock, hs):
4035 self.store = hs.get_datastore()
1313 # See the License for the specific language governing permissions and
1414 # limitations under the License.
1515
16 from unittest.mock import Mock
17
1816 from synapse.api.constants import Membership
1917 from synapse.rest.admin import register_servlets_for_client_rest_resource
2018 from synapse.rest.client.v1 import login, room
3230 register_servlets_for_client_rest_resource,
3331 room.register_servlets,
3432 ]
35
36 def make_homeserver(self, reactor, clock):
37 hs = self.setup_test_homeserver(
38 resource_for_federation=Mock(), http_client=None
39 )
40 return hs
4133
4234 def prepare(self, reactor, clock, hs: TestHomeServer):
4335
2020 ALICE = "@alice:a"
2121 BOB = "@bob:b"
2222 BOBBY = "@bobby:a"
23 # The localpart isn't 'Bela' on purpose so we can test looking up display names.
24 BELA = "@somenickname:a"
2325
2426
2527 class UserDirectoryStoreTestCase(unittest.TestCase):
3840 )
3941 yield defer.ensureDeferred(
4042 self.store.update_profile_in_user_dir(BOBBY, "bobby", None)
43 )
44 yield defer.ensureDeferred(
45 self.store.update_profile_in_user_dir(BELA, "Bela", None)
4146 )
4247 yield defer.ensureDeferred(
4348 self.store.add_users_in_public_rooms("!room:id", (ALICE, BOB))
7176 )
7277 finally:
7378 self.hs.config.user_directory_search_all_users = False
79
80 @defer.inlineCallbacks
81 def test_search_user_dir_stop_words(self):
82 """Tests that a user can look up another user by searching for the start if its
83 display name even if that name happens to be a common English word that would
84 usually be ignored in full text searches.
85 """
86 self.hs.config.user_directory_search_all_users = True
87 try:
88 r = yield defer.ensureDeferred(self.store.search_user_dir(ALICE, "be", 10))
89 self.assertFalse(r["limited"])
90 self.assertEqual(1, len(r["results"]))
91 self.assertDictEqual(
92 r["results"][0],
93 {"user_id": BELA, "display_name": "Bela", "avatar_url": None},
94 )
95 finally:
96 self.hs.config.user_directory_search_all_users = False
3636 self.hs_clock = Clock(self.reactor)
3737 self.homeserver = setup_test_homeserver(
3838 self.addCleanup,
39 http_client=self.http_client,
39 federation_http_client=self.http_client,
4040 clock=self.hs_clock,
4141 reactor=self.reactor,
4242 )
133133 }
134134 )
135135
136 with LoggingContext(request="lying_event"):
136 with LoggingContext():
137137 failure = self.get_failure(
138138 self.handler.on_receive_pdu(
139139 "test.serv", lying_event, sent_to_us_directly=True
1818
1919 from synapse.api.constants import LoginType
2020 from synapse.api.errors import Codes, HttpResponseException, SynapseError
21 from synapse.appservice import ApplicationService
2122 from synapse.rest.client.v2_alpha import register, sync
2223
2324 from tests import unittest
7475 self.assertEqual(e.code, 403)
7576 self.assertEqual(e.errcode, Codes.RESOURCE_LIMIT_EXCEEDED)
7677
78 def test_as_ignores_mau(self):
79 """Test that application services can still create users when the MAU
80 limit has been reached. This only works when application service
81 user ip tracking is disabled.
82 """
83
84 # Create and sync so that the MAU counts get updated
85 token1 = self.create_user("kermit1")
86 self.do_sync_for_user(token1)
87 token2 = self.create_user("kermit2")
88 self.do_sync_for_user(token2)
89
90 # check we're testing what we think we are: there should be two active users
91 self.assertEqual(self.get_success(self.store.get_monthly_active_count()), 2)
92
93 # We've created and activated two users, we shouldn't be able to
94 # register new users
95 with self.assertRaises(SynapseError) as cm:
96 self.create_user("kermit3")
97
98 e = cm.exception
99 self.assertEqual(e.code, 403)
100 self.assertEqual(e.errcode, Codes.RESOURCE_LIMIT_EXCEEDED)
101
102 # Cheekily add an application service that we use to register a new user
103 # with.
104 as_token = "foobartoken"
105 self.store.services_cache.append(
106 ApplicationService(
107 token=as_token,
108 hostname=self.hs.hostname,
109 id="SomeASID",
110 sender="@as_sender:test",
111 namespaces={"users": [{"regex": "@as_*", "exclusive": True}]},
112 )
113 )
114
115 self.create_user("as_kermit4", token=as_token)
116
77117 def test_allowed_after_a_month_mau(self):
78118 # Create and sync so that the MAU counts get updated
79119 token1 = self.create_user("kermit1")
191231 self.reactor.advance(100)
192232 self.assertEqual(2, self.successResultOf(count))
193233
194 def create_user(self, localpart):
234 def create_user(self, localpart, token=None):
195235 request_data = json.dumps(
196236 {
197237 "username": localpart,
200240 }
201241 )
202242
203 request, channel = self.make_request("POST", "/register", request_data)
243 channel = self.make_request(
244 "POST", "/register", request_data, access_token=token,
245 )
204246
205247 if channel.code != 200:
206248 raise HttpResponseException(
212254 return access_token
213255
214256 def do_sync_for_user(self, token):
215 request, channel = self.make_request("GET", "/sync", access_token=token)
257 channel = self.make_request("GET", "/sync", access_token=token)
216258
217259 if channel.code != 200:
218260 raise HttpResponseException(
5555
5656 desc = summarize_paragraphs(example_paras, min_size=200, max_size=500)
5757
58 self.assertEquals(
58 self.assertEqual(
5959 desc,
6060 "Tromsø (Norwegian pronunciation: [ˈtrʊmsœ] ( listen); Northern Sami:"
6161 " Romsa; Finnish: Tromssa[2] Kven: Tromssa) is a city and municipality in"
6868
6969 desc = summarize_paragraphs(example_paras[1:], min_size=200, max_size=500)
7070
71 self.assertEquals(
71 self.assertEqual(
7272 desc,
7373 "Tromsø lies in Northern Norway. The municipality has a population of"
7474 " (2015) 72,066, but with an annual influx of students it has over 75,000"
9595
9696 desc = summarize_paragraphs(example_paras, min_size=200, max_size=500)
9797
98 self.assertEquals(
98 self.assertEqual(
9999 desc,
100100 "Tromsø (Norwegian pronunciation: [ˈtrʊmsœ] ( listen); Northern Sami:"
101101 " Romsa; Finnish: Tromssa[2] Kven: Tromssa) is a city and municipality in"
121121 ]
122122
123123 desc = summarize_paragraphs(example_paras, min_size=200, max_size=500)
124 self.assertEquals(
124 self.assertEqual(
125125 desc,
126126 "Tromsø (Norwegian pronunciation: [ˈtrʊmsœ] ( listen); Northern Sami:"
127127 " Romsa; Finnish: Tromssa[2] Kven: Tromssa) is a city and municipality in"
148148
149149 og = decode_and_calc_og(html, "http://example.com/test.html")
150150
151 self.assertEquals(og, {"og:title": "Foo", "og:description": "Some text."})
151 self.assertEqual(og, {"og:title": "Foo", "og:description": "Some text."})
152152
153153 def test_comment(self):
154154 html = """
163163
164164 og = decode_and_calc_og(html, "http://example.com/test.html")
165165
166 self.assertEquals(og, {"og:title": "Foo", "og:description": "Some text."})
166 self.assertEqual(og, {"og:title": "Foo", "og:description": "Some text."})
167167
168168 def test_comment2(self):
169169 html = """
181181
182182 og = decode_and_calc_og(html, "http://example.com/test.html")
183183
184 self.assertEquals(
184 self.assertEqual(
185185 og,
186186 {
187187 "og:title": "Foo",
202202
203203 og = decode_and_calc_og(html, "http://example.com/test.html")
204204
205 self.assertEquals(og, {"og:title": "Foo", "og:description": "Some text."})
205 self.assertEqual(og, {"og:title": "Foo", "og:description": "Some text."})
206206
207207 def test_missing_title(self):
208208 html = """
215215
216216 og = decode_and_calc_og(html, "http://example.com/test.html")
217217
218 self.assertEquals(og, {"og:title": None, "og:description": "Some text."})
218 self.assertEqual(og, {"og:title": None, "og:description": "Some text."})
219219
220220 def test_h1_as_title(self):
221221 html = """
229229
230230 og = decode_and_calc_og(html, "http://example.com/test.html")
231231
232 self.assertEquals(og, {"og:title": "Title", "og:description": "Some text."})
232 self.assertEqual(og, {"og:title": "Title", "og:description": "Some text."})
233233
234234 def test_missing_title_and_broken_h1(self):
235235 html = """
243243
244244 og = decode_and_calc_og(html, "http://example.com/test.html")
245245
246 self.assertEquals(og, {"og:title": None, "og:description": "Some text."})
246 self.assertEqual(og, {"og:title": None, "og:description": "Some text."})
247
248 def test_empty(self):
249 html = ""
250 og = decode_and_calc_og(html, "http://example.com/test.html")
251 self.assertEqual(og, {})
3737 self.reactor = ThreadedMemoryReactorClock()
3838 self.hs_clock = Clock(self.reactor)
3939 self.homeserver = setup_test_homeserver(
40 self.addCleanup, http_client=None, clock=self.hs_clock, reactor=self.reactor
40 self.addCleanup,
41 federation_http_client=None,
42 clock=self.hs_clock,
43 reactor=self.reactor,
4144 )
4245
4346 def test_handler_for_request(self):
6063 "test_servlet",
6164 )
6265
63 request, channel = make_request(
66 make_request(
6467 self.reactor, FakeSite(res), b"GET", b"/_matrix/foo/%E2%98%83?a=%E2%98%83"
6568 )
6669
67 self.assertEqual(request.args, {b"a": ["\N{SNOWMAN}".encode("utf8")]})
6870 self.assertEqual(got_kwargs, {"room_id": "\N{SNOWMAN}"})
6971
7072 def test_callback_direct_exception(self):
8183 "GET", [re.compile("^/_matrix/foo$")], _callback, "test_servlet"
8284 )
8385
84 _, channel = make_request(self.reactor, FakeSite(res), b"GET", b"/_matrix/foo")
86 channel = make_request(self.reactor, FakeSite(res), b"GET", b"/_matrix/foo")
8587
8688 self.assertEqual(channel.result["code"], b"500")
8789
105107 "GET", [re.compile("^/_matrix/foo$")], _callback, "test_servlet"
106108 )
107109
108 _, channel = make_request(self.reactor, FakeSite(res), b"GET", b"/_matrix/foo")
110 channel = make_request(self.reactor, FakeSite(res), b"GET", b"/_matrix/foo")
109111
110112 self.assertEqual(channel.result["code"], b"500")
111113
123125 "GET", [re.compile("^/_matrix/foo$")], _callback, "test_servlet"
124126 )
125127
126 _, channel = make_request(self.reactor, FakeSite(res), b"GET", b"/_matrix/foo")
128 channel = make_request(self.reactor, FakeSite(res), b"GET", b"/_matrix/foo")
127129
128130 self.assertEqual(channel.result["code"], b"403")
129131 self.assertEqual(channel.json_body["error"], "Forbidden!!one!")
145147 "GET", [re.compile("^/_matrix/foo$")], _callback, "test_servlet"
146148 )
147149
148 _, channel = make_request(
149 self.reactor, FakeSite(res), b"GET", b"/_matrix/foobar"
150 )
150 channel = make_request(self.reactor, FakeSite(res), b"GET", b"/_matrix/foobar")
151151
152152 self.assertEqual(channel.result["code"], b"400")
153153 self.assertEqual(channel.json_body["error"], "Unrecognized request")
169169 )
170170
171171 # The path was registered as GET, but this is a HEAD request.
172 _, channel = make_request(self.reactor, FakeSite(res), b"HEAD", b"/_matrix/foo")
172 channel = make_request(self.reactor, FakeSite(res), b"HEAD", b"/_matrix/foo")
173173
174174 self.assertEqual(channel.result["code"], b"200")
175175 self.assertNotIn("body", channel.result)
201201 )
202202
203203 # render the request and return the channel
204 _, channel = make_request(self.reactor, site, method, path, shorthand=False)
204 channel = make_request(self.reactor, site, method, path, shorthand=False)
205205 return channel
206206
207207 def test_unknown_options_request(self):
274274 res = WrapHtmlRequestHandlerTests.TestResource()
275275 res.callback = callback
276276
277 _, channel = make_request(self.reactor, FakeSite(res), b"GET", b"/path")
277 channel = make_request(self.reactor, FakeSite(res), b"GET", b"/path")
278278
279279 self.assertEqual(channel.result["code"], b"200")
280280 body = channel.result["body"]
292292 res = WrapHtmlRequestHandlerTests.TestResource()
293293 res.callback = callback
294294
295 _, channel = make_request(self.reactor, FakeSite(res), b"GET", b"/path")
295 channel = make_request(self.reactor, FakeSite(res), b"GET", b"/path")
296296
297297 self.assertEqual(channel.result["code"], b"301")
298298 headers = channel.result["headers"]
313313 res = WrapHtmlRequestHandlerTests.TestResource()
314314 res.callback = callback
315315
316 _, channel = make_request(self.reactor, FakeSite(res), b"GET", b"/path")
316 channel = make_request(self.reactor, FakeSite(res), b"GET", b"/path")
317317
318318 self.assertEqual(channel.result["code"], b"304")
319319 headers = channel.result["headers"]
332332 res = WrapHtmlRequestHandlerTests.TestResource()
333333 res.callback = callback
334334
335 _, channel = make_request(self.reactor, FakeSite(res), b"HEAD", b"/path")
335 channel = make_request(self.reactor, FakeSite(res), b"HEAD", b"/path")
336336
337337 self.assertEqual(channel.result["code"], b"200")
338338 self.assertNotIn("body", channel.result)
5252 def test_ui_auth(self):
5353 # Do a UI auth request
5454 request_data = json.dumps({"username": "kermit", "password": "monkey"})
55 request, channel = self.make_request(b"POST", self.url, request_data)
55 channel = self.make_request(b"POST", self.url, request_data)
5656
5757 self.assertEquals(channel.result["code"], b"401", channel.result)
5858
9595
9696 self.registration_handler.check_username = Mock(return_value=True)
9797
98 request, channel = self.make_request(b"POST", self.url, request_data)
98 channel = self.make_request(b"POST", self.url, request_data)
9999
100100 # We don't bother checking that the response is correct - we'll leave that to
101101 # other tests. We just want to make sure we're on the right path.
112112 },
113113 }
114114 )
115 request, channel = self.make_request(b"POST", self.url, request_data)
115 channel = self.make_request(b"POST", self.url, request_data)
116116
117117 # We're interested in getting a response that looks like a successful
118118 # registration, not so much that the details are exactly what we want.
2020 import warnings
2121 from asyncio import Future
2222 from typing import Any, Awaitable, Callable, TypeVar
23
24 from mock import Mock
25
26 import attr
27
28 from twisted.python.failure import Failure
29 from twisted.web.client import ResponseDone
2330
2431 TV = TypeVar("TV")
2532
7986 sys.unraisablehook = unraisablehook # type: ignore
8087
8188 return cleanup
89
90
91 def simple_async_mock(return_value=None, raises=None) -> Mock:
92 # AsyncMock is not available in python3.5, this mimics part of its behaviour
93 async def cb(*args, **kwargs):
94 if raises:
95 raise raises
96 return return_value
97
98 return Mock(side_effect=cb)
99
100
101 @attr.s
102 class FakeResponse:
103 """A fake twisted.web.IResponse object
104
105 there is a similar class at treq.test.test_response, but it lacks a `phrase`
106 attribute, and didn't support deliverBody until recently.
107 """
108
109 # HTTP response code
110 code = attr.ib(type=int)
111
112 # HTTP response phrase (eg b'OK' for a 200)
113 phrase = attr.ib(type=bytes)
114
115 # body of the response
116 body = attr.ib(type=bytes)
117
118 def deliverBody(self, protocol):
119 protocol.dataReceived(self.body)
120 protocol.connectionLost(Failure(ResponseDone()))
4747 handler = ToTwistedHandler()
4848 formatter = logging.Formatter(log_format)
4949 handler.setFormatter(formatter)
50 handler.addFilter(LoggingContextFilter(request=""))
50 handler.addFilter(LoggingContextFilter())
5151 root_logger.addHandler(handler)
5252
5353 log_level = os.environ.get("SYNAPSE_TEST_LOG_LEVEL", "ERROR")
1919 import inspect
2020 import logging
2121 import time
22 from typing import Optional, Tuple, Type, TypeVar, Union, overload
22 from typing import Dict, Iterable, Optional, Tuple, Type, TypeVar, Union
2323
2424 from mock import Mock, patch
2525
4545 )
4646 from synapse.server import HomeServer
4747 from synapse.types import UserID, create_requester
48 from synapse.util.httpresourcetree import create_resource_tree
4849 from synapse.util.ratelimitutils import FederationRateLimiter
4950
5051 from tests.server import FakeChannel, get_clock, make_request, setup_test_homeserver
319320 """
320321 Create a the root resource for the test server.
321322
323 The default calls `self.create_resource_dict` and builds the resultant dict
324 into a tree.
325 """
326 root_resource = Resource()
327 create_resource_tree(self.create_resource_dict(), root_resource)
328 return root_resource
329
330 def create_resource_dict(self) -> Dict[str, Resource]:
331 """Create a resource tree for the test server
332
333 A resource tree is a mapping from path to twisted.web.resource.
334
322335 The default implementation creates a JsonResource and calls each function in
323 `servlets` to register servletes against it
324 """
325 resource = JsonResource(self.hs)
326
336 `servlets` to register servlets against it.
337 """
338 servlet_resource = JsonResource(self.hs)
327339 for servlet in self.servlets:
328 servlet(self.hs, resource)
329
330 return resource
340 servlet(self.hs, servlet_resource)
341 return {
342 "/_matrix/client": servlet_resource,
343 "/_synapse/admin": servlet_resource,
344 }
331345
332346 def default_config(self):
333347 """
357371 Function to optionally be overridden in subclasses.
358372 """
359373
360 # Annoyingly mypy doesn't seem to pick up the fact that T is SynapseRequest
361 # when the `request` arg isn't given, so we define an explicit override to
362 # cover that case.
363 @overload
364 def make_request(
365 self,
366 method: Union[bytes, str],
367 path: Union[bytes, str],
368 content: Union[bytes, dict] = b"",
369 access_token: Optional[str] = None,
370 shorthand: bool = True,
371 federation_auth_origin: str = None,
372 content_is_form: bool = False,
373 await_result: bool = True,
374 ) -> Tuple[SynapseRequest, FakeChannel]:
375 ...
376
377 @overload
378374 def make_request(
379375 self,
380376 method: Union[bytes, str],
386382 federation_auth_origin: str = None,
387383 content_is_form: bool = False,
388384 await_result: bool = True,
389 ) -> Tuple[T, FakeChannel]:
390 ...
391
392 def make_request(
393 self,
394 method: Union[bytes, str],
395 path: Union[bytes, str],
396 content: Union[bytes, dict] = b"",
397 access_token: Optional[str] = None,
398 request: Type[T] = SynapseRequest,
399 shorthand: bool = True,
400 federation_auth_origin: str = None,
401 content_is_form: bool = False,
402 await_result: bool = True,
403 ) -> Tuple[T, FakeChannel]:
385 custom_headers: Optional[
386 Iterable[Tuple[Union[bytes, str], Union[bytes, str]]]
387 ] = None,
388 ) -> FakeChannel:
404389 """
405390 Create a SynapseRequest at the path using the method and containing the
406391 given content.
422407 true (the default), will pump the test reactor until the the renderer
423408 tells the channel the request is finished.
424409
410 custom_headers: (name, value) pairs to add as request headers
411
425412 Returns:
426 Tuple[synapse.http.site.SynapseRequest, channel]
413 The FakeChannel object which stores the result of the request.
427414 """
428415 return make_request(
429416 self.reactor,
437424 federation_auth_origin,
438425 content_is_form,
439426 await_result,
427 custom_headers,
440428 )
441429
442430 def setup_test_homeserver(self, *args, **kwargs):
553541 self.hs.config.registration_shared_secret = "shared"
554542
555543 # Create the user
556 request, channel = self.make_request("GET", "/_synapse/admin/v1/register")
544 channel = self.make_request("GET", "/_synapse/admin/v1/register")
557545 self.assertEqual(channel.code, 200, msg=channel.result)
558546 nonce = channel.json_body["nonce"]
559547
578566 "inhibit_login": True,
579567 }
580568 )
581 request, channel = self.make_request(
569 channel = self.make_request(
582570 "POST", "/_synapse/admin/v1/register", body.encode("utf8")
583571 )
584572 self.assertEqual(channel.code, 200, channel.json_body)
596584 if device_id:
597585 body["device_id"] = device_id
598586
599 request, channel = self.make_request(
587 channel = self.make_request(
600588 "POST", "/_matrix/client/r0/login", json.dumps(body).encode("utf8")
601589 )
602590 self.assertEqual(channel.code, 200, channel.result)
664652 """
665653 body = {"type": "m.login.password", "user": username, "password": password}
666654
667 request, channel = self.make_request(
655 channel = self.make_request(
668656 "POST", "/_matrix/client/r0/login", json.dumps(body).encode("utf8")
669657 )
670658 self.assertEqual(channel.code, 403, channel.result)
690678 A federating homeserver that authenticates incoming requests as `other.example.com`.
691679 """
692680
693 def prepare(self, reactor, clock, homeserver):
681 def create_resource_dict(self) -> Dict[str, Resource]:
682 d = super().create_resource_dict()
683 d["/_matrix/federation"] = TestTransportLayerServer(self.hs)
684 return d
685
686
687 class TestTransportLayerServer(JsonResource):
688 """A test implementation of TransportLayerServer
689
690 authenticates incoming requests as `other.example.com`.
691 """
692
693 def __init__(self, hs):
694 super().__init__(hs)
695
694696 class Authenticator:
695697 def authenticate_request(self, request, content):
696698 return succeed("other.example.com")
697699
700 authenticator = Authenticator()
701
698702 ratelimiter = FederationRateLimiter(
699 clock,
703 hs.get_clock(),
700704 FederationRateLimitConfig(
701705 window_size=1,
702706 sleep_limit=1,
705709 concurrent_requests=1000,
706710 ),
707711 )
708 federation_server.register_servlets(
709 homeserver, self.resource, Authenticator(), ratelimiter
710 )
711
712 return super().prepare(reactor, clock, homeserver)
712
713 federation_server.register_servlets(hs, self, authenticator, ratelimiter)
713714
714715
715716 def override_config(extra_config):
1919 import time
2020 import uuid
2121 import warnings
22 from inspect import getcallargs
2322 from typing import Type
2423 from urllib import parse as urlparse
2524
2625 from mock import Mock, patch
2726
28 from twisted.internet import defer, reactor
27 from twisted.internet import defer
2928
3029 from synapse.api.constants import EventTypes
3130 from synapse.api.errors import CodeMessageException, cs_error
3332 from synapse.config.database import DatabaseConnectionConfig
3433 from synapse.config.homeserver import HomeServerConfig
3534 from synapse.config.server import DEFAULT_ROOM_VERSION
36 from synapse.federation.transport import server as federation_server
3735 from synapse.http.server import HttpServer
3836 from synapse.logging.context import current_context, set_current_context
3937 from synapse.server import HomeServer
4139 from synapse.storage.database import LoggingDatabaseConnection
4240 from synapse.storage.engines import PostgresEngine, create_engine
4341 from synapse.storage.prepare_database import prepare_database
44 from synapse.util.ratelimitutils import FederationRateLimiter
4542
4643 # set this to True to run the tests against postgres instead of sqlite.
4744 #
341338
342339 hs.get_auth_handler().validate_hash = validate_hash
343340
344 fed = kwargs.get("resource_for_federation", None)
345 if fed:
346 register_federation_servlets(hs, fed)
347
348341 return hs
349
350
351 def register_federation_servlets(hs, resource):
352 federation_server.register_servlets(
353 hs,
354 resource=resource,
355 authenticator=federation_server.Authenticator(hs),
356 ratelimiter=FederationRateLimiter(
357 hs.get_clock(), config=hs.config.rc_federation
358 ),
359 )
360
361
362 def get_mock_call_args(pattern_func, mock_func):
363 """ Return the arguments the mock function was called with interpreted
364 by the pattern functions argument list.
365 """
366 invoked_args, invoked_kargs = mock_func.call_args
367 return getcallargs(pattern_func, *invoked_args, **invoked_kargs)
368342
369343
370344 def mock_getRawHeaders(headers=None):
552526 return d
553527
554528
555 def _format_call(args, kwargs):
556 return ", ".join(
557 ["%r" % (a) for a in args] + ["%s=%r" % (k, v) for k, v in kwargs.items()]
558 )
559
560
561 class DeferredMockCallable:
562 """A callable instance that stores a set of pending call expectations and
563 return values for them. It allows a unit test to assert that the given set
564 of function calls are eventually made, by awaiting on them to be called.
565 """
566
567 def __init__(self):
568 self.expectations = []
569 self.calls = []
570
571 def __call__(self, *args, **kwargs):
572 self.calls.append((args, kwargs))
573
574 if not self.expectations:
575 raise ValueError(
576 "%r has no pending calls to handle call(%s)"
577 % (self, _format_call(args, kwargs))
578 )
579
580 for (call, result, d) in self.expectations:
581 if args == call[1] and kwargs == call[2]:
582 d.callback(None)
583 return result
584
585 failure = AssertionError(
586 "Was not expecting call(%s)" % (_format_call(args, kwargs))
587 )
588
589 for _, _, d in self.expectations:
590 try:
591 d.errback(failure)
592 except Exception:
593 pass
594
595 raise failure
596
597 def expect_call_and_return(self, call, result):
598 self.expectations.append((call, result, defer.Deferred()))
599
600 @defer.inlineCallbacks
601 def await_calls(self, timeout=1000):
602 deferred = defer.DeferredList(
603 [d for _, _, d in self.expectations], fireOnOneErrback=True
604 )
605
606 timer = reactor.callLater(
607 timeout / 1000,
608 deferred.errback,
609 AssertionError(
610 "%d pending calls left: %s"
611 % (
612 len([e for e in self.expectations if not e[2].called]),
613 [e for e in self.expectations if not e[2].called],
614 )
615 ),
616 )
617
618 yield deferred
619
620 timer.cancel()
621
622 self.calls = []
623
624 def assert_had_no_calls(self):
625 if self.calls:
626 calls = self.calls
627 self.calls = []
628
629 raise AssertionError(
630 "Expected not to received any calls, got:\n"
631 + "\n".join(["call(%s)" % _format_call(c[0], c[1]) for c in calls])
632 )
633
634
635529 async def create_room(hs, room_id: str, creator_id: str):
636530 """Creates and persist a creation event for the given room
637531 """
66 python-subunit
77 junitxml
88 coverage
9 coverage-enable-subprocess
9
10 # this is pinned since it's a bit of an obscure package.
11 coverage-enable-subprocess==1.0
1012
1113 # cyptography 2.2 requires setuptools >= 18.5
1214 #
2224 # install the "enum34" dependency of cryptography.
2325 pip>=10
2426
25 setenv =
26 PYTHONDONTWRITEBYTECODE = no_byte_code
27 COVERAGE_PROCESS_START = {toxinidir}/.coveragerc
28
2927 [testenv]
3028 deps =
3129 {[base]deps}
3230 extras = all, test
3331
34 whitelist_externals =
35 sh
32 setenv =
33 # use a postgres db for tox environments with "-postgres" in the name
34 # (see https://tox.readthedocs.io/en/3.20.1/config.html#factors-and-factor-conditional-settings)
35 postgres: SYNAPSE_POSTGRES = 1
3636
37 setenv =
38 {[base]setenv}
39 postgres: SYNAPSE_POSTGRES = 1
37 # this is used by .coveragerc to refer to the top of our tree.
4038 TOP={toxinidir}
4139
4240 passenv = *
4341
4442 commands =
45 /usr/bin/find "{toxinidir}" -name '*.pyc' -delete
46 # Add this so that coverage will run on subprocesses
47 {envbindir}/coverage run "{envbindir}/trial" {env:TRIAL_FLAGS:} {posargs:tests} {env:TOXSUFFIX:}
43 # the "env" invocation enables coverage checking for sub-processes. This is
44 # particularly important when running trial with `-j`, since that will make
45 # it run tests in a subprocess, whose coverage would otherwise not be
46 # tracked. (It also makes an explicit `coverage run` command redundant.)
47 #
48 # (See https://coverage.readthedocs.io/en/coverage-5.3/subprocess.html.
49 # Note that the `coverage.process_startup()` call is done by
50 # `coverage-enable-subprocess`.)
51 #
52 # we use "env" rather than putting a value in `setenv` so that it is not
53 # inherited by other tox environments.
54 #
55 # keep this in sync with the copy in `testenv:py35-old`.
56 #
57 /usr/bin/env COVERAGE_PROCESS_START={toxinidir}/.coveragerc "{envbindir}/trial" {env:TRIAL_FLAGS:} {posargs:tests} {env:TOXSUFFIX:}
4858
4959 # As of twisted 16.4, trial tries to import the tests as a package (previously
5060 # it loaded the files explicitly), which means they need to be on the
7989
8090 lxml
8191 coverage
82 coverage-enable-subprocess
92 coverage-enable-subprocess==1.0
8393
8494 commands =
85 /usr/bin/find "{toxinidir}" -name '*.pyc' -delete
8695 # Make all greater-thans equals so we test the oldest version of our direct
8796 # dependencies, but make the pyopenssl 17.0, which can work against an
8897 # OpenSSL 1.1 compiled cryptography (as older ones don't compile on Travis).
91100 # Install Synapse itself. This won't update any libraries.
92101 pip install -e ".[test]"
93102
94 {envbindir}/coverage run "{envbindir}/trial" {env:TRIAL_FLAGS:} {posargs:tests} {env:TOXSUFFIX:}
103 # we have to duplicate the command from `testenv` rather than refer to it
104 # as `{[testenv]commands}`, because we run on ubuntu xenial, which has
105 # tox 2.3.1, and https://github.com/tox-dev/tox/issues/208.
106 #
107 /usr/bin/env COVERAGE_PROCESS_START={toxinidir}/.coveragerc "{envbindir}/trial" {env:TRIAL_FLAGS:} {posargs:tests} {env:TOXSUFFIX:}
95108
96109 [testenv:benchmark]
97110 deps =
156169 {[base]deps}
157170 extras = all,mypy
158171 commands = mypy
159
160 # To find all folders that pass mypy you run:
161 #
162 # find synapse/* -type d -not -name __pycache__ -exec bash -c "mypy '{}' > /dev/null" \; -print