Merge tag 'v2.3.0' into upstream
Sébastien Delafond
7 years ago
19 | 19 | |
20 | 20 | install: |
21 | 21 | - ".travis/install.sh" |
22 | before_script: "flake8 --max-complexity 10 hpack test" | |
22 | before_script: "flake8 hpack test" | |
23 | 23 | script: |
24 | 24 | - > |
25 | 25 | if [[ $TRAVIS_PYTHON_VERSION == pypy ]]; then |
0 | 0 | Release History |
1 | 1 | =============== |
2 | ||
3 | 2.3.0 (2016-08-04) | |
4 | ------------------ | |
5 | ||
6 | **Security Fixes** | |
7 | ||
8 | - CVE-2016-6581: HPACK Bomb. This release now enforces a maximum value of the | |
9 | decompressed size of the header list. This is to avoid the so-called "HPACK | |
10 | Bomb" vulnerability, which is caused when a malicious peer sends a compressed | |
11 | HPACK body that decompresses to a gigantic header list size. | |
12 | ||
13 | This also adds a ``OversizedHeaderListError``, which is thrown by the | |
14 | ``decode`` method if the maximum header list size is being violated. This | |
15 | places the HPACK decoder into a broken state: it must not be used after this | |
16 | exception is thrown. | |
17 | ||
18 | This also adds a ``max_header_list_size`` to the ``Decoder`` object. This | |
19 | controls the maximum allowable decompressed size of the header list. By | |
20 | default this is set to 64kB. | |
2 | 21 | |
3 | 22 | 2.2.0 (2016-04-20) |
4 | 23 | ------------------ |
3 | 3 | |
4 | 4 | .. image:: https://raw.github.com/Lukasa/hyper/development/docs/source/images/hyper.png |
5 | 5 | |
6 | .. image:: https://travis-ci.org/Lukasa/hpack.png?branch=master | |
7 | :target: https://travis-ci.org/Lukasa/hpack | |
6 | .. image:: https://travis-ci.org/python-hyper/hpack.png?branch=master | |
7 | :target: https://travis-ci.org/python-hyper/hpack | |
8 | 8 | |
9 | 9 | This module contains a pure-Python HTTP/2 header encoding (HPACK) logic for use |
10 | 10 | in Python programs that implement HTTP/2. It also contains a compatibility |
19 | 19 | .. autoclass:: hpack.HPACKDecodingError |
20 | 20 | |
21 | 21 | .. autoclass:: hpack.InvalidTableIndex |
22 | ||
23 | .. autoclass:: hpack.OversizedHeaderListError |
53 | 53 | # built documents. |
54 | 54 | # |
55 | 55 | # The short X.Y version. |
56 | version = '2.2.0' | |
56 | version = '2.3.0' | |
57 | 57 | # The full version, including alpha/beta/rc tags. |
58 | release = '2.2.0' | |
58 | release = '2.3.0' | |
59 | 59 | |
60 | 60 | # The language for content autogenerated by Sphinx. Refer to documentation |
61 | 61 | # for a list of supported languages. |
29 | 29 | |
30 | 30 | installation |
31 | 31 | api |
32 | security/index | |
32 | 33 | |
33 | 34 | |
34 | 35 | .. _HPACK: https://tools.ietf.org/html/rfc7541 |
0 | :orphan: | |
1 | ||
2 | HPACK Bomb | |
3 | ========== | |
4 | ||
5 | Hyper Project security advisory, August 4th 2016. | |
6 | ||
7 | Vulnerability | |
8 | ------------- | |
9 | ||
10 | A HTTP/2 implementation built using the priority library could be targetted for | |
11 | a denial of service attack based on HPACK, specifically a so-called "HPACK | |
12 | Bomb" attack. | |
13 | ||
14 | This attack occurs when an attacker inserts a header field that is exactly the | |
15 | size of the HPACK dynamic header table into the dynamic header table. The | |
16 | attacker can then send a header block that is simply repeated requests to | |
17 | expand that field in the dynamic table. This can lead to a gigantic compression | |
18 | ratio of 4,096 or better, meaning that 16kB of data can decompress to 64MB of | |
19 | data on the target machine. | |
20 | ||
21 | It only takes a few such header blocks before the attacker has forced the | |
22 | target to allocate gigabytes of memory, which will take the process down. This | |
23 | requires relatively few resources on the part of the attacker. | |
24 | ||
25 | While we are not aware of any attacker actively exploiting this vulnerability, | |
26 | it has been public disclosed in `this report`_, and so users should assume that | |
27 | they are likely to be targetted by such an attack. | |
28 | ||
29 | Info | |
30 | ---- | |
31 | ||
32 | This issue has been given the name CVE-2016-6581. | |
33 | ||
34 | Affected Versions | |
35 | ----------------- | |
36 | ||
37 | This issue affects all versions of the HPACK library prior to 2.3.0. | |
38 | ||
39 | The Solution | |
40 | ------------ | |
41 | ||
42 | In version 2.3.0, the HPACK library limits the maximum decompressed size of the | |
43 | header block. It does so by essentially adding support for the HTTP/2 setting | |
44 | ``SETTINGS_MAX_HEADER_LIST_SIZE``. This value defaults to 64kB, but is | |
45 | user-configurable. | |
46 | ||
47 | If it is necessary to backport a patch, the patch can be found in | |
48 | `this GitHub pull request`_. | |
49 | ||
50 | Recommendations | |
51 | --------------- | |
52 | ||
53 | We suggest you take the following actions immediately, in order of preference: | |
54 | ||
55 | 1. Update HPACK to 2.3.0 immediately. | |
56 | 2. Backport the patch made available on GitHub. | |
57 | 3. Substantially decrease the maximum size of the compressed header block your | |
58 | application will accept, or alternatively ensure that each decompressed | |
59 | header block is freed before your application processes the next one. | |
60 | ||
61 | Timeline | |
62 | -------- | |
63 | ||
64 | This class of vulnerability was publicly reported in `this report`_ on the | |
65 | 3rd of August. We requested a CVE ID from Mitre the same day. | |
66 | ||
67 | HPACK 2.3.0 was released on the 4th of August, at the same time as the | |
68 | publication of this advisory. | |
69 | ||
70 | ||
71 | .. _this report: http://www.imperva.com/docs/Imperva_HII_HTTP2.pdf | |
72 | .. _this GitHub pull request: https://github.com/python-hyper/hpack/pull/56 |
0 | Vulnerability Notifications | |
1 | =========================== | |
2 | ||
3 | This section of the page contains all known vulnerabilities in the HPACK | |
4 | library. These vulnerabilities have all been reported to us via our | |
5 | `vulnerability disclosure policy`_. | |
6 | ||
7 | Known Vulnerabilities | |
8 | --------------------- | |
9 | ||
10 | +----+---------------------------+----------------+---------------+--------------+---------------+ | |
11 | | \# | Vulnerability | Date Announced | First Version | Last Version | CVE | | |
12 | +====+===========================+================+===============+==============+===============+ | |
13 | | 1 | :doc:`HPACK Bomb | 2016-08-04 | 1.0.0 | 2.2.0 | CVE-2016-6581 | | |
14 | | | <CVE-2016-6581>` | | | | | | |
15 | +----+---------------------------+----------------+---------------+--------------+---------------+ | |
16 | ||
17 | .. _vulnerability disclosure policy: http://python-hyper.org/en/latest/security.html#vulnerability-disclosure |
6 | 6 | """ |
7 | 7 | from .hpack import Encoder, Decoder |
8 | 8 | from .struct import HeaderTuple, NeverIndexedHeaderTuple |
9 | from .exceptions import HPACKError, HPACKDecodingError, InvalidTableIndex | |
9 | from .exceptions import ( | |
10 | HPACKError, HPACKDecodingError, InvalidTableIndex, OversizedHeaderListError | |
11 | ) | |
10 | 12 | |
11 | 13 | __all__ = [ |
12 | 14 | 'Encoder', 'Decoder', 'HPACKError', 'HPACKDecodingError', |
13 | 'InvalidTableIndex', 'HeaderTuple', 'NeverIndexedHeaderTuple' | |
15 | 'InvalidTableIndex', 'HeaderTuple', 'NeverIndexedHeaderTuple', | |
16 | 'OversizedHeaderListError' | |
14 | 17 | ] |
15 | 18 | |
16 | __version__ = '2.2.0' | |
19 | __version__ = '2.3.0' |
25 | 25 | An invalid table index was received. |
26 | 26 | """ |
27 | 27 | pass |
28 | ||
29 | ||
30 | class OversizedHeaderListError(HPACKDecodingError): | |
31 | """ | |
32 | A header list that was larger than we allow has been received. This may be | |
33 | a DoS attack. | |
34 | ||
35 | .. versionadded:: 2.3.0 | |
36 | """ | |
37 | pass |
6 | 6 | """ |
7 | 7 | import logging |
8 | 8 | |
9 | from .table import HeaderTable | |
9 | from .table import HeaderTable, table_entry_size | |
10 | 10 | from .compat import to_byte, to_bytes |
11 | from .exceptions import HPACKDecodingError | |
11 | from .exceptions import HPACKDecodingError, OversizedHeaderListError | |
12 | 12 | from .huffman import HuffmanEncoder |
13 | 13 | from .huffman_constants import ( |
14 | 14 | REQUEST_CODES, REQUEST_CODES_LENGTH |
26 | 26 | basestring = basestring |
27 | 27 | except NameError: # pragma: no cover |
28 | 28 | basestring = (str, bytes) |
29 | ||
30 | ||
31 | # We default the maximum header list we're willing to accept to 64kB. That's a | |
32 | # lot of headers, but if applications want to raise it they can do. | |
33 | DEFAULT_MAX_HEADER_LIST_SIZE = 2**16 | |
29 | 34 | |
30 | 35 | |
31 | 36 | def _unicode_if_needed(header, raw): |
348 | 353 | class Decoder(object): |
349 | 354 | """ |
350 | 355 | An HPACK decoder object. |
351 | """ | |
352 | ||
353 | def __init__(self): | |
356 | ||
357 | .. versionchanged:: 2.3.0 | |
358 | Added ``max_header_list_size`` argument. | |
359 | ||
360 | :param max_header_list_size: The maximum decompressed size we will allow | |
361 | for any single header block. This is a protection against DoS attacks | |
362 | that attempt to force the application to expand a relatively small | |
363 | amount of data into a really large header list, allowing enormous | |
364 | amounts of memory to be allocated. | |
365 | ||
366 | If this amount of data is exceeded, a `OversizedHeaderListError | |
367 | <hpack.OversizedHeaderListError>` exception will be raised. At this | |
368 | point the connection should be shut down, as the HPACK state will no | |
369 | longer be useable. | |
370 | ||
371 | Defaults to 64kB. | |
372 | :type max_header_list_size: ``int`` | |
373 | """ | |
374 | def __init__(self, max_header_list_size=DEFAULT_MAX_HEADER_LIST_SIZE): | |
354 | 375 | self.header_table = HeaderTable() |
376 | ||
377 | #: The maximum decompressed size we will allow for any single header | |
378 | #: block. This is a protection against DoS attacks that attempt to | |
379 | #: force the application to expand a relatively small amount of data | |
380 | #: into a really large header list, allowing enormous amounts of memory | |
381 | #: to be allocated. | |
382 | #: | |
383 | #: If this amount of data is exceeded, a `OversizedHeaderListError | |
384 | #: <hpack.OversizedHeaderListError>` exception will be raised. At this | |
385 | #: point the connection should be shut down, as the HPACK state will no | |
386 | #: longer be useable. | |
387 | #: | |
388 | #: Defaults to 64kB. | |
389 | #: | |
390 | #: .. versionadded:: 2.3.0 | |
391 | self.max_header_list_size = max_header_list_size | |
355 | 392 | |
356 | 393 | @property |
357 | 394 | def header_table_size(self): |
384 | 421 | data_mem = memoryview(data) |
385 | 422 | headers = [] |
386 | 423 | data_len = len(data) |
424 | inflated_size = 0 | |
387 | 425 | current_index = 0 |
388 | 426 | |
389 | 427 | while current_index < data_len: |
421 | 459 | |
422 | 460 | if header: |
423 | 461 | headers.append(header) |
462 | inflated_size += table_entry_size(*header) | |
463 | ||
464 | if inflated_size > self.max_header_list_size: | |
465 | raise OversizedHeaderListError( | |
466 | "A header list larger than %d has been received" % | |
467 | self.max_header_list_size | |
468 | ) | |
424 | 469 | |
425 | 470 | current_index += consumed |
426 | 471 |
284 | 284 | 20, 24, 20, 21, 22, 21, 21, 23, 22, 22, 25, 25, 24, 24, 26, 23, |
285 | 285 | 26, 27, 26, 26, 27, 27, 27, 27, 27, 28, 27, 27, 27, 27, 27, 26, |
286 | 286 | 30, |
287 | ] # noqa | |
287 | ] |
0 | 0 | [wheel] |
1 | 1 | universal = 1 |
2 | ||
3 | [flake8] | |
4 | max-complexity = 10 | |
5 | exclude = | |
6 | hpack/huffman_constants.py |
2 | 2 | Encoder, Decoder, encode_integer, decode_integer, _dict_to_iterable, |
3 | 3 | _to_bytes |
4 | 4 | ) |
5 | from hpack.exceptions import HPACKDecodingError, InvalidTableIndex | |
5 | from hpack.exceptions import ( | |
6 | HPACKDecodingError, InvalidTableIndex, OversizedHeaderListError | |
7 | ) | |
6 | 8 | from hpack.struct import HeaderTuple, NeverIndexedHeaderTuple |
7 | 9 | import itertools |
8 | 10 | import os |
627 | 629 | header = headers[0] |
628 | 630 | assert isinstance(header, NeverIndexedHeaderTuple) |
629 | 631 | |
632 | def test_max_header_list_size(self): | |
633 | """ | |
634 | If the header block is larger than the max_header_list_size, the HPACK | |
635 | decoder throws an OversizedHeaderListError. | |
636 | """ | |
637 | d = Decoder(max_header_list_size=44) | |
638 | data = b'\x14\x0c/sample/path' | |
639 | ||
640 | with pytest.raises(OversizedHeaderListError): | |
641 | d.decode(data) | |
642 | ||
630 | 643 | |
631 | 644 | class TestIntegerEncoding(object): |
632 | 645 | # These tests are stolen from the HPACK spec. |