Import upstream version 1.13.0+git20210519.1.8609d38
Debian Janitor
2 years ago
0 | 0 | Metadata-Version: 2.1 |
1 | 1 | Name: eliot |
2 | Version: 1.11.0 | |
2 | Version: 0+unknown | |
3 | 3 | Summary: Logging library that tells you why it happened |
4 | 4 | Home-page: https://github.com/itamarst/eliot/ |
5 | 5 | Maintainer: Itamar Turner-Trauring |
36 | 36 | |
37 | 37 | Eliot is only used to generate your logs; you will might need tools like Logstash and ElasticSearch to aggregate and store logs if you are using multiple processes across multiple machines. |
38 | 38 | |
39 | Eliot supports Python 3.5, 3.6, 3.7, and 3.8, as well as PyPy3. | |
39 | Eliot supports Python 3.6, 3.7, 3.8, and 3.9, as well as PyPy3. | |
40 | 40 | It is maintained by Itamar Turner-Trauring, and released under the Apache 2.0 License. |
41 | 41 | |
42 | 42 | Python 2.7 is in legacy support mode, with the last release supported being 1.7; see `here <https://eliot.readthedocs.io/en/stable/python2.html>`_ for details. |
63 | 63 | Classifier: Operating System :: OS Independent |
64 | 64 | Classifier: Programming Language :: Python |
65 | 65 | Classifier: Programming Language :: Python :: 3 |
66 | Classifier: Programming Language :: Python :: 3.5 | |
67 | 66 | Classifier: Programming Language :: Python :: 3.6 |
68 | 67 | Classifier: Programming Language :: Python :: 3.7 |
69 | 68 | Classifier: Programming Language :: Python :: 3.8 |
69 | Classifier: Programming Language :: Python :: 3.9 | |
70 | 70 | Classifier: Programming Language :: Python :: Implementation :: CPython |
71 | 71 | Classifier: Programming Language :: Python :: Implementation :: PyPy |
72 | 72 | Classifier: Topic :: System :: Logging |
73 | Requires-Python: >=3.5.3 | |
73 | Requires-Python: >=3.6.0 | |
74 | Provides-Extra: dev | |
75 | Provides-Extra: journald | |
74 | 76 | Provides-Extra: test |
75 | Provides-Extra: journald | |
76 | Provides-Extra: dev |
28 | 28 | |
29 | 29 | Eliot is only used to generate your logs; you will might need tools like Logstash and ElasticSearch to aggregate and store logs if you are using multiple processes across multiple machines. |
30 | 30 | |
31 | Eliot supports Python 3.5, 3.6, 3.7, and 3.8, as well as PyPy3. | |
31 | Eliot supports Python 3.6, 3.7, 3.8, and 3.9, as well as PyPy3. | |
32 | 32 | It is maintained by Itamar Turner-Trauring, and released under the Apache 2.0 License. |
33 | 33 | |
34 | 34 | Python 2.7 is in legacy support mode, with the last release supported being 1.7; see `here <https://eliot.readthedocs.io/en/stable/python2.html>`_ for details. |
232 | 232 | self.assertEqual(servers, [msg.message["server"] for msg in messages]) |
233 | 233 | |
234 | 234 | |
235 | Custom JSON encoding | |
236 | -------------------- | |
237 | ||
238 | Just like a ``FileDestination`` can have a custom JSON encoder, so can your tests, so you can validate your messages with that JSON encoder: | |
239 | ||
240 | .. code-block:: python | |
241 | ||
242 | from unittest import TestCase | |
243 | from eliot.json import EliotJSONEncoder | |
244 | from eliot.testing import capture_logging | |
245 | ||
246 | class MyClass: | |
247 | def __init__(self, x): | |
248 | self.x = x | |
249 | ||
250 | class MyEncoder(EliotJSONEncoder): | |
251 | def default(self, obj): | |
252 | if isinstance(obj, MyClass): | |
253 | return {"x": obj.x} | |
254 | return EliotJSONEncoder.default(self, obj) | |
255 | ||
256 | class LoggingTests(TestCase): | |
257 | @capture_logging(None, encoder_=MyEncoder) | |
258 | def test_logging(self, logger): | |
259 | # Logged messages will be validated using MyEncoder.... | |
260 | ... | |
261 | ||
262 | Notice that the hyphen after ``encoder_`` is deliberate: by default keyword arguments are passed to the assertion function (the first argument to ``@capture_logging``) so it's marked this way to indicate it's part of Eliot's API. | |
263 | ||
235 | 264 | Custom testing setup |
236 | 265 | -------------------- |
237 | 266 |
27 | 27 | * **Start here:** :doc:`Quickstart documentation <quickstart>` |
28 | 28 | * Need help or have any questions? `File an issue <https://github.com/itamarst/eliot/issues/new>`_. |
29 | 29 | * Eliot is licensed under the `Apache 2.0 license <https://github.com/itamarst/eliot/blob/master/LICENSE>`_, and the source code is `available on GitHub <https://github.com/itamarst/eliot>`_. |
30 | * Eliot supports Python 3.8, 3.7, 3.6, and 3.5, as well as PyPy3. | |
30 | * Eliot supports Python 3.9, 3.8, 3.7, 3.6, and PyPy3. | |
31 | 31 | Python 2.7 is in legacy support mode (see :ref:`python2` for details). |
32 | 32 | * **Commercial support** is available from `Python⇒Speed <https://pythonspeed.com/services/#eliot>`_. |
33 | 33 | * Read on for the full documentation. |
0 | 0 | What's New |
1 | 1 | ========== |
2 | ||
3 | 1.13.0 | |
4 | ^^^^^^ | |
5 | ||
6 | Features: | |
7 | ||
8 | * ``@capture_logging`` and ``MemoryLogger`` now support specifying a custom JSON encoder. By default they now use Eliot's encoder. This means tests can now match the encoding used by a ``FileDestination``. | |
9 | * Added support for Python 3.9. | |
10 | ||
11 | Deprecation: | |
12 | ||
13 | * Python 3.5 is no longer supported. | |
14 | ||
15 | 1.12.0 | |
16 | ^^^^^^ | |
17 | ||
18 | Features: | |
19 | ||
20 | * Dask support now includes support for tracing logging of ``dask.persist()``, via wrapper API ``eliot.dask.persist_with_trace()``. | |
21 | ||
22 | Bug fixes: | |
23 | ||
24 | * Dask edge cases that previously weren't handled correctly should work better. | |
2 | 25 | |
3 | 26 | 1.11.0 |
4 | 27 | ^^^^^^ |
17 | 17 | |
18 | 18 | Run ``eliot-prettyprint --help`` to see the various formatting options; you can for example use a more compact one-message-per-line format. |
19 | 19 | |
20 | Additionally, the **highly recommended third-party `eliot-tree`_ tool** renders JSON-formatted Eliot messages into a tree visualizing the tasks' actions. | |
20 | Additionally, the **highly recommended** third-party `eliot-tree`_ tool renders JSON-formatted Eliot messages into a tree visualizing the tasks' actions. | |
21 | 21 | |
22 | 22 | |
23 | 23 | Filtering logs |
43 | 43 | * Ensure all worker processes write the Eliot logs to disk (if you're using the ``multiprocessing`` or ``distributed`` backends). |
44 | 44 | * If you're using multiple worker machines, aggregate all log files into a single place, so you can more easily analyze them with e.g. `eliot-tree <https://github.com/jonathanj/eliottree>`_. |
45 | 45 | * Replace ``dask.compute()`` with ``eliot.dask.compute_with_trace()``. |
46 | * Replace ``dask.persist()`` with ``eliot.dask.persist_with_trace()``. | |
46 | 47 | |
47 | In the following example, you can see how this works for a Dask run using ``distributed``, the recommended Dask scheduler. | |
48 | In the following example, you can see how this works for a Dask run using ``distributed``, the recommended Dask scheduler for more sophisticated use cases. | |
48 | 49 | We'll be using multiple worker processes, but only use a single machine: |
49 | 50 | |
50 | 51 | .. literalinclude:: ../../examples/dask_eliot.py |
261 | 261 | not mutate this list. |
262 | 262 | """ |
263 | 263 | |
264 | def __init__(self): | |
264 | def __init__(self, encoder=EliotJSONEncoder): | |
265 | """ | |
266 | @param encoder: A JSONEncoder subclass to use when encoding JSON. | |
267 | """ | |
265 | 268 | self._lock = Lock() |
269 | self._encoder = encoder | |
266 | 270 | self.reset() |
267 | 271 | |
268 | 272 | @exclusively |
343 | 347 | serializer.serialize(dictionary) |
344 | 348 | |
345 | 349 | try: |
346 | bytesjson.dumps(dictionary) | |
347 | pyjson.dumps(dictionary) | |
350 | pyjson.dumps(dictionary, cls=self._encoder) | |
348 | 351 | except Exception as e: |
349 | 352 | raise TypeError("Message %s doesn't encode to JSON: %s" % (dictionary, e)) |
350 | 353 | |
461 | 464 | Add a destination that writes a JSON message per line to the given file. |
462 | 465 | |
463 | 466 | @param output_file: A file-like object. |
467 | ||
468 | @param encoder: A JSONEncoder subclass to use when encoding JSON. | |
464 | 469 | """ |
465 | 470 | Logger._destinations.add(FileDestination(file=output_file, encoder=encoder)) |
466 | 471 |
387 | 387 | this action's start message. |
388 | 388 | |
389 | 389 | @ivar successFields: A C{list} of L{Field} instances which can appear in |
390 | this action's succesful finish message. | |
390 | this action's successful finish message. | |
391 | 391 | |
392 | 392 | @ivar failureFields: A C{list} of L{Field} instances which can appear in |
393 | 393 | this action's failed finish message (in addition to the built-in |
7 | 7 | |
8 | 8 | version_json = ''' |
9 | 9 | { |
10 | "date": "2019-12-07T14:22:41-0500", | |
11 | "dirty": false, | |
12 | "error": null, | |
13 | "full-revisionid": "4ca0fa7519321aceec860e982123a5c448a9debd", | |
14 | "version": "1.11.0" | |
10 | "date": null, | |
11 | "dirty": null, | |
12 | "error": "unable to compute version", | |
13 | "full-revisionid": null, | |
14 | "version": "0+unknown" | |
15 | 15 | } |
16 | 16 | ''' # END VERSION_JSON |
17 | 17 |
1 | 1 | |
2 | 2 | from pyrsistent import PClass, field |
3 | 3 | |
4 | from dask import compute, optimize | |
5 | from dask.core import toposort, get_dependencies | |
4 | from dask import compute, optimize, persist | |
5 | ||
6 | try: | |
7 | from dask.distributed import Future | |
8 | except: | |
9 | ||
10 | class Future(object): | |
11 | pass | |
12 | ||
13 | ||
14 | from dask.core import toposort, get_dependencies, ishashable | |
6 | 15 | from . import start_action, current_action, Action |
7 | 16 | |
8 | 17 | |
74 | 83 | return compute(*optimized, optimize_graph=False) |
75 | 84 | |
76 | 85 | |
86 | def persist_with_trace(*args): | |
87 | """Do Dask persist(), but with added Eliot tracing. | |
88 | ||
89 | Known issues: | |
90 | ||
91 | 1. Retries will confuse Eliot. Probably need different | |
92 | distributed-tree mechanism within Eliot to solve that. | |
93 | """ | |
94 | # 1. Create top-level Eliot Action: | |
95 | with start_action(action_type="dask:persist"): | |
96 | # In order to reduce logging verbosity, add logging to the already | |
97 | # optimized graph: | |
98 | optimized = optimize(*args, optimizations=[_add_logging]) | |
99 | return persist(*optimized, optimize_graph=False) | |
100 | ||
101 | ||
77 | 102 | def _add_logging(dsk, ignore=None): |
78 | 103 | """ |
79 | 104 | Add logging to a Dask graph. |
100 | 125 | key_names = {} |
101 | 126 | for key in keys: |
102 | 127 | value = dsk[key] |
103 | if not callable(value) and value in keys: | |
128 | if not callable(value) and ishashable(value) and value in keys: | |
104 | 129 | # It's an alias for another key: |
105 | 130 | key_names[key] = key_names[value] |
106 | 131 | else: |
107 | 132 | key_names[key] = simplify(key) |
108 | 133 | |
109 | # 2. Create Eliot child Actions for each key, in topological order: | |
110 | key_to_action_id = {key: str(ctx.serialize_task_id(), "utf-8") for key in keys} | |
134 | # Values in the graph can be either: | |
135 | # | |
136 | # 1. A list of other values. | |
137 | # 2. A tuple, where first value might be a callable, aka a task. | |
138 | # 3. A literal of some sort. | |
139 | def maybe_wrap(key, value): | |
140 | if isinstance(value, list): | |
141 | return [maybe_wrap(key, v) for v in value] | |
142 | elif isinstance(value, tuple): | |
143 | func = value[0] | |
144 | args = value[1:] | |
145 | if not callable(func): | |
146 | # Not a callable, so nothing to wrap. | |
147 | return value | |
148 | wrapped_func = _RunWithEliotContext( | |
149 | task_id=str(ctx.serialize_task_id(), "utf-8"), | |
150 | func=func, | |
151 | key=key_names[key], | |
152 | dependencies=[key_names[k] for k in get_dependencies(dsk, key)], | |
153 | ) | |
154 | return (wrapped_func,) + args | |
155 | else: | |
156 | return value | |
111 | 157 | |
112 | # 3. Replace function with wrapper that logs appropriate Action: | |
158 | # Replace function with wrapper that logs appropriate Action; iterate in | |
159 | # topological order so action task levels are in reasonable order. | |
113 | 160 | for key in keys: |
114 | func = dsk[key][0] | |
115 | args = dsk[key][1:] | |
116 | if not callable(func): | |
117 | # This key is just an alias for another key, no need to add | |
118 | # logging: | |
119 | result[key] = dsk[key] | |
120 | continue | |
121 | wrapped_func = _RunWithEliotContext( | |
122 | task_id=key_to_action_id[key], | |
123 | func=func, | |
124 | key=key_names[key], | |
125 | dependencies=[key_names[k] for k in get_dependencies(dsk, key)], | |
126 | ) | |
127 | result[key] = (wrapped_func,) + tuple(args) | |
161 | result[key] = maybe_wrap(key, dsk[key]) | |
128 | 162 | |
129 | 163 | assert set(result.keys()) == set(dsk.keys()) |
130 | 164 | return result |
131 | 165 | |
132 | 166 | |
133 | __all__ = ["compute_with_trace"] | |
167 | __all__ = ["compute_with_trace", "persist_with_trace"] |
19 | 19 | from ._message import MESSAGE_TYPE_FIELD, TASK_LEVEL_FIELD, TASK_UUID_FIELD |
20 | 20 | from ._output import MemoryLogger |
21 | 21 | from . import _output |
22 | from .json import EliotJSONEncoder | |
22 | 23 | |
23 | 24 | COMPLETED_STATUSES = (FAILED_STATUS, SUCCEEDED_STATUS) |
24 | 25 | |
297 | 298 | return previous_logger |
298 | 299 | |
299 | 300 | |
300 | def validateLogging(assertion, *assertionArgs, **assertionKwargs): | |
301 | def validateLogging( | |
302 | assertion, *assertionArgs, encoder_=EliotJSONEncoder, **assertionKwargs | |
303 | ): | |
301 | 304 | """ |
302 | 305 | Decorator factory for L{unittest.TestCase} methods to add logging |
303 | 306 | validation. |
329 | 332 | |
330 | 333 | @param assertionKwargs: Additional keyword arguments to pass to |
331 | 334 | C{assertion}. |
335 | ||
336 | @param encoder_: C{json.JSONEncoder} subclass to use when validating JSON. | |
332 | 337 | """ |
333 | 338 | |
334 | 339 | def decorator(function): |
336 | 341 | def wrapper(self, *args, **kwargs): |
337 | 342 | skipped = False |
338 | 343 | |
339 | kwargs["logger"] = logger = MemoryLogger() | |
344 | kwargs["logger"] = logger = MemoryLogger(encoder=encoder_) | |
340 | 345 | self.addCleanup(check_for_errors, logger) |
341 | 346 | # TestCase runs cleanups in reverse order, and we want this to |
342 | 347 | # run *before* tracebacks are checked: |
360 | 365 | validate_logging = validateLogging |
361 | 366 | |
362 | 367 | |
363 | def capture_logging(assertion, *assertionArgs, **assertionKwargs): | |
368 | def capture_logging( | |
369 | assertion, *assertionArgs, encoder_=EliotJSONEncoder, **assertionKwargs | |
370 | ): | |
364 | 371 | """ |
365 | 372 | Capture and validate all logging that doesn't specify a L{Logger}. |
366 | 373 | |
368 | 375 | """ |
369 | 376 | |
370 | 377 | def decorator(function): |
371 | @validate_logging(assertion, *assertionArgs, **assertionKwargs) | |
378 | @validate_logging( | |
379 | assertion, *assertionArgs, encoder_=encoder_, **assertionKwargs | |
380 | ) | |
372 | 381 | @wraps(function) |
373 | 382 | def wrapper(self, *args, **kwargs): |
374 | 383 | logger = kwargs["logger"] |
2 | 2 | """ |
3 | 3 | |
4 | 4 | from io import BytesIO |
5 | from json import JSONEncoder | |
6 | ||
7 | ||
8 | class CustomObject(object): | |
9 | """Gets encoded to JSON.""" | |
10 | ||
11 | ||
12 | class CustomJSONEncoder(JSONEncoder): | |
13 | """JSONEncoder that knows about L{CustomObject}.""" | |
14 | ||
15 | def default(self, o): | |
16 | if isinstance(o, CustomObject): | |
17 | return "CUSTOM!" | |
18 | return JSONEncoder.default(self, o) | |
5 | 19 | |
6 | 20 | |
7 | 21 | class FakeSys(object): |
2 | 2 | from unittest import TestCase, skipUnless |
3 | 3 | |
4 | 4 | from ..testing import capture_logging, LoggedAction, LoggedMessage |
5 | from .. import start_action, Message | |
5 | from .. import start_action, log_message | |
6 | 6 | |
7 | 7 | try: |
8 | 8 | import dask |
9 | 9 | from dask.bag import from_sequence |
10 | from dask.distributed import Client | |
11 | import dask.dataframe as dd | |
12 | import pandas as pd | |
10 | 13 | except ImportError: |
11 | 14 | dask = None |
12 | 15 | else: |
13 | from ..dask import compute_with_trace, _RunWithEliotContext, _add_logging | |
16 | from ..dask import ( | |
17 | compute_with_trace, | |
18 | _RunWithEliotContext, | |
19 | _add_logging, | |
20 | persist_with_trace, | |
21 | ) | |
14 | 22 | |
15 | 23 | |
16 | 24 | @skipUnless(dask, "Dask not available.") |
27 | 35 | bag = bag.fold(lambda x, y: x + y) |
28 | 36 | self.assertEqual(dask.compute(bag), compute_with_trace(bag)) |
29 | 37 | |
38 | def test_future(self): | |
39 | """compute_with_trace() can handle Futures.""" | |
40 | client = Client(processes=False) | |
41 | self.addCleanup(client.shutdown) | |
42 | [bag] = dask.persist(from_sequence([1, 2, 3])) | |
43 | bag = bag.map(lambda x: x * 5) | |
44 | result = dask.compute(bag) | |
45 | self.assertEqual(result, ([5, 10, 15],)) | |
46 | self.assertEqual(result, compute_with_trace(bag)) | |
47 | ||
48 | def test_persist_result(self): | |
49 | """persist_with_trace() runs the same logic as process().""" | |
50 | client = Client(processes=False) | |
51 | self.addCleanup(client.shutdown) | |
52 | bag = from_sequence([1, 2, 3]) | |
53 | bag = bag.map(lambda x: x * 7) | |
54 | self.assertEqual( | |
55 | [b.compute() for b in dask.persist(bag)], | |
56 | [b.compute() for b in persist_with_trace(bag)], | |
57 | ) | |
58 | ||
59 | def test_persist_pandas(self): | |
60 | """persist_with_trace() with a Pandas dataframe. | |
61 | ||
62 | This ensures we don't blow up, which used to be the case. | |
63 | """ | |
64 | df = pd.DataFrame() | |
65 | df = dd.from_pandas(df, npartitions=1) | |
66 | persist_with_trace(df) | |
67 | ||
30 | 68 | @capture_logging(None) |
31 | def test_logging(self, logger): | |
69 | def test_persist_logging(self, logger): | |
70 | """persist_with_trace() preserves Eliot context.""" | |
71 | ||
72 | def persister(bag): | |
73 | [bag] = persist_with_trace(bag) | |
74 | return dask.compute(bag) | |
75 | ||
76 | self.assert_logging(logger, persister, "dask:persist") | |
77 | ||
78 | @capture_logging(None) | |
79 | def test_compute_logging(self, logger): | |
32 | 80 | """compute_with_trace() preserves Eliot context.""" |
81 | self.assert_logging(logger, compute_with_trace, "dask:compute") | |
82 | ||
83 | def assert_logging(self, logger, run_with_trace, top_action_name): | |
84 | """Utility function for _with_trace() logging tests.""" | |
33 | 85 | |
34 | 86 | def mult(x): |
35 | Message.log(message_type="mult") | |
87 | log_message(message_type="mult") | |
36 | 88 | return x * 4 |
37 | 89 | |
38 | 90 | def summer(x, y): |
39 | Message.log(message_type="finally") | |
91 | log_message(message_type="finally") | |
40 | 92 | return x + y |
41 | 93 | |
42 | 94 | bag = from_sequence([1, 2]) |
43 | 95 | bag = bag.map(mult).fold(summer) |
44 | 96 | with start_action(action_type="act1"): |
45 | compute_with_trace(bag) | |
97 | run_with_trace(bag) | |
46 | 98 | |
47 | 99 | [logged_action] = LoggedAction.ofType(logger.messages, "act1") |
48 | 100 | self.assertEqual( |
50 | 102 | { |
51 | 103 | "act1": [ |
52 | 104 | { |
53 | "dask:compute": [ | |
105 | top_action_name: [ | |
54 | 106 | {"eliot:remote_task": ["dask:task", "mult"]}, |
55 | 107 | {"eliot:remote_task": ["dask:task", "mult"]}, |
56 | 108 | {"eliot:remote_task": ["dask:task", "finally"]}, |
61 | 113 | ) |
62 | 114 | |
63 | 115 | # Make sure dependencies are tracked: |
64 | mult1_msg, mult2_msg, final_msg = LoggedMessage.ofType( | |
65 | logger.messages, "dask:task" | |
66 | ) | |
116 | ( | |
117 | mult1_msg, | |
118 | mult2_msg, | |
119 | final_msg, | |
120 | ) = LoggedMessage.ofType(logger.messages, "dask:task") | |
67 | 121 | self.assertEqual( |
68 | 122 | sorted(final_msg.message["dependencies"]), |
69 | 123 | sorted([mult1_msg.message["key"], mult2_msg.message["key"]]), |
81 | 135 | @skipUnless(dask, "Dask not available.") |
82 | 136 | class AddLoggingTests(TestCase): |
83 | 137 | """Tests for _add_logging().""" |
138 | ||
139 | maxDiff = None | |
84 | 140 | |
85 | 141 | def test_add_logging_to_full_graph(self): |
86 | 142 | """_add_logging() recreates Dask graph with wrappers.""" |
103 | 159 | logging_removed[key] = value |
104 | 160 | |
105 | 161 | self.assertEqual(logging_removed, graph) |
162 | ||
163 | def test_add_logging_explicit(self): | |
164 | """_add_logging() on more edge cases of the graph.""" | |
165 | ||
166 | def add(s): | |
167 | return s + "s" | |
168 | ||
169 | def add2(s): | |
170 | return s + "s" | |
171 | ||
172 | # b runs first, then d, then a and c. | |
173 | graph = { | |
174 | "a": "d", | |
175 | "d": [1, 2, (add, "b")], | |
176 | ("b", 0): 1, | |
177 | "c": (add2, "d"), | |
178 | } | |
179 | ||
180 | with start_action(action_type="bleh") as action: | |
181 | task_id = action.task_uuid | |
182 | self.assertEqual( | |
183 | _add_logging(graph), | |
184 | { | |
185 | "d": [ | |
186 | 1, | |
187 | 2, | |
188 | ( | |
189 | _RunWithEliotContext( | |
190 | task_id=task_id + "@/2", | |
191 | func=add, | |
192 | key="d", | |
193 | dependencies=["b"], | |
194 | ), | |
195 | "b", | |
196 | ), | |
197 | ], | |
198 | "a": "d", | |
199 | ("b", 0): 1, | |
200 | "c": ( | |
201 | _RunWithEliotContext( | |
202 | task_id=task_id + "@/3", | |
203 | func=add2, | |
204 | key="c", | |
205 | dependencies=["d"], | |
206 | ), | |
207 | "d", | |
208 | ), | |
209 | }, | |
210 | ) |
31 | 31 | from .._validation import ValidationError, Field, _MessageSerializer |
32 | 32 | from .._traceback import write_traceback |
33 | 33 | from ..testing import assertContainsFields |
34 | from .common import CustomObject, CustomJSONEncoder | |
34 | 35 | |
35 | 36 | |
36 | 37 | class MemoryLoggerTests(TestCase): |
120 | 121 | {"message_type": "type", "foo": "will become object()"}, serializer |
121 | 122 | ) |
122 | 123 | self.assertRaises(TypeError, logger.validate) |
124 | ||
125 | @skipUnless(np, "NumPy is not installed.") | |
126 | def test_EliotJSONEncoder(self): | |
127 | """ | |
128 | L{MemoryLogger.validate} uses the EliotJSONEncoder by default to do | |
129 | encoding testing. | |
130 | """ | |
131 | logger = MemoryLogger() | |
132 | logger.write({"message_type": "type", "foo": np.uint64(12)}, None) | |
133 | logger.validate() | |
134 | ||
135 | def test_JSON_custom_encoder(self): | |
136 | """ | |
137 | L{MemoryLogger.validate} will use a custom JSON encoder if one was given. | |
138 | """ | |
139 | logger = MemoryLogger(encoder=CustomJSONEncoder) | |
140 | logger.write( | |
141 | {"message_type": "type", "custom": CustomObject()}, | |
142 | None, | |
143 | ) | |
144 | logger.validate() | |
123 | 145 | |
124 | 146 | def test_serialize(self): |
125 | 147 | """ |
3 | 3 | |
4 | 4 | from __future__ import unicode_literals |
5 | 5 | |
6 | from unittest import SkipTest, TestResult, TestCase | |
6 | from unittest import SkipTest, TestResult, TestCase, skipUnless | |
7 | ||
8 | try: | |
9 | import numpy as np | |
10 | except ImportError: | |
11 | np = None | |
7 | 12 | |
8 | 13 | from ..testing import ( |
9 | 14 | issuperset, |
24 | 29 | from .._message import Message |
25 | 30 | from .._validation import ActionType, MessageType, ValidationError, Field |
26 | 31 | from .._traceback import write_traceback |
27 | from .. import add_destination, remove_destination, _output | |
32 | from .. import add_destination, remove_destination, _output, log_message | |
33 | from .common import CustomObject, CustomJSONEncoder | |
28 | 34 | |
29 | 35 | |
30 | 36 | class IsSuperSetTests(TestCase): |
739 | 745 | ) |
740 | 746 | |
741 | 747 | |
748 | class JSONEncodingTests(TestCase): | |
749 | """Tests for L{capture_logging} JSON encoder support.""" | |
750 | ||
751 | @skipUnless(np, "NumPy is not installed.") | |
752 | @capture_logging(None) | |
753 | def test_default_JSON_encoder(self, logger): | |
754 | """ | |
755 | L{capture_logging} validates using L{EliotJSONEncoder} by default. | |
756 | """ | |
757 | # Default JSON encoder can't handle NumPy: | |
758 | log_message(message_type="hello", number=np.uint32(12)) | |
759 | ||
760 | @capture_logging(None, encoder_=CustomJSONEncoder) | |
761 | def test_custom_JSON_encoder(self, logger): | |
762 | """ | |
763 | L{capture_logging} can be called with a custom JSON encoder, which is then | |
764 | used for validation. | |
765 | """ | |
766 | # Default JSON encoder can't handle this custom object: | |
767 | log_message(message_type="hello", object=CustomObject()) | |
768 | ||
769 | ||
742 | 770 | MESSAGE1 = MessageType( |
743 | 771 | "message1", [Field.forTypes("x", [int], "A number")], "A message for testing." |
744 | 772 | ) |
0 | 0 | Metadata-Version: 2.1 |
1 | 1 | Name: eliot |
2 | Version: 1.11.0 | |
2 | Version: 0+unknown | |
3 | 3 | Summary: Logging library that tells you why it happened |
4 | 4 | Home-page: https://github.com/itamarst/eliot/ |
5 | 5 | Maintainer: Itamar Turner-Trauring |
36 | 36 | |
37 | 37 | Eliot is only used to generate your logs; you will might need tools like Logstash and ElasticSearch to aggregate and store logs if you are using multiple processes across multiple machines. |
38 | 38 | |
39 | Eliot supports Python 3.5, 3.6, 3.7, and 3.8, as well as PyPy3. | |
39 | Eliot supports Python 3.6, 3.7, 3.8, and 3.9, as well as PyPy3. | |
40 | 40 | It is maintained by Itamar Turner-Trauring, and released under the Apache 2.0 License. |
41 | 41 | |
42 | 42 | Python 2.7 is in legacy support mode, with the last release supported being 1.7; see `here <https://eliot.readthedocs.io/en/stable/python2.html>`_ for details. |
63 | 63 | Classifier: Operating System :: OS Independent |
64 | 64 | Classifier: Programming Language :: Python |
65 | 65 | Classifier: Programming Language :: Python :: 3 |
66 | Classifier: Programming Language :: Python :: 3.5 | |
67 | 66 | Classifier: Programming Language :: Python :: 3.6 |
68 | 67 | Classifier: Programming Language :: Python :: 3.7 |
69 | 68 | Classifier: Programming Language :: Python :: 3.8 |
69 | Classifier: Programming Language :: Python :: 3.9 | |
70 | 70 | Classifier: Programming Language :: Python :: Implementation :: CPython |
71 | 71 | Classifier: Programming Language :: Python :: Implementation :: PyPy |
72 | 72 | Classifier: Topic :: System :: Logging |
73 | Requires-Python: >=3.5.3 | |
73 | Requires-Python: >=3.6.0 | |
74 | Provides-Extra: dev | |
75 | Provides-Extra: journald | |
74 | 76 | Provides-Extra: test |
75 | Provides-Extra: journald | |
76 | Provides-Extra: dev |
0 | boltons>=19.0.1 | |
1 | pyrsistent>=0.11.8 | |
0 | 2 | six |
1 | 3 | zope.interface |
2 | pyrsistent>=0.11.8 | |
3 | boltons>=19.0.1 | |
4 | 4 | |
5 | 5 | [:python_version < "3.7" and python_version > "2.7"] |
6 | 6 | aiocontextvars |
7 | 7 | |
8 | 8 | [dev] |
9 | black | |
10 | coverage | |
11 | flake8 | |
9 | 12 | setuptools>=40 |
10 | twine>=1.12.1 | |
11 | coverage | |
12 | 13 | sphinx |
13 | 14 | sphinx_rtd_theme |
14 | flake8 | |
15 | black | |
15 | twine>=1.12.1 | |
16 | 16 | |
17 | 17 | [journald] |
18 | 18 | cffi>=1.1.2 |
19 | 19 | |
20 | 20 | [test] |
21 | 21 | hypothesis>=1.14.0 |
22 | pytest | |
23 | pytest-xdist | |
22 | 24 | testtools |
23 | pytest |
17 | 17 | "Operating System :: OS Independent", |
18 | 18 | "Programming Language :: Python", |
19 | 19 | "Programming Language :: Python :: 3", |
20 | "Programming Language :: Python :: 3.5", | |
21 | 20 | "Programming Language :: Python :: 3.6", |
22 | 21 | "Programming Language :: Python :: 3.7", |
23 | 22 | "Programming Language :: Python :: 3.8", |
23 | "Programming Language :: Python :: 3.9", | |
24 | 24 | "Programming Language :: Python :: Implementation :: CPython", |
25 | 25 | "Programming Language :: Python :: Implementation :: PyPy", |
26 | 26 | "Topic :: System :: Logging", |
29 | 29 | version=versioneer.get_version(), |
30 | 30 | cmdclass=versioneer.get_cmdclass(), |
31 | 31 | description="Logging library that tells you why it happened", |
32 | python_requires=">=3.5.3", | |
32 | python_requires=">=3.6.0", | |
33 | 33 | install_requires=[ |
34 | 34 | # Python 3 compatibility: |
35 | 35 | "six", |
53 | 53 | # Tasteful testing for Python: |
54 | 54 | "testtools", |
55 | 55 | "pytest", |
56 | "pytest-xdist", | |
56 | 57 | ], |
57 | 58 | "dev": [ |
58 | 59 | # Ensure we can do python_requires correctly: |