Codebase list python-dogpile.cache / 9909d9a
Merge tag '0.5.3' into debian/unstable Thomas Goirand 10 years ago
25 changed file(s) with 786 addition(s) and 149 deletion(s). Raw diff Collapse all Expand all
00 *.pyc
1 build/
1 /build
22 dist/
33 docs/build/output/
44 *.orig
55 alembic.ini
66 tox.ini
77 .venv
8 .egg-info
8 *.egg
9 *.egg-info
910 .coverage
0 Copyright (c) 2011-2013 Mike Bayer
0 Copyright (c) 2011-2014 Mike Bayer
11
22 All rights reserved.
33
1313 =============
1414
1515 See the section :ref:`creating_backends` for details on how to
16 register new backends or :ref:`changing_backend_behavior` for details on
17 how to alter the behavior of existing backends.
16 register new backends or :ref:`changing_backend_behavior` for details on
17 how to alter the behavior of existing backends.
1818
1919 .. automodule:: dogpile.cache.api
2020 :members:
21
21
2222
2323 Backends
2424 ==========
3434
3535 .. automodule:: dogpile.cache.backends.file
3636 :members:
37
37
3838 .. automodule:: dogpile.cache.proxy
3939 :members:
40
41
40
41
4242 Plugins
4343 ========
4444
00 ==============
11 Changelog
22 ==============
3 .. changelog::
4 :version: 0.5.3
5 :released: Wed Jan 8 2014
6
7 .. change::
8 :tags: bug
9 :pullreq: 10
10
11 Fixed bug where the key_mangler would get in the way of usage of the
12 async_creation_runner feature within the :meth:`.Region.get_or_create`
13 method, by sending in the mangled key instead of the original key. The
14 "mangled" key is only supposed to be exposed within the backend storage,
15 not the creation function which sends the key back into the :meth:`.Region.set`,
16 which does the mangling itself. Pull request courtesy Ryan Kolak.
17
18 .. change::
19 :tags: bug, py3k
20
21 Fixed bug where the :meth:`.Region.get_multi` method wasn't calling
22 the backend correctly in Py3K (e.g. was passing a destructive ``map()``
23 object) which would cause this method to fail on the memcached backend.
24
25 .. change::
26 :tags: feature
27 :tickets: 55
28
29 Added a ``get()`` method to complement the ``set()``, ``invalidate()``
30 and ``refresh()`` methods established on functions decorated by
31 :meth:`.CacheRegion.cache_on_arguments` and
32 :meth:`.CacheRegion.cache_multi_on_arguments`. Pullreq courtesy
33 Eric Hanchrow.
34
35 .. change::
36 :tags: feature
37 :tickets: 51
38 :pullreq: 11
39
40 Added a new variant on :class:`.MemoryBackend`, :class:`.MemoryPickleBackend`.
41 This backend applies ``pickle.dumps()`` and ``pickle.loads()`` to cached
42 values upon set and get, so that similar copy-on-cache behavior as that
43 of other backends is employed, guarding cached values against subsequent
44 in-memory state changes. Pullreq courtesy Jonathan Vanasco.
45
46 .. change::
47 :tags: bug
48 :pullreq: 9
49
50 Fixed a format call in the redis backend which would otherwise fail
51 on Python 2.6; courtesy Jeff Dairiki.
52
53 .. changelog::
54 :version: 0.5.2
55 :released: Fri Nov 15 2013
56
57 .. change::
58 :tags: bug
59
60 Fixes to routines on Windows, including that default unit tests pass,
61 and an adjustment to the "soft expiration" feature to ensure the
62 expiration works given windows time.time() behavior.
63
64 .. change::
65 :tags: bug
66
67 Added py2.6 compatibility for unsupported ``total_seconds()`` call
68 in region.py
69
70 .. change::
71 :tags: feature
72 :tickets: 44
73
74 Added a new argument ``lock_factory`` to the :class:`.DBMBackend`
75 implementation. This allows for drop-in replacement of the default
76 :class:`.FileLock` backend, which builds on ``os.flock()`` and only
77 supports Unix platforms. A new abstract base :class:`.AbstractFileLock`
78 has been added to provide a common base for custom lock implementations.
79 The documentation points to an example thread-based rw lock which is
80 now tested on Windows.
81
82 .. changelog::
83 :version: 0.5.1
84 :released: Thu Oct 10 2013
85
86 .. change::
87 :tags: feature
88 :tickets: 38
89
90 The :meth:`.CacheRegion.invalidate` method now supports an option
91 ``hard=True|False``. A "hard" invalidation, equivalent to the
92 existing functionality of :meth:`.CacheRegion.invalidate`, means
93 :meth:`.CacheRegion.get_or_create` will not return the "old" value at
94 all, forcing all getters to regenerate or wait for a regeneration.
95 "soft" invalidation means that getters can continue to return the
96 old value until a new one is generated.
97
98 .. change::
99 :tags: feature
100 :tickets: 40
101
102 New dogpile-specific exception classes have been added, so that
103 issues like "region already configured", "region unconfigured",
104 raise dogpile-specific exceptions. Other exception classes have
105 been made more specific. Also added new accessor
106 :attr:`.CacheRegion.is_configured`. Pullreq courtesy Morgan Fainberg.
107
108 .. change::
109 :tags: bug
110
111 Erroneously missed when the same change was made for ``set()``
112 in 0.5.0, the Redis backend now uses ``pickle.HIGHEST_PROTOCOL``
113 for the ``set_multi()`` method as well when producing pickles.
114 Courtesy Ɓukasz Fidosz.
115
116 .. change::
117 :tags: bug, redis, py3k
118 :tickets: 39
119
120 Fixed an errant ``u''`` causing incompatibility in Python3.2
121 in the Redis backend, courtesy Jimmey Mabey.
122
123 .. change::
124 :tags: bug
125
126 The :func:`.util.coerce_string_conf` method now correctly coerces
127 negative integers and those with a leading + sign. This previously
128 prevented configuring a :class:`.CacheRegion` with an ``expiration_time``
129 of ``'-1'``. Courtesy David Beitey.
130
131 .. change::
132 :tags: bug
133
134 The ``refresh()`` method on :meth:`.CacheRegion.cache_multi_on_arguments`
135 now supports the ``asdict`` flag.
136
3137 .. changelog::
4138 :version: 0.5.0
5139 :released: Fri Jun 21 2013
5050
5151 # General information about the project.
5252 project = u'dogpile.cache'
53 copyright = u'2011-2013 Mike Bayer'
53 copyright = u'2011-2014 Mike Bayer'
5454
5555 # The version info for the project you're documenting, acts as replacement for
5656 # |version| and |release|, also used in various other places throughout the
172172
173173 from dogpile.cache import make_region
174174
175 region = make_region("dictionary")
175 region = make_region("myregion")
176
177 region.configure("dictionary")
176178
177179 data = region.set("somekey", "somevalue")
178180
0 __version__ = '0.5.0'
0 __version__ = '0.5.3'
11
22 from .region import CacheRegion, register_backend, make_region
44 register_backend("dogpile.cache.bmemcached", "dogpile.cache.backends.memcached", "BMemcachedBackend")
55 register_backend("dogpile.cache.memcached", "dogpile.cache.backends.memcached", "MemcachedBackend")
66 register_backend("dogpile.cache.memory", "dogpile.cache.backends.memory", "MemoryBackend")
7 register_backend("dogpile.cache.memory_pickle", "dogpile.cache.backends.memory", "MemoryPickleBackend")
78 register_backend("dogpile.cache.redis", "dogpile.cache.backends.redis", "RedisBackend")
1111 from dogpile.cache import compat
1212 from dogpile.cache import util
1313 import os
14 import fcntl
15
16 __all__ = 'DBMBackend', 'FileLock'
14
15 __all__ = 'DBMBackend', 'FileLock', 'AbstractFileLock'
1716
1817 class DBMBackend(CacheBackend):
1918 """A file-backend using a dbm file to store keys.
4746 concurrent writes, the other is to coordinate
4847 value creation (i.e. the dogpile lock). By default,
4948 these lockfiles use the ``flock()`` system call
50 for locking; this is only available on Unix
51 platforms.
49 for locking; this is **only available on Unix
50 platforms**. An alternative lock implementation, such as one
51 which is based on threads or uses a third-party system
52 such as `portalocker <https://pypi.python.org/pypi/portalocker>`_,
53 can be dropped in using the ``lock_factory`` argument
54 in conjunction with the :class:`.AbstractFileLock` base class.
5255
5356 Currently, the dogpile lock is against the entire
5457 DBM file, not per key. This means there can
7982 suffix ".dogpile.lock" to the DBM filename. If
8083 False, then dogpile.cache uses the default dogpile
8184 lock, a plain thread-based mutex.
85 :param lock_factory: a function or class which provides
86 for a read/write lock. Defaults to :class:`.FileLock`.
87 Custom implementations need to implement context-manager
88 based ``read()`` and ``write()`` functions - the
89 :class:`.AbstractFileLock` class is provided as a base class
90 which provides these methods based on individual read/write lock
91 functions. E.g. to replace the lock with the dogpile.core
92 :class:`.ReadWriteMutex`::
93
94 from dogpile.core.readwrite_lock import ReadWriteMutex
95 from dogpile.cache.backends.file import AbstractFileLock
96
97 class MutexLock(AbstractFileLock):
98 def __init__(self, filename):
99 self.mutex = ReadWriteMutex()
100
101 def acquire_read_lock(self, wait):
102 ret = self.mutex.acquire_read_lock(wait)
103 return wait or ret
104
105 def acquire_write_lock(self, wait):
106 ret = self.mutex.acquire_write_lock(wait)
107 return wait or ret
108
109 def release_read_lock(self):
110 return self.mutex.release_read_lock()
111
112 def release_write_lock(self):
113 return self.mutex.release_write_lock()
114
115 from dogpile.cache import make_region
116
117 region = make_region().configure(
118 "dogpile.cache.dbm",
119 expiration_time=300,
120 arguments={
121 "filename": "file.dbm",
122 "lock_factory": MutexLock
123 }
124 )
125
126 While the included :class:`.FileLock` uses ``os.flock()``, a
127 windows-compatible implementation can be built using a library
128 such as `portalocker <https://pypi.python.org/pypi/portalocker>`_.
129
130 .. versionadded:: 0.5.2
131
82132
83133
84134 """
88138 )
89139 dir_, filename = os.path.split(self.filename)
90140
141 self.lock_factory = arguments.get("lock_factory", FileLock)
91142 self._rw_lock = self._init_lock(
92143 arguments.get('rw_lockfile'),
93144 ".rw.lock", dir_, filename)
107158
108159 def _init_lock(self, argument, suffix, basedir, basefile, wrapper=None):
109160 if argument is None:
110 lock = FileLock(os.path.join(basedir, basefile + suffix))
161 lock = self.lock_factory(os.path.join(basedir, basefile + suffix))
111162 elif argument is not False:
112 lock = FileLock(
163 lock = self.lock_factory(
113164 os.path.abspath(
114165 os.path.normpath(argument)
115166 ))
203254 except KeyError:
204255 pass
205256
206 class FileLock(object):
207 """Use lockfiles to coordinate read/write access to a file.
208
209 Only works on Unix systems, using
210 `fcntl.flock() <http://docs.python.org/library/fcntl.html>`_.
257 class AbstractFileLock(object):
258 """Coordinate read/write access to a file.
259
260 typically is a file-based lock but doesn't necessarily have to be.
261
262 The default implementation here is :class:`.FileLock`.
263
264 Implementations should provide the following methods::
265
266 * __init__()
267 * acquire_read_lock()
268 * acquire_write_lock()
269 * release_read_lock()
270 * release_write_lock()
271
272 The ``__init__()`` method accepts a single argument "filename", which
273 may be used as the "lock file", for those implementations that use a lock
274 file.
275
276 Note that multithreaded environments must provide a thread-safe
277 version of this lock. The recommended approach for file-descriptor-based
278 locks is to use a Python ``threading.local()`` so that a unique file descriptor
279 is held per thread. See the source code of :class:`.FileLock` for an
280 implementation example.
281
211282
212283 """
213284
214285 def __init__(self, filename):
215 self._filedescriptor = compat.threading.local()
216 self.filename = filename
286 """Constructor, is given the filename of a potential lockfile.
287
288 The usage of this filename is optional and no file is
289 created by default.
290
291 Raises ``NotImplementedError`` by default, must be
292 implemented by subclasses.
293 """
294 raise NotImplementedError()
217295
218296 def acquire(self, wait=True):
297 """Acquire the "write" lock.
298
299 This is a direct call to :meth:`.AbstractFileLock.acquire_write_lock`.
300
301 """
219302 return self.acquire_write_lock(wait)
220303
221304 def release(self):
305 """Release the "write" lock.
306
307 This is a direct call to :meth:`.AbstractFileLock.release_write_lock`.
308
309 """
222310 self.release_write_lock()
223
224 @property
225 def is_open(self):
226 return hasattr(self._filedescriptor, 'fileno')
227311
228312 @contextmanager
229313 def read(self):
314 """Provide a context manager for the "read" lock.
315
316 This method makes use of :meth:`.AbstractFileLock.acquire_read_lock`
317 and :meth:`.AbstractFileLock.release_read_lock`
318
319 """
320
230321 self.acquire_read_lock(True)
231322 try:
232323 yield
235326
236327 @contextmanager
237328 def write(self):
329 """Provide a context manager for the "write" lock.
330
331 This method makes use of :meth:`.AbstractFileLock.acquire_write_lock`
332 and :meth:`.AbstractFileLock.release_write_lock`
333
334 """
335
238336 self.acquire_write_lock(True)
239337 try:
240338 yield
241339 finally:
242340 self.release_write_lock()
243341
342 @property
343 def is_open(self):
344 """optional method."""
345 raise NotImplementedError()
346
244347 def acquire_read_lock(self, wait):
245 return self._acquire(wait, os.O_RDONLY, fcntl.LOCK_SH)
348 """Acquire a 'reader' lock.
349
350 Raises ``NotImplementedError`` by default, must be
351 implemented by subclasses.
352 """
353 raise NotImplementedError()
246354
247355 def acquire_write_lock(self, wait):
248 return self._acquire(wait, os.O_WRONLY, fcntl.LOCK_EX)
356 """Acquire a 'write' lock.
357
358 Raises ``NotImplementedError`` by default, must be
359 implemented by subclasses.
360 """
361 raise NotImplementedError()
362
363 def release_read_lock(self):
364 """Release a 'reader' lock.
365
366 Raises ``NotImplementedError`` by default, must be
367 implemented by subclasses.
368 """
369 raise NotImplementedError()
370
371 def release_write_lock(self):
372 """Release a 'writer' lock.
373
374 Raises ``NotImplementedError`` by default, must be
375 implemented by subclasses.
376 """
377 raise NotImplementedError()
378
379 class FileLock(AbstractFileLock):
380 """Use lockfiles to coordinate read/write access to a file.
381
382 Only works on Unix systems, using
383 `fcntl.flock() <http://docs.python.org/library/fcntl.html>`_.
384
385 """
386
387 def __init__(self, filename):
388 self._filedescriptor = compat.threading.local()
389 self.filename = filename
390
391 @util.memoized_property
392 def _module(self):
393 import fcntl
394 return fcntl
395
396 @property
397 def is_open(self):
398 return hasattr(self._filedescriptor, 'fileno')
399
400 def acquire_read_lock(self, wait):
401 return self._acquire(wait, os.O_RDONLY, self._module.LOCK_SH)
402
403 def acquire_write_lock(self, wait):
404 return self._acquire(wait, os.O_WRONLY, self._module.LOCK_EX)
249405
250406 def release_read_lock(self):
251407 self._release()
258414 fileno = os.open(self.filename, wrflag)
259415 try:
260416 if not wait:
261 lockflag |= fcntl.LOCK_NB
262 fcntl.flock(fileno, lockflag)
417 lockflag |= self._module.LOCK_NB
418 self._module.flock(fileno, lockflag)
263419 except IOError:
264420 os.close(fileno)
265421 if not wait:
279435 except AttributeError:
280436 return
281437 else:
282 fcntl.flock(fileno, fcntl.LOCK_UN)
438 self._module.flock(fileno, self._module.LOCK_UN)
283439 os.close(fileno)
284440 del self._filedescriptor.fileno
217217 ``pylibmc.Client``.
218218 :param behaviors: a dictionary which will be passed to
219219 ``pylibmc.Client`` as the ``behaviors`` parameter.
220 :param min_compres_len: Integer, will be passed as the
220 :param min_compress_len: Integer, will be passed as the
221221 ``min_compress_len`` parameter to the ``pylibmc.Client.set``
222222 method.
223223
00 """
1 Memory Backend
2 --------------
1 Memory Backends
2 ---------------
33
4 Provides a simple dictionary-based backend.
4 Provides simple dictionary-based backends.
5
6 The two backends are :class:`.MemoryBackend` and :class:`.MemoryPickleBackend`;
7 the latter applies a serialization step to cached values while the former
8 places the value as given into the dictionary.
59
610 """
711
812 from dogpile.cache.api import CacheBackend, NO_VALUE
13 from dogpile.cache.compat import pickle
914
1015 class MemoryBackend(CacheBackend):
1116 """A backend that uses a plain dictionary.
4045
4146
4247 """
48 pickle_values = False
49
4350 def __init__(self, arguments):
4451 self._cache = arguments.pop("cache_dict", {})
4552
4653 def get(self, key):
47 return self._cache.get(key, NO_VALUE)
54 value = self._cache.get(key, NO_VALUE)
55 if value is not NO_VALUE and self.pickle_values:
56 value = pickle.loads(value)
57 return value
4858
4959 def get_multi(self, keys):
50 return [
51 self._cache.get(key, NO_VALUE)
52 for key in keys
53 ]
60 ret = [self._cache.get(key, NO_VALUE)
61 for key in keys]
62 if self.pickle_values:
63 ret = [
64 pickle.loads(value)
65 if value is not NO_VALUE else value
66 for value in ret
67 ]
68 return ret
5469
5570 def set(self, key, value):
71 if self.pickle_values:
72 value = pickle.dumps(value, pickle.HIGHEST_PROTOCOL)
5673 self._cache[key] = value
5774
5875 def set_multi(self, mapping):
59 for key,value in mapping.items():
76 pickle_values = self.pickle_values
77 for key, value in mapping.items():
78 if pickle_values:
79 value = pickle.dumps(value, pickle.HIGHEST_PROTOCOL)
6080 self._cache[key] = value
6181
6282 def delete(self, key):
6585 def delete_multi(self, keys):
6686 for key in keys:
6787 self._cache.pop(key, None)
88
89
90 class MemoryPickleBackend(MemoryBackend):
91 """A backend that uses a plain dictionary, but serializes objects on
92 :meth:`.MemoryBackend.set` and deserializes :meth:`.MemoryBackend.get`.
93
94 E.g.::
95
96 from dogpile.cache import make_region
97
98 region = make_region().configure(
99 'dogpile.cache.memory_pickle'
100 )
101
102 The usage of pickle to serialize cached values allows an object
103 as placed in the cache to be a copy of the original given object, so
104 that any subsequent changes to the given object aren't reflected
105 in the cached value, thus making the backend behave the same way
106 as other backends which make use of serialization.
107
108 The serialization is performed via pickle, and incurs the same
109 performance hit in doing so as that of other backends; in this way
110 the :class:`.MemoryPickleBackend` performance is somewhere in between
111 that of the pure :class:`.MemoryBackend` and the remote server oriented
112 backends such as that of Memcached or Redis.
113
114 Pickle behavior here is the same as that of the Redis backend, using
115 either ``cPickle`` or ``pickle`` and specifying ``HIGHEST_PROTOCOL``
116 upon serialize.
117
118 .. versionadded:: 0.5.3
119
120 """
121 pickle_values = True
77
88 from __future__ import absolute_import
99 from dogpile.cache.api import CacheBackend, NO_VALUE
10 from dogpile.cache.compat import pickle
10 from dogpile.cache.compat import pickle, u
1111
1212 redis = None
1313
104104
105105 def get_mutex(self, key):
106106 if self.distributed_lock:
107 return self.client.lock(u"_lock{}".format(key), self.lock_timeout,
108 self.lock_sleep)
107 return self.client.lock(u('_lock{0}').format(key),
108 self.lock_timeout, self.lock_sleep)
109109 else:
110110 return None
111111
128128 self.client.set(key, pickle.dumps(value, pickle.HIGHEST_PROTOCOL))
129129
130130 def set_multi(self, mapping):
131 mapping = dict((k, pickle.dumps(v)) for k, v in mapping.items())
131 mapping = dict(
132 (k, pickle.dumps(v, pickle.HIGHEST_PROTOCOL))
133 for k, v in mapping.items()
134 )
132135
133136 if not self.redis_expiration_time:
134137 self.client.mset(mapping)
33 py2k = sys.version_info < (3, 0)
44 py3k = sys.version_info >= (3, 0)
55 py32 = sys.version_info >= (3, 2)
6 py27 = sys.version_info >= (2, 7)
67 jython = sys.platform.startswith('java')
7
8 win32 = sys.platform.startswith('win')
89
910 try:
1011 import threading
5455 if py3k or jython:
5556 import pickle
5657 else:
57 import cPickle as pickle⏎
58 import cPickle as pickle
59
60
61 def timedelta_total_seconds(td):
62 if py27:
63 return td.total_seconds()
64 else:
65 return (td.microseconds + (td.seconds + td.days * 24 * 3600) * 1e6) / 1e6
66
67
0 """Exception classes for dogpile.cache."""
1
2
3 class DogpileCacheException(Exception):
4 """Base Exception for dogpile.cache exceptions to inherit from."""
5
6
7 class RegionAlreadyConfigured(DogpileCacheException):
8 """CacheRegion instance is already configured."""
9
10
11 class RegionNotConfigured(DogpileCacheException):
12 """CacheRegion instance has not been configured."""
13
14
15 class ValidationError(DogpileCacheException):
16 """Error validating a value or option."""
00 from __future__ import with_statement
11 from dogpile.core import Lock, NeedRegenerationException
22 from dogpile.core.nameregistry import NameRegistry
3 from . import exception
34 from .util import function_key_generator, PluginLoader, \
45 memoized_property, coerce_string_conf, function_multi_key_generator
56 from .api import NO_VALUE, CachedValue
169170 self.key_mangler = key_mangler
170171 else:
171172 self.key_mangler = None
172 self._invalidated = None
173 self._hard_invalidated = None
174 self._soft_invalidated = None
173175 self.async_creation_runner = async_creation_runner
174176
175177 def configure(self, backend,
220222 """
221223
222224 if "backend" in self.__dict__:
223 raise Exception(
225 raise exception.RegionAlreadyConfigured(
224226 "This region is already "
225227 "configured with backend: %s"
226228 % self.backend)
236238 if not expiration_time or isinstance(expiration_time, Number):
237239 self.expiration_time = expiration_time
238240 elif isinstance(expiration_time, datetime.timedelta):
239 self.expiration_time = int(expiration_time.total_seconds())
241 self.expiration_time = int(compat.timedelta_total_seconds(expiration_time))
240242 else:
241 raise Exception('expiration_time is not a number or timedelta.')
243 raise exception.ValidationError(
244 'expiration_time is not a number or timedelta.')
242245
243246 if self.key_mangler is None:
244247 self.key_mangler = self.backend.key_mangler
261264 proxy = proxy()
262265
263266 if not issubclass(type(proxy), ProxyBackend):
264 raise Exception("Type %s is not a valid ProxyBackend"
265 % type(proxy))
267 raise TypeError("Type %s is not a valid ProxyBackend"
268 % type(proxy))
266269
267270 self.backend = proxy.wrap(self.backend)
268271
288291 else:
289292 return self._LockWrapper()
290293
291 def invalidate(self):
294 def invalidate(self, hard=True):
292295 """Invalidate this :class:`.CacheRegion`.
293296
294297 Invalidation works by setting a current timestamp
301304 local to this instance of :class:`.CacheRegion`.
302305
303306 Once set, the invalidation time is honored by
304 the :meth:`.CacheRegion.get_or_create` and
307 the :meth:`.CacheRegion.get_or_create`,
308 :meth:`.CacheRegion.get_or_create_multi` and
305309 :meth:`.CacheRegion.get` methods.
306310
311 The method
312 supports both "hard" and "soft" invalidation options. With "hard"
313 invalidation, :meth:`.CacheRegion.get_or_create` will force an immediate
314 regeneration of the value which all getters will wait for. With
315 "soft" invalidation, subsequent getters will return the "old" value until
316 the new one is available.
317
318 Usage of "soft" invalidation requires that the region or the method
319 is given a non-None expiration time.
320
307321 .. versionadded:: 0.3.0
308322
323 :param hard: if True, cache values will all require immediate
324 regeneration; dogpile logic won't be used. If False, the
325 creation time of existing values will be pushed back before
326 the expiration time so that a return+regen will be invoked.
327
328 .. versionadded:: 0.5.1
329
309330 """
310 self._invalidated = time.time()
331 if hard:
332 self._hard_invalidated = time.time()
333 self._soft_invalidated = None
334 else:
335 self._hard_invalidated = None
336 self._soft_invalidated = time.time()
311337
312338 def configure_from_config(self, config_dict, prefix):
313339 """Configure from a configuration dictionary
345371
346372 @memoized_property
347373 def backend(self):
348 raise Exception("No backend is configured on this region.")
374 raise exception.RegionNotConfigured(
375 "No backend is configured on this region.")
376
377 @property
378 def is_configured(self):
379 """Return True if the backend has been configured via the
380 :meth:`.CacheRegion.configure` method already.
381
382 .. versionadded:: 0.5.1
383
384 """
385 return 'backend' in self.__dict__
349386
350387 def get(self, key, expiration_time=None, ignore_expiration=False):
351388 """Return a value from the cache, based on the given key.
418455
419456 current_time = time.time()
420457
458 invalidated = self._hard_invalidated or self._soft_invalidated
421459 def value_fn(value):
422460 if value is NO_VALUE:
423461 return value
424462 elif expiration_time is not None and \
425463 current_time - value.metadata["ct"] > expiration_time:
426464 return NO_VALUE
427 elif self._invalidated and \
428 value.metadata["ct"] < self._invalidated:
465 elif invalidated and \
466 value.metadata["ct"] < invalidated:
429467 return NO_VALUE
430468 else:
431469 return value
465503
466504 """
467505 if self.key_mangler:
468 keys = map(lambda key: self.key_mangler(key), keys)
506 keys = list(map(lambda key: self.key_mangler(key), keys))
469507
470508 backend_values = self.backend.get_multi(keys)
471509
548586 :meth:`.CacheRegion.get_or_create_multi` - multiple key/value version
549587
550588 """
589 orig_key = key
551590 if self.key_mangler:
552591 key = self.key_mangler(key)
553592
555594 value = self.backend.get(key)
556595 if value is NO_VALUE or \
557596 value.metadata['v'] != value_version or \
558 (self._invalidated and
559 value.metadata["ct"] < self._invalidated):
597 (self._hard_invalidated and
598 value.metadata["ct"] < self._hard_invalidated):
560599 raise NeedRegenerationException()
561 return value.payload, value.metadata["ct"]
600 ct = value.metadata["ct"]
601 if self._soft_invalidated:
602 if ct < self._soft_invalidated:
603 ct = time.time() - expiration_time - .0001
604
605 return value.payload, ct
562606
563607 def gen_value():
564608 created_value = creator()
573617 if expiration_time is None:
574618 expiration_time = self.expiration_time
575619
620 if expiration_time is None and self._soft_invalidated:
621 raise exception.DogpileCacheException(
622 "Non-None expiration time required "
623 "for soft invalidation")
624
576625 if self.async_creation_runner:
577626 def async_creator(mutex):
578 return self.async_creation_runner(self, key, creator, mutex)
627 return self.async_creation_runner(self, orig_key, creator, mutex)
579628 else:
580629 async_creator = None
581630
629678
630679 def get_value(key):
631680 value = values.get(key, NO_VALUE)
681
632682 if value is NO_VALUE or \
633683 value.metadata['v'] != value_version or \
634 (self._invalidated and
635 value.metadata["ct"] < self._invalidated):
684 (self._hard_invalidated and
685 value.metadata["ct"] < self._hard_invalidated):
686 # dogpile.core understands a 0 here as
687 # "the value is not available", e.g.
688 # _has_value() will return False.
636689 return value.payload, 0
637690 else:
638 return value.payload, value.metadata["ct"]
691 ct = value.metadata["ct"]
692 if self._soft_invalidated:
693 if ct < self._soft_invalidated:
694 ct = time.time() - expiration_time - .0001
695
696 return value.payload, ct
639697
640698 def gen_value():
641699 raise NotImplementedError()
645703
646704 if expiration_time is None:
647705 expiration_time = self.expiration_time
706
707 if expiration_time is None and self._soft_invalidated:
708 raise exception.DogpileCacheException(
709 "Non-None expiration time required "
710 "for soft invalidation")
648711
649712 mutexes = {}
650713
750813 """
751814
752815 if self.key_mangler:
753 keys = map(lambda key: self.key_mangler(key), keys)
816 keys = list(map(lambda key: self.key_mangler(key), keys))
754817
755818 self.backend.delete_multi(keys)
756819
810873 newvalue = generate_something.refresh(5, 6)
811874
812875 .. versionadded:: 0.5.0 Added ``refresh()`` method to decorated
876 function.
877
878 Lastly, the ``get()`` method returns either the value cached
879 for the given key, or the token ``NO_VALUE`` if no such key
880 exists::
881
882 value = generate_something.get(5, 6)
883
884 .. versionadded:: 0.5.3 Added ``get()`` method to decorated
813885 function.
814886
815887 The default key generation will use the name
9421014 key = key_generator(*arg, **kw)
9431015 self.set(key, value)
9441016
1017 def get(*arg, **kw):
1018 key = key_generator(*arg, **kw)
1019 return self.get(key)
1020
9451021 def refresh(*arg, **kw):
9461022 key = key_generator(*arg, **kw)
9471023 value = fn(*arg, **kw)
9511027 decorate.set = set_
9521028 decorate.invalidate = invalidate
9531029 decorate.refresh = refresh
1030 decorate.get = get
9541031
9551032 return decorate
9561033 return decorator
10061083 generate_something.set({"k1": "value1",
10071084 "k2": "value2", "k3": "value3"})
10081085
1009 an ``invalidate()`` method, which has the effect of deleting
1086 ...an ``invalidate()`` method, which has the effect of deleting
10101087 the given sequence of keys using the same mechanism as that of
10111088 :meth:`.CacheRegion.delete_multi`::
10121089
10131090 generate_something.invalidate("k1", "k2", "k3")
10141091
1015 and finally a ``refresh()`` method, which will call the creation
1092 ...a ``refresh()`` method, which will call the creation
10161093 function, cache the new values, and return them::
10171094
10181095 values = generate_something.refresh("k1", "k2", "k3")
1096
1097 ...and a ``get()`` method, which will return values
1098 based on the given arguments::
1099
1100 values = generate_something.get("k1", "k2", "k3")
1101
1102 .. versionadded:: 0.5.3 Added ``get()`` method to decorated
1103 function.
10191104
10201105 Parameters passed to :meth:`.CacheRegion.cache_multi_on_arguments`
10211106 have the same meaning as those passed to
11101195 in zip(gen_keys, keys))
11111196 )
11121197
1198 def get(*arg):
1199 keys = key_generator(*arg)
1200 return self.get_multi(keys)
1201
11131202 def refresh(*arg):
11141203 keys = key_generator(*arg)
11151204 values = fn(*arg)
1116 self.set_multi(
1117 dict(zip(keys, values))
1118 )
1119 return values
1205 if asdict:
1206 self.set_multi(
1207 dict(zip(keys, [values[a] for a in arg]))
1208 )
1209 return values
1210 else:
1211 self.set_multi(
1212 dict(zip(keys, values))
1213 )
1214 return values
11201215
11211216 decorate.set = set_
11221217 decorate.invalidate = invalidate
11231218 decorate.refresh = refresh
1219 decorate.get = get
11241220
11251221 return decorate
11261222 return decorator
11361232
11371233 """
11381234 return CacheRegion(*arg, **kw)
1139
00 from hashlib import sha1
11 import inspect
2 import sys
32 import re
43 import collections
54 from . import compat
1312 continue
1413
1514 v = v.strip()
16 if re.match(r'^\d+$', v):
15 if re.match(r'^[-+]?\d+$', v):
1716 result[k] = int(v)
1817 elif v.lower() in ('false', 'true'):
1918 result[k] = v.lower() == 'true'
3029
3130 def load(self, name):
3231 if name in self.impls:
33 return self.impls[name]()
34 else: #pragma NO COVERAGE
35 # TODO: if someone has ideas on how to
36 # unit test entrypoint stuff, let me know.
32 return self.impls[name]()
33 else: # pragma NO COVERAGE
3734 import pkg_resources
3835 for impl in pkg_resources.iter_entry_points(
3936 self.group,
11
22 [upload_docs]
33 upload-dir = docs/build/output/html
4
5 [wheel]
6 universal = 1
47
58 [upload]
69 sign = 1
1114 with-coverage = 1
1215 cover-erase = 1
1316 nologcapture = 1
17 where = tests
1414 description="A caching front-end based on the Dogpile lock.",
1515 long_description=open(readme).read(),
1616 classifiers=[
17 'Development Status :: 3 - Alpha',
17 'Development Status :: 4 - Beta',
1818 'Intended Audience :: Developers',
1919 'License :: OSI Approved :: BSD License',
2020 'Programming Language :: Python',
3535 install_requires=['dogpile.core>=0.4.1'],
3636 test_suite='nose.collector',
3737 tests_require=['nose', 'mock'],
38 )⏎
38 )
11 from nose import SkipTest
22 from functools import wraps
33 from dogpile.cache import compat
4 import time
5
46
57 def eq_(a, b, msg=None):
68 """Assert a == b, with repr messaging on failure."""
2325
2426 from dogpile.cache.compat import configparser, io
2527
28 def winsleep():
29 # sleep a for an amount of time
30 # sufficient for windows time.time()
31 # to change
32 if compat.win32:
33 time.sleep(.001)
2634
2735 def requires_py3k(fn):
2836 @wraps(fn)
44 import time
55 import os
66 from nose import SkipTest
7 from dogpile.core.readwrite_lock import ReadWriteMutex
8 from dogpile.cache.backends.file import AbstractFileLock
79
810 try:
911 import fcntl
12 has_fcntl = True
1013 except ImportError:
11 raise SkipTest("fcntl not available")
14 has_fcntl = False
1215
13 class DBMBackendTest(_GenericBackendTest):
16 class MutexLock(AbstractFileLock):
17 def __init__(self, filename):
18 self.mutex = ReadWriteMutex()
19
20 def acquire_read_lock(self, wait):
21 ret = self.mutex.acquire_read_lock(wait)
22 return wait or ret
23
24 def acquire_write_lock(self, wait):
25 ret = self.mutex.acquire_write_lock(wait)
26 return wait or ret
27
28 def release_read_lock(self):
29 return self.mutex.release_read_lock()
30
31 def release_write_lock(self):
32 return self.mutex.release_write_lock()
33
34 if has_fcntl:
35 class DBMBackendTest(_GenericBackendTest):
36 backend = "dogpile.cache.dbm"
37
38 config_args = {
39 "arguments": {
40 "filename": "test.dbm"
41 }
42 }
43
44 class DBMBackendConditionTest(_GenericBackendTest):
1445 backend = "dogpile.cache.dbm"
1546
1647 config_args = {
17 "arguments":{
18 "filename":"test.dbm"
48 "arguments": {
49 "filename": "test.dbm",
50 "lock_factory": MutexLock
1951 }
2052 }
53
2154
2255 class DBMBackendNoLockTest(_GenericBackendTest):
2356 backend = "dogpile.cache.dbm"
2457
2558 config_args = {
26 "arguments":{
27 "filename":"test.dbm",
28 "rw_lockfile":False,
29 "dogpile_lockfile":False,
59 "arguments": {
60 "filename": "test.dbm",
61 "rw_lockfile": False,
62 "dogpile_lockfile": False,
3063 }
3164 }
3265
3366
34 class DBMMutexTest(_GenericMutexTest):
67 class _DBMMutexTest(_GenericMutexTest):
3568 backend = "dogpile.cache.dbm"
36
37 config_args = {
38 "arguments":{
39 "filename":"test.dbm"
40 }
41 }
4269
4370 def test_release_assertion_thread(self):
4471 backend = self._backend()
6491 finally:
6592 m1.release()
6693
94 if has_fcntl:
95 class DBMMutexFileTest(_DBMMutexTest):
96 config_args = {
97 "arguments": {
98 "filename": "test.dbm"
99 }
100 }
101
102
103 class DBMMutexConditionTest(_DBMMutexTest):
104 config_args = {
105 "arguments": {
106 "filename": "test.dbm",
107 "lock_factory": MutexLock
108 }
109 }
110
67111
68112 def teardown():
69113 for fname in os.listdir(os.curdir):
00 #! coding: utf-8
11
22 from ._fixtures import _GenericBackendFixture
3 from . import eq_, requires_py3k
3 from . import eq_, requires_py3k, winsleep
44 from unittest import TestCase
55 import time
66 from dogpile.cache import util, compat
7474 def test_decorator_expire_callable_zero(self):
7575 go = self._fixture(expiration_time=lambda: 0)
7676 eq_(go(1, 2), (1, 1, 2))
77 winsleep()
7778 eq_(go(1, 2), (2, 1, 2))
79 winsleep()
7880 eq_(go(1, 2), (3, 1, 2))
7981
8082 def test_explicit_expire(self):
9597 eq_(go(1, 2), (3, 1, 2))
9698 go.set(0, 1, 3)
9799 eq_(go(1, 3), 0)
100
101 def test_explicit_get(self):
102 go = self._fixture(expiration_time=1)
103 eq_(go(1, 2), (1, 1, 2))
104 eq_(go.get(1, 2), (1, 1, 2))
105 eq_(go.get(2, 1), NO_VALUE)
106 eq_(go(2, 1), (2, 2, 1))
107 eq_(go.get(2, 1), (2, 2, 1))
108
109 def test_explicit_get_multi(self):
110 go = self._multi_fixture(expiration_time=1)
111 eq_(go(1, 2), ['1 1', '1 2'])
112 eq_(go.get(1, 2), ['1 1', '1 2'])
113 eq_(go.get(3, 1), [NO_VALUE, '1 1'])
114 eq_(go(3, 1), ['2 3', '1 1'])
115 eq_(go.get(3, 1), ['2 3', '1 1'])
98116
99117 def test_explicit_set_multi(self):
100118 go = self._multi_fixture(expiration_time=1)
300318
301319 generate.set({7: 18, 10: 15})
302320 eq_(generate(2, 7, 10), {2: '2 5', 7: 18, 10: 15})
321
322 eq_(
323 generate.refresh(2, 7),
324 {2: '2 7', 7: '7 8'}
325 )
326 eq_(generate(2, 7, 10), {2: '2 7', 10: 15, 7: '7 8'})
327
303328
304329 def test_multi_asdict_keys_missing(self):
305330 reg = self._region()
375400
376401 generate.set({7: 18, 10: 15})
377402 eq_(generate(2, 7, 10), ['2 5', 18, 15])
378
379
00 from ._fixtures import _GenericBackendTest, _GenericMutexTest
1 from . import eq_
1 from . import eq_, winsleep
22 from unittest import TestCase
33 from threading import Thread
44 import time
55 from nose import SkipTest
6 from dogpile.cache import compat
7
68
79 class _TestMemcachedConn(object):
810 @classmethod
1820
1921 class _NonDistributedMemcachedTest(_TestMemcachedConn, _GenericBackendTest):
2022 region_args = {
21 "key_mangler":lambda x: x.replace(" ", "_")
23 "key_mangler": lambda x: x.replace(" ", "_")
2224 }
2325 config_args = {
24 "arguments":{
25 "url":"127.0.0.1:11211"
26 "arguments": {
27 "url": "127.0.0.1:11211"
2628 }
2729 }
2830
2931 class _DistributedMemcachedTest(_TestMemcachedConn, _GenericBackendTest):
3032 region_args = {
31 "key_mangler":lambda x: x.replace(" ", "_")
33 "key_mangler": lambda x: x.replace(" ", "_")
3234 }
3335 config_args = {
34 "arguments":{
35 "url":"127.0.0.1:11211",
36 "distributed_lock":True
36 "arguments": {
37 "url": "127.0.0.1:11211",
38 "distributed_lock": True
3739 }
3840 }
3941
4042 class _DistributedMemcachedMutexTest(_TestMemcachedConn, _GenericMutexTest):
4143 config_args = {
42 "arguments":{
43 "url":"127.0.0.1:11211",
44 "distributed_lock":True
44 "arguments": {
45 "url": "127.0.0.1:11211",
46 "distributed_lock": True
4547 }
4648 }
4749
123125
124126 class PylibmcArgsTest(TestCase):
125127 def test_binary_flag(self):
126 backend = MockPylibmcBackend(arguments={'url':'foo','binary':True})
128 backend = MockPylibmcBackend(arguments={'url': 'foo','binary': True})
127129 eq_(backend._create_client().kw["binary"], True)
128130
129131 def test_url_list(self):
130 backend = MockPylibmcBackend(arguments={'url':["a", "b", "c"]})
132 backend = MockPylibmcBackend(arguments={'url': ["a", "b", "c"]})
131133 eq_(backend._create_client().arg[0], ["a", "b", "c"])
132134
133135 def test_url_scalar(self):
134 backend = MockPylibmcBackend(arguments={'url':"foo"})
136 backend = MockPylibmcBackend(arguments={'url': "foo"})
135137 eq_(backend._create_client().arg[0], ["foo"])
136138
137139 def test_behaviors(self):
138 backend = MockPylibmcBackend(arguments={'url':"foo",
139 "behaviors":{"q":"p"}})
140 backend = MockPylibmcBackend(arguments={'url': "foo",
141 "behaviors": {"q": "p"}})
140142 eq_(backend._create_client().kw["behaviors"], {"q": "p"})
141143
142144 def test_set_time(self):
143 backend = MockPylibmcBackend(arguments={'url':"foo",
144 "memcached_expire_time":20})
145 backend.set("foo", "bar")
146 eq_(backend._clients.memcached.canary, [{"time":20}])
145 backend = MockPylibmcBackend(arguments={'url': "foo",
146 "memcached_expire_time": 20})
147 backend.set("foo", "bar")
148 eq_(backend._clients.memcached.canary, [{"time": 20}])
147149
148150 def test_set_min_compress_len(self):
149 backend = MockPylibmcBackend(arguments={'url':"foo",
150 "min_compress_len":20})
151 backend.set("foo", "bar")
152 eq_(backend._clients.memcached.canary, [{"min_compress_len":20}])
151 backend = MockPylibmcBackend(arguments={'url': "foo",
152 "min_compress_len": 20})
153 backend.set("foo", "bar")
154 eq_(backend._clients.memcached.canary, [{"min_compress_len": 20}])
153155
154156 def test_no_set_args(self):
155 backend = MockPylibmcBackend(arguments={'url':"foo"})
157 backend = MockPylibmcBackend(arguments={'url': "foo"})
156158 backend.set("foo", "bar")
157159 eq_(backend._clients.memcached.canary, [{}])
158160
159161 class MemcachedArgstest(TestCase):
160162 def test_set_time(self):
161 backend = MockMemcacheBackend(arguments={'url':"foo",
162 "memcached_expire_time":20})
163 backend.set("foo", "bar")
164 eq_(backend._clients.memcached.canary, [{"time":20}])
163 backend = MockMemcacheBackend(arguments={'url': "foo",
164 "memcached_expire_time": 20})
165 backend.set("foo", "bar")
166 eq_(backend._clients.memcached.canary, [{"time": 20}])
165167
166168 def test_set_min_compress_len(self):
167 backend = MockMemcacheBackend(arguments={'url':"foo",
168 "min_compress_len":20})
169 backend.set("foo", "bar")
170 eq_(backend._clients.memcached.canary, [{"min_compress_len":20}])
169 backend = MockMemcacheBackend(arguments={'url': "foo",
170 "min_compress_len": 20})
171 backend.set("foo", "bar")
172 eq_(backend._clients.memcached.canary, [{"min_compress_len": 20}])
171173
172174
173175 class LocalThreadTest(TestCase):
200202 for t in threads:
201203 t.join()
202204 eq_(canary, [i + 1 for i in range(count)])
203 eq_(MockClient.number_of_clients, 0)
204
205
205
206 if compat.py27:
207 eq_(MockClient.number_of_clients, 0)
208 else:
209 eq_(MockClient.number_of_clients, 1)
210
211
22 class MemoryBackendTest(_GenericBackendTest):
33 backend = "dogpile.cache.memory"
44
5
6 class MemoryPickleBackendTest(_GenericBackendTest):
7 backend = "dogpile.cache.memory_pickle"
00 import pprint
11 from unittest import TestCase
22 from dogpile.cache.api import CacheBackend, CachedValue, NO_VALUE
3 from dogpile.cache import exception
34 from dogpile.cache import make_region, register_backend, CacheRegion, util
45 from dogpile.cache.proxy import ProxyBackend
5 from . import eq_, is_, assert_raises_message, io, configparser
6 from . import eq_, is_, assert_raises_message, io, configparser, winsleep
67 import time, datetime
78 import itertools
89 from collections import defaultdict
6768 my_region = make_region()
6869
6970 assert_raises_message(
70 Exception,
71 exception.ValidationError,
7172 "expiration_time is not a number or timedelta.",
7273 my_region.configure, 'mock', 'one hour'
7374 )
9899 reg = CacheRegion()
99100 reg.configure("mock")
100101 assert_raises_message(
101 Exception,
102 exception.RegionAlreadyConfigured,
102103 "This region is already configured",
103104 reg.configure, "mock"
104105 )
106 eq_(reg.is_configured, True)
105107
106108 def test_no_config(self):
107109 reg = CacheRegion()
108110 assert_raises_message(
109 Exception,
111 exception.RegionNotConfigured,
110112 "No backend is configured on this region.",
111113 getattr, reg, "backend"
112114 )
115 eq_(reg.is_configured, False)
113116
114117 def test_set_get_value(self):
115118 reg = self._region()
205208 eq_(reg.get("some key"), "some value 2")
206209
207210
208 def test_invalidate_get(self):
209 reg = self._region()
210 reg.set("some key", "some value")
211 def test_hard_invalidate_get(self):
212 reg = self._region()
213 reg.set("some key", "some value")
214 time.sleep(.1)
211215 reg.invalidate()
212216 is_(reg.get("some key"), NO_VALUE)
213217
214 def test_invalidate_get_or_create(self):
218 def test_hard_invalidate_get_or_create(self):
215219 reg = self._region()
216220 counter = itertools.count(1)
217221 def creator():
219223 eq_(reg.get_or_create("some key", creator),
220224 "some value 1")
221225
226 time.sleep(.1)
222227 reg.invalidate()
223228 eq_(reg.get_or_create("some key", creator),
224229 "some value 2")
230
231 def test_soft_invalidate_get(self):
232 reg = self._region(config_args={"expiration_time": 1})
233 reg.set("some key", "some value")
234 time.sleep(.1)
235 reg.invalidate(hard=False)
236 is_(reg.get("some key"), NO_VALUE)
237
238 def test_soft_invalidate_get_or_create(self):
239 reg = self._region(config_args={"expiration_time": 1})
240 counter = itertools.count(1)
241 def creator():
242 return "some value %d" % next(counter)
243 eq_(reg.get_or_create("some key", creator),
244 "some value 1")
245
246 time.sleep(.1)
247 reg.invalidate(hard=False)
248 eq_(reg.get_or_create("some key", creator),
249 "some value 2")
250
251 def test_soft_invalidate_get_or_create_multi(self):
252 reg = self._region(config_args={"expiration_time": 5})
253 values = [1, 2, 3]
254 def creator(*keys):
255 v = values.pop(0)
256 return [v for k in keys]
257 ret = reg.get_or_create_multi(
258 [1, 2], creator)
259 eq_(ret, [1, 1])
260 time.sleep(.1)
261 reg.invalidate(hard=False)
262 ret = reg.get_or_create_multi(
263 [1, 2], creator)
264 eq_(ret, [2, 2])
265
266 def test_soft_invalidate_requires_expire_time_get(self):
267 reg = self._region()
268 reg.invalidate(hard=False)
269 assert_raises_message(
270 exception.DogpileCacheException,
271 "Non-None expiration time required for soft invalidation",
272 reg.get_or_create, "some key", lambda: "x"
273 )
274
275 def test_soft_invalidate_requires_expire_time_get_multi(self):
276 reg = self._region()
277 reg.invalidate(hard=False)
278 assert_raises_message(
279 exception.DogpileCacheException,
280 "Non-None expiration time required for soft invalidation",
281 reg.get_or_create_multi, ["k1", "k2"], lambda k: "x"
282 )
225283
226284 def test_should_cache_fn(self):
227285 reg = self._region()
234292 should_cache_fn=should_cache_fn)
235293 eq_(ret, 1)
236294 eq_(reg.backend._cache['some key'][0], 1)
295 time.sleep(.1)
237296 reg.invalidate()
238297 ret = reg.get_or_create(
239298 "some key", creator,
246305 should_cache_fn=should_cache_fn)
247306 eq_(ret, 3)
248307 eq_(reg.backend._cache['some key'][0], 3)
308
249309
250310 def test_should_cache_fn_multi(self):
251311 reg = self._region()
259319 should_cache_fn=should_cache_fn)
260320 eq_(ret, [1, 1])
261321 eq_(reg.backend._cache[1][0], 1)
322 time.sleep(.1)
262323 reg.invalidate()
263324 ret = reg.get_or_create_multi(
264325 [1, 2], creator,
265326 should_cache_fn=should_cache_fn)
266327 eq_(ret, [2, 2])
267328 eq_(reg.backend._cache[1][0], 1)
329 time.sleep(.1)
268330 reg.invalidate()
269331 ret = reg.get_or_create_multi(
270332 [1, 2], creator,
0 from unittest import TestCase
1
2 from dogpile.cache import util
3
4
5 class UtilsTest(TestCase):
6 """ Test the relevant utils functionality.
7 """
8
9 def test_coerce_string_conf(self):
10 settings = {'expiration_time': '-1'}
11 coerced = util.coerce_string_conf(settings)
12 self.assertEqual(coerced['expiration_time'], -1)
13
14 settings = {'expiration_time': '+1'}
15 coerced = util.coerce_string_conf(settings)
16 self.assertEqual(coerced['expiration_time'], 1)