pyppin.base.cache¶
Caching decorators for functions and methods
This file provides two decorators, @cache (for functions) and @cachemethod (for methods), which can be used to memoize the return values of the function. For example:
class MyClass(object):
@cachemethod(key=lambda self, val1, val2: val1)
def mymethod(self, val1: string, val2: int) -> bool:
....
will cause mymethod
to automatically cache its results in a dedicated dict, keyed only by
val1
– that is, calls with different values of val2
will be assumed to always yield the
same result.
The decorators share most of their arguments and behavior:
cache: The cache to use for this method. Valid values are:
A class (such as
dict
,weakref.WeakValueDictionary
, or any of the cache classes fromcachetools
); create a separate cache of this type for just this method. Any**kwargs
passed to the decorator will be forwarded on to the cache’s constructor.An explicit object to use as a cache. (Careful!
@cachemethod(cache={})
will create a per-class dict and use it for every instance, which is rarely what you want!)
None
to simply not cache.(
@cachemethod
only) The (string) name of an instance variable of the class whose method is being decorated; this variable should usually be set up in the object__init__
method.The default is dict, i.e. to use a per-method unbounded cache.
lock: The mutex to use to guard the cache. Valid values are:
A class (such as threading.Lock) to use for the lock.
An explicit lock object (typically a threading.Lock).
True (equivalent to the class threading.Lock, the most common value to pass)
False (to not lock the cache)
@cachemethod
only: The (string) name of an instance variable containing the lock.The default is False, i.e. to not lock the cache.
key: A function that maps the decorated function’s argument to the cache key.
This function should take the same arguments (including
self
if appropriate) as the decorated function or method, and returns a cache key to use given those values. The default is to infer a key function based on all function arguments and the id of self. Note that this default only works if all the arguments are hashable, and calls will raiseExplicitKeyNeeded
if this is not true; you will very often want to override this default!cache_exceptions: Whether exceptions raised by the function should also be cached.
If True and the function raises an exception, we will cache the exception, so that future calls to the function will get a cache hit and the response to that hit will be to re-raise the exception. The default is not to do this.
Skipping the Cache¶
The resulting function will be a wrapped version of the original, but will have an additional
keyword argument _skip: Union[bool, str, CacheFlags, None]=None
, which can be used to control
caching behavior when invoking it. Some particular useful arguments:
wrappedfn(..., _skip=True)
will completely skip the cache.wrappedfn(..., _skip='r')
(“skip the cache read”) will always re-evaluate the underlying
function, and update the cache, so this can be used to forcibly refresh the cache entry for these
arguments.
* wrappedfn(..., _skip='w')
(“skip the cache write”) will check the cache and use its value on a
hit, and on a cache miss will re-evaluate the function but not update the cache. There are two big
cases where this is helpful: if it’s a value that would be expensive to store in the cache, or if
you’re doing an “unusual operation” which you know isn’t going to get cache hits in the future, and
writing it could pollute the cache. (For example, if there are two paths that hit a function, one of
which is performance-critical and the other not, and the second would have a very different
distribution of keys, then skipping the cache write for the non-critical operation guarantees that
it won’t mess up the cache for the critical one.)
Checking cache presence¶
In addition, the wrapped function has an incache() method of its own:
wrappedfn.incache(...)
has the same signature as the function itself, but returns a bool: True if the given arguments would lead to a cache hit, False otherwise.
Functions
|
A decorator to memoize (cache) the results of a function call. |
|
A decorator to memoize (cache) the results of a class or instance method. |
Classes
|
These flags indicate how the cache should be consulted on a given call. |
Exceptions
Error: You used the default key argument in a place where it doesn't work. |
- class pyppin.base.cache.CacheFlags(read: bool, write: bool)[source]¶
Bases:
NamedTuple
These flags indicate how the cache should be consulted on a given call.
- read: bool[source]¶
If read is true, we should see if the value is in the cache before the function call, and if so, return the cached value. Setting it to false means we ignore the cached value.
- write: bool[source]¶
If write is true, then if we didn’t get the value from the cache (either because read=False or because of a cache miss), we should update the cache with the new value.
- classmethod from_skip_arg(arg: Optional[Union[bool, str, CacheFlags]]) CacheFlags [source]¶
Parse a “skip=<foo>” argument into CacheFlags.
- exception pyppin.base.cache.ExplicitKeyNeeded[source]¶
Bases:
Exception
Error: You used the default key argument in a place where it doesn’t work.
- pyppin.base.cache.cachemethod(function: ~typing.Callable[[...], ~pyppin.base.cache.ValueType], *, cache: ~typing.Optional[~typing.Union[~typing.MutableMapping[~pyppin.base.cache.KeyType, ~pyppin.base.cache.ValueType], ~typing.Type[~typing.MutableMapping[~pyppin.base.cache.KeyType, ~pyppin.base.cache.ValueType]], str]] = <class 'dict'>, lock: ~typing.Union[~contextlib.AbstractContextManager, ~typing.Type[~contextlib.AbstractContextManager], bool, str] = False, key: ~typing.Optional[~typing.Callable[[...], ~pyppin.base.cache.KeyType]] = None, cache_exceptions: bool = False, **kwargs: ~typing.Any) _WrappedDescriptor [source]¶
A decorator to memoize (cache) the results of a class or instance method.
See the module documentation for an explanation of its arguments.
- pyppin.base.cache.cache(function: ~typing.Callable[[...], ~pyppin.base.cache.ValueType], *, cache: ~typing.Optional[~typing.Union[~typing.MutableMapping[~pyppin.base.cache.KeyType, ~pyppin.base.cache.ValueType], ~typing.Type[~typing.MutableMapping[~pyppin.base.cache.KeyType, ~pyppin.base.cache.ValueType]]]] = <class 'dict'>, lock: ~typing.Union[~contextlib.AbstractContextManager, ~typing.Type[~contextlib.AbstractContextManager], bool] = False, key: ~typing.Optional[~typing.Callable[[...], ~pyppin.base.cache.KeyType]] = None, cache_exceptions: bool = False, **kwargs: ~typing.Any) _WrappedFunction [source]¶
A decorator to memoize (cache) the results of a function call.
See the module documentation for an explanation of its arguments.