Backends#
Several cache backends are included. The default is SQLite, since it’s generally the simplest to use, and requires no extra dependencies or configuration.
See requests_cache.backends
for usage details for specific backends.
Note
In the rare case that SQLite is not available (for example, on Heroku), a non-persistent in-memory cache is used by default.
Backend Dependencies#
Most of the other backends require some extra dependencies, listed below.
Backend |
Class |
Alias |
Dependencies |
---|---|---|---|
|
|||
|
|||
|
|||
|
|||
|
|||
Filesystem |
|
||
Memory |
|
Specifying a Backend#
You can specify which backend to use with the backend
parameter for either CachedSession
or install_cache()
. You can specify one by name, using the aliases listed above:
>>> session = CachedSession('my_cache', backend='redis')
Or by instance:
>>> backend = RedisCache(host='192.168.1.63', port=6379)
>>> session = CachedSession('my_cache', backend=backend)
Backend Options#
The cache_name
parameter has a different use depending on the backend:
Backend |
Cache name used as |
---|---|
SQLite |
Database path |
Redis |
Hash namespace |
MongoDB, GridFS |
Database name |
DynamoDB |
Table name |
Filesystem |
Cache directory |
Each backend class also accepts optional parameters for the underlying connection. For example,
SQLiteCache
accepts parameters for sqlite3.connect()
:
>>> session = CachedSession('my_cache', backend='sqlite', timeout=30)
Testing Backends#
If you just want to quickly try out all of the available backends for comparison, docker-compose config is included for all supported services. First, install docker if you haven’t already. Then, run:
pip install -U requests-cache[all] docker-compose
curl https://raw.githubusercontent.com/requests-cache/requests-cache/main/docker-compose.yml -O docker-compose.yml
docker-compose up -d
pip install -U requests-cache[all] docker-compose
Invoke-WebRequest -Uri https://raw.githubusercontent.com/requests-cache/requests-cache/main/docker-compose.yml -Outfile docker-compose.yml
docker-compose up -d
Exporting To A Different Backend#
If you have cached data that you want to copy or migrate to a different backend, you can do this
with CachedSession.cache.update()
. For example, if you want to dump the contents of a Redis cache
to JSON files:
>>> src_session = CachedSession('my_cache', backend='redis')
>>> dest_session = CachedSession('~/workspace/cache_dump', backend='filesystem', serializer='json')
>>> dest_session.cache.update(src_session.cache)
>>> # List the exported files
>>> print(dest_session.cache.paths())
'/home/user/workspace/cache_dump/9e7a71a3ff2e.json'
'/home/user/workspace/cache_dump/8a922ff3c53f.json'
Or, using backend classes directly:
>>> src_cache = RedisCache()
>>> dest_cache = FileCache('~/workspace/cache_dump', serializer='json')
>>> dest_cache.update(src_cache)
Custom Backends#
If the built-in backends don’t suit your needs, you can create your own by making subclasses of BaseCache
and BaseStorage
:
Example code
>>> from requests_cache import CachedSession
>>> from requests_cache.backends import BaseCache, BaseStorage
>>> class CustomCache(BaseCache):
... """Wrapper for higher-level cache operations. In most cases, the only thing you need
... to specify here is which storage class(es) to use.
... """
... def __init__(self, **kwargs):
... super().__init__(**kwargs)
... self.redirects = CustomStorage(**kwargs)
... self.responses = CustomStorage(**kwargs)
>>> class CustomStorage(BaseStorage):
... """Dict-like interface for lower-level backend storage operations"""
... def __init__(self, **kwargs):
... super().__init__(**kwargs)
...
... def __getitem__(self, key):
... pass
...
... def __setitem__(self, key, value):
... pass
...
... def __delitem__(self, key):
... pass
...
... def __iter__(self):
... pass
...
... def __len__(self):
... pass
...
... def clear(self):
... pass
You can then use your custom backend in a CachedSession
with the backend
parameter:
>>> session = CachedSession(backend=CustomCache())