This is an extension aiming at making cache access on the server By configuration at startup of the FastAPI App instance, you can set the backend and other configuration options and have it remain a class constant when using FastAPI's intuitive Dependency Injection system.
The design has built-in limitations like fixed codec and backend once the app has been launched and encourage developers to design their applications with this in mind.
Most of the Backend implementation is directly lifted from fastapi-cache by @long2ice excluding the MongoDB backend option.
The following are the current available configuration keys that can be set on this FastAPI extension
on startup either by using a method which returns a list of tuples or a Pydantic BaseSettings object
(See examples below or in examples/
folder)
backend -- optional; must be one of ["inmemory", "memcached", "mongodb", "pickle", "redis"];
defaults to using inmemory option which required no extra package dependencies. To use
other listed options; See installation guide on the README.md at
[Repository Page](https://github.com/aekasitt/cachette).
codec -- optional; serialization and de-serialization format to have cache values stored in
the cache backend of choice as a string of selected encoding. once fetched, will have their
decoded values returned of the same format. must be one of ["feather", "msgpack", "parquet",
"pickle"]; if none is defined, will vanilla codec of basic string conversion will be used.
database_name -- required when backend set to "mongodb"; the database name to be automatically
created if not exists on the MongoDB instance and store the cache table; defaults to
"cachette-db"
memcached_host -- required when backend set to "memcached"; the host endpoint to the memcached
distributed memory caching system.
mongodb_url -- required when backend set to "mongodb"; the url set to MongoDB database
instance with or without provided authentication in such formats
"mongodb://user:password@host:port" and "mongodb://host:port" respectively.
pickle_path -- required when backend set to "pickle"; the file-system path to create local
store using python pickling on local directory
redis_url -- required when backend set to "redis"; the url set to redis-server instance with
or without provided authentication in such formats "redis://user:password@host:port" and
"redis://host:port" respectively.
table_name -- required when backend set to "mongodb"; name of the cache collection in case of
"mongodb" backend to have key-value pairs stored; defaults to "cachette".
ttl -- optional; the time-to-live or amount before this cache item expires within the cache;
defaults to 60 (seconds) and must be between 1 second to 1 hour (3600 seconds).
valkey_url -- required when backend set to "valkey"; the url set to valkey-server instance
with or without provided authentication in such formats "valkey://user:password@host:port"
and "valkey://host:port" respectively.
The following shows and example of setting up FastAPI Cachette in its default configuration, which is an In-Memory cache implementation.
from cachette import Cachette
from fastapi import FastAPI, Depends
from fastapi.responses import PlainTextResponse
from pydantic import BaseModel
app = FastAPI()
### Routing ###
class Payload(BaseModel):
key: str
value: str
@app.post('/', response_class=PlainTextResponse)
async def setter(payload: Payload, cachette: Cachette = Depends()):
await cachette.put(payload.key, payload.value)
return 'OK'
@app.get('/{key}', response_class=PlainTextResponse, status_code=200)
async def getter(key: str, cachette: Cachette = Depends()):
value: str = await cachette.fetch(key)
return value
And then this is how you set up a FastAPI Cachette with Redis support enabled.
from cachette import Cachette
from fastapi import FastAPI, Depends
from fastapi.responses import PlainTextResponse
from pydantic import BaseModel
app = FastAPI()
@Cachette.load_config
def get_cachette_config():
return [('backend', 'redis'), ('redis_url', 'redis://localhost:6379')]
class Payload(BaseModel):
key: str
value: str
@app.post('/', response_class=PlainTextResponse)
async def setter(payload: Payload, cachette: Cachette = Depends()):
await cachette.put(payload.key, payload.value)
return 'OK'
@app.get('/{key}', response_class=PlainTextResponse, status_code=200)
async def getter(key: str, cachette: Cachette = Depends()):
value: str = await cachette.fetch(key)
return value
-
Implement
flush
andflush_expired
methods on individual backends (Not needed for Redis & Memcached backends) -
Memcached Authentication (No SASL Support) Change library?
-
Add behaviors responding to "Cache-Control" request header
-
More character validations for URLs and Database/Table/Collection names in configuration options
The easiest way to start working with this extension with pip
pip install cachette
# or
poetry add cachette
When you familiarize with the basic structure of how to Dependency Inject Cachette within your
endpoints, please experiment more of using external backends with extras
installations like
# Install FastAPI Cachette's extra requirements to Redis support
pip install cachette --install-option "--extras-require=redis"
# or Install FastAPI Cachette's support to Memcached
poetry add cachette[memcached]
# or Special JSON Codec written on Rust at lightning speed
poetry add cachette[orjson]
# or Include PyArrow package making DataFrame serialization much easier
pip install cachette --install-option "--extras-require=dataframe"
This FastAPI extension utilizes "Dependency Injection" (To be continued)
Configuration of this FastAPI extension must be done at startup using "@Cachette.load_config" decorator (To be continued)
These are all available options with explanations and validation requirements (To be continued)
The following examples show you how to integrate this extension to a FastAPI App (To be continued)
See "examples/" folders
To run examples, first you must install extra dependencies
Do all in one go with this command...
pip install aiomcache motor uvicorn redis
# or
poetry install --extras examples
Do individual example with this command...
pip install redis
# or
poetry install --extras redis
See features and write tests I guess.
This project utilizes multiple external backend services namely AWS DynamoDB, Memcached, MongoDB and
Redis as backend service options as well as a possible internal option called InMemoryBackend. In
order to test viability, we must have specific instances of these set up in the background of our
testing environment. Utilize orchestration file attached to reposity and docker-compose
command
to set up testing instances of backend services using the following command...
docker-compose up --detach
When you are finished, you can stop and remove background running backend instances with the following command...
docker-compose down
Now that you have background running backend instances, you can proceed with the tests by using
pytest
command as such...
pytest
Or you can configure the command to run specific tests as such...
pytest -k test_load_invalid_configs
# or
pytest -k test_set_then_clear
All test suites must be placed under tests/
folder or its subfolders.
This project is licensed under the terms of the MIT license.