Caching¶
Vayu's cache module provides a TTL-aware @cached decorator that works on both sync and async functions, plus two storage backends and pluggable serializers.
The building blocks¶
Cache— abstract base class. Subclasses implement_read/_write/_delete(sync and async).FileCache(path, serializer=None)— persistent on-disk cache. One file per key, atomic writes viatmp → replace. Default serializer isPickler.MemoryCache()— in-process dict-backed cache. No serialization cost.Serializer—Pickler(default) orJsoner(usesorjsonwhen installed, else stdlibjson).mem_cache/mem_cached— module-levelMemoryCacheinstance and its.cachedshortcut.
@cached basics¶
from datetime import timedelta
from vayu.cache import FileCache
cache = FileCache("/var/cache/myapp")
@cache.cached(ttl=timedelta(minutes=10))
async def fetch(url: str) -> bytes:
...
ttl accepts a timedelta or a number of seconds. It must be positive — None or <= 0 raises ValueError.
How keys are built¶
By default, the key is:
Override with prefix= and/or key=:
@cache.cached(ttl=60, prefix="users")
def get_user(user_id: int): ...
# key: users:<sha256(args)>
@cache.cached(ttl=60, key="singleton")
def config(): ...
# key: <module>.config:singleton
@cache.cached(ttl=60, key=lambda user_id, **_: f"u:{user_id}")
def get_user(user_id: int, *, include_deleted=False): ...
# key: <module>.get_user:u:42
Key construction time is measured; if it exceeds 1 ms, Vayu logs a warning (turn off with log=False).
Eviction¶
Each decorated function gets a .evict(*args, **kwargs) attribute that deletes the entry for those arguments:
@cache.cached(ttl=60)
async def fetch(url): ...
await fetch("https://example.com")
await fetch.evict("https://example.com") # next call re-fetches
Picking a serializer¶
from vayu.cache import FileCache, Jsoner, Pickler
json_cache = FileCache("/tmp/json_cache", Jsoner())
pickle_cache = FileCache("/tmp/pickle_cache", Pickler())
# Per-call override
@pickle_cache.cached(ttl=60, serializer=Jsoner())
def get_dict() -> dict: ...
Use Jsoner when you want human-readable cache files or to share cache between languages. Use Pickler (default) when you cache arbitrary Python objects.
Sync vs. async¶
The decorator inspects the wrapped function once:
async def→ async wrapper usingasyncio.Lockdef→ sync wrapper usingthreading.Lock
Locking is per-key and prevents a thundering herd on cache miss.
Concurrency semantics¶
On a miss, the first caller for a given key acquires the lock, computes, and writes. Subsequent callers block, re-read, and return the freshly written value. A second miss check happens inside the lock to handle the case where the value was populated while waiting.
Direct read/write¶
Skip the decorator when you want explicit control:
cache = FileCache("/tmp/c")
await cache.write("greeting", "hello", ttl=timedelta(minutes=5))
value = await cache.read("greeting")
read raises CacheMissError on miss or expiry — handle it explicitly.