LLM caching proxy for AI agents - exact + semantic cache. Free health.

Hosted
Streamable HTTPofficialinfra

Deployment

Hosted
streamable-httphttps://cache.api.ainode.tech/mcp