io.github.Ainode-tech
cache-proxy
LLM caching proxy for AI agents - exact + semantic cache. Free health.
Hosted
Streamable HTTPofficialinfraDeployment
Hosted
streamable-http
https://cache.api.ainode.tech/mcpLLM caching proxy for AI agents - exact + semantic cache. Free health.
https://cache.api.ainode.tech/mcp