io.github.Lykhoyda
ask-ollama
Bridge Claude with local Ollama LLMs for private AI-to-AI collaboration — no API keys, fully local
stdiocommunityapplication
Package Details
ask-ollama-mcp
Transportstdio
Environment Variables
OLLAMA_HOST(str)
Default:
http://localhost:11434Ollama server address (default: http://localhost:11434)
GMCPT_TIMEOUT_MS(str)
Default:
300000Timeout for Ollama execution in milliseconds (default: 300000 = 5 minutes)
GMCPT_LOG_LEVEL(str)
Default:
warnLog verbosity: debug, info, warn, error (default: warn)