Local research agent that verifies its own answers. Runs on Gemma 3 4B + Ollama, $0/query.

stdiocommunityapplication

Package Details

Transportstdio
Runtimeagentic-research-mcp

Environment Variables

OPENAI_BASE_URL

Any OpenAI-compatible endpoint. Default: OpenAI cloud. Use http://localhost:11434/v1 for Ollama.

OPENAI_API_KEY
Secret

API key for the endpoint above. Use 'ollama' as a sentinel value when running locally against Ollama.

MODEL_SYNTHESIZER

Model identifier used for the synthesize node. Defaults to 'gpt-5-mini'; set to 'gemma3:4b' for Mac-local Ollama.

MODEL_PLANNER

Model for the plan / classify / critic / compress / verify nodes. Defaults to 'gpt-5-nano'.

EMBED_MODEL

Embedding model identifier (for retrieval + memory). Default 'text-embedding-3-small'; use 'nomic-embed-text' on Ollama.

SEARXNG_URL

Base URL of the SearXNG meta-search instance. Default http://localhost:8888.

LOCAL_CORPUS_PATH

Path to an index directory built via scripts/index_corpus.py. When set, local corpus hits augment web search.

ENABLE_RERANK

Set to '1' to enable the BAAI/bge-reranker-v2-m3 cross-encoder rerank stage. First run downloads ~560MB.

ENABLE_FETCH

Set to '0' to skip the trafilatura full-page fetch stage. Default '1'.