agentic-research
Local research agent that verifies its own answers. Runs on Gemma 3 4B + Ollama, $0/query.
Package Details
agentic-research-engine
Environment Variables
Any OpenAI-compatible endpoint. Default: OpenAI cloud. Use http://localhost:11434/v1 for Ollama.
API key for the endpoint above. Use 'ollama' as a sentinel value when running locally against Ollama.
Model identifier used for the synthesize node. Defaults to 'gpt-5-mini'; set to 'gemma3:4b' for Mac-local Ollama.
Model for the plan / classify / critic / compress / verify nodes. Defaults to 'gpt-5-nano'.
Embedding model identifier (for retrieval + memory). Default 'text-embedding-3-small'; use 'nomic-embed-text' on Ollama.
Base URL of the SearXNG meta-search instance. Default http://localhost:8888.
Path to an index directory built via scripts/index_corpus.py. When set, local corpus hits augment web search.
Set to '1' to enable the BAAI/bge-reranker-v2-m3 cross-encoder rerank stage. First run downloads ~560MB.
Set to '0' to skip the trafilatura full-page fetch stage. Default '1'.