Local-first, autonomous agent operating system for personal AI assistance.
Cognition + Thor — Intelligence with Power
Ollama (local), OpenAI, Anthropic, Gemini, Groq, DeepSeek, Mistral, Together, OpenRouter, xAI, Cerebras, GitHub Models, AWS Bedrock, Hugging Face, Moonshot. Auto-detects from API key.
CLI, Web UI, REST API, Telegram, Discord, Slack, WhatsApp, Signal, iMessage, Teams, Matrix, Google Chat, Mattermost, Feishu, IRC, Twitch, Voice (STT/TTS). Auto-detect from tokens.
Core identity, episodic logs, semantic knowledge graph, procedural skills, working memory. 3-channel hybrid search: BM25 + vector + graph.
Deterministic Gatekeeper (4 risk levels), multi-level sandbox, SHA-256 audit chain, encrypted credential vault, EU AI Act compliance, red-teaming suite.
Full web dashboard (React 19 + Vite 7) with backend launcher, live config editing, agent management, prompt editing, cron jobs, MCP servers, A2A settings.
13+ MCP tool servers (filesystem, shell, memory, web, browser, media). Agent-to-Agent protocol (Linux Foundation RC v1.0) for inter-agent communication.
Planner (LLM) → Gatekeeper (deterministic policy) → Executor (sandboxed). No hallucinated tool calls — every action is policy-checked before execution.
All data stays on your machine. No cloud required — Ollama runs fully local. Optional cloud LLM providers available. Full GDPR compliance.
Reflector auto-synthesizes reusable skills from successful sessions. Learned procedures are suggested for future similar requests.
┌──────────────────────────────────────────────────────────────┐ │ Control Center UI (React 19 + Vite 7) │ ├──────────────────────────────────────────────────────────────┤ │ REST API (FastAPI, 20+ endpoints, port 8741) │ ├──────────────────────────────────────────────────────────────┤ │ Channels (17) │ │ CLI · Web · Telegram · Discord · Slack · WhatsApp · Signal │ ├──────────────────────────────────────────────────────────────┤ │ Gateway Layer │ │ Session Management · Agent Loop │ ├────────────┬──────────────┬──────────────────────────────────┤ │ Planner │ Gatekeeper │ Executor │ │ (LLM) │ (Policy) │ (Sandbox) │ ├────────────┴──────────────┴──────────────────────────────────┤ │ MCP Tool Layer (13+) + A2A Protocol │ ├──────────────────────────────────────────────────────────────┤ │ Multi-LLM Backend Layer (15) │ ├──────────────────────────────────────────────────────────────┤ │ 5-Tier Cognitive Memory │ │ Core · Episodic · Semantic · Procedural · Working │ └──────────────────────────────────────────────────────────────┘
# Clone and install
git clone https://github.com/Alex8791-cyber/cognithor.git
cd cognithor
pip install -e ".[all,dev]"
# Pull Ollama models
ollama pull qwen3:32b
ollama pull qwen3:8b
ollama pull nomic-embed-text
# Start CLI
cognithor
# Or start the Control Center UI
cd ui && npm install && npm run dev