Documentation

Read the source code of your memory.

Docs / getting-started / quickstart

Quickstart

Get a running Galaxy in under five minutes.

Prerequisites

Requirement Version Notes
Docker + Docker Compose v2+ Required
Python 3.11+ Only for CLI usage outside Docker
Node.js 18+ Only for frontend development

Start the stack

git clone https://github.com/aw537/orion && cd orion
cp .env.example .env
docker compose up

This starts 6 services:

Service Port Role
orion-api 8000 FastAPI — REST API + MCP server (single process)
orion-frontend 3000 React dashboard + Galaxy visualization
orion-postgres 5432 PostgreSQL 16 — structural data, knowledge graph
orion-redis 6379 Redis 7 — hot cache with per-region TTLs
orion-chroma ChromaDB — vector embeddings (7 collections per galaxy)
orion-ollama 11434 Ollama — local embeddings + LLM inference

First boot takes 2–3 minutes. Ollama pulls nomic-embed-text and llama3 (~4 GB). Subsequent starts are fast.

Verify:

curl http://localhost:8000/health
# → {"status": "ok", "service": "orion-api", "version": "0.1.0", "degraded": []}

Open http://localhost:3000 for the dashboard.

Connect your AI tool

Orion exposes 16 MCP tools at http://localhost:8000/mcp.

Claude Code:

claude mcp add orion --transport http http://localhost:8000/mcp

Cursor / Windsurf — add to your MCP config:

{
  "orion": {
    "url": "http://localhost:8000/mcp"
  }
}

Once connected, the agent can call brain.orient, brain.think, brain.recall, and 13 other tools. See the MCP Tools Reference for the full list.

Create your Galaxy

Three paths to the same result:

Web UI — navigate to http://localhost:3000/onboarding. The 6-step wizard creates your Galaxy, first Planet, first Biome, and Sun configuration. Every answer creates real data — the Galaxy is immediately usable.

CLI:

pip install -e ./backend
orion init

API:

curl -X POST http://localhost:8000/api/v1/onboarding/start \
  -H "Content-Type: application/json" \
  -d '{
    "role": "Software Engineer",
    "first_biome_name": "My Project",
    "name": "Andy",
    "goal": "Build a production-ready API",
    "tools": ["FastAPI", "PostgreSQL", "Redis"]
  }'

Import existing knowledge

orion memory import ~/Documents/notes/
orion memory import ~/Documents/Obsidian\ Vault/

Orion auto-detects the format:

Format Detection Behavior
Obsidian .obsidian/ directory Parses [[wikilinks]] as entity relationships
CLAUDE.md CLAUDE.md at root Each rule → GALAXY-gravity stardust (confidence 0.95)
GBrain Frontmatter has cognitive_mode Maps cognitive_mode, confidence, gravity to metadata
Plain markdown Default YAML frontmatter + paragraph chunking

All formats use semantic Planet routing — files are assigned to Planets based on content, not folder structure. Records the engine can't confidently route go to the Inbox for manual review.

Next steps