← Back to blog
Company

Local-first is a principle, not a feature

OT
Orion Team
2026-04-15 · 5 min

We get asked constantly: "When are you launching a cloud version?"

Never. And here's why that's a feature, not a limitation.

Your knowledge graph is your cognitive fingerprint

Orion's knowledge graph contains your decisions, your reasoning patterns, the relationships between your ideas, and the contradictions you've resolved. Over time, it becomes a detailed map of how you think — what you prioritize, where your expertise concentrates, how your understanding evolves.

This is qualitatively different from storing files or chat logs. It's structured cognition. We don't think that data should ever leave your device, and we don't want to be in the business of hosting it.

The architecture makes local-first practical

Orion runs Postgres, Redis, ChromaDB, Ollama, the API, and the dashboard in Docker on your machine. The resource footprint:

Resource Requirement
Disk ~2 GB (including Ollama models)
RAM (idle) ~500 MB
RAM (active) ~1.2 GB
GPU Not required
Network Not required (works fully offline)

brain.recall hits the Redis cache in under 1ms. brain.think completes in ~200ms including entity extraction and graph linking. There's no latency to a remote server, no cold starts, no egress costs, no rate limits.

For most individual and small-team use cases, a laptop has more than enough compute. The bottleneck in AI memory isn't processing power — it's the quality of the retrieval pipeline.

What local-first means in practice

  • No accounts. No sign-up, no email, no password. Run docker compose up and you have a Galaxy.
  • No telemetry. Orion sends nothing to any server. There's no analytics, no crash reporting, no usage tracking.
  • Full offline support. Every feature works without an internet connection (when using Ollama for embeddings and LLM).
  • You own the database. It's Postgres. Run pg_dump for backups. Query it directly with psql if you want.
  • Portable. Copy the Docker volumes to another machine and you have an exact replica of your Galaxy.

The business model question

"But how will you make money?"

Not from hosting your cognitive fingerprint. Orion is source-available under the Functional Source License (FSL). You can read, modify, and self-host the code. After two years, each release converts to Apache 2.0. If we build commercial products, they'll be tools that run on top of your local Galaxy — visualization, team coordination, advanced analytics — never a replacement for local ownership.

We believe the right architecture for AI memory is the same architecture that made Git successful: local-first, with optional collaboration layers on top.

Get started

git clone https://github.com/aw537/orion && cd orion
cp .env.example .env
docker compose up

Your data stays on your machine. Read the docs →