Mneme.
"Stop grepping. Start recalling."
Local-only memory layer for AI coding tools. Index your project once, then ask for any symbol, dependency, or call graph in seconds — across sessions, restarts, and compaction.
One line. Any platform.
Paste into your terminal. Mneme installs locally, never phones home, and registers itself with your AI client. The source stays with us — only the binary leaves the build.
v0.4.0 "Genesis" · proprietary · binaries hosted by Stryx Labs
Six pieces of the brain.
Every part of Mneme has a job. Together they make AI memory feel like a real system layer, not a prompt trick.
Symbol Resolver
Three real resolvers — Rust, TypeScript, Python — turn syntactic names like spawn, super::spawn, and crate::manager::spawn into one canonical string per logical symbol.
Soft-redirect Hooks
When the AI calls Grep on something resolver-shaped, the PreToolUse hook injects a hint pointing at find_references. Never blocks. Always fail-open.
Vision SPA
14 graph views — call graph, dependency mesh, force-directed galaxy, time-travel, treemap, sunburst, sankey — paint locally in under 500ms.
50 MCP Tools
Every query the AI needs: recall_concept, find_references, blast_radius, call_graph, audit_*. Local. Deterministic. Audited.
Local-only by Design
Daemon binds to 127.0.0.1. Embeddings via ONNX Runtime, the optional LLM via llama.cpp, the graph in plain SQLite under ~/.mneme/. No telemetry. No cloud sync.
Hot Rebuild on Edit
The daemon watches your tree. Save a file and the affected slice of the graph re-builds in under a second. Symbol-anchored embeddings keep the same canonical name across edits.
Ask. Recall.
Mneme's CLI is what your AI host sees through MCP, but it also works directly at the terminal.
When grep isn't enough.
The problem.
When you ask an AI "where does WorkerPool::spawn get called?", the cheap answer is regex over text. The slightly less-cheap answer is grep. Both miss super::spawn, crate::manager::spawn, use crate::manager; spawn(), and aliased re-exports.
The answer.
Mneme answers with structural certainty — parser-built call graphs, symbol resolver, BGE embeddings anchored on canonical names — all in a daemon the AI talks to via MCP.
Measured against the golden benchmark.
| System | Score (pre-Genesis) | Score (Genesis) | Notes |
|---|---|---|---|
| Mneme | 2 / 10 | ~6 / 10 parity | Closed by symbol resolver |
| CRG (reference) | 6 / 10 | 6 / 10 | Baseline |
Root cause: absence of symbol resolver. Genesis closes the gap. Token-reduction gap attributed to resolver implementation.
Three places to start.
Mneme has depth. Here's the path that makes the rest of it click.
Architecture
What's running on your machine, what data goes where, what doesn't go on the network.
Symbol resolver
The keystone of Genesis. The reason your AI gives a sharper answer than yesterday.
MCP tools
The 50 tools your AI can call. Every one local, deterministic, and audited.
Common questions.
Does Mneme send any data off my machine?
The default install never opens an outbound connection. The daemon binds to 127.0.0.1. Embeddings run locally via ONNX Runtime, the optional LLM via llama.cpp, and the graph lives in plain SQLite under ~/.mneme/. The only opt-in network feature is federated_similar, which exchanges blake3-hashed signatures only between machines you control — and only if you turn it on.
How is this different from grep, ripgrep, or my IDE's "Find References"?
Grep is text matching. Mneme is a real symbol resolver — it knows that super::spawn, crate::manager::spawn, and a re-exported spawn() are the same logical symbol, and it indexes their callers, dependencies, and tests in a graph database. Your IDE's "Find References" only sees the file you have open; Mneme sees the whole project at once and exposes it through 50 MCP tools your AI host can call.
Which AI hosts work with Mneme?
Anything that speaks the Model Context Protocol (MCP) — Claude Code, Claude Desktop, Continue, Cursor, and the growing list of MCP-aware IDEs. Mneme registers an MCP server that exposes 50 tools and stays running in the background.
How do I update? Will my graph survive?
Run mneme update. Genesis ships an apply-with-rollback updater: it downloads the new binary, runs a post-swap health check, and restores the old binary automatically if the check fails. Your graph survives the update — the schema is forward-only, never drops columns, never renames. On the keystone release specifically, run mneme rebuild once after upgrade to pick up the symbol-anchored embeddings.
What's the license?
Personal-Use License. Free for personal use. Free for internal commercial use within your own organization. Redistribution and derivative works are prohibited. Source is not public. For commercial inquiries: hello@stryxlabs.com.
What platforms are supported?
Native binaries for macOS (Apple Silicon arm64 + Intel x86_64), Linux (x86_64), and Windows (x86_64). Also distributed as a Python wrapper via pipx and on Windows via winget.
How big is the install? How much disk does the graph take?
The Mneme binary is roughly 50-65 MB depending on platform. The graph size scales with your codebase — a medium-sized Rust workspace (50k LOC, 2k symbols) lands at around 60-120 MB across all 27 sharded SQLite stores.
Install once.
Run forever.
The graph stays fresh as you edit. Free for personal use, no signup, no telemetry.