Persistent project knowledge graph for coding agents. MCP server with semantic search, in-process embeddings, and web explorer.
{
"mcpServers": {
"megamemory": {
"command": "node",
"args": ["/path/to/MegaMemory/dist/index.js"]
}
}
}~/Library/Application Support/Claude/claude_desktop_config.json (Mac) or %APPDATA%\Claude\claude_desktop_config.json (Windows).<placeholder> values with your API keys or paths.MCP Servers overview
What people ask about MegaMemory
What is 0xK3vin/MegaMemory?
+
0xK3vin/MegaMemory is mcp servers for the Claude AI ecosystem. Persistent project knowledge graph for coding agents. MCP server with semantic search, in-process embeddings, and web explorer. It has 166 GitHub stars and was last updated 9d ago.
How do I install MegaMemory?
+
You can install MegaMemory by cloning the repository (https://github.com/0xK3vin/MegaMemory) or following the README instructions on GitHub. ClaudeWave also provides quick install blocks on this page.
Is 0xK3vin/MegaMemory safe to use?
+
0xK3vin/MegaMemory has not been audited yet by our security agent. Review the original repository on GitHub before using it in production.
Who maintains 0xK3vin/MegaMemory?
+
0xK3vin/MegaMemory is maintained by 0xK3vin. The last recorded GitHub activity is from 9d ago, with 5 open issues.
Are there alternatives to MegaMemory?
+
Yes. On ClaudeWave you can browse similar mcp servers at /categories/mcp, sorted by popularity or recent activity.
Deploy MegaMemory to your cloud
Ship this repo to production in minutes. Each platform spins up its own environment with editable env vars.
Maintain this repo? Add a badge to your README
Drop the badge into your GitHub README to show it's tracked on ClaudeWave. Each badge links back to this page and reflects the live Trust Score.
[](https://claudewave.com/repo/0xk3vin-megamemory)<a href="https://claudewave.com/repo/0xk3vin-megamemory"><img src="https://claudewave.com/api/badge/0xk3vin-megamemory" alt="Featured on ClaudeWave: 0xK3vin/MegaMemory" width="320" height="64" /></a>More MCP Servers
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
An open-source AI agent that brings the power of Gemini directly into your terminal.
A collection of MCP servers.
The fastest path to AI-powered full stack observability, even for lean teams.
The all-in-one AI productivity accelerator. On device and privacy first with no annoying setup or configuration.