Persistent Intelligence Infrastructure for AI Agents
- ✓Open-source license (MIT)
- ✓Healthy fork ratio
- ✓Clear description
- ✓Topics declared
{
"mcpServers": {
"in-memoria": {
"command": "node",
"args": ["/path/to/In-Memoria/dist/index.js"]
}
}
}~/Library/Application Support/Claude/claude_desktop_config.json (Mac) or %APPDATA%\Claude\claude_desktop_config.json (Windows).<placeholder> values with your API keys or paths.MCP Servers overview
What people ask about In-Memoria
What is pi22by7/In-Memoria?
+
pi22by7/In-Memoria is mcp servers for the Claude AI ecosystem. Persistent Intelligence Infrastructure for AI Agents It has 166 GitHub stars and was last updated 4mo ago.
How do I install In-Memoria?
+
You can install In-Memoria by cloning the repository (https://github.com/pi22by7/In-Memoria) or following the README instructions on GitHub. ClaudeWave also provides quick install blocks on this page.
Is pi22by7/In-Memoria safe to use?
+
Our security agent has analyzed pi22by7/In-Memoria and assigned a Trust Score of 87/100 (tier: Trusted). See the full breakdown of passed checks and flags on this page.
Who maintains pi22by7/In-Memoria?
+
pi22by7/In-Memoria is maintained by pi22by7. The last recorded GitHub activity is from 4mo ago, with 2 open issues.
Are there alternatives to In-Memoria?
+
Yes. On ClaudeWave you can browse similar mcp servers at /categories/mcp, sorted by popularity or recent activity.
Deploy In-Memoria to your cloud
Ship this repo to production in minutes. Each platform spins up its own environment with editable env vars.
Maintain this repo? Add a badge to your README
Drop the badge into your GitHub README to show it's tracked on ClaudeWave. Each badge links back to this page and reflects the live Trust Score.
[](https://claudewave.com/repo/pi22by7-in-memoria)<a href="https://claudewave.com/repo/pi22by7-in-memoria"><img src="https://claudewave.com/api/badge/pi22by7-in-memoria" alt="Featured on ClaudeWave — pi22by7/In-Memoria" width="320" height="64" /></a>More MCP Servers
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
An open-source AI agent that brings the power of Gemini directly into your terminal.
A collection of MCP servers.
The fastest path to AI-powered full stack observability, even for lean teams.
The all-in-one AI productivity accelerator. On device and privacy first with no annoying setup or configuration.