MemOS (Preview) | Intelligence Begins with Memory
-
Updated
Sep 30, 2025 - Python
MemOS (Preview) | Intelligence Begins with Memory
Profile-Based Long-Term Memory for AI Applications. Memobase handles user profiles, memory events, and evolving context — perfect for chatbots, companions, tutors, customer service bots, and all chat-based agents.
Mirix is a multi-agent personal assistant designed to track on-screen activities and answer user questions intelligently. By capturing real-time visual data and consolidating it into structured memories, Mirix transforms raw inputs into a rich knowledge base that adapts to your digital experiences.
✨ mem0 MCP Server: A memory system using mem0 for AI applications with model context protocl (MCP) integration. Enables long-term memory for AI agents as a drop-in MCP server.
Know Me, Respond to Me: Benchmarking LLMs for Dynamic User Profiling and Personalized Responses at Scale
Reproducible Structured Memory for LLMs
Reliable and Efficient Semantic Prompt Caching with vCache
A MCP (Model Context Protocol) server providing long-term memory for LLMs
Official Python SDK for SwastikAI
A simple MCP server that stores and retrieves memories from multiple LLMs
Knowledge Graph Memory, Part of the Vital A.I. Agent Ecosystem
The CMR system is a sophisticated memory-enhanced language model architecture that enables contextual memory reweaving for improved performance on long-context tasks.
Carry your brain and consciousness across AI agents like ChatGPT and Claude. The first AI persona sync engine.
Functional memory for GPT-3.5-turbo-instruct with embeddings based on the MemoryBank whitepaper.
The Memory of your Agent
A MCP (Model Context Protocol) server providing long-term memory for LLMs
Simple AI-agent Conversation Orchestrator
A collection of intelligent AI agents built using Ollama, LangChain, and local LLMs — including chatbots, voice assistants, web scrapers, document readers, and custom tools. No APIs required. Fully offline.
Add a description, image, and links to the llm-memory topic page so that developers can more easily learn about it.
To associate your repository with the llm-memory topic, visit your repo's landing page and select "manage topics."