Memory API
API reference for the memory system that manages agent conversation history.
See Also
For memory management patterns, see the Memory Concepts guide.
ConversationMemory
Stores conversation history as a list of Message objects.
Import
from marsys.agents.memory import ConversationMemory
Constructor
ConversationMemory(description: Optional[str] = None)
Methods
| Method | Returns | Description |
|---|---|---|
| add(message=None, *, role=None, content=None, ...) | str | Add message, returns message_id |
| update(message_id, *, role=None, content=None, ...) | None | Update existing message by ID |
| retrieve_all() | List[Dict] | Get all messages as dicts |
| retrieve_recent(n: int = 1) | List[Dict] | Get last n messages |
| get_messages() | List[Dict] | Get messages for LLM |
| retrieve_by_role(role: str, n: int = None) | List[Dict] | Filter by role |
| reset_memory() | None | Clear all (keeps system prompt) |
Example
memory = ConversationMemory(description="You are helpful")# Add messagesmsg_id = memory.add(role="user", content="Hello")memory.add(role="assistant", content="Hi there!")# Retrieve messagesall_msgs = memory.retrieve_all() # List[Dict]recent = memory.retrieve_recent(5)# Clearmemory.reset_memory() # Keeps system prompt
ManagedConversationMemory
Conversation memory with automatic token management.
Import
from marsys.agents.memory import ManagedConversationMemory, ManagedMemoryConfig
ManagedMemoryConfig
@dataclassclass ManagedMemoryConfig:max_total_tokens_trigger: int = 150_000target_total_tokens: int = 100_000image_token_estimate: int = 800min_retrieval_gap_steps: int = 2min_retrieval_gap_tokens: int = 5000trigger_events: List[str] = ["add", "get_messages"]cache_invalidation_events: List[str] = ["add", "remove_by_id", "delete_memory"]token_counter: Optional[Callable] = Noneenable_headroom_percent: float = 0.1processing_strategy: str = "none"
Example
# Standard memory with token managementmemory = ManagedConversationMemory(config=ManagedMemoryConfig(max_total_tokens_trigger=100_000,target_total_tokens=75_000),description="You are a helpful assistant")# Use same as ConversationMemorymemory.add(role="user", content="Hello")msgs = memory.get_messages() # Auto-manages token count
KGMemory
Knowledge graph memory storing facts as (Subject, Predicate, Object) triplets.
Import
from marsys.agents.memory import KGMemory
Constructor
KGMemory(model: Union[BaseVLM, BaseLLM, BaseAPIModel],description: Optional[str] = None)
Additional Methods
| Method | Description |
|---|---|
| add_fact(role, subject, predicate, obj) | Add fact directly |
| extract_and_update_from_text(input_text, role) | Extract facts from text using model |
MemoryManager
Factory that creates appropriate memory type.
Import
from marsys.agents.memory import MemoryManager
Constructor
MemoryManager(memory_type: str = "conversation_history",description: Optional[str] = None,model: Optional[Union[BaseLLM, BaseVLM]] = None,memory_config: Optional[ManagedMemoryConfig] = None,token_counter: Optional[Callable] = None)
Memory Types
"conversation_history"- Standard conversation memory"managed_conversation"- Token-managed memory"kg"- Knowledge graph memory (requires model)
Example
# Standard memorymanager = MemoryManager(memory_type="conversation_history",description="System prompt")# With token managementmanager = MemoryManager(memory_type="managed_conversation",memory_config=ManagedMemoryConfig(max_total_tokens_trigger=100000))# Knowledge graphmanager = MemoryManager(memory_type="kg",model=your_model)# Use itmanager.add(role="user", content="Hello")msgs = manager.get_messages()manager.save_to_file("memory.json")
Message
Single message in conversation.
Import
from marsys.agents.memory import Message
Constructor
Message(role: str,content: Optional[Union[str, Dict[str, Any], List[Dict[str, Any]]]] = None,message_id: str = auto_generated,name: Optional[str] = None,tool_calls: Optional[List[ToolCallMsg]] = None,agent_calls: Optional[List[AgentCallMsg]] = None,structured_data: Optional[Dict[str, Any]] = None,images: Optional[List[str]] = None,tool_call_id: Optional[str] = None)
Role Values
"system"- System instructions"user"- User input"assistant"- Agent response"tool"- Tool response
Example
# Simple messagemsg = Message(role="user", content="Hello")# With tool callsmsg = Message(role="assistant",content=None,tool_calls=[ToolCallMsg(id="call_123",call_id="call_123",type="function",name="search",arguments='{"query": "AI"}')])# Tool responsemsg = Message(role="tool",content='{"result": "found"}',tool_call_id="call_123",name="search")# With imagesmsg = Message(role="user",content="Describe this",images=["path/to/image.jpg"])
ToolCallMsg
Tool call request in a message.
from marsys.agents.memory import ToolCallMsgtool_call = ToolCallMsg(id="call_123",call_id="call_123",type="function",name="search",arguments='{"query": "AI trends"}')
AgentCallMsg
Agent invocation request.
from marsys.agents.memory import AgentCallMsgagent_call = AgentCallMsg(agent_name="DataProcessor",request="Process the sales data")
Pro Tip
Use ManagedConversationMemory for long conversations to automatically manage token limits and prevent context window overflow.