Memory API

API reference for the memory system that manages agent conversation history.

See Also

For memory management patterns, see the Memory Concepts guide.

ConversationMemory

Stores conversation history as a list of Message objects.

Import

from marsys.agents.memory import ConversationMemory

Constructor

ConversationMemory(description: Optional[str] = None)

Methods

MethodReturnsDescription
add(message=None, *, role=None, content=None, ...)strAdd message, returns message_id
update(message_id, *, role=None, content=None, ...)NoneUpdate existing message by ID
retrieve_all()List[Dict]Get all messages as dicts
retrieve_recent(n: int = 1)List[Dict]Get last n messages
get_messages()List[Dict]Get messages for LLM
retrieve_by_role(role: str, n: int = None)List[Dict]Filter by role
reset_memory()NoneClear all (keeps system prompt)

Example

memory = ConversationMemory(description="You are helpful")
# Add messages
msg_id = memory.add(role="user", content="Hello")
memory.add(role="assistant", content="Hi there!")
# Retrieve messages
all_msgs = memory.retrieve_all() # List[Dict]
recent = memory.retrieve_recent(5)
# Clear
memory.reset_memory() # Keeps system prompt

ManagedConversationMemory

Conversation memory with automatic token management.

Import

from marsys.agents.memory import ManagedConversationMemory, ManagedMemoryConfig

ManagedMemoryConfig

@dataclass
class ManagedMemoryConfig:
max_total_tokens_trigger: int = 150_000
target_total_tokens: int = 100_000
image_token_estimate: int = 800
min_retrieval_gap_steps: int = 2
min_retrieval_gap_tokens: int = 5000
trigger_events: List[str] = ["add", "get_messages"]
cache_invalidation_events: List[str] = ["add", "remove_by_id", "delete_memory"]
token_counter: Optional[Callable] = None
enable_headroom_percent: float = 0.1
processing_strategy: str = "none"

Example

# Standard memory with token management
memory = ManagedConversationMemory(
config=ManagedMemoryConfig(
max_total_tokens_trigger=100_000,
target_total_tokens=75_000
),
description="You are a helpful assistant"
)
# Use same as ConversationMemory
memory.add(role="user", content="Hello")
msgs = memory.get_messages() # Auto-manages token count

KGMemory

Knowledge graph memory storing facts as (Subject, Predicate, Object) triplets.

Import

from marsys.agents.memory import KGMemory

Constructor

KGMemory(
model: Union[BaseVLM, BaseLLM, BaseAPIModel],
description: Optional[str] = None
)

Additional Methods

MethodDescription
add_fact(role, subject, predicate, obj)Add fact directly
extract_and_update_from_text(input_text, role)Extract facts from text using model

MemoryManager

Factory that creates appropriate memory type.

Import

from marsys.agents.memory import MemoryManager

Constructor

MemoryManager(
memory_type: str = "conversation_history",
description: Optional[str] = None,
model: Optional[Union[BaseLLM, BaseVLM]] = None,
memory_config: Optional[ManagedMemoryConfig] = None,
token_counter: Optional[Callable] = None
)

Memory Types

  • "conversation_history" - Standard conversation memory
  • "managed_conversation" - Token-managed memory
  • "kg" - Knowledge graph memory (requires model)

Example

# Standard memory
manager = MemoryManager(
memory_type="conversation_history",
description="System prompt"
)
# With token management
manager = MemoryManager(
memory_type="managed_conversation",
memory_config=ManagedMemoryConfig(
max_total_tokens_trigger=100000
)
)
# Knowledge graph
manager = MemoryManager(
memory_type="kg",
model=your_model
)
# Use it
manager.add(role="user", content="Hello")
msgs = manager.get_messages()
manager.save_to_file("memory.json")

Message

Single message in conversation.

Import

from marsys.agents.memory import Message

Constructor

Message(
role: str,
content: Optional[Union[str, Dict[str, Any], List[Dict[str, Any]]]] = None,
message_id: str = auto_generated,
name: Optional[str] = None,
tool_calls: Optional[List[ToolCallMsg]] = None,
agent_calls: Optional[List[AgentCallMsg]] = None,
structured_data: Optional[Dict[str, Any]] = None,
images: Optional[List[str]] = None,
tool_call_id: Optional[str] = None
)

Role Values

  • "system" - System instructions
  • "user" - User input
  • "assistant" - Agent response
  • "tool" - Tool response

Example

# Simple message
msg = Message(role="user", content="Hello")
# With tool calls
msg = Message(
role="assistant",
content=None,
tool_calls=[
ToolCallMsg(
id="call_123",
call_id="call_123",
type="function",
name="search",
arguments='{"query": "AI"}'
)
]
)
# Tool response
msg = Message(
role="tool",
content='{"result": "found"}',
tool_call_id="call_123",
name="search"
)
# With images
msg = Message(
role="user",
content="Describe this",
images=["path/to/image.jpg"]
)

ToolCallMsg

Tool call request in a message.

from marsys.agents.memory import ToolCallMsg
tool_call = ToolCallMsg(
id="call_123",
call_id="call_123",
type="function",
name="search",
arguments='{"query": "AI trends"}'
)

AgentCallMsg

Agent invocation request.

from marsys.agents.memory import AgentCallMsg
agent_call = AgentCallMsg(
agent_name="DataProcessor",
request="Process the sales data"
)

Pro Tip

Use ManagedConversationMemory for long conversations to automatically manage token limits and prevent context window overflow.