You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Flink-Agents introduced Long-Term Memory support in version 0.2, providing the ability to store and retrieve information persistently.
Base Interface — Long-Term Memory exposes the following operations:
Memory Set level — a memory set is a named collection of memory items
create_memory_set
get_memory_set
delete_memory_set
Memory Item level — a memory item holds the stored data and its associated metadata
add
get
delete
search
In version 0.2, we shipped a vector-store-based Long-Term Memory backend and an automatic compaction mechanism for managing memory sets.
Mem0
Mem0 is an intelligent memory layer specifically designed for AI agents. It provides built-in mechanisms for data compression and retrieval, supporting storage, automatic compaction, and semantic search.
Write path:
Information extraction — Mem0 sends the messages through an LLM that pulls out key facts, decisions, or preferences to remember.
Conflict resolution — Existing memories are checked for duplicates or contradictions so the latest truth wins.
Storage — The resulting memories land in managed vector storage (and optional graph storage) so future searches return them quickly.
Read path:
Query processing — Mem0 cleans and enriches your natural-language query so the downstream embedding search is accurate.
Vector search — Embeddings locate the closest memories using cosine similarity across your scoped dataset.
Results delivery — Formatted memories (with metadata and timestamps) return to your agent or calling service.
Why Mem0 as the Long-Term Memory Backend
Building a production-grade Long-Term Memory system requires significant engineering effort. Since this capability is not a differentiating feature of Flink-Agents relative to other agent frameworks, integrating a mature and widely adopted third-party solution is preferable to reinventing the wheel.
After evaluating several popular memory frameworks, we chose to integrate Mem0 first, for the following reasons:
Popularity — Mem0 has over 46.6k GitHub stars and is already used as the Long-Term Memory backend in other agent frameworks such as AgentScope.
API alignment — Mem0's API is intuitive and closely maps to the Long-Term Memory interface in Flink-Agents, making integration straightforward.
User Interface
Configuration
Users configure Long-Term Memory parameters through the Configuration:
NOTE: Both PRs will be included in version 0.3. PR 2 involves a refactor of the vector store interface — including updates to three Java implementations and one Python implementation — which is independent of the Long-Term Memory logic itself. See Use Flink-Agents ChatModel/EmbeddingModel/VectorStore in Mem0 for details.
Breaking Changes
The EXTERNAL_VECTOR_STORE backend is deprecated and will be removed.
LongTermMemoryOptions.Backend is no longer configurable; the Mem0 backend is now the default.
The BaseLongTermMemory interface signature will be slightly updated to align with Mem0's interface, while preserving equivalent capabilities.
LongTermMemory will only support reading and writing str values. Users are responsible for encoding/decoding other types before writing and after reading.
BaseLongTermMemory Interface Changes
Memory Set Operations
The following parameters are removed from get_or_create_memory_set:
capacity — Mem0 performs eager compaction on every insert, making explicit capacity management unnecessary.
CompactionConfig — Mem0 uses a global compaction policy.
item_type — Mem0 only supports str. While automatic serialization/deserialization could be preserved, it would require storing item_type as collection metadata — which introduces complexity and potential inconsistency across vector stores (e.g., S3Vector, Elasticsearch, OpenSearch do not natively support collection-level metadata). Other agent frameworks such as AgentScope and LangChain also restrict Long-Term Memory to str only. Therefore, we adopt the same approach and leave serialization to the user.
Mem0 always generates a random ID on insert; specifying a custom ID is not supported. Users can still access specific items using the ID returned by Mem0.
This section covers the key implementation considerations for integrating Mem0 into Flink-Agents.
Visibility Isolation
Mem0 supports data scoping via user_id, agent_id, and run_id filters. We map these to job_id, key, and memory_set.name respectively, ensuring that Long-Term Memory data is not visible across different keys within the same Flink job.
@overridedefsearch(
self, memory_set: MemorySet, query: str, limit: int, **kwargs: Any
) ->List[MemorySetItem]:
"""Search for memories related to the query."""result=self.mem0.search(
query=query,
user_id=self.job_id,
agent_id=self.key,
run_id=memory_set.name,
limit=limit,
**kwargs,
)
Use Flink-Agents ChatModel / EmbeddingModel / VectorStore in Mem0
Mem0 requires a chat model, an embedding model, and a vector store at initialization. Although Mem0 ships its own implementations, we want Flink-Agents' implementations to be used inside Mem0.
Chat Model & Embedding Model
We wrap Flink-Agents' ChatModel and EmbeddingModel by implementing Mem0's corresponding interfaces. Using ChatModel as an example:
Extend Mem0's LLMBase and implement generate_response, delegating to the Flink-Agents chat model:
classFlinkAgentsLLM(LLMBase):
"""Wrapper for the Flink-Agents chat model. This class wraps the Flink-Agents LLM to generate responses within Mem0 using Flink-Agents' chat model implementation. """model: BaseChatModelSetupdef__init__(self, config: BaseLlmConfig|None=None):
"""Initialize the Mem0 LLM wrapper. Args: config (`BaseLlmConfig | None`, optional): Configuration object for the LLM. Defaults to None. """super().__init__(config)
self.model=self.config.modeldefgenerate_response(
self,
messages: List[Dict[str, str]],
response_format: Any|None=None,
tools: List[Dict] |None=None,
tool_choice: str="auto",
) ->str|dict:
"""Generate a response using the Flink-Agents chat model."""
The EmbeddingModel wrapper follows the same pattern — only the embed method needs to be implemented.
VectorStore
Wrapping the VectorStore follows the same approach: extend Mem0's VectorStoreBase and implement the required methods.
Mem0's VectorStoreBase:
classVectorStoreBase(ABC):
@abstractmethoddefcreate_col(self, name, vector_size, distance):
"""Create a new collection."""@abstractmethoddefinsert(self, vectors, payloads=None, ids=None):
"""Insert vectors into a collection."""@abstractmethoddefsearch(self, query, vectors, limit=5, filters=None):
"""Search for similar vectors."""@abstractmethoddefdelete(self, vector_id):
"""Delete a vector by ID."""@abstractmethoddefupdate(self, vector_id, vector=None, payload=None):
"""Update a vector and its payload."""@abstractmethoddefget(self, vector_id):
"""Retrieve a vector by ID."""@abstractmethoddeflist_cols(self):
"""List all collections."""@abstractmethoddefdelete_col(self):
"""Delete a collection."""@abstractmethoddefcol_info(self):
"""Get information about a collection."""@abstractmethoddeflist(self, filters=None, limit=None):
"""List all memories."""@abstractmethoddefreset(self):
"""Reset by deleting and recreating the collection."""
Flink-Agents' BaseVectorStore and CollectionManageableVectorStore:
classBaseVectorStore(Resource, ABC):
"""Base abstract class for vector store."""defadd(
self,
documents: Document|List[Document],
collection_name: str|None=None,
**kwargs: Any,
) ->List[str]:
"""Add documents to the vector store."""defquery(self, query: VectorStoreQuery) ->VectorStoreQueryResult:
"""Perform vector search using a structured query object."""@abstractmethoddefsize(self, collection_name: str|None=None) ->int:
"""Return the number of items in the collection."""@abstractmethoddefget(
self,
ids: str|List[str] |None=None,
collection_name: str|None=None,
**kwargs: Any,
) ->List[Document]:
"""Retrieve documents by ID."""@abstractmethoddefdelete(
self,
ids: str|List[str] |None=None,
collection_name: str|None=None,
**kwargs: Any,
) ->None:
"""Delete documents by ID."""classCollectionManageableVectorStore(BaseVectorStore, ABC):
"""Base abstract class for vector stores that support collection management."""@abstractmethoddefget_or_create_collection(
self, name: str, metadata: Dict[str, Any] |None=None
) ->Collection:
"""Get a collection, creating it if it does not exist."""@abstractmethoddefget_collection(self, name: str) ->Collection:
"""Get a collection; raise an exception if it does not exist."""@abstractmethoddefdelete_collection(self, name: str) ->Collection:
"""Delete a collection."""
To bridge the two interfaces, Flink-Agents' VectorStore needs to add support for:
Search filters
update
reset
col_info
Extending the supported vector store ecosystem:
Currently, Flink-Agents supports only a limited set of vector stores:
Python: Chroma
Java: Elasticsearch, OpenSearch, S3Vector
To broaden this, we plan to expose Mem0's vector store ecosystem within Flink-Agents via a Mem0VectorStore adapter. This approach:
Adds support for Milvus, pgvector, Redis, Qdrant, and more.
Gives users who use Mem0VectorStore in Long-Term Memory a consistent experience aligned with native Mem0 usage.
frommem0.vector_stores.baseimportVectorStoreBasefromflink_agents.api.vector_stores.vector_storeimportCollectionManageableVectorStoreclassMem0VectorStore(CollectionManageableVectorStore):
"""Flink-Agents vector store backed by a Mem0 vector store implementation."""store: VectorStoreBase
Java Implementation
The Mem0 SDK only provides a Python implementation, so the Mem0-based Long-Term Memory backend will be implemented in Python only. Java-side usage of Long-Term Memory will delegate to the Python implementation via Pemja.
There are two cross-language call sites to handle:
The Java Long-Term Memory calling the corresponding Python Long-Term Memory methods.
Flink-Agents already supports cross-language resource invocation, so neither site presents a fundamental technical blocker. However, the call paths differ from existing cross-language resource calls and will require targeted handling.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Introduction
Long-Term Memory in Flink-Agents
Flink-Agents introduced Long-Term Memory support in version 0.2, providing the ability to store and retrieve information persistently.
Base Interface — Long-Term Memory exposes the following operations:
Memory Set level — a memory set is a named collection of memory items
create_memory_setget_memory_setdelete_memory_setMemory Item level — a memory item holds the stored data and its associated metadata
addgetdeletesearchIn version 0.2, we shipped a vector-store-based Long-Term Memory backend and an automatic compaction mechanism for managing memory sets.
Mem0
Mem0 is an intelligent memory layer specifically designed for AI agents. It provides built-in mechanisms for data compression and retrieval, supporting storage, automatic compaction, and semantic search.
Write path:
Read path:
Why Mem0 as the Long-Term Memory Backend
Building a production-grade Long-Term Memory system requires significant engineering effort. Since this capability is not a differentiating feature of Flink-Agents relative to other agent frameworks, integrating a mature and widely adopted third-party solution is preferable to reinventing the wheel.
After evaluating several popular memory frameworks, we chose to integrate Mem0 first, for the following reasons:
User Interface
Configuration
Users configure Long-Term Memory parameters through the
Configuration:Vector store configuration will be delivered in two PRs due to implementation complexity:
PR 1 — Users declare the vector store inline in the Long-Term Memory configuration:
PR 2 — Users can reference a named vector store declared in the Agent class, consistent with how chat models and embedding models are referenced:
Breaking Changes
EXTERNAL_VECTOR_STOREbackend is deprecated and will be removed.LongTermMemoryOptions.Backendis no longer configurable; the Mem0 backend is now the default.BaseLongTermMemoryinterface signature will be slightly updated to align with Mem0's interface, while preserving equivalent capabilities.LongTermMemorywill only support reading and writingstrvalues. Users are responsible for encoding/decoding other types before writing and after reading.BaseLongTermMemoryInterface ChangesMemory Set Operations
The following parameters are removed from
get_or_create_memory_set:capacity— Mem0 performs eager compaction on every insert, making explicit capacity management unnecessary.CompactionConfig— Mem0 uses a global compaction policy.item_type— Mem0 only supportsstr. While automatic serialization/deserialization could be preserved, it would require storingitem_typeas collection metadata — which introduces complexity and potential inconsistency across vector stores (e.g., S3Vector, Elasticsearch, OpenSearch do not natively support collection-level metadata). Other agent frameworks such as AgentScope and LangChain also restrict Long-Term Memory tostronly. Therefore, we adopt the same approach and leave serialization to the user.Memory Item Operations
Mem0 always generates a random ID on insert; specifying a custom ID is not supported. Users can still access specific items using the ID returned by Mem0.
Implementation
This section covers the key implementation considerations for integrating Mem0 into Flink-Agents.
Visibility Isolation
Mem0 supports data scoping via
user_id,agent_id, andrun_idfilters. We map these tojob_id,key, andmemory_set.namerespectively, ensuring that Long-Term Memory data is not visible across different keys within the same Flink job.Use Flink-Agents ChatModel / EmbeddingModel / VectorStore in Mem0
Mem0 requires a chat model, an embedding model, and a vector store at initialization. Although Mem0 ships its own implementations, we want Flink-Agents' implementations to be used inside Mem0.
Chat Model & Embedding Model
We wrap Flink-Agents' ChatModel and EmbeddingModel by implementing Mem0's corresponding interfaces. Using ChatModel as an example:
LLMBaseand implementgenerate_response, delegating to the Flink-Agents chat model:LlmFactory:The EmbeddingModel wrapper follows the same pattern — only the
embedmethod needs to be implemented.VectorStore
Wrapping the VectorStore follows the same approach: extend Mem0's
VectorStoreBaseand implement the required methods.Mem0's
VectorStoreBase:Flink-Agents'
BaseVectorStoreandCollectionManageableVectorStore:To bridge the two interfaces, Flink-Agents' VectorStore needs to add support for:
updateresetcol_infoExtending the supported vector store ecosystem:
Currently, Flink-Agents supports only a limited set of vector stores:
To broaden this, we plan to expose Mem0's vector store ecosystem within Flink-Agents via a
Mem0VectorStoreadapter. This approach:Mem0VectorStorein Long-Term Memory a consistent experience aligned with native Mem0 usage.Java Implementation
The Mem0 SDK only provides a Python implementation, so the Mem0-based Long-Term Memory backend will be implemented in Python only. Java-side usage of Long-Term Memory will delegate to the Python implementation via Pemja.
There are two cross-language call sites to handle:
Flink-Agents already supports cross-language resource invocation, so neither site presents a fundamental technical blocker. However, the call paths differ from existing cross-language resource calls and will require targeted handling.
Beta Was this translation helpful? Give feedback.
All reactions