Skip to main content

Overview

Mem0MemoryService provides long-term memory capabilities for conversational agents by integrating with Mem0’s API. It automatically stores conversation history and retrieves relevant past context based on the current conversation, enhancing LLM responses with persistent memory across sessions.

Installation

To use Mem0 memory services, install the required dependencies:
pip install "pipecat-ai[mem0]"

Prerequisites

Mem0 Account Setup

Before using Mem0 memory services, you need:
  1. Mem0 Account: Sign up at Mem0 Platform
  2. API Key: Generate an API key from your account dashboard
  3. Host Configuration: Set up your Mem0 host endpoint
  4. User Management: Configure user IDs for memory association

Required Environment Variables

  • MEM0_API_KEY: Your Mem0 API key for authentication

Configuration Options

  • User ID: Unique identifier for associating memories with specific users
  • Agent ID: Identifier for the agent using the memory service
  • Run ID: Identifier for specific conversation sessions
  • Memory Retrieval: Configure how past context is retrieved and used

Key Features

  • Persistent Memory: Long-term conversation history across sessions
  • Context Retrieval: Automatic retrieval of relevant past conversations
  • User Association: Memory tied to specific users for personalization
  • Session Management: Track conversations across different runs and agents

Configuration

api_key
str
default:"None"
The API key for accessing Mem0’s cloud API.
local_config
dict
default:"None"
Local configuration for Mem0 client as an alternative to the cloud API.
user_id
str
default:"None"
The user ID to associate with memories in Mem0. At least one of user_id, agent_id, or run_id must be provided.
agent_id
str
default:"None"
The agent ID to associate with memories in Mem0.
run_id
str
default:"None"
The run ID to associate with memories in Mem0.
params
InputParams
default:"None"
Configuration parameters for memory retrieval and storage. See InputParams below.
host
str
default:"None"
The host of the Mem0 server.

InputParams

ParameterTypeDefaultDescription
search_limitint10Maximum number of memories to retrieve per query (min: 1).
search_thresholdfloat0.1Minimum similarity threshold for memory retrieval (0.0-1.0).
api_versionstr"v2"API version to use for Mem0 client operations.
system_promptstr"Based on previous conversations, I recall: \n\n"Prefix text for memory context messages.
add_as_system_messageboolTrueWhether to add memories as system messages. When False, adds as user messages.
positionint1Position to insert memory messages in context.

Usage

Cloud API Setup

from pipecat.services.mem0 import Mem0MemoryService

memory = Mem0MemoryService(
    api_key=os.getenv("MEM0_API_KEY"),
    user_id="user-123",
)

With Custom Parameters

memory = Mem0MemoryService(
    api_key=os.getenv("MEM0_API_KEY"),
    user_id="user-123",
    agent_id="assistant-1",
    params=Mem0MemoryService.InputParams(
        search_limit=5,
        search_threshold=0.3,
        add_as_system_message=True,
    ),
)

Local Configuration

memory = Mem0MemoryService(
    local_config={
        "vector_store": {
            "provider": "chroma",
            "config": {"collection_name": "memories"},
        },
    },
    user_id="user-123",
)

Notes

  • At least one ID required: You must provide at least one of user_id, agent_id, or run_id. A ValueError is raised if none are provided.
  • Cloud vs local: Use api_key for Mem0’s cloud API, or local_config for a self-hosted Mem0 instance using Memory.from_config().
  • Pipeline placement: Place the Mem0MemoryService before your LLM service in the pipeline. It intercepts LLMContextFrame, OpenAILLMContextFrame, and LLMMessagesFrame to enhance context with relevant memories before passing them downstream.