Skip to main content
Pipecat Flows represents a conversation as a graph where each step is a node. Nodes are of type NodeConfig and may contain the following properties:
  • name: The name of the node; used as a reference to transition to the node.
  • role_message: A str defining the bot’s role/personality. Sent as the LLM’s system instruction and persists across transitions until changed. Typically set once in the initial node.
  • task_messages: A list of message dicts defining the current node’s objectives.
  • functions: A list of function call definitions and their corresponding handlers.
  • pre_actions: Actions to execute before LLM inference. Actions run once upon transitioning to a node.
  • post_actions: Actions to execute after LLM inference. Actions run once after the node’s initial LLM inference.
  • context_strategy: Strategy for updating context during transitions. The default behavior is to append messages to the context.
  • respond_immediately: Whether to run LLM inference as soon as the node is set. The default is True.
The only required field is task_messages, as your bot always needs a prompt to advance the conversation.

Messages

Messages define what your bot should do and how it should behave at each node in your conversation flow.

Message Types

There are two types of messages you can configure: Role Message (Optional) Defines your bot’s personality, tone, and overall behavior as a plain string. This is sent as the LLM’s system instruction and persists across node transitions until a new node explicitly sets it again. Typically set once in the initial node. Task Messages (Required) Define the specific objective your bot should accomplish in the current node. These messages focus the LLM on the immediate task at hand, such as asking a specific question or processing particular information.

Message Format

The role message is a plain string, while task messages use OpenAI format as a list of dicts:
"role_message": "You are an inquisitive child. Use very simple language. Ask simple questions. You must ALWAYS use one of the available functions to progress the conversation. Your responses will be converted to audio. Avoid outputting special characters and emojis.",
"task_messages": [
    {
        "role": "system",
        "content": "Say 'Hello world' and ask what is the user's favorite color.",
    }
],

Cross-Provider Compatibility

Task messages use Pipecat’s default OpenAI message format and are automatically translated to work with your chosen LLM provider. The role_message is sent as the LLM’s system instruction via LLMUpdateSettingsFrame, which is handled by each provider’s implementation.

Respond Immediately

For each node in the conversation, you can decide whether the LLM should respond immediately upon entering the node (the default behavior) or whether the LLM should wait for the user to speak first before responding. You do this using the respond_immediately field.
respond_immediately=False may be particularly useful in the very first node, especially in outbound-calling cases where the user has to first answer the phone to trigger the conversation.
NodeConfig(
    task_messages=[
        {
            "role": "system",
            "content": "Warmly greet the customer and ask how many people are in their party. This is your only job for now; if the customer asks for something else, politely remind them you can't do it.",
        }
    ],
    respond_immediately=False,
    # ... other fields
)
Keep in mind that if you specify respond_immediately=False, the user may not be aware of the conversational task at hand when entering the node (the bot hasn’t told them yet). While it’s always important to have guardrails in your node messages to keep the conversation on topic, letting the user speak first makes it even more so.