Skip to main content

NodeConfig

Configuration for a single node in a conversation flow. task_messages is the only required field.
task_messages
List[dict]
required
List of message dicts defining the current node’s objectives. These tell the LLM what to do in this conversation state.
"task_messages": [
    {"role": "system", "content": "Ask the user for their name and email address."}
]
name
str
Identifier for the node. Useful for debug logging. If not provided, a UUID is generated automatically.
role_messages
List[Dict[str, Any]]
List of message dicts defining the bot’s role and personality. These persist across nodes when using the APPEND context strategy and are placed before task messages in the context.
"role_messages": [
    {"role": "system", "content": "You are a friendly customer service agent."}
]
functions
List[Dict | FlowsFunctionSchema | FlowsDirectFunction]
List of function definitions available in this node. Accepts provider-specific dict format, FlowsFunctionSchema objects, or direct functions. See Function Types.
pre_actions
List[ActionConfig]
Actions to execute before LLM inference when transitioning to this node. See ActionConfig.
post_actions
List[ActionConfig]
Actions to execute after LLM inference when transitioning to this node. If respond_immediately is False, post-actions are deferred until after the first LLM response in this node. See ActionConfig.
context_strategy
ContextStrategyConfig
Strategy for managing conversation context when transitioning to this node. Overrides the default strategy set on FlowManager. See ContextStrategyConfig.
respond_immediately
bool
default:"True"
Whether to trigger LLM inference immediately upon entering the node. Set to False when you want to wait for user input before the LLM responds (e.g., after a tts_say pre-action that asks a question).

FlowsFunctionSchema

Dataclass for defining function call schemas with Flows-specific properties. Provides a uniform way to define functions that works across all LLM providers.
name
str
required
Name of the function. This is used to identify the function in LLM tool calls.
description
str
required
Description of what the function does. The LLM uses this to decide when to call the function.
properties
Dict[str, Any]
required
Dictionary defining the function’s parameters using JSON Schema format.
"properties": {
    "city": {
        "type": "string",
        "description": "The city to get weather for"
    },
    "units": {
        "type": "string",
        "enum": ["celsius", "fahrenheit"],
        "description": "Temperature units"
    }
}
required
List[str]
required
List of required parameter names from properties.
handler
FunctionHandler
default:"None"
Function handler to process the function call. Can be a legacy handler (args) or modern handler (args, flow_manager). The handler should return a FlowResult or a ConsolidatedFunctionResult tuple.
cancel_on_interruption
bool
default:"True"
Whether to cancel this function call when the user interrupts. Set to False for long-running operations that should complete even if the user speaks.

Methods

to_function_schema

schema.to_function_schema() -> FunctionSchema
Convert to a standard FunctionSchema for use with LLMs. Strips Flows-specific fields (handler, cancel_on_interruption).

Example

from pipecat_flows import FlowsFunctionSchema, FlowResult

async def handle_weather(args, flow_manager):
    city = args["city"]
    weather = await get_weather(city)
    return {"status": "success", "weather": weather}

weather_function = FlowsFunctionSchema(
    name="get_weather",
    description="Get current weather for a city",
    properties={
        "city": {"type": "string", "description": "City name"}
    },
    required=["city"],
    handler=handle_weather,
)

ActionConfig

TypedDict for configuring actions that execute during node transitions.
type
str
required
Action type identifier. Must match a registered action handler. Built-in types are "tts_say", "end_conversation", and "function".
handler
Callable
default:"None"
Action handler function. Required for custom action types if not previously registered via FlowManager.register_action(). Can be a legacy handler (action) or modern handler (action, flow_manager).
text
str
default:"None"
Text content used by tts_say and optionally by end_conversation (as a goodbye message).
Additional fields are allowed and passed through to the handler. For example, a "notify_slack" action could include "channel" and "text" fields.

Built-in Action Types

TypeDescriptionRequired Fields
tts_saySpeak text using the pipeline’s TTS servicetext
end_conversationEnd the conversation, optionally speaking a goodbye messagetext (optional)
functionExecute a function inline in the pipelinehandler

Example

node_config: NodeConfig = {
    "task_messages": [{"role": "system", "content": "Help the user."}],
    "pre_actions": [
        {"type": "tts_say", "text": "Welcome! Let me help you with that."},
    ],
    "post_actions": [
        {"type": "end_conversation", "text": "Goodbye!"},
    ],
}

ContextStrategy

Enum defining strategies for managing conversation context during node transitions.
ValueDescription
APPENDAppend new messages to existing context. This is the default behavior.
RESETReset context with new messages only. Previous conversation history is discarded.
RESET_WITH_SUMMARYReset context but include an LLM-generated summary of the previous conversation. Requires summary_prompt in ContextStrategyConfig.
from pipecat_flows import ContextStrategy

strategy = ContextStrategy.APPEND
strategy = ContextStrategy.RESET
strategy = ContextStrategy.RESET_WITH_SUMMARY

ContextStrategyConfig

Dataclass for configuring context management behavior.
strategy
ContextStrategy
required
The context management strategy to use. See ContextStrategy.
summary_prompt
str
default:"None"
Prompt text for generating a conversation summary. Required when using RESET_WITH_SUMMARY. The LLM uses this prompt to summarize the conversation before resetting context.
Raises: ValueError if summary_prompt is not provided when using RESET_WITH_SUMMARY.

Example

from pipecat_flows import ContextStrategy, ContextStrategyConfig

# Append (default)
config = ContextStrategyConfig(strategy=ContextStrategy.APPEND)

# Reset
config = ContextStrategyConfig(strategy=ContextStrategy.RESET)

# Reset with summary
config = ContextStrategyConfig(
    strategy=ContextStrategy.RESET_WITH_SUMMARY,
    summary_prompt="Summarize the key information collected so far.",
)

flows_direct_function Decorator

Decorator that attaches metadata to a Pipecat direct function for use in Flows.
@flows_direct_function(*, cancel_on_interruption: bool = True)
ParameterTypeDefaultDescription
cancel_on_interruptionboolTrueWhether to cancel the function call when the user interrupts.
Direct functions have their schema automatically extracted from the function signature and docstring. The first parameter must be flow_manager: FlowManager, and all other parameters become the function’s properties. The docstring provides the function description and parameter descriptions (Google-style). Direct functions must return a ConsolidatedFunctionResult tuple.

Example

from pipecat_flows import FlowManager, flows_direct_function, ConsolidatedFunctionResult

@flows_direct_function(cancel_on_interruption=False)
async def lookup_order(
    flow_manager: FlowManager, order_id: str
) -> ConsolidatedFunctionResult:
    """Look up an order by its ID.

    Args:
        order_id: The order ID to look up.
    """
    order = await db.get_order(order_id)
    flow_manager.state["order"] = order
    result = {"status": "success", "order": order}
    next_node = create_order_details_node(order)
    return result, next_node
The function can then be passed directly in a node’s functions list:
node_config: NodeConfig = {
    "task_messages": [{"role": "system", "content": "Ask for the order ID."}],
    "functions": [lookup_order],
}

FlowsDirectFunction

Protocol defining the interface for direct functions. Any async callable matching this signature can be used as a direct function in node configurations.
class FlowsDirectFunction(Protocol):
    def __call__(
        self, flow_manager: FlowManager, **kwargs: Any
    ) -> Awaitable[ConsolidatedFunctionResult]: ...

Type Aliases

FlowResult

class FlowResult(TypedDict, total=False):
    status: str
    error: str
Base return type for function results. The status field indicates the outcome. The optional error field contains an error message if execution failed. Additional fields are allowed and passed through to the LLM.

FlowArgs

FlowArgs = Dict[str, Any]
Type alias for function handler arguments. Contains the parameters extracted from the LLM’s function call.

ConsolidatedFunctionResult

ConsolidatedFunctionResult = Tuple[Optional[FlowResult], Optional[NodeConfig]]
Return type for consolidated function handlers that both do work and specify the next node:
  • First element: The function result (or None for transition-only functions)
  • Second element: The next node as a NodeConfig, or None for node functions

FlowFunctionHandler

FlowFunctionHandler = Callable[
    [FlowArgs, FlowManager], Awaitable[FlowResult | ConsolidatedFunctionResult]
]
Type for modern function handlers that receive both arguments and the FlowManager instance.

LegacyFunctionHandler

LegacyFunctionHandler = Callable[
    [FlowArgs], Awaitable[FlowResult | ConsolidatedFunctionResult]
]
Type for legacy function handlers that only receive arguments. Both legacy and modern handlers are supported; the flow manager detects the signature automatically.