Build your first conversation flow with Pipecat Flows.
This guide walks through the Hello World example — a two-node conversation flow where the bot asks for a favorite color, records the answer, and says goodbye.
A flow is a graph of nodes. Each node gives the LLM a task and the functions it needs. This example has two nodes: one to ask a question and one to end the conversation.
The initial node sets the bot’s personality via role_message, gives it a task via task_messages, and provides a function the LLM will call when the user answers:
from pipecat_flows import FlowArgs, FlowManager, FlowsFunctionSchema, NodeConfigdef create_initial_node() -> NodeConfig: record_favorite_color_func = FlowsFunctionSchema( name="record_favorite_color_func", description="Record the color the user said is their favorite.", required=["color"], handler=record_favorite_color_and_set_next_node, properties={"color": {"type": "string"}}, ) return { "name": "initial", "role_message": "You are an inquisitive child. Use very simple language. Ask simple questions. You must ALWAYS use one of the available functions to progress the conversation. Your responses will be converted to audio. Avoid outputting special characters and emojis.", "task_messages": [ { "role": "developer", "content": "Say 'Hello world' and ask what is the user's favorite color.", } ], "functions": [record_favorite_color_func], }
The end node thanks the user and ends the conversation via the end_conversation post-action:
def create_end_node() -> NodeConfig: return NodeConfig( name="end", task_messages=[ { "role": "developer", "content": "Thank the user for answering and end the conversation", } ], post_actions=[{"type": "end_conversation"}], )
Nodes can be defined as plain dicts or as NodeConfig objects — both work
identically. This example uses both styles to show you the options.
When the LLM calls the function, the handler processes the result and returns the next node:
async def record_favorite_color_and_set_next_node( args: FlowArgs, flow_manager: FlowManager) -> tuple[str, NodeConfig]: print(f"Your favorite color is: {args['color']}") return args["color"], create_end_node()
The handler returns a tuple of (result, next_node). The result is provided to the LLM as context, and the next node is where the conversation transitions to.