Skip to main content
This guide walks through the Hello World example — a two-node conversation flow where the bot asks for a favorite color, records the answer, and says goodbye.

Hello World Example

View the full source code on GitHub

Prerequisites

Install Pipecat Flows and Pipecat with the services used in this example:
pip install pipecat-ai-flows
pip install "pipecat-ai[daily,google,cartesia,silero]"
You’ll need API keys for Cartesia (STT + TTS) and Google (LLM) set as environment variables:
export CARTESIA_API_KEY=...
export GOOGLE_API_KEY=...

Define the Nodes

A flow is a graph of nodes. Each node gives the LLM a task and the functions it needs. This example has two nodes: one to ask a question and one to end the conversation.

Initial Node

The initial node sets the bot’s personality via role_message, gives it a task via task_messages, and provides a function the LLM will call when the user answers:
from pipecat_flows import FlowArgs, FlowManager, FlowsFunctionSchema, NodeConfig

def create_initial_node() -> NodeConfig:
    record_favorite_color_func = FlowsFunctionSchema(
        name="record_favorite_color_func",
        description="Record the color the user said is their favorite.",
        required=["color"],
        handler=record_favorite_color_and_set_next_node,
        properties={"color": {"type": "string"}},
    )

    return {
        "name": "initial",
        "role_message": "You are an inquisitive child. Use very simple language. Ask simple questions. You must ALWAYS use one of the available functions to progress the conversation. Your responses will be converted to audio. Avoid outputting special characters and emojis.",
        "task_messages": [
            {
                "role": "developer",
                "content": "Say 'Hello world' and ask what is the user's favorite color.",
            }
        ],
        "functions": [record_favorite_color_func],
    }

End Node

The end node thanks the user and ends the conversation via the end_conversation post-action:
def create_end_node() -> NodeConfig:
    return NodeConfig(
        name="end",
        task_messages=[
            {
                "role": "developer",
                "content": "Thank the user for answering and end the conversation",
            }
        ],
        post_actions=[{"type": "end_conversation"}],
    )
Nodes can be defined as plain dicts or as NodeConfig objects — both work identically. This example uses both styles to show you the options.

Write the Function Handler

When the LLM calls the function, the handler processes the result and returns the next node:
async def record_favorite_color_and_set_next_node(
    args: FlowArgs, flow_manager: FlowManager
) -> tuple[str, NodeConfig]:
    print(f"Your favorite color is: {args['color']}")
    return args["color"], create_end_node()
The handler returns a tuple of (result, next_node). The result is provided to the LLM as context, and the next node is where the conversation transitions to.

Build the Pipeline and FlowManager

Set up a standard Pipecat pipeline, then create a FlowManager and initialize it when a client connects:
async def run_bot(transport: BaseTransport, runner_args: RunnerArguments):
    stt = CartesiaSTTService(api_key=os.getenv("CARTESIA_API_KEY"))
    tts = CartesiaTTSService(
        api_key=os.getenv("CARTESIA_API_KEY"),
        voice_id="32b3f3c5-7171-46aa-abe7-b598964aa793",
    )
    llm = GoogleLLMService(api_key=os.getenv("GOOGLE_API_KEY"))

    context = LLMContext()
    context_aggregator = LLMContextAggregatorPair(
        context,
        user_params=LLMUserAggregatorParams(
            vad_analyzer=SileroVADAnalyzer(),
        ),
    )

    pipeline = Pipeline(
        [
            transport.input(),
            stt,
            context_aggregator.user(),
            llm,
            tts,
            transport.output(),
            context_aggregator.assistant(),
        ]
    )

    task = PipelineTask(pipeline, params=PipelineParams(allow_interruptions=True))

    # Initialize flow manager
    flow_manager = FlowManager(
        task=task,
        llm=llm,
        context_aggregator=context_aggregator,
        transport=transport,
    )

    @transport.event_handler("on_client_connected")
    async def on_client_connected(transport, client):
        await flow_manager.initialize(create_initial_node())

    runner = PipelineRunner(handle_sigint=runner_args.handle_sigint)
    await runner.run(task)
That’s it! When a user connects, the bot greets them, asks for their favorite color, records the answer, thanks them, and ends the conversation.
See the full source code for the complete runnable example.

Next Steps

Nodes & Messages

Learn about node configuration and message types

Functions

Understand node functions, edge functions, and direct functions

Examples

Explore more complex examples

API Reference

Complete technical reference