Function calling (also known as tool calling) allows LLMs to request information from external services and APIs. This enables your bot to access real-time data and perform actions that aren’t part of its training data.
For example, you could give your bot the ability to:
Check current weather conditions
Look up stock prices
Query a database
Control smart home devices
Schedule appointments
Here’s how it works:
You define functions the LLM can use and register them to the LLM service used in your pipeline
When needed, the LLM requests a function call
Your application executes any corresponding functions
Pipecat provides a standardized FunctionSchema that works across all supported LLM providers. This makes it easy to define functions once and use them with any provider.
As a shorthand, you could also bypass specifying a function configuration at all and instead use “direct” functions. Under the hood, these are converted to FunctionSchemas.
from pipecat.adapters.schemas.function_schema import FunctionSchemafrom pipecat.adapters.schemas.tools_schema import ToolsSchema# Define a function using the standard schemaweather_function = FunctionSchema( name="get_current_weather", description="Get the current weather in a location", properties={ "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA", }, "format": { "type": "string", "enum": ["celsius", "fahrenheit"], "description": "The temperature unit to use.", }, }, required=["location", "format"])# Create a tools schema with your functionstools = ToolsSchema(standard_tools=[weather_function])# Pass this to your LLM contextcontext = OpenAILLMContext( messages=[{"role": "system", "content": "You are a helpful assistant."}], tools=tools)
The ToolsSchema will be automatically converted to the correct format for your LLM provider through adapters.
You can bypass specifying a function configuration (as a FunctionSchema or in a provider-specific format) and instead pass the function directly to your ToolsSchema. Pipecat will auto-configure the function, gathering relevant metadata from its signature and docstring. Metadata includes:
Note that the function signature is a bit different when using direct functions. The first parameter is FunctionCallParams, followed by any others necessary for the function.
Copy
Ask AI
from pipecat.adapters.schemas.tools_schema import ToolsSchemafrom pipecat.services.llm_service import FunctionCallParams# Define a direct functionasync def get_current_weather(params: FunctionCallParams, location: str, format: str): """Get the current weather. Args: location: The city and state, e.g. "San Francisco, CA". format: The temperature unit to use. Must be either "celsius" or "fahrenheit". """ weather_data = {"conditions": "sunny", "temperature": "75"} await params.result_callback(weather_data)# Create a tools schema, passing your function directly to ittools = ToolsSchema(standard_tools=[get_current_weather])# Pass this to your LLM contextcontext = OpenAILLMContext( messages=[{"role": "system", "content": "You are a helpful assistant."}], tools=tools)
You can also define functions in the provider-specific format if needed:
Copy
Ask AI
from openai.types.chat import ChatCompletionToolParam# OpenAI native formattools = [ ChatCompletionToolParam( type="function", function={ "name": "get_current_weather", "description": "Get the current weather", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA", }, "format": { "type": "string", "enum": ["celsius", "fahrenheit"], "description": "The temperature unit to use.", }, }, "required": ["location", "format"], }, }, )]
Some providers support unique tools that don’t fit the standard function schema. For these cases, you can add custom tools:
Copy
Ask AI
from pipecat.adapters.schemas.tools_schema import AdapterType, ToolsSchema# Standard functionsweather_function = FunctionSchema( name="get_current_weather", description="Get the current weather", properties={"location": {"type": "string"}}, required=["location"])# Custom Gemini search toolgemini_search_tool = { "web_search": { "description": "Search the web for information" }}# Create a tools schema with both standard and custom toolstools = ToolsSchema( standard_tools=[weather_function], custom_tools={ AdapterType.GEMINI: [gemini_search_tool] })
See the provider-specific documentation for details on custom tools and their
formats.
Register handlers for your functions using one of these LLM service methods:
register_function
register_direct_function
Which one you use depends on whether your function is a “direct” function.
Copy
Ask AI
from pipecat.services.llm_service import FunctionCallParamsllm = OpenAILLMService(api_key="your-api-key")# Main function handler - called to execute the functionasync def fetch_weather_from_api(params: FunctionCallParams): # Fetch weather data from your API weather_data = {"conditions": "sunny", "temperature": "75"} await params.result_callback(weather_data)# Register the functionllm.register_function( "get_current_weather", fetch_weather_from_api,)
Include your LLM service in your pipeline with the registered functions:
Copy
Ask AI
# Initialize the LLM context with your function schemascontext = OpenAILLMContext( messages=[{"role": "system", "content": "You are a helpful assistant."}], tools=tools)# Create the context aggregator to collect the user and assistant contextcontext_aggregator = llm.create_context_aggregator(context)# Create the pipelinepipeline = Pipeline([ transport.input(), # Input from the transport stt, # STT processing context_aggregator.user(), # User context aggregation llm, # LLM processing tts, # TTS processing transport.output(), # Output to the transport context_aggregator.assistant(), # Assistant context aggregation])
When returning results from a function handler, you can control how the LLM processes those results using a FunctionCallResultProperties object passed to the result callback.
It can be handy to skip a completion when you have back-to-back function
calls. Note, if you skip a completion, you must manually trigger one from the
context.