Features
Function Calling
Enable LLMs to interact with external services and APIs
Understanding Function Calling
Function calling (also known as tool calling) allows LLMs to request information from external services and APIs. This enables your bot to access real-time data and perform actions that aren’t part of its training data.
For example, you could give your bot the ability to:
- Check current weather conditions
- Look up stock prices
- Query a database
- Control smart home devices
- Schedule appointments
Here’s how it works:
- You define functions the LLM can use
- When needed, the LLM requests a function call
- Your application executes the function
- The result is sent back to the LLM
- The LLM uses this information in its response
Implementation
1. Define Functions
Functions are defined differently depending on your LLM provider. Here are examples of a weather function for supported providers:
2. Register Function Handlers
Register handlers for your functions using the LLM service’s register_function
method:
Alternatively, register a handle for all function calls:
3. Create the Pipeline
Include your LLM service in your pipeline with the registered functions:
Function Handler Details
Handler Parameters
function_name
: Name of the called functiontool_call_id
: Unique identifier for the function callargs
: Arguments passed by the LLMllm
: Reference to the LLM servicecontext
: Current conversation contextresult_callback
: Async function to return results
Return Values
- Return data through the
result_callback
- Return
None
to ignore the function call - Errors should be handled within your function
Next steps
- Check out the function calling examples to see a complete example for specific LLM providers.
- Refer to your LLM providers documentation to learn more about their funciton calling capabilities.