OpenAI
Large Language Model services using OpenAI’s chat completion API
Overview
OpenAILLMService
provides chat completion capabilities using OpenAI’s API, supporting streaming responses, function calling, vision input, and advanced context management for conversational AI applications.
API Reference
Complete API documentation and method details
OpenAI Docs
Official OpenAI API documentation
Example Code
Function calling example with weather API
Installation
To use OpenAI services, install the required dependencies:
You’ll also need to set up your OpenAI API key as an environment variable: OPENAI_API_KEY
.
Get your API key from the OpenAI Platform.
Frames
Input
OpenAILLMContextFrame
- OpenAI-specific conversation contextLLMMessagesFrame
- Standard conversation messagesVisionImageRawFrame
- Images for vision model processingLLMUpdateSettingsFrame
- Runtime model configuration updates
Output
LLMFullResponseStartFrame
/LLMFullResponseEndFrame
- Response boundariesLLMTextFrame
- Streamed completion chunksFunctionCallInProgressFrame
/FunctionCallResultFrame
- Function call lifecycleErrorFrame
- API or processing errors
Function Calling
Function Calling Guide
Learn how to implement function calling with standardized schemas, register handlers, manage context properly, and control execution flow in your conversational AI applications.
Context Management
Context Management Guide
Learn how to manage conversation context, handle message history, and integrate context aggregators for consistent conversational experiences.
Usage Example
Basic Conversation with Function Calling
Metrics
The service provides:
- Time to First Byte (TTFB) - Latency from request to first response token
- Processing Duration - Total request processing time
- Token Usage - Prompt tokens, completion tokens, and total usage
Enable with:
Additional Notes
- Streaming Responses: All responses are streamed for low latency
- Context Persistence: Use context aggregators to maintain conversation history
- Error Handling: Automatic retry logic for rate limits and transient errors
- Compatible Services: Works with OpenAI-compatible APIs by setting
base_url