LLM service implementation using OpenRouter’s API with OpenAI-compatible interface
OpenRouterLLMService
provides access to OpenRouter’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService
and supports streaming responses, function calling, and context management.
Complete API documentation and method details
Official OpenRouter API documentation and features
Working example with function calling
To use OpenRouterLLMService
, install the required dependencies:
You’ll also need to set up your OpenRouter API key as an environment variable: OPENROUTER_API_KEY
.
Get your API key from OpenRouter. Free tier includes $1 of credits.
OpenAILLMContextFrame
- Conversation context and historyLLMMessagesFrame
- Direct message listVisionImageRawFrame
- Images for vision processing (select models)LLMUpdateSettingsFrame
- Runtime parameter updatesLLMFullResponseStartFrame
/ LLMFullResponseEndFrame
- Response boundariesLLMTextFrame
- Streamed completion chunksFunctionCallInProgressFrame
/ FunctionCallResultFrame
- Function call lifecycleErrorFrame
- API or processing errorsLearn how to implement function calling with standardized schemas, register handlers, manage context properly, and control execution flow in your conversational AI applications.
Learn how to manage conversation context, handle message history, and integrate context aggregators for consistent conversational experiences.
Inherits all OpenAI metrics capabilities:
Enable with:
LLM service implementation using OpenRouter’s API with OpenAI-compatible interface
OpenRouterLLMService
provides access to OpenRouter’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService
and supports streaming responses, function calling, and context management.
Complete API documentation and method details
Official OpenRouter API documentation and features
Working example with function calling
To use OpenRouterLLMService
, install the required dependencies:
You’ll also need to set up your OpenRouter API key as an environment variable: OPENROUTER_API_KEY
.
Get your API key from OpenRouter. Free tier includes $1 of credits.
OpenAILLMContextFrame
- Conversation context and historyLLMMessagesFrame
- Direct message listVisionImageRawFrame
- Images for vision processing (select models)LLMUpdateSettingsFrame
- Runtime parameter updatesLLMFullResponseStartFrame
/ LLMFullResponseEndFrame
- Response boundariesLLMTextFrame
- Streamed completion chunksFunctionCallInProgressFrame
/ FunctionCallResultFrame
- Function call lifecycleErrorFrame
- API or processing errorsLearn how to implement function calling with standardized schemas, register handlers, manage context properly, and control execution flow in your conversational AI applications.
Learn how to manage conversation context, handle message history, and integrate context aggregators for consistent conversational experiences.
Inherits all OpenAI metrics capabilities:
Enable with: