LLM service implementation using Together AI’s API with OpenAI-compatible interface
TogetherLLMService
provides access to Together AI’s language models, including Meta’s Llama 3.1 and 3.2 models, through an OpenAI-compatible interface. It inherits from OpenAILLMService
and supports streaming responses, function calling, and context management.
Complete API documentation and method details
Official Together AI API documentation and features
Working example with function calling
To use TogetherLLMService
, install the required dependencies:
You’ll also need to set up your Together AI API key as an environment variable: TOGETHER_API_KEY
.
Get your API key from Together AI Console.
OpenAILLMContextFrame
- Conversation context and historyLLMMessagesFrame
- Direct message listVisionImageRawFrame
- Images for vision processing (select models)LLMUpdateSettingsFrame
- Runtime parameter updatesLLMFullResponseStartFrame
/ LLMFullResponseEndFrame
- Response boundariesLLMTextFrame
- Streamed completion chunksFunctionCallInProgressFrame
/ FunctionCallResultFrame
- Function call lifecycleErrorFrame
- API or processing errorsLearn how to implement function calling with standardized schemas, register handlers, manage context properly, and control execution flow in your conversational AI applications.
Learn how to manage conversation context, handle message history, and integrate context aggregators for consistent conversational experiences.
Inherits all OpenAI metrics capabilities:
Enable with:
LLM service implementation using Together AI’s API with OpenAI-compatible interface
TogetherLLMService
provides access to Together AI’s language models, including Meta’s Llama 3.1 and 3.2 models, through an OpenAI-compatible interface. It inherits from OpenAILLMService
and supports streaming responses, function calling, and context management.
Complete API documentation and method details
Official Together AI API documentation and features
Working example with function calling
To use TogetherLLMService
, install the required dependencies:
You’ll also need to set up your Together AI API key as an environment variable: TOGETHER_API_KEY
.
Get your API key from Together AI Console.
OpenAILLMContextFrame
- Conversation context and historyLLMMessagesFrame
- Direct message listVisionImageRawFrame
- Images for vision processing (select models)LLMUpdateSettingsFrame
- Runtime parameter updatesLLMFullResponseStartFrame
/ LLMFullResponseEndFrame
- Response boundariesLLMTextFrame
- Streamed completion chunksFunctionCallInProgressFrame
/ FunctionCallResultFrame
- Function call lifecycleErrorFrame
- API or processing errorsLearn how to implement function calling with standardized schemas, register handlers, manage context properly, and control execution flow in your conversational AI applications.
Learn how to manage conversation context, handle message history, and integrate context aggregators for consistent conversational experiences.
Inherits all OpenAI metrics capabilities:
Enable with: