Large Language Model service implementation using Anthropic’s Claude API
AnthropicLLMService
provides integration with Anthropic’s Claude models, supporting streaming responses, function calling, and prompt caching with specialized context handling for Anthropic’s message format.
Complete API documentation and method details
Official Anthropic API documentation and features
Working example with function calling
To use Anthropic services, install the required dependency:
You’ll also need to set up your Anthropic API key as an environment variable: ANTHROPIC_API_KEY
.
Get your API key from Anthropic Console.
OpenAILLMContextFrame
- Conversation context and historyLLMMessagesFrame
- Direct message listVisionImageRawFrame
- Images for vision processingLLMUpdateSettingsFrame
- Runtime parameter updatesLLMEnablePromptCachingFrame
- Toggle prompt cachingLLMFullResponseStartFrame
/ LLMFullResponseEndFrame
- Response boundariesLLMTextFrame
- Streamed completion chunksFunctionCallInProgressFrame
/ FunctionCallResultFrame
- Function call lifecycleErrorFrame
- API or processing errorsLearn how to implement function calling with standardized schemas, register handlers, manage context properly, and control execution flow in your conversational AI applications.
Learn how to manage conversation context, handle message history, and integrate context aggregators for consistent conversational experiences.
The service provides:
Enable with:
Large Language Model service implementation using Anthropic’s Claude API
AnthropicLLMService
provides integration with Anthropic’s Claude models, supporting streaming responses, function calling, and prompt caching with specialized context handling for Anthropic’s message format.
Complete API documentation and method details
Official Anthropic API documentation and features
Working example with function calling
To use Anthropic services, install the required dependency:
You’ll also need to set up your Anthropic API key as an environment variable: ANTHROPIC_API_KEY
.
Get your API key from Anthropic Console.
OpenAILLMContextFrame
- Conversation context and historyLLMMessagesFrame
- Direct message listVisionImageRawFrame
- Images for vision processingLLMUpdateSettingsFrame
- Runtime parameter updatesLLMEnablePromptCachingFrame
- Toggle prompt cachingLLMFullResponseStartFrame
/ LLMFullResponseEndFrame
- Response boundariesLLMTextFrame
- Streamed completion chunksFunctionCallInProgressFrame
/ FunctionCallResultFrame
- Function call lifecycleErrorFrame
- API or processing errorsLearn how to implement function calling with standardized schemas, register handlers, manage context properly, and control execution flow in your conversational AI applications.
Learn how to manage conversation context, handle message history, and integrate context aggregators for consistent conversational experiences.
The service provides:
Enable with: