Large Language Model service implementation using Anthropic’s Claude API
AnthropicLLMService
provides integration with Anthropic’s Claude models, supporting streaming responses, function calling, and prompt caching with specialized context handling for Anthropic’s message format.
ANTHROPIC_API_KEY
.
OpenAILLMContextFrame
- Conversation context and historyLLMMessagesFrame
- Direct message listVisionImageRawFrame
- Images for vision processingLLMUpdateSettingsFrame
- Runtime parameter updatesLLMEnablePromptCachingFrame
- Toggle prompt cachingLLMFullResponseStartFrame
/ LLMFullResponseEndFrame
- Response boundariesLLMTextFrame
- Streamed completion chunksFunctionCallInProgressFrame
/ FunctionCallResultFrame
- Function call lifecycleErrorFrame
- API or processing errors