DeepSeek
LLM service implementation using DeepSeek’s API with OpenAI-compatible interface
Overview
DeepSeekLLMService
provides access to DeepSeek’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService
and supports streaming responses, function calling, and context management.
Installation
To use DeepSeekLLMService
, install the required dependencies:
You’ll also need to set up your DeepSeek API key as an environment variable: DEEPSEEK_API_KEY
Configuration
Constructor Parameters
Your DeepSeek API key
DeepSeek API endpoint
Model identifier
Input Parameters
Inherits all input parameters from OpenAILLMService:
Additional parameters to pass to the model
Reduces likelihood of repeating tokens based on their frequency. Range: [-2.0, 2.0]
Maximum number of tokens to generate. Must be greater than or equal to 1
Reduces likelihood of repeating any tokens that have appeared. Range: [-2.0, 2.0]
Controls randomness in the output. Range: [0.0, 2.0]
Controls diversity via nucleus sampling. Range: [0.0, 1.0]
Input Frames
Contains OpenAI-specific conversation context
Contains conversation messages
Contains image for vision model processing
Updates model settings
Output Frames
Contains generated text chunks
Indicates start of function call
Contains function call results
Usage Example
Methods
See the LLM base class methods for additional functionality.
Function Calling
Supports OpenAI-compatible function calling:
See the Function Calling guide for:
- Detailed implementation instructions
- Provider-specific function definitions
- Handler registration examples
- Control over function call behavior
- Complete usage examples
Available Models
Model Name | Description |
---|---|
deepseek-chat | DeepSeek’s chat completion model |
See DeepSeeks’s docs for a complete list of supported models.
Frame Flow
Inherits the OpenAI LLM Service frame flow:
Metrics Support
The service collects the same metrics as OpenAILLMService:
- Token usage (prompt and completion)
- Processing duration
- Time to First Byte (TTFB)
- Function call metrics
Notes
- OpenAI-compatible interface
- Supports streaming responses
- Handles function calling
- Manages conversation context
- Includes token usage tracking
- Thread-safe processing
- Automatic error handling
- Inherits OpenAI service features