OpenPipe
LLM service implementation using OpenPipe for LLM request logging and fine-tuning
Overview
OpenPipeLLMService
extends the BaseOpenAILLMService to provide integration with OpenPipe, enabling request logging, model fine-tuning, and performance monitoring. It maintains compatibility with OpenAI’s API while adding OpenPipe’s logging and optimization capabilities.
Installation
To use OpenPipeLLMService
, install the required dependencies:
You’ll need to set up the following environment variables:
OPENPIPE_API_KEY
- Your OpenPipe API keyOPENAI_API_KEY
- Your OpenAI API key
Configuration
Constructor Parameters
Model identifier
OpenAI API key
OpenAI API endpoint
OpenPipe API key
OpenPipe API endpoint
Custom tags for request logging
Input Parameters
Inherits all input parameters from BaseOpenAILLMService:
Additional parameters to pass to the model
Reduces likelihood of repeating tokens based on their frequency. Range: [-2.0, 2.0]
Maximum number of tokens in the completion. Must be greater than or equal to 1
Maximum number of tokens to generate. Must be greater than or equal to 1
Reduces likelihood of repeating any tokens that have appeared. Range: [-2.0, 2.0]
Random seed for deterministic generation. Must be greater than or equal to 0
Controls randomness in the output. Range: [0.0, 2.0]
Controls diversity via nucleus sampling. Range: [0.0, 1.0]
Usage Example
Request Logging
OpenPipe automatically logs requests with configurable tags:
Frame Flow
Inherits the BaseOpenAI LLM Service frame flow with added logging:
Metrics Support
The service collects standard metrics plus OpenPipe-specific data:
- Token usage (prompt and completion)
- Processing duration
- Time to First Byte (TTFB)
- Request logs and metadata
- Performance metrics
- Cost tracking
Common Use Cases
-
Performance Monitoring
- Request latency tracking
- Token usage monitoring
- Cost analysis
-
Model Optimization
- Data collection for fine-tuning
- Response quality monitoring
- Usage pattern analysis
-
Development and Testing
- Request logging for debugging
- A/B testing
- Quality assurance
Notes
- Maintains OpenAI API compatibility
- Automatic request logging
- Support for custom tags
- Fine-tuning data collection
- Performance monitoring
- Cost tracking capabilities
- Thread-safe processing