Overview
OpenPipeLLMService extends the BaseOpenAILLMService to provide integration with OpenPipe, enabling request logging, model fine-tuning, and performance monitoring. It maintains compatibility with OpenAI’s API while adding OpenPipe’s logging and optimization capabilities.
OpenPipe LLM API Reference
Pipecat’s API methods for OpenPipe integration
Example Implementation
Browse examples using OpenPipe logging
OpenPipe Documentation
Official OpenPipe API documentation and features
OpenPipe Platform
Access logging and fine-tuning features
Installation
To use OpenPipe services, install the required dependencies:Prerequisites
OpenPipe Account Setup
Before using OpenPipe LLM services, you need:- OpenPipe Account: Sign up at OpenPipe
- API Keys: Generate both OpenPipe and OpenAI API keys
- Project Setup: Configure logging and fine-tuning projects
Required Environment Variables
OPENPIPE_API_KEY: Your OpenPipe API key for logging and fine-tuningOPENAI_API_KEY: Your OpenAI API key for underlying model access
Configuration
The model name to use for completions.
OpenAI API key for authentication. If not provided, reads from environment.
Custom OpenAI API endpoint URL. Uses the default OpenAI URL if not provided.
OpenPipe API key for request logging and fine-tuning features. If not provided, reads from environment.
OpenPipe API endpoint URL.
Dictionary of tags to apply to all requests for tracking and filtering in the OpenPipe dashboard.
InputParams
This service uses the same input parameters asOpenAILLMService. See OpenAI LLM for details.
Usage
Basic Setup
With Tags for Tracking
Notes
- All requests are automatically logged to OpenPipe for monitoring and fine-tuning purposes.
- Tags are included with every request and can be used to filter and organize requests in the OpenPipe dashboard.
- OpenPipe uses its own client (
openpipe.AsyncOpenAI) instead of the standard OpenAI client to enable transparent request logging.