Overview
GroqLLMService provides access to Groq’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management with ultra-fast inference speeds.
Groq LLM API Reference
Pipecat’s API methods for Groq integration
Example Implementation
Complete example with function calling
Groq Documentation
Official Groq API documentation and features
Groq Console
Access models and manage API keys
Installation
To use Groq services, install the required dependency:Prerequisites
Groq Account Setup
Before using Groq LLM services, you need:- Groq Account: Sign up at Groq Console
- API Key: Generate an API key from your console dashboard
- Model Selection: Choose from available models with ultra-fast inference
Required Environment Variables
GROQ_API_KEY: Your Groq API key for authentication
Configuration
Groq API key for authentication.
Base URL for Groq API endpoint.
Model identifier to use.
InputParams
This service uses the same input parameters asOpenAILLMService. See OpenAI LLM for details.
Usage
Basic Setup
With Custom Parameters
Notes
- Groq provides ultra-fast inference using custom LPU (Language Processing Unit) hardware.
- Groq fully supports the OpenAI-compatible parameter set inherited from
OpenAILLMService.