Skip to main content

Overview

GroqLLMService provides access to Groq’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management with ultra-fast inference speeds.

Installation

To use Groq services, install the required dependency:
pip install "pipecat-ai[groq]"

Prerequisites

Groq Account Setup

Before using Groq LLM services, you need:
  1. Groq Account: Sign up at Groq Console
  2. API Key: Generate an API key from your console dashboard
  3. Model Selection: Choose from available models with ultra-fast inference

Required Environment Variables

  • GROQ_API_KEY: Your Groq API key for authentication

Configuration

api_key
str
required
Groq API key for authentication.
base_url
str
default:"https://api.groq.com/openai/v1"
Base URL for Groq API endpoint.
model
str
default:"None"
deprecated
Model identifier to use.Deprecated in v0.0.105. Use settings=GroqLLMService.Settings(model=...) instead.
settings
GroqLLMService.Settings
default:"None"
Runtime-configurable settings. See Settings below.

Settings

Runtime-configurable settings passed via the settings constructor argument using GroqLLMService.Settings(...). These can be updated mid-conversation with LLMUpdateSettingsFrame. See Service Settings for details. This service uses the same settings as OpenAILLMService. See OpenAI LLM Settings for the full parameter reference.

Usage

Basic Setup

import os
from pipecat.services.groq import GroqLLMService

llm = GroqLLMService(
    api_key=os.getenv("GROQ_API_KEY"),
    model="llama-3.3-70b-versatile",
)

With Custom Settings

from pipecat.services.groq import GroqLLMService

llm = GroqLLMService(
    api_key=os.getenv("GROQ_API_KEY"),
    settings=GroqLLMService.Settings(
        model="llama-3.3-70b-versatile",
        temperature=0.7,
        top_p=0.9,
        max_completion_tokens=1024,
    ),
)

Notes

  • Groq provides ultra-fast inference using custom LPU (Language Processing Unit) hardware.
  • Groq fully supports the OpenAI-compatible parameter set inherited from OpenAILLMService.
The InputParams / params= pattern is deprecated as of v0.0.105. Use Settings / settings= instead. See the Service Settings guide for migration details.