Overview
GrokLLMService provides access to Grok’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management with Grok’s unique reasoning capabilities.
Grok LLM API Reference
Pipecat’s API methods for Grok integration
Example Implementation
Complete example with function calling
Grok Documentation
Official Grok API documentation and features
X.AI Platform
Access Grok models and manage API keys
Installation
To use Grok services, install the required dependencies:Prerequisites
Grok Account Setup
Before using Grok LLM services, you need:- X.AI Account: Sign up at X.AI Console
- API Key: Generate an API key from your console dashboard
- Model Selection: Choose from available Grok models
Required Environment Variables
XAI_API_KEY: Your X.AI API key for authentication
Configuration
X.AI API key for authentication.
Base URL for Grok API endpoint.
Model identifier to use.
InputParams
This service uses the same input parameters asOpenAILLMService. See OpenAI LLM for details.
Usage
Basic Setup
With Custom Parameters
Notes
- Grok uses incremental token reporting. The service accumulates token usage metrics during processing and reports the final totals at the end of each request.
- Grok supports prompt caching and reasoning tokens, which are tracked in usage metrics when available.