Observers
LLM Log Observer
Logging LLM activity in Pipecat
The LLMLogObserver
provides detailed logging of Large Language Model (LLM) activity within your Pipecat pipeline. It tracks the entire lifecycle of LLM interactions, from initial prompts to final responses.
Frame Types Monitored
The observer tracks the following frame types (only from/to LLM service):
- LLMFullResponseStartFrame: When the LLM begins generating a response
- LLMFullResponseEndFrame: When the LLM completes its response
- LLMTextFrame: Individual text chunks generated by the LLM
- FunctionCallInProgressFrame: Function/tool calls made by the LLM
- LLMMessagesFrame: Input messages sent to the LLM
- OpenAILLMContextFrame: Context information for OpenAI LLM calls
- FunctionCallResultFrame: Results returned from function calls
Usage
Log Output Format
The observer uses emojis and consistent formatting for easy log reading:
- 🧠 [Source] → LLM START/END RESPONSE
- 🧠 [Source] → LLM GENERATING: [text]
- 🧠 [Source] → LLM FUNCTION CALL: [details]
- 🧠 → [Destination] LLM MESSAGES FRAME: [messages]
- 🧠 → [Destination] LLM CONTEXT FRAME: [context]
All log entries include timestamps for precise timing analysis.