LLMOps
3 Min Read

Setting Up Logfire Observability for LLM

Subhajeet Dey
January 22, 2025

Today, I discovered an exciting observability tool from the Pydantic team called Logfire. It’s an excellent initiative that not only addresses a critical need but also provides the team with a smart way to use observability for LLM frameworks

The Logfire SDK demonstrates exceptional versatility, seamlessly integrating with key technologies to enhance monitoring, debugging, and performance analysis. It supports OpenAI, enabling efficient logging and monitoring of LLM API calls for improved transparency and optimization. With FastAPI integration, it facilitates precise tracing and debugging, providing deeper insights into an application's behavior. Additionally, the SDK extends to databases, offering comprehensive performance metrics and interaction analysis to ensure efficient data management. Whether applied to AI, web frameworks, or databases, the SDK equips developers with powerful tools for enhanced observability and optimization

Why Logfire Stands Out

These three areas—LLM API calls, FastAPI applications, and database observability, The Logfire SDK is easy to implement, making it a breeze to integrate with both new and existing projects. with few lines of code you can easily trace any llm call

Getting Started with Logfire

Let us now see how to use logfire, create an Account and Set Up a Project, Head over to the Logfire website and sign up for an account.Create a new project within the dashboard. Generate a Write Token

Obtain a write token from your project settings. Set the Token as an Environment Variable, Securely store the write token by setting it as an environment variable in your project:

export LOGFIRE_WRITE_TOKEN=your_write_token_here

Send data to Logfire:

import logfire

logfire.configure()
logfire.info("Hello, {name}!", name="world")

Integrate with FastAPI:

import logfire
from fastapi import FastAPI

app = FastAPI()
logfire.instrument_fastapi(app)

...

@app.get("/1lm/ask/stream/{chat_id}")
async def llm_stream(chat_id: str) -> StreamingResponse:
    ...

Integrate with Python’s logging library:

import logging
import logfire

logfire.configure()
LOGGER = logging.getLogger(__name__)
LOGGER.setLevel(logging.INFO)
LOGGER.addHandler(logfire.LogfireLoggingHandler())

Logfire also has a 30 days retention period, but data can be kept for longer periods if needed in their paid plans

To view the logs you can view the live dashboard of the project, or run the following query on the explore page:

SELECT * FROM records

You can use SQL syntax to further adjust the query and get what you’re looking for in the logs.

Logfire helps in prompt engineering along with optimising agentic workflows with complete discovery