Instrument your AI application with OpenTelemetry and OpenLIT
This section describes the quickest and easiest way to get started with AI Observability to instrument your generative AI application and send telemetry data to Grafana Cloud.
This guide walks you through the process to:
- Install the AI Observability integration
- Configure OpenTelemetry and generate your API token
- Install the OpenLIT Telemetry SDK
- Set Up the OTEL endpoint and headers
- Configure your telemetry data destination:
- OTLP gateway: for a quick local development and testing setup
- OpenTelemetry Collector: for a robust and scalable production setup
- Observe your AI Stack using a pre-built Grafana dashboard
Install the AI Observability integration
To install the AI Observability integration:
In your Grafana Cloud stack, click Connections in the left-side menu.
Search for the name
AI Observability
.Click the AI Observability card and follow the instructions to instrument your application.
Alternatively, follow the instructions in Instrument your AI application.Click Install dashboards to install the pre-built GenAI Observability dashboard.
Configure OpenTelemetry and generate an API token
To instrument your AI application using Grafana Cloud, follow these steps:
- Sign in to Grafana Cloud and go to the Grafana Cloud Portal.
If you have access to multiple organizations, select one from the top-left dropdown. - Click the stack you want to configure from the sidebar or the main stack list.
- Under Manage your stack, click the Configure button in the OpenTelemetry section.
- Scroll down to the Password / API Token section and click Generate now to launch the Create API token window.
- Enter a name for the token and click Create token.
- Click Close. You don’t need to copy the token.
- Scroll down and copy the
OTEL_EXPORTER_OTLP_ENDPOINT
andOTEL_EXPORTER_OTLP_HEADERS
values from the Environment variables section.
Save these for later.
Install the OpenLIT Telemetry SDK
AI Observability uses the OpenLIT SDK to provide auto-instrumentation for many tools in generative AI stacks, such as LLMs, vector databases, and frameworks like LangChain.
To install the Python SDK, run the following command:
pip install openlit
After you’ve installed the Python SDK, update your application code:
```python
import openlit
openlit.init()
// Rest of your application code
```
Set Up the OTEL endpoint and headers
Run the following commands in your shell to configure the endpoint and headers:
```shell
export OTEL_EXPORTER_OTLP_ENDPOINT="<YOUR_GRAFANA_OTEL_GATEWAY_URL>"
export OTEL_EXPORTER_OTLP_HEADERS="<YOUR_GRAFANA_OTEL_GATEWAY_AUTH>"
```
For the command above, replace values with the ones your copied in the API token creation process:
- Replace
<YOUR_GRAFANA_OTEL_GATEWAY_URL>
with theOTEL_EXPORTER_OTLP_ENDPOINT
value you copied earlier.
For example:https://otlp-gateway-<ZONE>.grafana.net/otlp
- Replace
YOUR_GRAFANA_OTEL_GATEWAY_AUTH
with theOTEL_EXPORTER_OTLP_HEADERS
value you copied earlier.
For example:Authorization=Basic%20<BASE64 ENCODED INSTANCE ID AND API TOKEN>
Monitor using the pre-built GenAI Observability dashboard
When you run your instrumented AI application, the OpenLIT SDK automatically starts sending OpenTelemetry traces and metrics about your LLM and vector database usage to Grafana Cloud.
Open the GenAI Observability dashboard you previously installed from the integration to visualize the instrumentation data.