Menu
Grafana Cloud RSS

Instrument your AI application with OpenTelemetry and OpenLIT

This section describes the quickest and easiest way to get started with AI Observability to instrument your generative AI application and send telemetry data to Grafana Cloud.

This guide walks you through the process to:

  • Install the AI Observability integration
  • Configure OpenTelemetry and generate your API token
  • Install the OpenLIT Telemetry SDK
  • Set Up the OTEL endpoint and headers
  • Configure your telemetry data destination:
  • Observe your AI Stack using a pre-built Grafana dashboard

Install the AI Observability integration

To install the AI Observability integration:

  1. In your Grafana Cloud stack, click Connections in the left-side menu.

  2. Search for the name AI Observability.

  3. Click the AI Observability card and follow the instructions to instrument your application.
    Alternatively, follow the instructions in Instrument your AI application.

  4. Click Install dashboards to install the pre-built GenAI Observability dashboard.

Configure OpenTelemetry and generate an API token

To instrument your AI application using Grafana Cloud, follow these steps:

  1. Sign in to Grafana Cloud and go to the Grafana Cloud Portal.
    If you have access to multiple organizations, select one from the top-left dropdown.
  2. Click the stack you want to configure from the sidebar or the main stack list.
  3. Under Manage your stack, click the Configure button in the OpenTelemetry section.
  4. Scroll down to the Password / API Token section and click Generate now to launch the Create API token window.
  5. Enter a name for the token and click Create token.
  6. Click Close. You don’t need to copy the token.
  7. Scroll down and copy the OTEL_EXPORTER_OTLP_ENDPOINT and OTEL_EXPORTER_OTLP_HEADERS values from the Environment variables section.
    Save these for later.

Install the OpenLIT Telemetry SDK

AI Observability uses the OpenLIT SDK to provide auto-instrumentation for many tools in generative AI stacks, such as LLMs, vector databases, and frameworks like LangChain.

To install the Python SDK, run the following command:

python
pip install openlit

After you’ve installed the Python SDK, update your application code:

 ```python
 import openlit

 openlit.init()

 // Rest of your application code
 ```

Set Up the OTEL endpoint and headers

Run the following commands in your shell to configure the endpoint and headers:

 ```shell
 export OTEL_EXPORTER_OTLP_ENDPOINT="<YOUR_GRAFANA_OTEL_GATEWAY_URL>"
 export OTEL_EXPORTER_OTLP_HEADERS="<YOUR_GRAFANA_OTEL_GATEWAY_AUTH>"
 ```

For the command above, replace values with the ones your copied in the API token creation process:

  • Replace <YOUR_GRAFANA_OTEL_GATEWAY_URL> with the OTEL_EXPORTER_OTLP_ENDPOINT value you copied earlier.
    For example: https://otlp-gateway-<ZONE>.grafana.net/otlp
  • Replace YOUR_GRAFANA_OTEL_GATEWAY_AUTH with the OTEL_EXPORTER_OTLP_HEADERS value you copied earlier.
    For example: Authorization=Basic%20<BASE64 ENCODED INSTANCE ID AND API TOKEN>

Monitor using the pre-built GenAI Observability dashboard

When you run your instrumented AI application, the OpenLIT SDK automatically starts sending OpenTelemetry traces and metrics about your LLM and vector database usage to Grafana Cloud.

Open the GenAI Observability dashboard you previously installed from the integration to visualize the instrumentation data.