LLM plugin
Note
LLM app plugin is currently in public preview. Grafana Labs offers limited support, and breaking changes might occur prior to the feature being made generally available.
Grafana Cloud offers a range of optional features that leverage Large Language Model (LLM) services. These features are not enabled by default, but you can easily activate them in the Grafana LLM app plugin by approving limited data sharing with the OpenAI API.
The Grafana LLM app centralizes access to Large Language Model (LLM) services across Grafana to secure and simplify your LLM interactions.
The Grafana LLM application plugin serves several key functions:
- Acts as a proxy, handling authenticated requests to LLMs. This eliminates the requirement for other Grafana components to manage API keys.
- Enables real-time streaming interactions on the Grafana front end by offering live streams of responses from the LLM provider.
If you prefer, you may also configure your own API authentication from supported LLM providers, such as OpenAI, Azure OpenAI or any OpenAI compatible API (e.g. vLLM, ollama, LMStudio, or LiteLLM). With this option, the LLM app securely stores API keys for you.
What can it do?
Unlock the potential of Grafana LLM plugin with features like:
- AI-powered flamegraph interpretation
- Incident Auto-summary
- Dashboard panel title and description generation
- Explanations of error log lines in Sift