· Joseph · AI & Machine Learning  · 5 min read

Use Grafana MCP with Gemini and n8n

The Model Context Protocol (MCP) is extremely useful. An AI assistant helps you decide when and how to use connected tools, so you only need to configure them. After integrating MCP logging management systems into several of my projects, it has saved me a significant amount of time.

In this article, I'm going to integrate Grafana with the Gemini CLI and n8n. I will chat with the Gemini CLI and n8n and have them invoke the Grafana MCP server.

structure

TOC

The Model Context Protocol (MCP) is extremely useful. An AI assistant helps you decide when and how to use connected tools, so you only need to configure them. After integrating MCP logging management systems into several of my projects, it has saved me a significant amount of time.

In this article, I’m going to integrate Grafana with the Gemini CLI and n8n. I will chat with the Gemini CLI and n8n and have them invoke the Grafana MCP server.

structure

TOC

Prerequisites

I’ll be using these tools, so you may need to install them first.

There are many ways to set up the Grafana MCP. I’m going to connect to the Grafana MCP server using Docker in the Gemini CLI and using a binary file in n8n. I ran Grafana in Docker on my localhost (http://host.docker.internal:9120), but you can point to your own Grafana service.

Grafana Service Account

A service account token allows an MCP client to connect and authenticate with Grafana. Therefore, we need to create a service account and generate a token for it. Let’s go through the steps:

  1. Click the menu on the left,
  2. Click Administration,
  3. Click Service accounts under Users and access,
  4. Create a service account by clicking Add service account,
  5. Generate a token by clicking Add service account token.
  6. Copy your token!

grafana1 grafana2

Once you have the token and Grafana URL, you can set them as environment variables for the Grafana MCP server.

Gemini CLI using Grafana MCP

Don’t worry; it’s quite simple if you remember my article about the Figma MCP. I’ll follow that guide to use Docker to connect Gemini and Grafana.

"mcpServers": {
  "grafana": {
    "command": "docker",
    "args": [
      "run",
      "--rm",
      "-i",
      "-e",
      "GRAFANA_URL",
      "-e",
      "GRAFANA_API_KEY",
      "mcp/grafana",
      "-t",
      "stdio"
    ],
    "env": {
      "GRAFANA_URL": "http://host.docker.internal:9120",
      "GRAFANA_API_KEY": "YOUR_GRAFANA_SERVICE_ACCOUNT_TOKEN"
    }
  }
}

When I type /mcp in the Gemini CLI, it will list all available tools. I can also ask Gemini to do things like Summarize grafana board, get the "Docker Prometheus Monitoring" board summary, or query_prometheus rate(node_cpu_seconds_total{mode="system"}[5m]). grafana-gemini

Now, let’s see how to use Grafana with n8n.

N8n using Grafana mcp

I did not install Docker in the n8n container, so I decided to run the mcp-grafana binary file directly.

Download here: https://github.com/grafana/mcp-grafana/releases

Just run these commands inside your container to install it.

$ wget https://github.com/grafana/mcp-grafana/releases/download/v0.6.2/mcp-grafana_Linux_x86_64.tar.gz
$ tar -xvzf mcp-grafana_Linux_x86_64.tar.gz

After that, n8n nodes should be able to run Grafana using the /home/node/mcp-grafana command. Before we try the command, we have to add the n8n-nodes-mcp community node to n8n and set up an AI agent workflow. Let’s take a look. community-node workflow

  • AI Agent Node: I set the system prompt to You are a helpful Grafana MCP assistant, and then triggered it with the Connected Chat Trigger node.
  • Google Gemini Chat Model: The important thing is that the host for the Credential Google Gemini(PaLM) Api should be https://generativelanguage.googleapis.com. Then, paste your Gemini API key into the API Key field.
  • Simple Memory: The Context Window Length is set to 10 because we need to communicate with the assistant multiple times.
  • MCP Client Tool - listTools: Remember to use the MCP client tool we just downloaded from the community. For this node, I set the Operation to List Tools. mcp-client list-tools
  • MCP Client Tool - executeTool: For this node, I set Operation to Execute Tool and Tool Name to {{ $fromAI('tool') }}. Pay attention to the green area that says Let the model define this parameter—just click it! execute-tool
  • Grafana MCP Credential: This credential connects using Command Line (STDIO). Set the command to /home/node/mcp-grafana and set the GRAFANA_URL and GRAFANA_API_KEY environment variables. mcp-credential

Okay! Let’s try some prompts for this MCP:

{ “tool”: “query_prometheus”, “Tool_Parameters”: { “datasourceUid”: “PBFA97CFB590B2093”, “expr”: “rate(node_cpu_seconds_total{mode=“system”}[5m])”, “queryType”: “range”, “startTime”: “now-1h”, “endTime”: “now”, “stepSeconds”: 60 } }

The Prometheus query returned time series data for the rate of system CPU usage over the last hour, sampled every 60 seconds, for 11 CPU cores (labeled 0 through 10).

Here is a summary of the data for each CPU core:

CPU 0:
  Latest Value: 0.0148
  Average Value: 0.0155
CPU 1:
  Latest Value: 0.0061
  Average Value: 0.0059
CPU 2:
  Latest Value: 0.0064
  Average Value: 0.0058
CPU 3:
  Latest Value: 0.0037
  Average Value: 0.0053
CPU 4:
  Latest Value: 0.0044
  Average Value: 0.0054
CPU 5:
  Latest Value: 0.0065
  Average Value: 0.0053
CPU 6:
  Latest Value: 0.0050
  Average Value: 0.0060
CPU 7:
  Latest Value: 0.0045
  Average Value: 0.0051
CPU 8:
  Latest Value: 0.0049
  Average Value: 0.0047
CPU 9:
  Latest Value: 0.0068
  Average Value: 0.0045
CPU 10:
  Latest Value: 0.0035
  Average Value: 0.0046
This data indicates the proportion of time each CPU core spent in "system" mode over 5-minute intervals. I can provide a more detailed analysis, create a visualization, or filter the data further if you need.

grafana-n8n

Conclusion

And there you have it! The Model Context Protocol (MCP) is a game-changer, allowing you to connect services like Grafana to multiple AI agents with incredible ease. Forget writing boilerplate code for HTTP requests or wrestling with APIs. With MCP, a few configuration steps are all you need to get your services talking to each other, making powerful integrations more accessible than ever.

If you liked this post, please connect with me on LinkedIn and give me some encouragement. Thanks.

Back to Blog

Related Posts

View All Posts »
Use Figma MCP server with Gemini CLI

Use Figma MCP server with Gemini CLI

In this article, I won't introduce what MCP is. Instead, I will explain how to set up the Figma MCP server and use Gemini as an MCP client to work with Figma. I will also show you how to run a prompt to get a Figma design with Gemini. TOC

AI code review with n8n

AI code review with n8n

Previously I read a post "Automate and Accelerate GitLab Code Reviews with OpenAI and n8n.io". This made me wonder: If I don’t choose GitHub Copilot for code reviews, can I still integrate AI and n8n with GitHub PR reviews? I haven’t written a blog in a long time—it’s time to start again!

Install Gemini CLI

Install Gemini CLI

Introduction Gemini CLI has been one of the most popular AI agents in the first half of 2025. It's similar to Claude Code, bringing its power directly into your terminal. Although other terminal AI agents exist, their pricing plans are quite different. Gemini CLI provides a free tier with 100 requests per day using Gemini 2.5 Pro, and you can unlock Tier 1 by upgrading to a paid plan. Prerequisites I'm going to use npm to install Gemini. My Node.js version is v24.4.1, and my npm version is 11.4.2. Gemini needs Node.js version 20 or higher installed. If you're using macOS, you can also choose Homebrew to install the Gemini CLI. Installation Now, let's install it using npm. After installation, you can run gemini directly in your terminal. npm install -g @google/gemini-cli installation I'm using the Use Gemini API key authentication method, so I need to generate a key from Google AI Studio and set it in .zshrc (or .bashrc) by adding this line: And then you can try Gemini now! Run some examples example Prompt: give me suggestions for the socket functionality of this project? Response: Conclusion: The Gemini installation is very simple. Although I am using Neovim with Avante, Gemini gives me more power to use the terminal. Next, I will explore how to use Gemini with an MCP server and integrate the workflow into my daily tasks.

Use ChatGPT to translate new react-dev doc

Use ChatGPT to translate new react-dev doc

react.dev was released on March 17. I've read the beta version for a while. I love the Escape Hatches section which has many correct and recommended usages about react hooks. After new react.dev released, I noticed that there's no translation. I haven'n played OpenAI API yet, so I think this is a good opportunity to play ChatGPT with its translation feature for react.dev. TOC