In our previous article about the Kiali toolset, we explored how Kiali acts as the "eyes" of your service mesh, providing unparalleled visibility into traffic topology, health, and metrics. We showed you how to manually inspect the graph, validate Istio configurations, and troubleshoot mTLS issues. But what if you didn't have to manually hunt for errors? What if you could just ask your cluster what’s wrong?
In this article, we will take a leap forward by installing the Kiali Model Context Protocol (Kubernetes MCP In our previous article about the Kiali toolset, we explored how Kiali acts as the "eyes" of your service mesh, providing unparalleled visibility into traffic topology, health, and metrics. We showed you how to manually inspect the graph, validate Istio configurations, and troubleshoot mTLS issues. But what if you didn't have to manually hunt for errors? What if you could just ask your cluster what’s wrong? In this article, we will take a leap forward by installing the Kiali Model Context Protocol (Kubernetes MCP) server in Red Hat OpenShift Lightspeed. This integration allows the OpenShift Lightspeed AI assistant to interface directly with Kiali, giving the AI visibility into your service mesh to assist with troubleshooting and configuration. Before we dive into the installation, ensure you have the following prerequisites: For an enterprise-grade experience, you can integrate this toolset directly into OpenShift Lightspeed. This allows any user on the cluster to utilize Kiali's capabilities through the OpenShift Lightspeed chat interface. Since OpenShift Lightspeed runs inside the cluster, we need to deploy the MCP server as a service rather than running it locally on your laptop. Created a Then, upload your TOML configuration to the cluster. Create a deployment that runs the server. This exposes the MCP over HTTP so OpenShift Lightspeed can connect to it. kind: Deployment metadata: name: kubernetes-mcp-server namespace: istio-system spec: replicas: 1 selector: matchLabels: app: kubernetes-mcp-server template: metadata: labels: app: kubernetes-mcp-server spec: serviceAccountName: kubernetes-mcp-server automountServiceAccountToken: true containers: - name: mcp-server image: quay.io/containers/kubernetes_mcp_server:latest # Check for latest version args: - “–port=8080” - “–config=/etc/mcp/mcp-viewer-config.toml” ports: - containerPort: 8080 volumeMounts: - name: config-vol mountPath: /etc/mcp volumes: - name: config-vol configMap: name: kubernetes-mcp-config apiVersion: v1 kind: Service metadata: name: kubernetes-mcp-server namespace: istio-system spec: selector: app: kubernetes-mcp-server ports: - port: 8080 targetPort: 8080 Once the MCP server is running, OpenShift Lightspeed needs to know it exists. Depending on your version, you may need to configure the OlsConfig (OpenShift Lightspeed config) to whitelist the new tool endpoint or enable the service mesh plug-in as follows: You can also modify the YAML directly (Figure 1). Now for the magic! Open the OpenShift Lightspeed chat window in the console and try this prompt (Figure 2): User: "Analyze the traffic flow for the 'payments' service." OpenShift Lightspeed: "I checked Kiali, and the 'payments' service is receiving traffic from 'checkout' but is experiencing a 15% error rate on the v2 workload..." OpenShift Lightspeed returns an explanation in Figure 3. By installing the Kiali MCP integration, we've transformed Kiali from a passive dashboard into an active data source for AI-driven operations. This setup reduces the "time-to-insight" for SREs and developers by combining the comprehensive Kiali toolset with the conversational power of OpenShift Lightspeed. You can watch the demo on YouTube. The post Transform Kiali with OpenShift Lightspeed and Kubernetes MCP appeared first on Red Hat Developer.
Set up OpenShift Lightspeed
Step 1: Create a ConfigMap for Kiali configuration
mcp-osl-config.toml in a known location (e.g., ~/mcp-osl-config.toml).toolsets = ["core","kiali"]
read_only = true
[toolset_configs.kiali]
url = "https://kiali-istio-system.apps-crc.testing/"
insecure = trueoc create configmap kubernetes-mcp-config \
--from-file=~/mcp-osl-config.toml=./mcp-osl-config.toml \
-n istio-systemStep 2: Deploy the MCP server
apiVersion: apps/v1
Step 3: Register the tool with OpenShift Lightspeed
<a href="/sites/default/files/image2_147.png" data-featherlight="image"><img loading="lazy" src="/sites/default/files/styles/article_floated/public/image2_147.png?itok=DLYynFUb" width="600" height="323" alt="This shows the Kiali toolset OpenShift Lightspeed configuration settings." typeof="Image">
Step 4: Chat with your mesh
<a href="/sites/default/files/image1_221.png" data-featherlight="image"><img loading="lazy" src="/sites/default/files/styles/article_floated/public/image1_221.png?itok=R3tJm0bs" width="600" height="324" alt="OpenShift Lightspeed talking with the MCP in the chat window." typeof="Image">
<a href="/sites/default/files/image3_126.png" data-featherlight="image"><img loading="lazy" src="/sites/default/files/styles/article_floated/public/image3_126.png?itok=XHuSLUSN" width="600" height="778" alt="OpenShift Lightspeed returns an explanation in this window." typeof="Image">
Final thoughts