DevOps/SRE engineers frequently switch between multiple systems and tools in their daily work, and this disconnect continuously consumes time and attention. The emergence of MCP (Model Context Protocol) offers a new solution to this problem. This article selects 10 representative MCP servers, spanning core scenarios such as Infrastructure as Code (IaC) and cloud resource management, containerization and orchestration platform operation and maintenance, software development and CI/CD processes, system observability and fault management, and data layer access and manipulation. By exposing operational capabilities that originally relied on complex command lines or graphical interfaces to AI in the form of structured interfaces, engineers can understand and execute tasks through natural…
DevOps/SRE engineers frequently switch between multiple systems and tools in their daily work, and this disconnect continuously consumes time and attention. The emergence of MCP (Model Context Protocol) offers a new solution to this problem. This article selects 10 representative MCP servers, spanning core scenarios such as Infrastructure as Code (IaC) and cloud resource management, containerization and orchestration platform operation and maintenance, software development and CI/CD processes, system observability and fault management, and data layer access and manipulation. By exposing operational capabilities that originally relied on complex command lines or graphical interfaces to AI in the form of structured interfaces, engineers can understand and execute tasks through natural language, thereby building a higher level of intelligent workflow and achieving a unified work experience centered around the Terminal.
Introduction
The daily work of a DevOps engineer often spans the entire software lifecycle, from code writing to system maintenance. Whether it’s coding and building during the development phase, or subsequent release, deployment, and maintenance, constant switching between different systems and tools is required. In practice, this disconnect is particularly pronounced: continuous integration and delivery typically rely on tools like GitHub and Jenkins; infrastructure and resource management are handled on cloud platforms like AWS and Alibaba Cloud; service deployment and orchestration are handled by Docker and Kubernetes; and in the maintenance phase, systems like Grafana and Sentry are needed for monitoring and issue tracking. The tools themselves aren’t complex, but the frequent context switching continuously consumes engineers’ time and attention.
Behind this entire process, the Terminal remains the core and most frequently used entry point for DevOps engineers. How to further simplify workflows and reduce tool switching costs around the Terminal has become a key issue in improving DevOps efficiency.
The emergence of MCP (Model Context Protocol) provides a new approach to this problem. Through MCP Server, AI can access different platforms and tools, unifying operational capabilities previously scattered across various systems into a single context. Leveraging AI’s understanding and execution capabilities, engineers can directly complete a series of operations such as building, deploying, and maintaining systems within the Terminal, achieving a unified workflow centered around the Terminal.
In this direction, Chatterm, as an open-source AI Terminal tool, has taken the lead in supporting the MCP protocol, providing a practical example of the "AI + Terminal" DevOps work model.
What is an MCP server?
MCP is an open-source protocol standard designed to provide a unified way for AI applications to connect to external systems. Through MCP, AI applications such as Claude or ChatGPT can securely and in a standardized manner access various data sources (such as local files and databases), tools (such as search engines and calculators), and workflows (such as customized cue chains), thereby expanding their information acquisition and task execution capabilities.
It can be likened to a "USB-C port" in the AI field: just as USB-C provides a universal physical connection standard for electronic devices, MCP defines a universal communication protocol and data exchange specification for the interaction between AI applications and external services. This positioning makes it a key infrastructure for building modular, scalable AI agents.
The above is the core definition of MCP. For more in-depth understanding of the protocol’s technical details, it is recommended to consult its official documentation.
How to choose the best MCP server
Thousands of MCP server implementations exist in community repositories such as "Awesome MCP Servers". To select the most suitable server for your needs from among the many options, it is recommended to evaluate based on the following core criteria:
Scenario Suitability: Assess whether the server is built around the services you use daily or plan to integrate. Can its tools automate the most common or time-consuming tasks in your work? The core value of an MCP server lies in its ability to automate specific business processes.
Core Tools: Carefully review the list of tools provided by the server. Different implementations have different focuses; ensure that its tools cover your key needs.
Implementation Status: Prioritize MCP servers officially released and maintained by the service provider, as this usually means better stability, security, and continuous updates. If there is no official version, examine the popularity (e.g., number of GitHub stars), activity level, and documentation completeness of community implementations.
Communication Protocol: MCP supports two main communication methods: - Stdio Transport: Suitable for locally deployed servers, inter-process communication, low latency. - HTTP Transport: Suitable for remote servers, generally simpler to configure, requires no complex local environment, and does not consume local computing resources.
Choose based on your deployment environment (local or cloud) and network conditions. Generally, if an HTTP implementation is available, HTTP is the preferred method.
Top 10 MCP Servers for Improving DevOps/SRE Workflows
To improve the efficiency and focus of DevOps/SRE engineers in their work, this article selects 10 representative MCP servers as the analysis objects. These tools have a certain degree of validation in terms of stability, functional maturity, and practical application scenarios.
In terms of capability coverage, they span multiple key aspects of the DevOps/SRE workflow, including Infrastructure as Code (IaC) and cloud resource management, containerization and orchestration platform operation and maintenance, software development and CI/CD processes, system observability and fault management, and data layer access and operation, basically covering the core scenarios of daily work.
In terms of technical path, these MCP servers attempt to expose operational capabilities that originally relied on complex command lines or graphical interfaces to AI in the form of structured interfaces, enabling AI to understand and execute tasks through natural language, thereby building a higher level of intelligent workflow.
The following sections will introduce the core functions, applicable scenarios, and respective advantages and limitations of these ten MCP servers one by one.
It should be noted that all the MCP servers mentioned above can be imported and used in the open-source AI Terminal tool Chaterm. Corresponding configuration examples are attached at the end of the article to facilitate readers’ practical experience.
Infrastructure and cloud services
1. AWS platform MCP server
AWS provides a dedicated suite of MCP servers that allow AI assistants to directly access AWS documentation, best practices, and cloud resources. Through these servers, AI applications can perform common cloud infrastructure management tasks, such as manipulating resources using the AWS CLI or Cloud Control API, managing EC2 instances, ECS/EKS container clusters, or querying services like IAM, RDS, and S3. The AWS website states that MCP servers significantly improve the quality and accuracy of model output because the model can access up-to-date documentation and service information within its context. Furthermore, AWS MCP servers encapsulate common Infrastructure as Code (IaaS) processes (such as CDK and Terraform) into AI-callable tools, increasing automation.
Functions and Application Scenarios: AWS MCP servers include document query services, infrastructure management services, and security scanning services, enabling AI to perform operations such as AWS resource creation, configuration, and auditing using natural language. For example, it can query the latest AWS API references, generate Cloud Formation templates, or monitor EKS cluster status.
Advantages: Provides real-time alignment with official AWS documentation, preventing models from responding with outdated information; a unified interface supports multiple AWS services (EC2, S3, Lambda, RDS, etc.), significantly reducing integration complexity; built-in best practices and security checks improve code quality and compliance.
Communication Protocol: Different MCP servers have different connection methods.
Community/Commercial Support: Implemented by AWS (open source)
Official Documentation/Project Address: https://awslabs.github.io/mcp/installation https://github.com/awslabs/mcp
Besides AWS, other cloud service providers also offer corresponding MCP services.
2. HashiCorp Terraform MCP Server
HashiCorp’s official Terraform MCP server introduces MCP support for Terraform configuration management. This server allows AI models to access provider documentation, modules, and policies in the Terraform Registry in real time, generating accurate Terraform configurations instead of relying on outdated training data. HashiCorp documentation states that the Terraform MCP server integrates with the public Registry’s API, supporting the lookup of module inputs and outputs, referencing Sentinel policies, and managing Terraform Cloud (HCP/TFE) organizations and workspaces.
Features and Application Scenarios: AI can query the latest Terraform provider documentation, sample code, and policy rules through the MCP server; it can also automatically create, update, or delete workspaces, variables, and tags in the Terraform Cloud environment. This is particularly useful for writing and reviewing infrastructure code, allowing AI assistants to generate best-practice-compliant configuration snippets or execution plan analyses.
Advantages: Eliminates knowledge gaps caused by Terraform version updates, ensuring that generated IaC content is always synchronized with the latest Registry. Supports access to private Registries and team environments (HCP/TFE), suitable for teams of all sizes.
Communication Protocol: HTTP + STDIO Community/Commercial Support: Officially maintained by HashiCorp (open source) Official Documentation/Project Address: https://developer.hashicorp.com/terraform/mcp-server/deploy https://github.com/hashicorp/terraform-mcp-server
3. Pulumi platform MCP server
Pulumi has launched the MCP server, enabling AI assistants to access resources in the Pulumi Cloud and delegate tasks to Pulumi Neo for automated execution. Pulumi documentation explains that the MCP server allows AI to query stacks and resources within a Pulumi organization, search cloud resources across organizations, and generate and manage infrastructure code using information from the Pulumi Registry.
Features and Application Scenarios: Through the MCP interface, AI can retrieve Pulumi Stack status, resource lists, policy compliance reports, etc.; it can also manage organization members, modify infrastructure configurations, and trigger automated deployments (Pulumi Neo). This makes infrastructure development more conversational and traceable. For example, it can ask, "List all AWS EC2 instances in my organization" or "Generate GCP virtual machine configurations based on my needs."
Advantages: Supports multi-language IaC (TypeScript, Python, etc.); AI can directly generate cross-cloud Pulumi code. Integrates Pulumi best practices and policy checks, improving code quality and avoiding manual deployment errors.
Communication Protocol: HTTP Community/Commercial Support: Implemented and maintained by Pulumi (not open source) Official Documentation/Project Address: https://www.pulumi.com/docs/iac/guides/ai-integration/mcp-server/
4. Kubernetes MCP Server
The Kubernetes MCP server is a community-developed implementation that allows users to manage and monitor Kubernetes environments using natural language commands. It supports core kubectl operations such as creating/deleting Pods, services, and Deployments, and diagnosing cluster health. It also incorporates secure connections and RBAC authentication mechanisms to ensure AI access complies with Kubernetes permission policies.
Features and Application Scenarios: AI can query resource status (such as pod lists and node metrics), deploy new services or extend existing deployments, and even perform complex troubleshooting. Example scenarios include "checking which Pods are abnormal within the namespace" and "helping me rollback a Deployment."
Advantages: Transforms cumbersome Kubernetes command-line operations into intuitive dialogue, lowering the barrier to managing complex clusters. It can monitor cluster health in real time and promptly identify problems.
Communication Protocol: stdio Community/Commercial Support: Developed by the community (open source) Official Documentation/Project Address: https://github.com/Flux159/mcp-server-kubernetes
5. Docker MCP Server
The Docker Hub MCP server exposes Docker Hub’s massive image catalog to the LLM (Docker Library) via the MCP protocol, helping developers discover, evaluate, and manage container images using natural language. Built on the Docker ecosystem, it’s designed specifically for intelligent container management scenarios.
Features and Applications: It provides one-click installation and configuration. The LLM can query required images using natural language (no need to remember complex tags or repository names) and retrieve image details. It can also perform repository management tasks through an intelligent assistant, such as listing repositories under a personal namespace, viewing image statistics, searching image content, and creating or updating repositories using natural language. It’s ideal for scenarios requiring rapid finding and management of container images in AI-assisted development workflows.
Advantages: The officially released service is integrated into the Docker toolchain, solving the MCP server environment dependency problem. It achieves one-click deployment through containerization, eliminating the need for manual environment configuration by users. The MCP Catalog simplifies the setup process and reduces integration costs.
Compatibility: HTTP + stdio Community/Commercial Support: Implemented and maintained by the official Docker team (open source).
Official documentation/project address: https://github.com/docker/hub-mcp/tree/main
Code and CI/CD
6. GitHub platform MCP server
GitHub has officially launched the MCP server, directly integrating AI applications into the GitHub platform. This allows AI to read repository files, manage Issues and Pull Requests, analyze code quality, and automate workflows. The server can be hosted on GitHub (remote MCP) or run locally, supporting one-click integration with clients such as VS Code (Copilot Agent), Claude Desktop, and Cursor. GitHub documentation states that through MCP, AI assistants can browse repository structures, search historical commits, perform code reviews, monitor the GitHub Actions pipeline, and obtain CI/CD feedback.
Functions and Applicable Scenarios: Through the MCP server, AI assistants can perform common version control and collaboration tasks, such as creating/updating Issues, merging branches, releasing versions, and reviewing code security warnings. For example, AI can ask in VS Code, "Which PRs are currently waiting to be merged?", and the MCP server returns a list of PRs and automatically triggers the merge operation.
Advantages: Reduces context switching between the IDE and GitHub interface, allowing developers to obtain the latest repository status and historical information through natural language. GitHub’s MCP server synchronizes data with the official platform in real time, ensuring the timeliness and accuracy of information.
Communication Protocol: HTTP + stdio Community/Commercial Support: Developed and maintained by GitHub (open source) Official Documentation/Project Address: https://github.com/github/github-mcp-server GitLab also provides a corresponding MCP service, which will not be elaborated upon further.
GitLab: https://docs.gitlab.com/user/gitlab_duo/model_context_protocol/mcp_server/
7. Jenkins platform MCP server
The Jenkins community has released the MCP Server plugin, enabling Jenkins to function as an MCP server. After installation, Jenkins automatically exposes its job, build, and log functionalities as MCP tools to the AI assistant. The Jenkins plugin page states: "The MCP Server plugin implements the MCP protocol server, enabling Jenkins to act as an MCP server, providing context, tools, and functionality to AI clients." This means that AI can query build status, trigger build tasks, or retrieve test results using natural language, with all operations executed and feedback provided by Jenkins.
Functions and Applicable Scenarios: The plugin provides Jenkins’ core functionalities (such as job triggering, build viewing, and log retrieval) to the AI in the form of MCP tools. The AI assistant can ask questions like "What caused the latest build failure?" or "Start a nighttime pipeline," and Jenkins will execute the corresponding actions and return the results.
Advantages: No additional dedicated server deployment is required; simply install the plugin on an existing Jenkins instance. It fully leverages existing Jenkins pipeline configurations and credential management for seamless integration with AI.
Compatibility: This plugin is compatible with Jenkins versions 2.479 and above.
Communication Protocol: HTTP + STDIO Community/Commercial Support: Developed and maintained by the official Jenkins team (open source) Official Documentation/Project Address: https://plugins.jenkins.io/mcp-server/
Observability
8. Grafana MCP Server
Grafana MCP Server is an official Grafana service that allows LLMs to access Grafana dashboards and its ecosystem via the MCP protocol. It enables AI to query and manage visualization resources within Grafana using natural language.
Features and Application Scenarios: Supports searching, retrieving, and modifying Grafana dashboards and data sources. For example, it allows searching and retrieving dashboard summaries or details, creating/updating dashboards, listing and retrieving data sources (supporting Prometheus, Loki, etc.), executing Prometheus/Loki queries to retrieve metrics and logs, and managing Grafana alerting rules, events, and Sift log investigations. Suitable for scenarios requiring the integration of monitoring data and visualization resources into intelligent operations or automated analysis processes.
Advantages: Officially maintained by Grafana, it offers broad functionality, covering most common scenarios such as dashboard management, querying, and data source operations, and is licensed under the Apache 2.0 open-source license. The official implementation guarantees stability and continuous updates, allowing full utilization of the existing functionality of the Grafana platform.
Compatibility: Grafana 9.0 and above support all features; some data source operations may be unavailable in versions prior to 9.0. Compatible with all Grafana instances configured with the management API.
Communication Protocol: HTTP + STDIO Community/Commercial Support: Implemented by the Grafana official website (open source) Official Documentation/Project Address: https://github.com/grafana/mcp-grafana
9、 Sentry MCP Server
Sentry’s MCP server provides access systems with complete Sentry issue and error contexts via the Model Context Protocol (MCP). This allows AI assistants and development tools to securely access Sentry data, making it suitable for scenarios that integrate Sentry error monitoring and debugging information into intelligent workflows.
Features and Applicable Scenarios: Supports querying Sentry events via natural language, such as accessing errors and issues in Sentry, searching for errors in specific files, querying project and organization information, listing/creating project DSNs, and performing autofixes and obtaining status. Suitable for scenarios requiring the integration of Sentry error logs and crash report context into AI-assisted development or automated operations processes.
Advantages: Officially hosted remote service, no self-deployment required. Tools and features are primarily geared towards developer workflow and debugging needs (such as error analysis in coding assistants), optimizing the experience for use with code assistance tools (such as Cursor and Claude Code).
Communication Protocol: HTTP + STDIO Community/Commercial Support: Maintained by Sentry (open source) Official Documentation/Project Address: https://github.com/getsentry/sentry-mcp
Database
10. MongoDB MCP Server
The official MongoDB MCP server (public beta) allows connecting MongoDB databases (Atlas, Community, or Enterprise) to AI tools via the MCP protocol. It enables AI to query document data and perform administrative operations using natural language.
Features and applicable scenarios: Supports data exploration (e.g., displaying the schema of a user collection or finding active users), database management (e.g., creating read-only users, listing network access rules), and context-aware query generation (AI describes the required data and automatically generates MongoDB queries and application code). Suitable for scenarios where intelligent assistants perform database queries, document analysis, and database maintenance tasks.
Advantages: Officially released and integrated with the MongoDB ecosystem, supports Atlas and local deployments, and provides natively supported MCP interfaces. Integrated into AI development environments such as Windsurf, allowing developers to access MongoDB data without leaving their IDE.
Communication protocol: HTTP + stdio Community/commercial support: Implemented and maintained by MongoDB (open source) Official documentation/project address: https://www.mongodb.com/company/blog/announcing-mongodb-mcp-server https://github.com/mongodb-js/mongodb-mcp-server Similarly, you can find MCP servers for other databases such as MySQL and Redis.
Best Practices:
Managing the Number of Tools
In practical use of MCP, managing the number of tools is often overlooked yet crucial. Most of the time, we add an MCP service to the application for a specific task and use it. However, when starting a new conversation after that task is completed, we easily forget that the previously added MCP tools are still running. This results in these tools, though completely unused throughout the new conversation task, occupying valuable context space and causing unnecessary waste of resources.
A better approach is to cultivate good habits: before starting a new conversation task, proactively check whether the currently enabled MCP services are truly needed and promptly disable those services unrelated to the new task. This not only frees up context space but also allows the model’s attention to focus more on truly relevant tools, improving overall response quality and efficiency.
Furthermore, some more sophisticated applications offer on/off functionality for individual tools, allowing users to selectively disable certain unnecessary tools without shutting down the entire MCP service. It is recommended to use this feature appropriately to make context management more precise and efficient.
Progressive Disclosure
When enabling numerous MCP services within a single context, developers often encounter two thorny issues: (1) excessive context space consumption and (2) tool forgetting in long conversations. These are unavoidable bottlenecks in the "load all tools at once" MCP usage model. While there is currently no unified standardized solution to address these issues, several promising technical paths have emerged: whether it’s Claude’s recently launched Skills or VS Code Copilot’s ToolSets, their core concept is to progressively disclose tool details, rather than loading all tool information at once. With continued community exploration and gradual improvement of standards, we have reason to expect a more efficient and intelligent MCP ecosystem.
Common MCP Platforms
Platform | Number of Listed Entries (as of September 2025) | Main Features | Usage Threshold | Recommendation Index / Suitable Users
mcp.so | 16436 | World's largest MCP library; supports keyword search; finely categorized MCPs; supports Chinese; provides direct copy installation commands; supports user-submitted custom MCP servers, with over 1000 submissions; detailed documentation introducing MCPs; comment and discussion function; Medium, requires manual MCP deployment, but the interface is clear and supports Chinese ⭐⭐⭐⭐⭐ | Highly recommended; suitable for users who need a large number of MCPs, clear categorization, and Chinese-friendly interface
MCPHub | 26181 | Supports keyword search for MCPs; finely categorized MCPs; supports user-submitted custom MCP servers; provides direct copy installation commands; detailed documentation introducing MCPs; comment and discussion function; Low, approximately 5000 MCPs are already hosted online ⭐⭐⭐⭐ | Suitable for developers and beginners who want to quickly experience MCPs
PulseMCP | 5966 | Dynamically updated; includes… MCP Servers + Clients; Offers the latest MCP-related news updates and detailed test cases; supports user-submitted custom MCP servers; Medium difficulty, intuitive interface, direct link to GitHub repository ⭐⭐⭐⭐ Suitable for those following the latest MCP ecosystem developments and wanting to see Client/trend reports.
Smithery 6374 Supports keyword search for MCPs; MCP categorization is relatively simple but provides direct copy installation commands; indicates client support status; provides automatic installation commands for some clients; provides a basic introduction to MCPs; clean interface; low difficulty, beginner-friendly, but some services are unstable ⭐⭐⭐ Suitable for beginners and developers who want to quickly experience MCPs.
Awesome MCP Servers 1968 A selection of small but excellent MCPs; clear categorization; focuses on MCP quality; provides a basic introduction to MCPs; supports user-submitted custom MCP servers; Medium difficulty, clean interface, requires some development experience ⭐⭐⭐ Suitable for those who want to quickly find "reliable MCPs" and don't want information overload.
Using the MCP server described above in Chaterm
Open the "Settings" page in Chaterm. 1.
Locate the Tools & MCP tab on the left, click Add Server, and the system will automatically open the mcp_setting.json file. 1.
Add the following configuration to the file, adjusting the corresponding parameters as needed. 1.
After saving, Chaterm will automatically read and attempt to connect to the server.
{
"mcpServers": {
"github": {
"url": "https://api.githubcopilot.com/mcp/",
"headers": {
"Authorization": "Bearer your-token"
},
"disabled": false
},
"awslabs.aws-documentation-mcp-server": {
"command": "uvx ",
"args": [
"awslabs.aws-documentation-mcp-server@latest"
],
"env": {
"FASTMCP_LOG_LEVEL": "ERROR",
"AWS_DOCUMENTATION_PARTITION": "aws"
},
"disabled": false,
},
"grafana": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"-e",
"GRAFANA_URL",
"-e",
"GRAFANA_SERVICE_ACCOUNT_TOKEN",
"mcp/grafana",
"-t",
"stdio"
],
"env": {
"GRAFANA_URL": "",
"GRAFANA_SERVICE_ACCOUNT_TOKEN": "",
"GRAFANA_USERNAME": "",
"GRAFANA_PASSWORD": "",
"GRAFANA_ORG_ID": "1"
},
"disabled": false
},
"sentry": {
"command": "npx",
"args": [
"-y",
"mcp-remote@latest",
"https://mcp.sentry.dev/mcp"
],
"disabled": false
}
},
"kubernetes": {
"command": "npx",
"args": [
"mcp-server-kubernetes"
],
"disabled": false
},
"MongoDB": {
"command": "npx",
"args": [
"-y",
"mongodb-mcp-server@latest",
"--readOnly"
],
"env": {
"MDB_MCP_CONNECTION_STRING": ""
},
"disabled": false
}
}
Originally published at https://chaterm.ai Website: https://chaterm.ai/ GitHub: https://github.com/chaterm/Chaterm