Today, we’re releasing the first production version of the SAP Datasphere MCP Server.
SAP Datasphere is the Fabric engine from SAP; it's where data is extracted from any source (SAP S/4HANA, Ariba, or SuccessFactors), curated, and exposed or loaded into a destination system.
Coming as an evolution of “SAP Data Warehouse Cloud”, Datasphere has been adopted by thousands of customers, and it's the central component of one of SAP’s key strategic initiatives: Business Data Cloud (BDC).
In other words, with Datasphere lies the future of any SAP Data practitioner.
Yet the tool is heavy to use, and the talent who can manage the application is hard to find. This MCP server can help.
Today, we’re releasing the first production version of the SAP Datasphere MCP Server.
SAP Datasphere is the Fabric engine from SAP; it's where data is extracted from any source (SAP S/4HANA, Ariba, or SuccessFactors), curated, and exposed or loaded into a destination system.
Coming as an evolution of “SAP Data Warehouse Cloud”, Datasphere has been adopted by thousands of customers, and it's the central component of one of SAP’s key strategic initiatives: Business Data Cloud (BDC).
In other words, with Datasphere lies the future of any SAP Data practitioner.
Yet the tool is heavy to use, and the talent who can manage the application is hard to find. This MCP server can help.

Intro
MCP, in case you are not familiar, is like a toolbox a mechanic has in its garage. It provides all the tools for a genAI Agent (a mechanic) to fix a car.
This MCP was initially built for a customer who had a large Datasphere tenant, where the customer had to document the existing configurations and their usage. Now it has been cleansed and expanded into 44 production-ready tools, the MCP has been developed in collaboration with Claude and Kiro, one as a developer and the other as a tester, and includes thousands of lines of code, hundreds of AI code hours of writing and validation, so you don't have to do it.
It’s the first of its kind. When we started building the SAP Datasphere MCP Server, any other Fabric MCP landscape alternative was sparse. Now it's evolving. Today, this MCP server provides 44 enterprise-grade tools through SAP Datasphere APIs, and will not stop there.
The current MCP server has extensive “read” and “analysis” capabilities on Datasphere assets, but lacks “developing” capabilities.
Use Case #1: The Monday Morning Health Check
- Persona: Data Operations Manager
- Time Saved: 45 minutes daily
- Tools Used: 8 tools working in concert
- Problem: Sarah must provide an early status of the system and highlight critical issues.
Tools used;
health_check, list_spaces, list_data_flows, get_data_flow_status for each failed flow, list_connections and get_tenant_info, discover_catalog and analyze_column_distribution
Result: Sarah has a complete environment status in 2–4 minutes, formatted as a report she can forward to leadership.

Use Case #2: Data Lineage inspection
-Persona: Data Engineer
-Time Saved: 3 hours per investigation
-Tools Used: 6 tools for impact analysis
-The Problem: Marcus, the Data Engineer, must answer this question ASAP;
“We’re deprecating the LEGACY_CUSTOMER_ID field next month. Can you tell me what breaks?”
This question triggers hours of investigation:
- Which tables contain this field?
- Which views reference it?
- Which data flows transform it?
- What’s the downstream impact?

Marcus asks the Agent:
Find everywhere LEGACY_CUSTOMER_ID is used in our data landscape
The Investigation Unfolds: Agent uses find_assets_by_column, get_view_definition, list_data_flows, get_data_flow_details …
Marcus finally has a complete picture:
- 7 tables/views directly affected
- 3 data flows that will break
- 1 critical JOIN operation requiring migration logic
- Estimated effort: 40 hours for proper migration
Result: What would have taken 3+ hours of manual investigation took 5–10 minutes. Marcus can now plan the migration properly instead of breaking production.

Use Case #3: The Pre-Analytics Data Quality Audit
- Persona: Data Analyst
- Time Saved: 2 hours per analysis project
- Tools Used: 5 tools for quality assessment
The Scenario: Lisa is building a customer segmentation dashboard. Before she invests time in Power BI development, she needs to know: Is the source data clean enough?

Result: Lisa knows exactly which fields are reliable for segmentation, understands the data quality issues, and even learned the business reason behind missing data. She can build her dashboard with confidence.
Time: 10 minutes vs. 2+ hours of SQL queries and data profiling scripts.
Use Case #4: The Onboarding Speedrun
Persona: New Data Engineer
Time Saved: 2 days of exploration
Tools Used: 12 tools for landscape discovery
It’s Alex’s first week at a new company. He needs to understand:
- What data exists in SAP Datasphere?
- How is it organized?
- What are the critical data flows?
- Where should he focus his learning?
“I’m new here. Give me an overview of our SAP Datasphere environment.”

After 30 minutes of conversation, Alex has:
- ✅ Complete environment inventory
- ✅ Understanding of critical data assets
- ✅ Knowledge of team structure and access
- ✅ List of key data flows to monitor
- ✅ Schema documentation for important tables
- ✅ Context on why data is structured this way
What used to take 2 days of meetings and documentation reading took 30 minutes.
Use Case #5: The Marketplace Shopping Spree
Persona: Analytics Team Lead
Time Saved: 1 hour per evaluation
Tools Used: 3 tools for content discovery
The analytics team needs pre-built industry models for retail analytics. Instead of building from scratch, they want to see what’s available in the SAP Datasphere Marketplace.
“What retail analytics content is available in the marketplace?”

The team lead can now:
1. Compare 15 relevant packages instantly
2. See free vs. paid options with pricing
3. Read reviews and ratings
4. Understand what’s included in each package
5. Make an informed decision without leaving the conversation
In future MCP releases, in order to trigger the installation workflow and model exploration, currently not possible through programmatic APIs
Use Case #6: The Security Audit
- Persona: Data Governance Manager
- Time Saved: 4 hours per audit
- Tools Used: 7 tools for compliance reporting
An Audit Requirement; Quarterly security audit requires documentation of:
- Who has access to sensitive data?
- Where is PII stored?
- Which connections access external systems?
- Are there any security risks?
“I need a security audit report for our Q4 compliance review.
Focus on PII data and user access.”

Result: A comprehensive security audit that would take a compliance analyst 4+ hours to compile manually was generated in 5 minutes.

Use Case #7: The Performance Troubleshooter
Persona: Data Platform Engineer
Time Saved: 1.5 hours per incident
Tools Used: 9 tools for root cause analysis
“SALES_ANALYTICS dashboard is slow. Help me debug.”

Marcus now knows:
1. What: 89M rows loaded instead of 100K
2. Why: Debug filter left in production deployment
3. When: Deployed Dec 12 at 22:45
4. Impact: 312 GB table, causing dashboard slowness
5. Solution: Revert data flow config, truncate bad data
Use Case #8: The Cross-Functional Collaboration
- Persona: Lisa and Marcus. Business Analyst + Data Engineer
- Time Saved: 3 hours of back-and-forth
- Tools Used: 6 tools bridging business and technical
Business Analyst (Lisa) on Slack:
“I need sales data by country for last quarter. Can you help?”
Lisa shares her Agent conversation with Marcus (data engineer)
Together, they work to achieve a common goal

Marcus can see the entire conversation and validate, Claude's conversation serves as documentation. Lisa learned what tables exist and how they join
The goal: AI as a Data Platform Copilot
Across all these use cases, we see common patterns:

1. Conversational Discovery
Instead of: “I need to find where customer data is stored” (opens UI, clicks around)
You ask: “Where is customer data?” and get instant, comprehensive answers.
2. Context Building
The AI doesn’t just answer one question — it:
- Uses multiple tools in sequence
- Connects related information
- Provides context you didn’t know to ask for
3. Iterative Refinement
“Show me sales data.”
→ “Add product categories.”
→ “Filter to Q4.”
→ “Group by country.”
→ “Add monthly trends.”
Each step builds on the previous, like pair programming with your data platform.
4. Explanation + Action
Not just “here’s the data” but:
- Why does the data look this way
- What might be wrong
- What you should check next
- How systems relate to each other
5. Cross-Domain Knowledge
The same conversation can:
- Query data (analyst skills)
- Check data flows (engineer skills)
- Review access controls (governance skills)
- Analyze performance (DBA skills)
The Impact: ROI
- Time Savings
- Quality Improvements
- Fewer Errors
- Better Documentation
- Faster Onboarding
- Democratized Access
- … and others
Future Enhancements in the plan
Multi-Tenant Operations:
- Compare data across dev/staging/production tenants
- Promote configurations between environments
- Cross-tenant lineage tracking
Advanced Analytics:
- Automated anomaly detection
- Predictive data quality scoring
- ML-powered optimization suggestions
Workflow Automation:
- “Create a data flow that does X.”
- “Schedule quality checks for all customer tables.”
- “Alert me when data freshness > 24 hours.”
Collaboration Features:
- Share conversations as runnable playbooks
- Team libraries of common questions
- Automated report generation
Due to the existing SAP Datasphere APIs, this MCP server focuses on Visualization & Query (Read-only/Data extraction).
In comparison to, for example, the Microsoft Fabric MCP server, the official Fabric MCP isn’t just about data; it’s also about the container of the data. The goal is shared; this Datasphere MCP server and the Microsoft Fabric MCP server are focused on the development context.
Both provide the LLM with the full OpenAPI specifications, JSON schemas for items (Lakehouses, Pipelines), and best-practice limits.
The value is significant. In the past, connecting an LLM to your data required building complex “RAG” pipelines or pasting CSVs into a chat window. MCP standardizes this, turning your Data Mesh stack into a “read/write” environment for AI. Right now a Junior data engineer, but it will get better as Datasphere APIs get extended and BDC APIs are added or customer combines with other SAP MCPs.
Try It Yourself
Everything in this blog post is real and available today.
No waitlists. No closed beta. No enterprise sales calls.
pip install sap-datasphere-mcp
Resources:
- PyPI: https://pypi.org/project/sap-datasphere-mcp/
- GitHub: https://github.com/MarioDeFelipe/sap-datasphere-mcp
Ask questions. Get answers. Iterate. Discover.
That’s the SAP Datasphere MCP Server. You can host it in your domain and modify it for your needs.
SAP Datasphere MCP server. Release blog was originally published in Towards AI on Medium, where people are continuing the conversation by highlighting and responding to this story.