This is an early dev version β not plug-and-play yet. Youβll need to know your way around Docker networks and storage mapping.
π§© AnythingLLM MCP HTTP Bridge
Lightweight HTTP bridge for AnythingLLM to connect multiple local MCP servers securely via Docker. A clean, modular, and safe way to run multiple MCP servers without Docker-in-Docker or direct process spawning.
π Overview
This bridge allows AnythingLLM to communicate with multiple MCP servers inside isolated Docker containers. It provides a single, secure HTTP entry point while keeping each MCP fully sandboxed.
β Key Features
- π Safe & clean: No host or Docker-in-Docker access.
- π§± Modular: Each MCP runs in its own container with a small FastAPI server.
- π Bridge-controlled: Anythiβ¦
This is an early dev version β not plug-and-play yet. Youβll need to know your way around Docker networks and storage mapping.
π§© AnythingLLM MCP HTTP Bridge
Lightweight HTTP bridge for AnythingLLM to connect multiple local MCP servers securely via Docker. A clean, modular, and safe way to run multiple MCP servers without Docker-in-Docker or direct process spawning.
π Overview
This bridge allows AnythingLLM to communicate with multiple MCP servers inside isolated Docker containers. It provides a single, secure HTTP entry point while keeping each MCP fully sandboxed.
β Key Features
- π Safe & clean: No host or Docker-in-Docker access.
- π§± Modular: Each MCP runs in its own container with a small FastAPI server.
- π Bridge-controlled: AnythingLLM only talks to the bridge (
mini-bridge), never to the MCPs directly. - βοΈ Dynamic registry: MCP servers are auto-loaded from
mcp_registry.json. - β»οΈ Auto reload: The bridge reloads configuration changes live without restarting containers.
- π Example MCPs: Includes a
Dummy-MCPfor testing and a workingMCP-Timemodule returning system time.
π§ Architecture
βββββββββββββββββββββββββββββββ
Docker Network: test-net
βββββββββββββββββββββββββββββββ
β
ββ anythingllm (Port 3001)
β β³ Main app β talks only to mini-bridge
β
ββ mini-bridge (Port 4100)
β β³ Forwards JSON-RPC requests to registered MCPs
β β³ Reloads mcp_registry.json dynamically
β
ββ dummy-mcp (Port 4200)
β β³ JSON-RPC test MCP returning handshake info
β
ββ mcp-time (Port 4210)
β³ Returns current UTC time
βββββββββββββββββββββββββββββββ
βοΈ Quick Setup
git clone https://github.com/yourname/anythingllm-mcp-http-bridge.git
cd anythingllm-mcp-http-bridge
docker compose up --build
Bridge log output:
[Bridge] Registry reloaded: ['dummy', 'time']
[Bridge] Default route β dummy (200)
Then open AnythingLLM β MCP Settings β
Add new server: http://mini-bridge:4100
π MCP Registry Example
mini_bridge/config/mcp_registry.json
{
"autoReload": true,
"servers": [
{
"id": "dummy",
"name": "Dummy MCP",
"url": "http://dummy-mcp:4200",
"type": "streamable",
"enabled": true
},
{
"id": "time",
"name": "Time MCP",
"url": "http://mcp-time:4210",
"type": "streamable",
"enabled": true
}
]
}
π§© Available MCP Modules
| MCP | Description | Example call |
|---|---|---|
| Dummy-MCP | Minimal test server, responds to initialize, ping, etc. | POST /dummy |
| MCP-Time | Returns current UTC time | tools/call β get_time |
Test manually:
curl -X POST http://localhost:4100/time -H "Content-Type: application/json" -d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"get_time"}}'
Response:
{"jsonrpc":"2.0","id":1,"result":{"message":"The current time is 18:42:10 UTC"}}
π§© Folder Structure
βββ docker-compose.yml
βββ mini_bridge/
β βββ mini_bridge_v2.py
β βββ Dockerfile
β βββ config/
β βββ mcp_registry.json
βββ dummy_MCP/
β βββ dummy_mcp.py
β βββ Dockerfile
βββ mcp_time/
βββ mcp_time.py
βββ Dockerfile
π Security
Each MCP server runs in its own container on a private Docker network. The bridge is the only exposed interface and accepts JSON-RPC over HTTP β nothing else.
- No direct container access from AnythingLLM
- No Docker-in-Docker
- No shell or file system commands
- Safe, stateless JSON-based communication only
π§Ύ License
MIT License Feel free to use, fork, and build upon this project. See the full LICENSE file for details.
π¬ Development Note
Iβll keep improving this project as my own independent solution. If the AnythingLLM team finds the idea useful, feel free to build on it or integrate parts of it.
As of now, tool output is still returned in a raw format instead of a fully formatted chat reply β once thatβs refined, the public release will follow.
π Credits
Created by Danny Built with β€οΈ using FastAPI, Docker, and curiosity.