Skip to main content
Version: devel View Markdown

MCP server

Overviewโ€‹

dltHub Workspace comes with an MCP server you can run locally and integrate with your preferred IDE. It provides a set of tools for interacting with pipelines and datasets:

  • Explore and describe pipeline schemas
  • Access and explore data in destination tables
  • Write SQL queries, models, and transformations
  • Combining all the above, it provides efficient help with writing reports, notebook code, and dlt pipelines themselves

๐Ÿšง (in development) We are adding a set of tools that help drill down into pipeline run traces to find possible problems and root causes for incidents.

The MCP server can be started in:

  • Workspace context, where it will see all the pipelines in it
  • Pipeline context, where it is attached to a single pipeline

Users can start as many MCP servers as necessary. The default configurations and examples below assume that workspace and pipeline MCP servers can work side by side.

Launch MCP serverโ€‹

The default transport is streamable-http (MCP spec 2025-03-26). The MCP will attach to the workspace context and pipelines in it. It must be able to start in the same Python environment and see the same workspace as dlt when running pipelines. We still support stdio and legacy sse transports via --stdio and --sse flags, but streamable-http is recommended.

To launch the server in workspace context:

dlt workspace mcp

INFO: Started server process [24925]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:43654 (Press CTRL+C to quit)

The workspace MCP server has 43654 as the default port and serves on /mcp. Use --sse flag to use legacy SSE transport instead.

To launch the server in pipeline context:

dlt pipeline fruitshop mcp

Starting dlt MCP server
INFO: Started server process [28972]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:43656 (Press CTRL+C to quit)

The pipeline MCP server has 43656 as the default port. The pipeline is already attached when the MCP server starts. Both pipeline and workspace MCP servers can work side by side.

Configure MCP serverโ€‹

Cursor, Cline, Claude Desktopโ€‹

{
"mcpServers": {
"dlt-workspace": {
"url": "http://127.0.0.1:43654/mcp"
},
"dlt-pipeline-mcp": {
"url": "http://127.0.0.1:43656/mcp"
}
}
}

Continue (local)โ€‹

name: dlt mcps
version: 0.0.1
schema: v1
mcpServers:
- name: dlt-workspace
url: "http://localhost:43654/mcp"

Continue Hubโ€‹

With Continue, you can use Continue Hub for a 1-click install of the MCP, or a local config file. Select Agent Mode to enable the MCP server.

See the dltHub page and select the dlt or dltHub Assistant. This bundles the MCP with additional Continue-specific features.

Configure MCP serverโ€‹

The server can be started with stdio or legacy sse transport and a different port using the command line.

๐Ÿšง The feature below is in development and not yet available: The plan is to allow full configuration of MCP via the dlt configuration system.

[workspace.mcp]
path="/mcp"
port=888
[pipelines.fruitshop.mcp]
transport="stdio"

Model Context Protocolโ€‹

The Model Context Protocol (MCP) is a standard initiated by Anthropic to connect large language models (LLMs) to external data and systems.

In the context of MCP, the client is built into the user-facing application. The most common clients are LLM-enabled IDEs or extensions such as Continue, Cursor, Claude Desktop, Cline, etc. The server is a process that handles requests to interact with external data and systems.

Core constructsโ€‹

  • Resources are data objects that can be retrieved by the client and added to the context (i.e., prompt) of the LLM request. Resources will be manually selected by the user, or certain clients will automatically retrieve them.

  • Tools provide a way to execute code and provide information to the LLM. Tools are called by the LLM; they can't be selected by the user or the client.

  • Prompts are strings, or templated strings, that can be injected into the conversation. Prompts are selected by the user. They provide shortcuts for frequent commands or allow asking the LLM to use specific tools.

This demo works on codespaces. Codespaces is a development environment available for free to anyone with a Github account. You'll be asked to fork the demo repository and from there the README guides you with further steps.
The demo uses the Continue VSCode extension.

Off to codespaces!

DHelp

Ask a question

Welcome to "Codex Central", your next-gen help center, driven by OpenAI's GPT-4 model. It's more than just a forum or a FAQ hub โ€“ it's a dynamic knowledge base where coders can find AI-assisted solutions to their pressing problems. With GPT-4's powerful comprehension and predictive abilities, Codex Central provides instantaneous issue resolution, insightful debugging, and personalized guidance. Get your code running smoothly with the unparalleled support at Codex Central - coding help reimagined with AI prowess.