dlt is the open-source Python library 10,000s of developers use to build data pipelines. dltHub Pro is the agentic platform that deploys, monitors, and scales them. One command. No manual environment setup. No silent failures.
OPEN SOURCE
The AI-native Python library for data movement. Write any pipeline, run anywhere, no backend needed.
pip install dltIN PUBLIC PREVIEW
Launch May 2026Agents build your dlt pipelines from a prompt. Pro allows you and your agent to deploy them to production with scheduling, alerting, and observability - one command, zero manual setup.
uv run dlt ai init --agent <agent> # <agent>: claude | cursor | codex"What I didn't expect is how much it unblocks the team. A mid-level engineer can spin up a prototype, browse the raw data in dltHub Pro's local DuckDB workspace, validate the SQL schema - all without pulling in a senior. That loop of prototype, inspect, fix, re-run - that's the real unlock."

Marcello Victorino
Staff Data Engineer, Tasman Analytics

Marcello Victorino
Staff Data Engineer, Tasman Analytics
THE AGENTIC DATA WORKFLOW
Build a pipeline that loads CRM contacts and deals into my warehouse using dlt
Agentic Workflows
Not autocomplete, not a chatbot on a dashboard. A guided sequence of skills, commands, rules, and MCP - with guardrails agents can't skip. Maintained by dltHub, controlling the infrastructure agents and pipelines operate on.
Agentic Workflows in Detail
See how each workflow guides your agent - step by step, from first prompt to production deployment.
Find a dlt source for a given API or data provider. Use when the user asks about a source, wants to find a connector, or asks to implement a pipeline for a specific data source.
Sonnet 4.6 · REST API Pipeline · ~/pipelines
Find a dlt source for a given API or data provider. Use when the user asks about a source, wants to find a connector, or asks to implement a pipeline for a specific data source.
Sonnet 4.6 · REST API Pipeline · ~/pipelines
Pair dltHub Pro with a certified dltHub consulting partner or a dltHub Forward Deployed Engineer. We own the move off your legacy stack onto dlt or dltHub Pro - on a fixed scope, with the pipelines, platform, and enablement your team signs off on.
PyPI downloads per month
Companies loading data into databases with dlt in production
Companies loading into Snowflake with dlt in production
The current machine learning revolution has been enabled by the Cambrian explosion of Python open-source tools that have become so accessible that a wide range of practitioners can use them. As a simple-to-use Python library, dlt is the first tool that this new wave of people can use. By leveraging this library, we can extend the machine learning revolution into enterprise data.

Python and machine learning under security constraints are key to our success. We found that our cloud ETL provider could not meet our needs. dlt is a lightweight yet powerful open source tool we can run together with Snowflake. Our event streaming and batch data loading performs at scale and low cost. Now anyone who knows Python can self-serve to fulfil their data needs.

Free, self-paced course. From first prompt to production deployment.
![An image with the command "pip install "dlt[hub]" in the middle, and logos of REST API sources around it](https://cdn.sanity.io/images/nsq559ov/production/0dc1cafc5f0f7b74e7993f903e7c23227d5edd70-1014x972.png?w=5400&auto=format)
dltHub's agentic workflows come with a REST API toolkit that taps directly into dltHub Context - a hub of deeply researched, enriched context on REST APIs across SaaS sources, databases, and destinations. Your agent pulls exactly what it needs to code any dlt pipeline, in minutes.
We already cover more than 10,100 sources, with a clear path to hundreds of thousands. From prompt to pipeline to live reports in a notebook - all in one agentic flow, with outputs tailored to data users.
Tools like Claude skills or Replit are great for writing and running code. But they are not built for data engineering workflows end to end.
dltHub Pro gives your team complete agentic workflows that cover every phase: coding, running, deploying, and debugging pipelines, on infrastructure you control. Not just a skill, not just an editor, but a guided workflow from first line to production.
dlt is the perfect match between standardization and customization. You get the automation that matters: schema inference, incremental state, normalization, and loading, while keeping the full flexibility and portability of plain Python.
And with agentic dltHub workflows, your team can code, run, deploy, and debug pipelines faster, with the reliability you can trust at every step.
dltHub is the managed platform for deploying and operating data pipelines built with dlt. It provides a runtime, observability, data quality checks, and collaboration features so teams can go from development to production with one command.
dlt (data load tool) is an open-source Python library for building data pipelines. It lets you write any connector, run anywhere, and requires no backend. dlt is Apache 2.0 licensed and always free to use.