dltHub Agentic Workflows
20 skills across 5 toolkits. agentic-workflows.mddlthub.com/cheatsheet
Getting started
Prerequisite: uv (fast Python package manager).
curl -LsSf https://astral.sh/uv/install.sh | sh# Install dlt with workspace + hub support
uv pip install --upgrade "dlt[workspace]" && uv pip install "dlt[hub]"
# Set up your workspace (auto-detects your coding assistant)
uv run dlt ai init
# Or specify your agent explicitly:
# uv run dlt ai init --agent claude
# uv run dlt ai init --agent cursor
# uv run dlt ai init --agent codexFind a dlt source for a given API or data provider. Use when the user asks about a source, wants to find a connector, or asks to implement a pipeline for a specific data source.
Create a dlt REST API pipeline. Use for the rest_api core source, or any generic REST/HTTP API source.
Add a new REST API endpoint/resource to an existing dlt pipeline.
Adjust a working dlt pipeline for production — remove dev limits, verify pagination, configure incremental loading.
Validate schema and data after a successful dlt pipeline load. Check row counts, schema, data types.
Query, explore, or view data loaded by a dlt pipeline. Covers dlt dataset API, ibis, and ReadableRelation.
Debug and inspect a dlt pipeline after running it. Inspect traces, load packages, schema, and diagnose errors.
Verify dlt workspace is ready for dltHub Runtime. Use when deploying for the first time.
Prepare production credentials and destinations. Set up profile-scoped secrets and production destinations.
Deploy dlt pipelines to dltHub Runtime. Assumes workspace is verified and credentials are set.
Debug a failed or misbehaving dltHub Runtime deployment. Check job status and logs.
Explore dlt pipeline data locally. Connect, profile tables, plan charts with ibis + altair, and write an analysis plan.
Assemble a marimo notebook from analysis_plan.md. Reads chart specs, generates Python file, validates, and launches.
Annotate dlt pipeline sources for transformation. Map data sources to canonical business concepts.
Build a business entity graph from annotated sources and taxonomy.
Generate a Canonical Data Model using Kimball dimensional modeling. Star schema from your ontology.
Write @dlt.hub.transformation functions that map source tables to CDM entities using ibis.
Improve existing skills based on the current session. Capture debugging patterns, doc references, workflow improvements.
Safely manage dlt secrets in *.secrets.toml. Uses MCP tools to never expose raw values.
Route users to the right toolkit and skill. Use when the user asks "what can you do" or "where do I start".
Find a dlt source for a given API or data provider. Use when the user asks about a source, wants to find a connector, or asks to implement a pipeline for a s...
Create a dlt REST API pipeline. Use for the rest_api core source, or any generic REST/HTTP API source.
Add a new REST API endpoint/resource to an existing dlt pipeline.
Adjust a working dlt pipeline for production — remove dev limits, verify pagination, configure incremental loading.
Validate schema and data after a successful dlt pipeline load. Check row counts, schema, data types.
Query, explore, or view data loaded by a dlt pipeline. Covers dlt dataset API, ibis, and ReadableRelation.
Debug and inspect a dlt pipeline after running it. Inspect traces, load packages, schema, and diagnose errors.
Verify dlt workspace is ready for dltHub Runtime. Use when deploying for the first time.
Prepare production credentials and destinations. Set up profile-scoped secrets and production destinations.
Deploy dlt pipelines to dltHub Runtime. Assumes workspace is verified and credentials are set.
Debug a failed or misbehaving dltHub Runtime deployment. Check job status and logs.
Explore dlt pipeline data locally. Connect, profile tables, plan charts with ibis + altair, and write an analysis plan.
Assemble a marimo notebook from analysis_plan.md. Reads chart specs, generates Python file, validates, and launches.
Annotate dlt pipeline sources for transformation. Map data sources to canonical business concepts.
Build a business entity graph from annotated sources and taxonomy.
Generate a Canonical Data Model using Kimball dimensional modeling. Star schema from your ontology.
Write @dlt.hub.transformation functions that map source tables to CDM entities using ibis.
Improve existing skills based on the current session. Capture debugging patterns, doc references, workflow improvements.
Safely manage dlt secrets in *.secrets.toml. Uses MCP tools to never expose raw values.
Route users to the right toolkit and skill. Use when the user asks "what can you do" or "where do I start".