Command line interface
This page is for dlt+, which requires a license. Join our early access program for a trial license.
Command Line Interface Reference
This page contains all commands available in the dlt CLI and is generated automatically from the fully populated python argparse object of dlt.
Flags and positional commands are inherited from the parent command. Position within the command string is important. For example if you want to enable debug mode on the pipeline command, you need to add the debug flag to the base dlt command:
dlt --debug pipeline
Adding the flag after the pipeline keyword will not work.
dlt
Creates, adds, inspects and deploys dlt pipelines. Further help is available at https://dlthub.com/docs/reference/command-line-interface.
Usage
dlt [-h] [--version] [--disable-telemetry] [--enable-telemetry]
[--non-interactive] [--debug]
{transformation,source,project,profile,pipeline,license,destination,dbt,dataset,cache,telemetry,schema,init,render-docs,deploy}
...
Show Arguments and Options
Options
-h, --help
- Show this help message and exit--version
- Show program's version number and exit--disable-telemetry
- Disables telemetry before command is executed--enable-telemetry
- Enables telemetry before command is executed--non-interactive
- Non interactive mode. default choices are automatically made for confirmations and prompts.--debug
- Displays full stack traces on exceptions. useful for debugging if the output is not clear enough.
Available subcommands
transformation
- Run transformations dlt+ project. experimental.source
- Manage dlt+ project sourcesproject
- Manage dlt+ projectsprofile
- Manage dlt+ project profilespipeline
- Operations on pipelines that were ran locallylicense
- View dlt+ license statusdestination
- Manage project destinationsdbt
- Dlt+ dbt transformation generatordataset
- Manage dlt+ project datasetscache
- Manage dlt+ project local data cache. experimental.telemetry
- Shows telemetry statusschema
- Shows, converts and upgrades schemasinit
- Creates a pipeline project in the current folder by adding existing verified source or creating a new one from template.render-docs
- Renders markdown version of cli docsdeploy
- Creates a deployment package for a selected pipeline script
dlt transformation
Run transformations dlt+ project. Experimental.
Usage
dlt transformation [-h] [--project PROJECT] [--profile PROFILE] pond_name
{list,info,run,verify-inputs,verify-outputs,populate,flush,transform,populate-state,flush-state,render-t-layer}
...
Description
Commands to run transformations on local cache in dlt+ projects
This is an experimental feature and will change substantially in the future.
Do not use in production..
Show Arguments and Options
Inherits arguments from dlt
.
Positional arguments
pond_name
- Name of the transformation, use '.' for the first one found in project.
Options
-h, --help
- Show this help message and exit--project PROJECT
- Name or path to the dlt package with dlt.yml--profile PROFILE
- Profile to use from the project configuration file
Available subcommands
list
- List all transformations discovered in this directoryinfo
- Transformation info: locations, cache status etc.run
- Sync cache, run transformation and commit the outputsverify-inputs
- Verify that cache can connect to all defined inputs and that tables declared are availableverify-outputs
- Verify that the output cache dataset contains all tables declaredpopulate
- Sync data from inputs to input cache datasetflush
- Flush data from output cache dataset to outputstransform
- Run transformations on input cache dataset and write to output cache datasetpopulate-state
- Populate transformation state from defined outputflush-state
- Flush transformation state to defined outputrender-t-layer
- Render a starting point for the t-layer
dlt transformation list
List all transformations discovered in this directory.
Usage
dlt transformation pond_name list [-h]
Description
List all transformations discovered in this directory.
Show Arguments and Options
dlt transformation info
Transformation info: locations, cache status etc.
Usage
dlt transformation pond_name info [-h]
Description
Transformation info: locations, cache status etc.
Show Arguments and Options
dlt transformation run
Sync cache, run transformation and commit the outputs.
Usage
dlt transformation pond_name run [-h]
Description
Sync cache, run transformation and commit the outputs.
Show Arguments and Options
dlt transformation verify-inputs
Verify that cache can connect to all defined inputs and that tables declared are available.
Usage
dlt transformation pond_name verify-inputs [-h]
Description
Verify that cache can connect to all defined inputs and that tables declared are available.
Show Arguments and Options
dlt transformation verify-outputs
Verify that the output cache dataset contains all tables declared.
Usage
dlt transformation pond_name verify-outputs [-h]
Description
Verify that the output cache dataset contains all tables declared.
Show Arguments and Options
dlt transformation populate
Sync data from inputs to input cache dataset.
Usage
dlt transformation pond_name populate [-h]
Description
Sync data from inputs to input cache dataset.
Show Arguments and Options
dlt transformation flush
Flush data from output cache dataset to outputs.
Usage
dlt transformation pond_name flush [-h]
Description
Flush data from output cache dataset to outputs.
Show Arguments and Options
dlt transformation transform
Run transformations on input cache dataset and write to output cache dataset.
Usage
dlt transformation pond_name transform [-h]
Description
Run transformations on input cache dataset and write to output cache dataset.
Show Arguments and Options
dlt transformation populate-state
Populate transformation state from defined output.
Usage
dlt transformation pond_name populate-state [-h]
Description
Populate transformation state from defined output.
Show Arguments and Options
dlt transformation flush-state
Flush transformation state to defined output.
Usage
dlt transformation pond_name flush-state [-h]
Description
Flush transformation state to defined output.
Show Arguments and Options
dlt transformation render-t-layer
Render a starting point for the t-layer.
Usage
dlt transformation pond_name render-t-layer [-h]
Description
Render a starting point for the t-layer.
Show Arguments and Options
dlt source
Manage dlt+ project sources.
Usage
dlt source [-h] [--project PROJECT] [--profile PROFILE] [source_name]
{check,list,add} ...
Description
Commands to manage sources for project. Run without arguments to list all sources in current project.
Show Arguments and Options
Inherits arguments from dlt
.
Positional arguments
source_name
- Name of the source to add.
Options
-h, --help
- Show this help message and exit--project PROJECT
- Name or path to the dlt package with dlt.yml--profile PROFILE
- Profile to use from the project configuration file
Available subcommands
dlt source check
(temporary feature) Checks if the source is importable, only works for sources within the sources folder.
Usage
dlt source [source_name] check [-h]
Description
(temporary feature) Checks if the source is importable, only works for sources within the sources folder.
Show Arguments and Options
dlt source list
List all sources in the project.
Usage
dlt source [source_name] list [-h]
Description
List all sources in the project context.
Show Arguments and Options
dlt source add
Add a new source to the project.
Usage
dlt source [source_name] add [-h] [source_type]
Description
Add a new source to the project context.
- If source type is not specified, the source type will be the same as the source name.
- If a give source_type is not found, a default source template will be used.
Show Arguments and Options
Inherits arguments from dlt source
.
Positional arguments
source_type
- Type of the source to add. if not specified, the source type will be the same as the source name.
Options
-h, --help
- Show this help message and exit
dlt project
Manage dlt+ projects.
Usage
dlt project [-h] [--project PROJECT] [--profile PROFILE]
{config,clean,init,list,info,audit} ...
Description
Commands to manage dlt+ projects. Run without arguments to list all projects in scope.
Show Arguments and Options
Inherits arguments from dlt
.
Options
-h, --help
- Show this help message and exit--project PROJECT
- Name or path to the dlt package with dlt.yml--profile PROFILE
- Profile to use from the project configuration file
Available subcommands
config
- Configuration management commandsclean
- Cleans local data for the selected profile. if tmp_dir is defined in project file, it gets deleted. pipelines and transformations working dir are also deleted by default. data in remote destinations is not affectedinit
- Initialize a new dlt+ projectlist
- List all projects that could be found in installed dlt packagesinfo
- List basic project info of current project.audit
- Creates and locks resource and secrets audit for a current profile.
dlt project config
Configuration management commands.
Usage
dlt project config [-h] {validate,show} ...
Description
Configuration management commands.
Show Arguments and Options
Inherits arguments from dlt project
.
Positional arguments
validate
- Validate configuration fileshow
- Show configuration
Options
-h, --help
- Show this help message and exit
dlt project config validate
Validate configuration file.
Usage
dlt project config validate [-h]
Description
Validate configuration file.
Show Arguments and Options
dlt project config show
Show configuration.
Usage
dlt project config show [-h] [--format {json,yaml}] [--section SECTION]
Description
Show configuration.
Show Arguments and Options
Inherits arguments from dlt project config
.
Options
-h, --help
- Show this help message and exit--format {json,yaml}
- Output format--section SECTION
- Show specific configuration section (e.g., sources, pipelines)
dlt project clean
Cleans local data for the selected profile. If tmp_dir is defined in project file, it gets deleted. Pipelines and transformations working dir are also deleted by default. Data in remote destinations is not affected.
Usage
dlt project clean [-h] [--skip-data-dir]
Description
Cleans local data for the selected profile. If tmp_dir is defined in project file, it gets deleted. Pipelines and transformations working dir are also deleted by default. Data in remote destinations is not affected.
Show Arguments and Options
Inherits arguments from dlt project
.
Options
-h, --help
- Show this help message and exit--skip-data-dir
- Do not delete pipelines and transformations working dir.
dlt project init
Initialize a new dlt+ project.
Usage
dlt project init [-h] [--project-name PROJECT_NAME] [--package] [--force]
[source] [destination]
Description
Initialize a new dlt+ project.
Show Arguments and Options
Inherits arguments from dlt project
.
Positional arguments
source
- Name of a source for your dlt projectdestination
- Name of a destination for your dlt project
Options
-h, --help
- Show this help message and exit--project-name PROJECT_NAME, -n PROJECT_NAME
- Optinal name of your dlt project--package
- Create a pip package instead of a flat project--force
- Overwrite project even if it already exists
dlt project list
List all projects that could be found in installed dlt packages.
Usage
dlt project list [-h]
Description
List all projects that could be found in installed dlt packages.
Show Arguments and Options
dlt project info
List basic project info of current project.
Usage
dlt project info [-h]
Description
List basic project info of current project.
Show Arguments and Options
dlt project audit
Creates and locks resource and secrets audit for a current profile.
Usage
dlt project audit [-h]
Description
Creates and locks resource and secrets audit for a current profile.
Show Arguments and Options
dlt profile
Manage dlt+ project profiles.
Usage
dlt profile [-h] [--project PROJECT] [--profile PROFILE] [profile_name]
{info,list,add,pin} ...
Description
Commands to manage profiles for project. Run without arguments to list all profiles, the default profile and the pinned profile in current project.
Show Arguments and Options
Inherits arguments from dlt
.
Positional arguments
profile_name
- Name of the profile to add
Options
-h, --help
- Show this help message and exit--project PROJECT
- Name or path to the dlt package with dlt.yml--profile PROFILE
- Profile to use from the project configuration file
Available subcommands
dlt profile info
Show information about profile settings.
Usage
dlt profile [profile_name] info [-h]
Description
Show information about the current profile.
Show Arguments and Options
dlt profile list
Show list of all profiles in the project.
Usage
dlt profile [profile_name] list [-h]
Description
Show list of all profiles in the project.
Show Arguments and Options
dlt profile add
Add a new profile to the project.
Usage
dlt profile [profile_name] add [-h]
Description
Add a new profile to the project.
Show Arguments and Options
dlt profile pin
Pin a profile to the project.
Usage
dlt profile [profile_name] pin [-h]
Description
Pin a profile to the project, this will be the new default profile while it is pinned.
Show Arguments and Options
dlt pipeline
Operations on pipelines that were ran locally.
Usage
dlt pipeline [-h] [--project PROJECT] [--profile PROFILE] [--list-pipelines]
[--hot-reload] [--pipelines-dir PIPELINES_DIR] [--verbose] [pipeline_name]
{info,show,failed-jobs,drop-pending-packages,sync,trace,schema,drop,load-package,list,add,run}
...
Description
The dlt pipeline
command provides a set of commands to inspect the pipeline working directory,
tables, and data in the destination and check for problems encountered during data loading.
Run without arguments to list all pipelines in the current project.
Show Arguments and Options
Inherits arguments from dlt
.
Positional arguments
pipeline_name
- Pipeline name
Options
-h, --help
- Show this help message and exit--project PROJECT
- Name or path to the dlt package with dlt.yml--profile PROFILE
- Profile to use from the project configuration file--list-pipelines, -l
- List local pipelines--hot-reload
- Reload streamlit app (for core development)--pipelines-dir PIPELINES_DIR
- Pipelines working directory--verbose, -v
- Provides more information for certain commands.
Available subcommands
info
- Displays state of the pipeline, use -v or -vv for more infoshow
- Generates and launches streamlit app with the loading status and dataset explorerfailed-jobs
- Displays information on all the failed loads in all completed packages, failed jobs and associated error messagesdrop-pending-packages
- Deletes all extracted and normalized packages including those that are partially loaded.sync
- Drops the local state of the pipeline and resets all the schemas and restores it from destination. the destination state, data and schemas are left intact.trace
- Displays last run trace, use -v or -vv for more infoschema
- Displays default schemadrop
- Selectively drop tables and reset stateload-package
- Displays information on load package, use -v or -vv for more infolist
- List all pipelines in the project.add
- Add a new pipeline to the current projectrun
- Run a pipeline
dlt pipeline info
Displays state of the pipeline, use -v or -vv for more info.
Usage
dlt pipeline [pipeline_name] info [-h]
Description
Displays the content of the working directory of the pipeline: dataset name, destination, list of schemas, resources in schemas, list of completed and normalized load packages, and optionally a pipeline state set by the resources during the extraction process.
Show Arguments and Options
dlt pipeline show
Generates and launches Streamlit app with the loading status and dataset explorer.
Usage
dlt pipeline [pipeline_name] show [-h]
Description
Generates and launches Streamlit (https://streamlit.io/) app with the loading status and dataset explorer.
This is a simple app that you can use to inspect the schemas and data in the destination as well as your pipeline state and loading status/stats. It should be executed from the same folder from which you ran the pipeline script to access destination credentials.
Requires streamlit
to be installed in the current environment: pip install streamlit
.
Show Arguments and Options
dlt pipeline failed-jobs
Displays information on all the failed loads in all completed packages, failed jobs and associated error messages.
Usage
dlt pipeline [pipeline_name] failed-jobs [-h]
Description
This command scans all the load packages looking for failed jobs and then displays information on files that got loaded and the failure message from the destination.
Show Arguments and Options
dlt pipeline drop-pending-packages
Deletes all extracted and normalized packages including those that are partially loaded.
Usage
dlt pipeline [pipeline_name] drop-pending-packages [-h]
Description
Removes all extracted and normalized packages in the pipeline's working dir.
dlt
keeps extracted and normalized load packages in the pipeline working directory. When the run
method is called, it will attempt to normalize and load
pending packages first. The command above removes such packages. Note that pipeline state is not reverted to the state at which the deleted packages
were created. Using dlt pipeline ... sync
is recommended if your destination supports state sync.
Show Arguments and Options
dlt pipeline sync
Drops the local state of the pipeline and resets all the schemas and restores it from destination. The destination state, data and schemas are left intact.
Usage
dlt pipeline [pipeline_name] sync [-h] [--destination DESTINATION]
[--dataset-name DATASET_NAME]
Description
This command will remove the pipeline working directory with all pending packages, not synchronized state changes, and schemas and retrieve the last synchronized data from the destination. If you drop the dataset the pipeline is loading to, this command results in a complete reset of the pipeline state.
In case of a pipeline without a working directory, the command may be used to create one from the
destination. In order to do that, you need to pass the dataset name and destination name to the CLI
and provide the credentials to connect to the destination (i.e., in .dlt/secrets.toml
) placed in the
folder where you execute the pipeline sync
command.
Show Arguments and Options
Inherits arguments from dlt pipeline
.
Options
-h, --help
- Show this help message and exit--destination DESTINATION
- Sync from this destination when local pipeline state is missing.--dataset-name DATASET_NAME
- Dataset name to sync from when local pipeline state is missing.
dlt pipeline trace
Displays last run trace, use -v or -vv for more info.
Usage
dlt pipeline [pipeline_name] trace [-h]
Description
Displays the trace of the last pipeline run containing the start date of the run, elapsed time, and the
same information for all the steps (extract
, normalize
, and load
). If any of the steps failed,
you'll see the message of the exceptions that caused that problem. Successful load
and run
steps
will display the load info instead.
Show Arguments and Options
dlt pipeline schema
Displays default schema.
Usage
dlt pipeline [pipeline_name] schema [-h] [--format {json,yaml}]
[--remove-defaults]
Description
Displays the default schema for the selected pipeline.
Show Arguments and Options
Inherits arguments from dlt pipeline
.
Options
-h, --help
- Show this help message and exit--format {json,yaml}
- Display schema in this format--remove-defaults
- Does not show default hint values
dlt pipeline drop
Selectively drop tables and reset state.
Usage
dlt pipeline [pipeline_name] drop [-h] [--destination DESTINATION]
[--dataset-name DATASET_NAME] [--drop-all] [--state-paths [STATE_PATHS ...]]
[--schema SCHEMA_NAME] [--state-only] [resources ...]
Description
Selectively drop tables and reset state.
dlt pipeline <pipeline name> drop [resource_1] [resource_2]
Drops tables generated by selected resources and resets the state associated with them. Mainly used
to force a full refresh on selected tables. In the example below, we drop all tables generated by
the repo_events
resource in the GitHub pipeline:
dlt pipeline github_events drop repo_events
dlt
will inform you of the names of dropped tables and the resource state slots that will be
reset:
About to drop the following data in dataset airflow_events_1 in destination dlt.destinations.duckdb:
Selected schema:: github_repo_events
Selected resource(s):: ['repo_events']
Table(s) to drop:: ['issues_event', 'fork_event', 'pull_request_event', 'pull_request_review_event', 'pull_request_review_comment_event', 'watch_event', 'issue_comment_event', 'push_event__payload__commits', 'push_event']
Resource(s) state to reset:: ['repo_events']
Source state path(s) to reset:: []
Do you want to apply these changes? [y/N]
As a result of the command above the following will happen:
- All the indicated tables will be dropped in the destination. Note that
dlt
drops the nested tables as well. - All the indicated tables will be removed from the indicated schema.
- The state for the resource
repo_events
was found and will be reset. - New schema and state will be stored in the destination.
The drop
command accepts several advanced settings:
- You can use regexes to select resources. Prepend the
re:
string to indicate a regex pattern. The example below will select all resources starting withrepo
:
dlt pipeline github_events drop "re:^repo"
- You can drop all tables in the indicated schema:
dlt pipeline chess drop --drop-all
- You can indicate additional state slots to reset by passing JsonPath to the source state. In the example
below, we reset the
archives
slot in the source state:
dlt pipeline chess_pipeline drop --state-paths archives
This will select the archives
key in the chess
source.
{
"sources":{
"chess": {
"archives": [
"https://api.chess.com/pub/player/magnuscarlsen/games/2022/05"
]
}
}
}
This command is still experimental and the interface will most probably change.
Show Arguments and Options
Inherits arguments from dlt pipeline
.
Positional arguments
resources
- One or more resources to drop. can be exact resource name(s) or regex pattern(s). regex patterns must start with re:
Options
-h, --help
- Show this help message and exit--destination DESTINATION
- Sync from this destination when local pipeline state is missing.--dataset-name DATASET_NAME
- Dataset name to sync from when local pipeline state is missing.--drop-all
- Drop all resources found in schema. supersedes [resources] argument.--state-paths [STATE_PATHS ...]
- State keys or json paths to drop--schema SCHEMA_NAME
- Schema name to drop from (if other than default schema).--state-only
- Only wipe state for matching resources without dropping tables.
dlt pipeline load-package
Displays information on load package, use -v or -vv for more info.
Usage
dlt pipeline [pipeline_name] load-package [-h] [load-id]
Description
Shows information on a load package with a given load_id
. The load_id
parameter defaults to the
most recent package. Package information includes its state (COMPLETED/PROCESSED
) and list of all
jobs in a package with their statuses, file sizes, types, and in case of failed jobs—the error
messages from the destination. With the verbose flag set dlt pipeline -v ...
, you can also see the
list of all tables and columns created at the destination during the loading of that package.
Show Arguments and Options
Inherits arguments from dlt pipeline
.
Positional arguments
load-id
- Load id of completed or normalized package. defaults to the most recent package.
Options
-h, --help
- Show this help message and exit
dlt pipeline list
List all pipelines in the project.
Usage
dlt pipeline [pipeline_name] list [-h]
Description
List all pipelines in the project.
Show Arguments and Options
dlt pipeline add
Add a new pipeline to the current project.
Usage
dlt pipeline [pipeline_name] add [-h] [--dataset-name DATASET_NAME] source_name
destination_name
Description
Adds a new pipeline to the current project. Will not create any sources or destinations, you can reference other entities by name.
Show Arguments and Options
Inherits arguments from dlt pipeline
.
Positional arguments
source_name
- Name of the source to adddestination_name
- Name of the destination to add
Options
-h, --help
- Show this help message and exit--dataset-name DATASET_NAME
- Name of the dataset to add
dlt pipeline run
Run a pipeline.
Usage
dlt pipeline [pipeline_name] run [-h] [--limit LIMIT] [--resources RESOURCES]
Description
Run a pipeline.
Show Arguments and Options
Inherits arguments from dlt pipeline
.
Options
-h, --help
- Show this help message and exit--limit LIMIT
- Limits the number of extracted pages for all resources. see source.add_limit.--resources RESOURCES
- Comma-separated list of resource names.
dlt license
View dlt+ license status.
Usage
dlt license [-h] {show,scopes} ...
Description
View dlt+ license status.
Show Arguments and Options
dlt license show
Show the installed license.
Usage
dlt license show [-h]
Description
Show the installed license.
Show Arguments and Options
dlt license scopes
Show available scopes.
Usage
dlt license scopes [-h]
Description
Show available scopes.
Show Arguments and Options
dlt destination
Manage project destinations.
Usage
dlt destination [-h] [--project PROJECT] [--profile PROFILE] [destination_name]
{list,list-available,add} ...
Description
Commands to manage destinations for project. Run without arguments to list all destinations in current project.
Show Arguments and Options
Inherits arguments from dlt
.
Positional arguments
destination_name
- Name of the destination
Options
-h, --help
- Show this help message and exit--project PROJECT
- Name or path to the dlt package with dlt.yml--profile PROFILE
- Profile to use from the project configuration file
Available subcommands
list
- List all destinations in the project.list-available
- List all destination types that can be added to the project.add
- Add a new destination to the project
dlt destination list
List all destinations in the project.
Usage
dlt destination [destination_name] list [-h]
Description
List all destinations in the project.
Show Arguments and Options
dlt destination list-available
List all destination types that can be added to the project.
Usage
dlt destination [destination_name] list-available [-h]
Description
List all destination types that can be added to the project.
Show Arguments and Options
dlt destination add
Add a new destination to the project.
Usage
dlt destination [destination_name] add [-h] [--dataset-name DATASET_NAME]
[destination_type]
Description
Add a new destination to the project.
Show Arguments and Options
Inherits arguments from dlt destination
.
Positional arguments
destination_type
- Will default to the destination name if not specified.
Options
-h, --help
- Show this help message and exit--dataset-name DATASET_NAME
- Name of the dataset to add in the datasets section. will add no dataset if not specified.
dlt dbt
dlt+ dbt transformation generator.
Usage
dlt dbt [-h] {generate} ...
Description
dlt+ dbt transformation generator.
Show Arguments and Options
dlt dbt generate
Generate dbt project.
Usage
dlt dbt generate [-h] [--include_dlt_tables] [--fact [FACT]] [--force]
[--mart_table_prefix [MART_TABLE_PREFIX]] pipeline_name
Description
Generate dbt project.
Show Arguments and Options
Inherits arguments from dlt dbt
.
Positional arguments
pipeline_name
- The pipeline to create a dbt project for
Options
-h, --help
- Show this help message and exit--include_dlt_tables
- Do not render _dlt tables--fact [FACT]
- Create a fact table for a given table--force
- Force overwrite of existing files--mart_table_prefix [MART_TABLE_PREFIX]
- Prefix for mart tables
dlt dataset
Manage dlt+ project datasets.
Usage
dlt dataset [-h] [--project PROJECT] [--profile PROFILE] [--destination
DESTINATION] [--schema SCHEMA] [dataset-name]
{list,info,drop,show,row-counts,head} ...
Description
Commands to manage datasets for project. Run without arguments to list all datasets in current project.
Show Arguments and Options
Inherits arguments from dlt
.
Positional arguments
dataset-name
- Dataset name
Options
-h, --help
- Show this help message and exit--project PROJECT
- Name or path to the dlt package with dlt.yml--profile PROFILE
- Profile to use from the project configuration file--destination DESTINATION
- Destination name, if many allowed--schema SCHEMA
- Limits to schema with name for multi-schema datasets
Available subcommands
dlt dataset list
List Datasets.
Usage
dlt dataset [dataset-name] list [-h]
Description
List Datasets.
Show Arguments and Options
dlt dataset info
Dataset info.
Usage
dlt dataset [dataset-name] info [-h]
Description
Dataset info.
Show Arguments and Options
dlt dataset drop
Drops the dataset and all data in it.
Usage
dlt dataset [dataset-name] drop [-h]
Description
Drops the dataset and all data in it.
Show Arguments and Options
dlt dataset show
Shows the content of dataset in Streamlit.
Usage
dlt dataset [dataset-name] show [-h]
Description
Shows the content of dataset in Streamlit.
Show Arguments and Options
dlt dataset row-counts
Display the row counts of all tables in the dataset.
Usage
dlt dataset [dataset-name] row-counts [-h]
Description
Display the row counts of all tables in the dataset.
Show Arguments and Options
dlt dataset head
Display the first x rows of a table, defaults to 5.
Usage
dlt dataset [dataset-name] head [-h] [--limit LIMIT] table_name
Description
Display the first x rows of a table, defaults to 5.
Show Arguments and Options
Inherits arguments from dlt dataset
.
Positional arguments
table_name
- Table name
Options
-h, --help
- Show this help message and exit--limit LIMIT
- Number of rows to display
dlt cache
Manage dlt+ project local data cache. Experimental.
Usage
dlt cache [-h] [--project PROJECT] [--profile PROFILE]
{info,show,drop,populate,flush,create-persistent-secrets,clear-persistent-secrets}
...
Description
Commands to manage local data cache for dlt+ project.
This is an experimental feature and will change substantially in the future.
Do not use in production..
Show Arguments and Options
Inherits arguments from dlt
.
Options
-h, --help
- Show this help message and exit--project PROJECT
- Name or path to the dlt package with dlt.yml--profile PROFILE
- Profile to use from the project configuration file
Available subcommands
info
- Shows cache infoshow
- Connects to cache enginedrop
- Drop the cachepopulate
- Populate the cache from the defined inputsflush
- Flush the cache to the defined outputscreate-persistent-secrets
- Create persistent secrets on cache for remote access.clear-persistent-secrets
- Clear persistent secrets from cache for remote access.
dlt cache info
Shows cache info.
Usage
dlt cache info [-h]
Description
Shows cache info.
Show Arguments and Options
dlt cache show
Connects to cache engine.
Usage
dlt cache show [-h]
Description
Connects to cache engine.
Show Arguments and Options
dlt cache drop
Drop the cache.
Usage
dlt cache drop [-h]
Description
Drop the cache.
Show Arguments and Options
dlt cache populate
Populate the cache from the defined inputs.
Usage
dlt cache populate [-h]
Description
Populate the cache from the defined inputs.
Show Arguments and Options
dlt cache flush
Flush the cache to the defined outputs.
Usage
dlt cache flush [-h]
Description
Flush the cache to the defined outputs.
Show Arguments and Options
dlt cache create-persistent-secrets
Create persistent secrets on cache for remote access.
Usage
dlt cache create-persistent-secrets [-h]
Description
Create persistent secrets on cache for remote access.
Show Arguments and Options
dlt cache clear-persistent-secrets
Clear persistent secrets from cache for remote access.
Usage
dlt cache clear-persistent-secrets [-h]
Description
Clear persistent secrets from cache for remote access.
Show Arguments and Options
dlt telemetry
Shows telemetry status.
Usage
dlt telemetry [-h]
Description
The dlt telemetry
command shows the current status of dlt telemetry. Lern more about telemetry and what we send in our telemetry docs.
Show Arguments and Options
dlt schema
Shows, converts and upgrades schemas.
Usage
dlt schema [-h] [--format {json,yaml}] [--remove-defaults] file
Description
The dlt schema
command will load, validate and print out a dlt schema: dlt schema path/to/my_schema_file.yaml
.
Show Arguments and Options
Inherits arguments from dlt
.
Positional arguments
file
- Schema file name, in yaml or json format, will autodetect based on extension
Options
-h, --help
- Show this help message and exit--format {json,yaml}
- Display schema in this format--remove-defaults
- Does not show default hint values
dlt init
Creates a pipeline project in the current folder by adding existing verified source or creating a new one from template.
Usage
dlt init [-h] [--list-sources] [--location LOCATION] [--branch BRANCH] [--eject]
[source] [destination]
Description
The dlt init
command creates a new dlt pipeline script that loads data from source
to destination
. When you run the command, several things happen:
- Creates a basic project structure if the current folder is empty by adding
.dlt/config.toml
,.dlt/secrets.toml
, and.gitignore
files. - Checks if the
source
argument matches one of our verified sources and, if so, adds it to your project. - If the
source
is unknown, uses a generic template to get you started. - Rewrites the pipeline scripts to use your
destination
. - Creates sample config and credentials in
secrets.toml
andconfig.toml
for the specified source and destination. - Creates
requirements.txt
with dependencies required by the source and destination. If one exists, prints instructions on what to add to it.
This command can be used several times in the same folder to add more sources, destinations, and pipelines. It will also update the verified source code to the newest
version if run again with an existing source
name. You will be warned if files will be overwritten or if the dlt
version needs an upgrade to run a particular pipeline.
Show Arguments and Options
Inherits arguments from dlt
.
Positional arguments
source
- Name of data source for which to create a pipeline. adds existing verified source or creates a new pipeline template if verified source for your data source is not yet implemented.destination
- Name of a destination ie. bigquery or redshift
Options
-h, --help
- Show this help message and exit--list-sources, -l
- Shows all available verified sources and their short descriptions. for each source, it checks if your localdlt
version requires an update and prints the relevant warning.--location LOCATION
- Advanced. uses a specific url or local path to verified sources repository.--branch BRANCH
- Advanced. uses specific branch of the verified sources repository to fetch the template.--eject
- Ejects the source code of the core source like sql_database or rest_api so they will be editable by you.
dlt render-docs
Renders markdown version of cli docs.
Usage
dlt render-docs [-h] [--compare] file_name
Description
The dlt render-docs
command renders markdown version of cli docs by parsing the argparse help output and generating a markdown file.
If you are reading this on the docs website, you are looking at the rendered version of the cli docs generated by this command.
Show Arguments and Options
Inherits arguments from dlt
.
Positional arguments
file_name
- Output file name
Options
-h, --help
- Show this help message and exit--compare
- Compare the changes and raise if output would be updated
dlt deploy
Creates a deployment package for a selected pipeline script.
Usage
dlt deploy [-h] pipeline-script-path
Description
The dlt deploy
command prepares your pipeline for deployment and gives you step-by-step instructions on how to accomplish it. To enable this functionality, please first execute pip install "dlt[cli]"
which will add additional packages to the current environment.
Show Arguments and Options
Inherits arguments from dlt
.
Positional arguments
pipeline-script-path
- Path to a pipeline script
Options
-h, --help
- Show this help message and exit