Skip to main content
Version: devel

dlt.destinations.decorators

destination

def destination(
func: Optional[AnyFun] = None,
loader_file_format: TLoaderFileFormat = None,
batch_size: int = 10,
name: str = None,
naming_convention: str = "direct",
skip_dlt_columns_and_tables: bool = True,
max_table_nesting: int = 0,
spec: Type[CustomDestinationClientConfiguration] = None,
max_parallel_load_jobs: Optional[int] = None,
loader_parallelism_strategy: Optional[TLoaderParallelismStrategy] = None
) -> Any

View source on GitHub

A decorator that transforms a function that takes two positional arguments "table" and "items" and any number of keyword arguments with defaults into a callable that will create a custom destination. The function does not return anything, the keyword arguments can be configuration and secrets values.

Example Usage with Configuration and Secrets:

Here all incoming data will be sent to the destination function with the items in the requested format and the dlt table schema. The config and secret values will be resolved from the path destination.my_destination.api_url and destination.my_destination.api_secret.


@dlt.destination(batch_size=100, loader_file_format="parquet")
def my_destination(items, table, api_url: str = dlt.config.value, api_secret = dlt.secrets.value):
print(table["name"])
print(items)

p = dlt.pipeline("chess_pipeline", destination=my_destination)

Arguments:

  • func Optional[AnyFun] - A function that takes two positional arguments "table" and "items" and any number of keyword arguments with defaults which will process the incoming data.

  • loader_file_format TLoaderFileFormat - defines in which format files are stored in the load package before being sent to the destination function, this can be puae-jsonl or parquet.

  • batch_size int - defines how many items per function call are batched together and sent as an array. If you set a batch-size of 0, instead of passing in actual dataitems, you will receive one call per load job with the path of the file as the items argument. You can then open and process that file in any way you like.

  • name str - defines the name of the destination that gets created by the destination decorator, defaults to the name of the function

  • naming_convention str - defines the name of the destination that gets created by the destination decorator. This controls how table and column names are normalized. The default is direct which will keep all names the same.

  • skip_dlt_columns_and_tables bool - defines wether internal tables and columns will be fed into the custom destination function. This is set to True by default.

  • max_table_nesting int - defines how deep the normalizer will go to normalize nested fields on your data to create subtables. This overwrites any settings on your source and is set to zero to not create any nested tables by default.

  • spec Type[CustomDestinationClientConfiguration] - defines a configuration spec that will be used to to inject arguments into the decorated functions. Argument not in spec will not be injected

  • max_parallel_load_jobs Optional[int] - how many load jobs at most will be running during the load

  • loader_parallelism_strategy Optional[TLoaderParallelismStrategy] - Can be "sequential" which equals max_parallel_load_jobs=1, "table-sequential" where each table will have at most one loadjob at any given time and "parallel"

Returns:

  • Any - A callable that can be used to create a dlt custom destination instance

This demo works on codespaces. Codespaces is a development environment available for free to anyone with a Github account. You'll be asked to fork the demo repository and from there the README guides you with further steps.
The demo uses the Continue VSCode extension.

Off to codespaces!

DHelp

Ask a question

Welcome to "Codex Central", your next-gen help center, driven by OpenAI's GPT-4 model. It's more than just a forum or a FAQ hub – it's a dynamic knowledge base where coders can find AI-assisted solutions to their pressing problems. With GPT-4's powerful comprehension and predictive abilities, Codex Central provides instantaneous issue resolution, insightful debugging, and personalized guidance. Get your code running smoothly with the unparalleled support at Codex Central - coding help reimagined with AI prowess.