Skip to main content
Version: 0.5.4

destinations.job_client_impl

SqlLoadJob Objects

class SqlLoadJob(RunnableLoadJob)

[view_source]

A job executing sql statement, without followup trait

SqlJobClientBase Objects

class SqlJobClientBase(JobClientBase, WithStateSync)

[view_source]

INFO_TABLES_QUERY_THRESHOLD

Fallback to querying all tables in the information schema if checking more than threshold

drop_tables

def drop_tables(*tables: str, delete_schema: bool = True) -> None

[view_source]

Drop tables in destination database and optionally delete the stored schema as well. Clients that support ddl transactions will have both operations performed in a single transaction.

Arguments:

  • tables - Names of tables to drop.
  • delete_schema - If True, also delete all versions of the current schema from storage

maybe_ddl_transaction

@contextlib.contextmanager
def maybe_ddl_transaction() -> Iterator[None]

[view_source]

Begins a transaction if sql client supports it, otherwise works in auto commit.

create_table_chain_completed_followup_jobs

def create_table_chain_completed_followup_jobs(
table_chain: Sequence[TTableSchema],
completed_table_chain_jobs: Optional[Sequence[LoadJobInfo]] = None
) -> List[FollowupJobRequest]

[view_source]

Creates a list of followup jobs for merge write disposition and staging replace strategies

create_load_job

def create_load_job(table: TTableSchema,
file_path: str,
load_id: str,
restore: bool = False) -> LoadJob

[view_source]

Starts SqlLoadJob for files ending with .sql or returns None to let derived classes to handle their specific jobs

get_storage_tables

def get_storage_tables(
table_names: Iterable[str]
) -> Iterable[Tuple[str, TTableSchemaColumns]]

[view_source]

Uses INFORMATION_SCHEMA to retrieve table and column information for tables in table_names iterator. Table names should be normalized according to naming convention and will be further converted to desired casing in order to (in most cases) create case-insensitive name suitable for search in information schema.

The column names are returned as in information schema. To match those with columns in existing table, you'll need to use schema.get_new_table_columns method and pass the correct casing. Most of the casing function are irreversible so it is not possible to convert identifiers into INFORMATION SCHEMA back into case sensitive dlt schema.

get_storage_table

def get_storage_table(table_name: str) -> Tuple[bool, TTableSchemaColumns]

[view_source]

Uses get_storage_tables to get single table_name schema.

Returns (True, ...) if table exists and (False, {}) when not

This demo works on codespaces. Codespaces is a development environment available for free to anyone with a Github account. You'll be asked to fork the demo repository and from there the README guides you with further steps.
The demo uses the Continue VSCode extension.

Off to codespaces!

DHelp

Ask a question

Welcome to "Codex Central", your next-gen help center, driven by OpenAI's GPT-4 model. It's more than just a forum or a FAQ hub – it's a dynamic knowledge base where coders can find AI-assisted solutions to their pressing problems. With GPT-4's powerful comprehension and predictive abilities, Codex Central provides instantaneous issue resolution, insightful debugging, and personalized guidance. Get your code running smoothly with the unparalleled support at Codex Central - coding help reimagined with AI prowess.