Skip to main content
Version: devel

dlt.destinations.sql_client

TJobQueryTags Objects

class TJobQueryTags(TypedDict)

View source on GitHub

Applied to sql client when a job using it starts. Using to tag queries

SqlClientBase Objects

class SqlClientBase(ABC, Generic[TNativeConn])

View source on GitHub

database_name

Database or catalog name, optional

dataset_name

Normalized dataset name

staging_dataset_name

Normalized staging dataset name

capabilities

Instance of adjusted destination capabilities

drop_tables

def drop_tables(*tables: str) -> None

View source on GitHub

Drops a set of tables if they exist

execute_fragments

def execute_fragments(fragments: Sequence[AnyStr], *args: Any,
**kwargs: Any) -> Optional[Sequence[Sequence[Any]]]

View source on GitHub

Executes several SQL fragments as efficiently as possible to prevent data copying. Default implementation just joins the strings and executes them together.

execute_many

def execute_many(statements: Sequence[str], *args: Any,
**kwargs: Any) -> Optional[Sequence[Sequence[Any]]]

View source on GitHub

Executes multiple SQL statements as efficiently as possible. When client supports multiple statements in a single query they are executed together in as few database calls as possible.

make_qualified_table_name_path

def make_qualified_table_name_path(table_name: Optional[str],
escape: bool = True) -> List[str]

View source on GitHub

Returns a list with path components leading from catalog to table_name. Used to construct fully qualified names. table_name is optional.

get_qualified_table_names

def get_qualified_table_names(table_name: str,
escape: bool = True) -> Tuple[str, str]

View source on GitHub

Returns qualified names for table and corresponding staging table as tuple.

with_alternative_dataset_name

@contextmanager
def with_alternative_dataset_name(
dataset_name: str) -> Iterator["SqlClientBase[TNativeConn]"]

View source on GitHub

Sets the dataset_name as the default dataset during the lifetime of the context. Does not modify any search paths in the existing connection.

with_staging_dataset

def with_staging_dataset() -> ContextManager["SqlClientBase[TNativeConn]"]

View source on GitHub

Temporarily switch sql client to staging dataset name

is_staging_dataset_active

@property
def is_staging_dataset_active() -> bool

View source on GitHub

Checks if staging dataset is currently active

set_query_tags

def set_query_tags(tags: TJobQueryTags) -> None

View source on GitHub

Sets current schema (source), resource, load_id and table name when a job starts

DBApiCursorImpl Objects

class DBApiCursorImpl(DBApiCursor)

View source on GitHub

A DBApi Cursor wrapper with dataframes reading functionality

df

def df(chunk_size: int = None, **kwargs: Any) -> Optional[DataFrame]

View source on GitHub

Fetches results as data frame in full or in specified chunks.

May use native pandas/arrow reader if available. Depending on the native implementation chunk size may vary.

arrow

def arrow(chunk_size: int = None, **kwargs: Any) -> Optional[ArrowTable]

View source on GitHub

Fetches results as data frame in full or in specified chunks.

May use native pandas/arrow reader if available. Depending on the native implementation chunk size may vary.

iter_df

def iter_df(chunk_size: int) -> Generator[DataFrame, None, None]

View source on GitHub

Default implementation converts arrow to df

iter_arrow

def iter_arrow(chunk_size: int) -> Generator[ArrowTable, None, None]

View source on GitHub

Default implementation converts query result to arrow table

This demo works on codespaces. Codespaces is a development environment available for free to anyone with a Github account. You'll be asked to fork the demo repository and from there the README guides you with further steps.
The demo uses the Continue VSCode extension.

Off to codespaces!

DHelp

Ask a question

Welcome to "Codex Central", your next-gen help center, driven by OpenAI's GPT-4 model. It's more than just a forum or a FAQ hub – it's a dynamic knowledge base where coders can find AI-assisted solutions to their pressing problems. With GPT-4's powerful comprehension and predictive abilities, Codex Central provides instantaneous issue resolution, insightful debugging, and personalized guidance. Get your code running smoothly with the unparalleled support at Codex Central - coding help reimagined with AI prowess.