Skip to main content
Version: devel

dlt.common.libs.ibis

_DltBackend Objects

class _DltBackend(SQLBackend, NoUrl, NoExampleLoader)

View source on GitHub

Ibis backend that delegates execution to dlt's native SQL engine.

To make Ibis expressions executable, they need to be bound to a table on the backend (i.e., data that exists somewhere). By default, Ibis backends work by creating and maintaining a connection to the backend.

This DltBackend is "lazier" and doesn't make a connection to the backend where the data lives ("backend" in Ibis, "destination" in dlt). Instead, it uses dlt metadata to register what table exists and create bound expressions.

For example, if you use Ibis with the Snowflake backend, you will get use Ibis' implementation. If you use the DltBackend with a Snowflake destination, you will use the dlt SQL client for Snowflake. The SQL query created should be the same because dlt internally uses the ibis -> sqlglot compiler, but the actual execution might differ. This is especially likely when calling .execute() or other methods that return data in memory.

from_dataset

@classmethod
def from_dataset(cls, dataset: dlt.Dataset) -> _DltBackend

View source on GitHub

Create an Ibis DltBackend from a dlt.Dataset

This enables dlt.Relation.to_ibis() to create bound tables.

raw_sql

def raw_sql(query: Union[str, sg.Expression], **kwargs: Any) -> Any

View source on GitHub

Execute SQL string or SQLGlot expression using the dlt destination SQL client

list_tables

def list_tables(
*,
like: Optional[str] = None,
database: Union[tuple[str, str], str, None] = None) -> list[str]

View source on GitHub

Return the list of table names

get_schema

def get_schema(table_name: str, *args: Any, **kwargs: Any) -> sch.Schema

View source on GitHub

Get the Ibis table schema

table

def table(name: str,
*,
database: Union[tuple[str, str], str, None] = None) -> ir.Table

View source on GitHub

Construct a table expression

This demo works on codespaces. Codespaces is a development environment available for free to anyone with a Github account. You'll be asked to fork the demo repository and from there the README guides you with further steps.
The demo uses the Continue VSCode extension.

Off to codespaces!

DHelp

Ask a question

Welcome to "Codex Central", your next-gen help center, driven by OpenAI's GPT-4 model. It's more than just a forum or a FAQ hub – it's a dynamic knowledge base where coders can find AI-assisted solutions to their pressing problems. With GPT-4's powerful comprehension and predictive abilities, Codex Central provides instantaneous issue resolution, insightful debugging, and personalized guidance. Get your code running smoothly with the unparalleled support at Codex Central - coding help reimagined with AI prowess.