Skip to main content
Version: 1.21.0 (latest)

dlt.destinations.impl.fabric.configuration

Configuration for Fabric Warehouse destination - extends Synapse configuration with COPY INTO support

FabricCredentials Objects

@configspec(init=False)
class FabricCredentials(AzureServicePrincipalCredentials)

View source on GitHub

Credentials for Microsoft Fabric Warehouse with Service Principal authentication.

Fabric Warehouse requires Azure AD Service Principal authentication. Inherits from AzureServicePrincipalCredentials for Service Principal fields and automatic fallback to DefaultAzureCredential.

drivername

SQLAlchemy driver name for SQL Server/Fabric.

host

Fabric Warehouse host (e.g., abc12345-6789-def0-1234-56789abcdef0.datawarehouse.fabric.microsoft.com)

port

Database port (default: 1433)

database

Fabric Warehouse database name

connect_timeout

Connection timeout in seconds (default: 15)

azure_storage_account_name

Not used for Fabric Warehouse credentials (only staging credentials need this)

on_partial

def on_partial() -> None

View source on GitHub

Enable fallback to DefaultAzureCredential if explicit credentials not provided.

get_odbc_dsn_dict

def get_odbc_dsn_dict() -> Dict[str, Any]

View source on GitHub

Build ODBC DSN dictionary with Fabric-specific settings.

to_odbc_dsn

def to_odbc_dsn() -> str

View source on GitHub

Build ODBC connection string for pyodbc.

to_native_credentials

def to_native_credentials() -> Optional[Any]

View source on GitHub

Return credentials in a format suitable for the native driver/library.

FabricClientConfiguration Objects

@configspec
class FabricClientConfiguration(DestinationClientDwhWithStagingConfiguration)

View source on GitHub

Configuration for Fabric Warehouse destination with staging and collation support.

Uses FabricCredentials for Service Principal authentication. Supports OneLake/Lakehouse or Azure Blob Storage staging with COPY INTO for efficient data loading.

Example usage with OneLake/Lakehouse staging (recommended): fabric( credentials={ "host": "abc12345-6789-def0-1234-56789abcdef0.datawarehouse.fabric.microsoft.com", "database": "mydb", "tenant_id": "your-tenant-id", "client_id": "your-client-id", "client_secret": "your-client-secret", }, staging=filesystem(

IMPORTANT: Must use workspace GUID and lakehouse GUID (not names)

Format: abfss://<workspace_guid>@onelake.dfs.fabric.microsoft.com/<lakehouse_guid>/Files

bucket_url="abfss://12345678-1234-1234-1234-123456789012@onelake.dfs.fabric.microsoft.com/87654321-4321-4321-4321-210987654321/Files",

IMPORTANT: Must specify Service Principal credentials (same as warehouse)

credentials={ "azure_storage_account_name": "onelake", "azure_account_host": "onelake.blob.fabric.microsoft.com", "azure_tenant_id": "your-tenant-id", "azure_client_id": "your-client-id", "azure_client_secret": "your-client-secret", }, ), collation="Latin1_General_100_BIN2_UTF8", )

Note: The bucket_url must use GUIDs for both workspace and lakehouse, not their display names. You can find these GUIDs in the Fabric portal workspace/lakehouse URLs.

Example usage with Azure Blob Storage staging: fabric( credentials={ "host": "abc12345-6789-def0-1234-56789abcdef0.datawarehouse.fabric.microsoft.com", "database": "mydb", "tenant_id": "your-tenant-id", "client_id": "your-client-id", "client_secret": "your-client-secret", }, staging=filesystem( bucket_url="az://your-container", credentials={ "azure_storage_account_name": "your-account-name", "azure_storage_account_key": "your-account-key", }, ), collation="Latin1_General_100_BIN2_UTF8", )

destination_type

type: ignore[misc]

collation

Database collation to use for text columns.

Note: Fabric Warehouse does not support table indexing. Storage is automatically managed by the system.

create_indexes

Whether primary_key and unique column hints are applied.

has_case_sensitive_identifiers

Whether identifiers (table/column names) are case-sensitive. Depends on database collation.

__config_gen_annotations__

Database collation for varchar columns. Fabric supports:

  • Latin1_General_100_BIN2_UTF8 (default, case-sensitive)
  • Latin1_General_100_CI_AS_KS_WS_SC_UTF8 (case-insensitive)

Both have UTF-8 encoding. LongAsMax=yes is automatically configured.

This demo works on codespaces. Codespaces is a development environment available for free to anyone with a Github account. You'll be asked to fork the demo repository and from there the README guides you with further steps.
The demo uses the Continue VSCode extension.

Off to codespaces!

DHelp

Ask a question

Welcome to "Codex Central", your next-gen help center, driven by OpenAI's GPT-4 model. It's more than just a forum or a FAQ hub – it's a dynamic knowledge base where coders can find AI-assisted solutions to their pressing problems. With GPT-4's powerful comprehension and predictive abilities, Codex Central provides instantaneous issue resolution, insightful debugging, and personalized guidance. Get your code running smoothly with the unparalleled support at Codex Central - coding help reimagined with AI prowess.