Skip to main content
Version: devel

common.schema.normalizers

configured_normalizers

@with_config(spec=SchemaConfiguration, sections=_section_for_schema)
def configured_normalizers(
naming: TNamingConventionReferenceArg = dlt.config.value,
json_normalizer: TJSONNormalizer = dlt.config.value,
allow_identifier_change_on_table_with_data: bool = None,
use_break_path_on_normalize: Optional[bool] = None,
schema_name: Optional[str] = None) -> TNormalizersConfig

[view_source]

Gets explicitly onfigured normalizers without any defaults or capabilities injection. If naming is a module or a type it will get converted into string form via import.

If schema_name is present, a section ("sources", schema_name, "schema") is used to inject the config

import_normalizers

@with_config
def import_normalizers(
explicit_normalizers: TNormalizersConfig,
default_normalizers: TNormalizersConfig = None
) -> Tuple[TNormalizersConfig, NamingConvention,
Type[DataItemNormalizer[Any]]]

[view_source]

Imports the normalizers specified in normalizers_config or taken from defaults. Returns the updated config and imported modules.

destination_capabilities are used to get naming convention, max length of the identifier and max nesting level.

naming_from_reference

def naming_from_reference(
names: TNamingConventionReferenceArg,
max_length: Optional[int] = None) -> NamingConvention

[view_source]

Resolves naming convention from reference in names and applies max length if specified

Reference may be: (1) shorthand name pointing to dlt.common.normalizers.naming namespace (2) a type name which is a module containing NamingConvention attribute (3) a type of class deriving from NamingConvention

serialize_reference

def serialize_reference(
naming: Optional[TNamingConventionReferenceArg]) -> Optional[str]

[view_source]

Serializes generic naming reference to importable string.

This demo works on codespaces. Codespaces is a development environment available for free to anyone with a Github account. You'll be asked to fork the demo repository and from there the README guides you with further steps.
The demo uses the Continue VSCode extension.

Off to codespaces!

DHelp

Ask a question

Welcome to "Codex Central", your next-gen help center, driven by OpenAI's GPT-4 model. It's more than just a forum or a FAQ hub – it's a dynamic knowledge base where coders can find AI-assisted solutions to their pressing problems. With GPT-4's powerful comprehension and predictive abilities, Codex Central provides instantaneous issue resolution, insightful debugging, and personalized guidance. Get your code running smoothly with the unparalleled support at Codex Central - coding help reimagined with AI prowess.