Skip to main content
Version: 1.25.0 (latest)

dlt.destinations.impl.lance.factory

lance Objects

class lance(Destination[LanceClientConfiguration, "LanceClient"])

View source on GitHub

__init__

def __init__(catalog_type: LanceCatalogType = None,
credentials: Union[LanceCredentials, Dict[str, Any]] = None,
storage: Union[LanceStorageConfiguration, Dict[str, Any]] = None,
branch_name: Optional[str] = None,
embeddings: Union[LanceEmbeddingsConfiguration, Dict[str,
Any]] = None,
destination_name: str = None,
environment: str = None,
**kwargs: Any) -> None

View source on GitHub

Configure the Lance destination to use in a pipeline.

All arguments provided here supersede other configuration sources such as environment variables and dlt config files.

Arguments:

  • catalog_type LanceCatalogType, optional - Lance catalog backend. Defaults to "dir" (directory namespace).
  • credentials Union[LanceCredentials, Dict[str, Any]], optional - Catalog-scoped credentials. For "dir", this is an optional DirectoryCatalogCredentials overriding the __manifest location; when empty, catalog colocates with storage.
  • storage Union[LanceStorageConfiguration, Dict[str, Any]], optional - Storage configuration for table data (bucket, credentials, options, namespace subpath).
  • branch_name Optional[str] - Read/write branch for Lance operations. Uses main if not set.
  • embeddings Union[LanceEmbeddingsConfiguration, Dict[str, Any]], optional - Embedding provider, model, and credentials. If not provided, no vector column is added.
  • destination_name str, optional - Name of the destination.
  • environment str, optional - Environment of the destination.
  • **kwargs Any - Additional arguments forwarded to the destination config.

This demo works on codespaces. Codespaces is a development environment available for free to anyone with a Github account. You'll be asked to fork the demo repository and from there the README guides you with further steps.
The demo uses the Continue VSCode extension.

Off to codespaces!

DHelp

Ask a question

Welcome to "Codex Central", your next-gen help center, driven by OpenAI's GPT-4 model. It's more than just a forum or a FAQ hub – it's a dynamic knowledge base where coders can find AI-assisted solutions to their pressing problems. With GPT-4's powerful comprehension and predictive abilities, Codex Central provides instantaneous issue resolution, insightful debugging, and personalized guidance. Get your code running smoothly with the unparalleled support at Codex Central - coding help reimagined with AI prowess.