Skip to main content
Version: devel

dlt.destinations.impl.lancedb.factory

lancedb Objects

class lancedb(Destination[LanceDBClientConfiguration, "LanceDBClient"])

View source on GitHub

__init__

def __init__(credentials: Union["DBConnection", LanceDBCredentials,
Dict[str, Any]] = None,
lance_uri: Optional[str] = None,
embedding_model_provider: TEmbeddingProvider = None,
embedding_model: str = None,
vector_field_name: str = None,
destination_name: str = None,
environment: str = None,
**kwargs: Any) -> None

View source on GitHub

Configure the LanceDB destination to use in a pipeline.

All arguments provided here supersede other configuration sources such as environment variables and dlt config files.

Arguments:

  • credentials Union["DBConnection", LanceDBCredentials, Dict[str, Any]] - Credentials to connect to the LanceDB database. Can be an instance of LanceDBCredentials or an instance of native LanceDB client or a dictionary with the credentials parameters.
  • lance_uri Optional[str] - LanceDB database URI. Defaults to local, on-disk instance. The available schemas are:
    • /path/to/database - local database.
    • db://host:port - remote database (LanceDB cloud).
  • embedding_model_provider TEmbeddingProvider, optional - Embedding provider used for generating embeddings. Default is "cohere". See LanceDB documentation for the full list of available providers.
  • embedding_model str, optional - The model used by the embedding provider for generating embeddings. Default is "embed-english-v3.0". Check with the embedding provider which options are available.
  • vector_field_name str, optional - Name of the special field to store the vector embeddings. Default is "vector".
  • destination_name str, optional - Name of the destination, can be used in config section to differentiate between multiple of the same type
  • environment str, optional - Environment of the destination
  • **kwargs Any, optional - Additional arguments forwarded to the destination config

This demo works on codespaces. Codespaces is a development environment available for free to anyone with a Github account. You'll be asked to fork the demo repository and from there the README guides you with further steps.
The demo uses the Continue VSCode extension.

Off to codespaces!

DHelp

Ask a question

Welcome to "Codex Central", your next-gen help center, driven by OpenAI's GPT-4 model. It's more than just a forum or a FAQ hub – it's a dynamic knowledge base where coders can find AI-assisted solutions to their pressing problems. With GPT-4's powerful comprehension and predictive abilities, Codex Central provides instantaneous issue resolution, insightful debugging, and personalized guidance. Get your code running smoothly with the unparalleled support at Codex Central - coding help reimagined with AI prowess.