Skip to main content
Version: 1.4.0 (latest)

common.utils

uniq_id

def uniq_id(len_: int = 16) -> str

[view_source]

Returns a hex encoded crypto-grade string of random bytes with desired len_

uniq_id_base64

def uniq_id_base64(len_: int = 16) -> str

[view_source]

Returns a base64 encoded crypto-grade string of random bytes with desired len_

many_uniq_ids_base64

def many_uniq_ids_base64(n_ids: int, len_: int = 16) -> List[str]

[view_source]

Generate n_ids base64 encoded crypto-grade strings of random bytes with desired len_. This is more performant than calling uniq_id_base64 multiple times.

digest128

def digest128(v: str, len_: int = 15) -> str

[view_source]

Returns a base64 encoded shake128 hash of str v with digest of length len_ (default: 15 bytes = 20 characters length)

digest128b

def digest128b(v: bytes, len_: int = 15) -> str

[view_source]

Returns a base64 encoded shake128 hash of bytes v with digest of length len_ (default: 15 bytes = 20 characters length)

flatten_list_of_str_or_dicts

def flatten_list_of_str_or_dicts(
seq: Sequence[Union[StrAny, str]]) -> DictStrAny

[view_source]

Transforms a list of objects or strings [{K: {...}}, L, ...] -> {K: {...}, L: None, ...}

concat_strings_with_limit

def concat_strings_with_limit(strings: List[str], separator: str,
limit: int) -> Iterator[str]

[view_source]

Generator function to concatenate strings.

The function takes a list of strings and concatenates them into a single string such that the length of each concatenated string does not exceed a specified limit. It yields each concatenated string as it is created. The strings are separated by a specified separator.

Arguments:

  • strings List[str] - The list of strings to be concatenated.
  • separator str - The separator to use between strings. Defaults to a single space.
  • limit int - The maximum length for each concatenated string.

Yields:

Generator[str, None, None]: A generator that yields each concatenated string.

graph_edges_to_nodes

def graph_edges_to_nodes(edges: Sequence[Tuple[TAny, TAny]],
directed: bool = True) -> Dict[TAny, Set[TAny]]

[view_source]

Converts a directed graph represented as a sequence of edges to a graph represented as a mapping from nodes a set of connected nodes.

Isolated nodes are represented as edges to itself. If directed is False, each edge is duplicated but going in opposite direction.

graph_find_scc_nodes

def graph_find_scc_nodes(undag: Dict[TAny, Set[TAny]]) -> List[Set[TAny]]

[view_source]

Finds and returns a list of sets of nodes in strongly connected components of a undag which is undirected

To obtain undirected graph from edges use graph_edges_to_nodes function with directed argument False.

update_dict_with_prune

def update_dict_with_prune(dest: DictStrAny, update: StrAny) -> None

[view_source]

Updates values that are both in dest and update and deletes dest values that are None in update

update_dict_nested

def update_dict_nested(dst: TDict,
src: TDict,
copy_src_dicts: bool = False) -> TDict

[view_source]

Merges src into dst key wise. Does not recur into lists. Values in src overwrite dst if both keys exit. Only dict and its subclasses are updated recursively. With copy_src_dicts, dict key:values will be deep copied, otherwise, both dst and src will keep the same references.

clone_dict_nested

def clone_dict_nested(src: TDict) -> TDict

[view_source]

Clones src structure descending into nested dicts. Does not descend into mappings that are not dicts ie. specs instances. Compared to deepcopy does not clone any other objects. Uses update_dict_nested internally

map_nested_in_place

def map_nested_in_place(func: AnyFun, _nested: TAny, *args: Any,
**kwargs: Any) -> TAny

[view_source]

Applies func to all elements in _dict recursively, replacing elements in nested dictionaries and lists in place. Additional *args and **kwargs are passed to func.

is_interactive

def is_interactive() -> bool

[view_source]

Determine if the current environment is interactive.

Returns:

  • bool - True if interactive (e.g., REPL, IPython, Jupyter Notebook), False if running as a script.

custom_environ

@contextmanager
def custom_environ(env: StrStr) -> Iterator[None]

[view_source]

Temporarily set environment variables inside the context manager and fully restore previous environment afterwards

multi_context_manager

@contextmanager
def multi_context_manager(
managers: Sequence[ContextManager[Any]]) -> Iterator[Any]

[view_source]

A context manager holding several other context managers. Enters and exists all of them. Yields from the last in the list

is_inner_callable

def is_inner_callable(f: AnyFun) -> bool

[view_source]

Checks if f is defined within other function

get_module_name

def get_module_name(m: ModuleType) -> str

[view_source]

Gets module name from module with a fallback for executing module main

derives_from_class_of_name

def derives_from_class_of_name(o: object, name: str) -> bool

[view_source]

Checks if object o has class of name in its derivation tree

compressed_b64encode

def compressed_b64encode(value: bytes) -> str

[view_source]

Compress and b64 encode the given bytestring

compressed_b64decode

def compressed_b64decode(value: str) -> bytes

[view_source]

Decode a bytestring encoded with compressed_b64encode

merge_row_counts

def merge_row_counts(row_counts_1: RowCounts, row_counts_2: RowCounts) -> None

[view_source]

merges row counts_2 into row_counts_1

extend_list_deduplicated

def extend_list_deduplicated(
original_list: List[Any],
extending_list: Iterable[Any],
normalize_f: Callable[[str], str] = str.__call__) -> List[Any]

[view_source]

extends the first list by the second, but does not add duplicates

maybe_context

@contextmanager
def maybe_context(manager: ContextManager[TAny]) -> Iterator[TAny]

[view_source]

Allows context manager manager to be None by creating dummy context. Otherwise manager is used

without_none

def without_none(d: Mapping[TKey, Optional[TValue]]) -> Mapping[TKey, TValue]

[view_source]

Return a new dict with all None values removed

exclude_keys

def exclude_keys(mapping: Mapping[str, Any],
keys: Iterable[str]) -> Dict[str, Any]

[view_source]

Create a new dictionary from the input mapping, excluding specified keys.

Arguments:

  • mapping Mapping[str, Any] - The input mapping from which keys will be excluded.
  • keys Iterable[str] - The keys to exclude.

Returns:

Dict[str, Any]: A new dictionary containing all key-value pairs from the original mapping except those with keys specified in keys.

get_exception_trace

def get_exception_trace(exc: BaseException) -> ExceptionTrace

[view_source]

Get exception trace and additional information for DltException(s)

get_exception_trace_chain

def get_exception_trace_chain(exc: BaseException,
traces: List[ExceptionTrace] = None,
seen: Set[int] = None) -> List[ExceptionTrace]

[view_source]

Get traces for exception chain. The function will recursively visit all cause and context exceptions. The top level exception trace is first on the list

group_dict_of_lists

def group_dict_of_lists(
input_dict: Dict[str, List[Any]]) -> List[Dict[str, Any]]

[view_source]

Decomposes a dictionary with list values into a list of dictionaries with unique keys.

This function takes an input dictionary where each key maps to a list of objects. It returns a list of dictionaries, each containing at most one object per key. The goal is to ensure that no two objects with the same key appear in the same dictionary.

Arguments:

  • input_dict Dict[str, List[Any]] - A dictionary with string keys and list of objects as values.

Returns:

List[Dict[str, Any]]: A list of dictionaries, each with unique keys and single objects.

order_deduped

def order_deduped(lst: List[Any]) -> List[Any]

[view_source]

Returns deduplicated list preserving order of input elements.

Only works for lists with hashable elements.

This demo works on codespaces. Codespaces is a development environment available for free to anyone with a Github account. You'll be asked to fork the demo repository and from there the README guides you with further steps.
The demo uses the Continue VSCode extension.

Off to codespaces!

DHelp

Ask a question

Welcome to "Codex Central", your next-gen help center, driven by OpenAI's GPT-4 model. It's more than just a forum or a FAQ hub – it's a dynamic knowledge base where coders can find AI-assisted solutions to their pressing problems. With GPT-4's powerful comprehension and predictive abilities, Codex Central provides instantaneous issue resolution, insightful debugging, and personalized guidance. Get your code running smoothly with the unparalleled support at Codex Central - coding help reimagined with AI prowess.