Securosys TSB Python API Docs | dltHub
Build a Securosys TSB-to-database pipeline in Python using dlt with AI Workbench support for Claude Code, Cursor, and Codex.
Last updated:
Securosys TSB is a Transaction Security Broker that exposes a REST API to perform cryptographic operations and manage HSM-backed keys and workflows. The REST API base URL is https://{host}/v1 and Authentication is provided via API keys, mTLS, or JWTs; CloudHSM deployments use JWT by default..
dlt is an open-source Python library that handles authentication, pagination, and schema evolution automatically. dlthub provides AI context files that enable code assistants to generate production-ready pipelines. Install with uv pip install "dlt[workspace]" and start loading Securosys TSB data in under 10 minutes.
What data can I load from Securosys TSB?
Here are some of the endpoints you can load from Securosys TSB:
| Resource | Endpoint | Method | Data selector | Description |
|---|---|---|---|---|
| version_info | v1/versionInfo | GET | Returns service version information (single object) | |
| system_time | v1/systemTime | GET | Returns current system/HSM time (service info) | |
| request_status | v1/requests/{id} | GET | Returns status and result for a submitted HSM request | |
| requests | v1/requests | GET | List or query submitted requests (if response contains a list key) | |
| keystores | v1/keystores | GET | List HSM keystores or partitions (if present) |
How do I authenticate with the Securosys TSB API?
The TSB supports API Key, mutual TLS (mTLS) and JSON Web Token (JWT) authentication. The required method depends on the deployment (CloudHSM uses JWT; on‑premise supports mTLS or API keys). Requests must include the appropriate header (e.g., Authorization: Bearer for JWT) or client certificate for mTLS.
1. Get your credentials
- Cloud (TSBaaS) – log into the Securosys CloudHSM portal and generate a JWT token in the "Access Tokens" section. 2) On‑premise – log into the TSB administration console, navigate to the "API Keys" page, and create a new API key; copy the generated key value. 3) mTLS – request a client certificate from your security team, install it in the application’s trust store, and configure the server to require client‑auth (e.g., set server.ssl.client-auth to "need").
2. Add them to .dlt/secrets.toml
[sources.securosys_tsb_source] api_key = "your_api_key_here"
dlt reads this automatically at runtime — never hardcode tokens in your pipeline script. For production environments, see setting up credentials with dlt for environment variable and vault-based options.
How do I set up and run the pipeline?
Set up a virtual environment and install dlt:
uv venv && source .venv/bin/activate uv pip install "dlt[workspace]"
1. Install the dlt AI Workbench:
dlt ai init --agent <your-agent> # <agent>: claude | cursor | codex
This installs project rules, a secrets management skill, appropriate ignore files, and configures the dlt MCP server for your agent. Learn more →
2. Install the rest-api-pipeline toolkit:
dlt ai toolkit rest-api-pipeline install
This loads the skills and context about dlt the agent uses to build the pipeline iteratively, efficiently, and safely. The agent uses MCP tools to inspect credentials — it never needs to read your secrets.toml directly. Learn more →
3. Start LLM-assisted coding:
Use /find-source to load data from the Securosys TSB API into DuckDB.
The rest-api-pipeline toolkit takes over from here — it reads relevant API documentation, presents you with options for which endpoints to load, and follows a structured workflow to scaffold, debug, and validate the pipeline step by step.
4. Run the pipeline:
python securosys_tsb_pipeline.py
If everything is configured correctly, you'll see output like this:
Pipeline securosys_tsb_pipeline load step completed in 0.26 seconds 1 load package(s) were loaded to destination duckdb and into dataset securosys_tsb_data The duckdb destination used duckdb:/securosys_tsb.duckdb location to store data Load package 1749667187.541553 is LOADED and contains no failed jobs
Inspect your pipeline and data:
dlt pipeline securosys_tsb_pipeline show
This opens the Pipeline Dashboard where you can verify pipeline state, load metrics, schema (tables, columns, types), and query the loaded data directly.
Python pipeline example
This example loads version_info and request_status from the Securosys TSB API into DuckDB. It mirrors the endpoint and data selector configuration from the table above:
import dlt from dlt.sources.rest_api import RESTAPIConfig, rest_api_resources @dlt.source def securosys_tsb_source(api_key=dlt.secrets.value): config: RESTAPIConfig = { "client": { "base_url": "https://{host}/v1", "auth": { "type": "api_key", "api_key": api_key, }, }, "resources": [ {"name": "version_info", "endpoint": {"path": "v1/versionInfo"}}, {"name": "request_status", "endpoint": {"path": "v1/requests/{id}"}} ], } yield from rest_api_resources(config) def get_data() -> None: pipeline = dlt.pipeline( pipeline_name="securosys_tsb_pipeline", destination="duckdb", dataset_name="securosys_tsb_data", ) load_info = pipeline.run(securosys_tsb_source()) print(load_info)
To add more endpoints, append entries from the resource table to the "resources" list using the same name, path, and data_selector pattern.
How do I query the loaded data?
Once the pipeline runs, dlt creates one table per resource. You can query with Python or SQL.
Python (pandas DataFrame):
import dlt data = dlt.pipeline("securosys_tsb_pipeline").dataset() sessions_df = data.version_info.df() print(sessions_df.head())
SQL (DuckDB example):
SELECT * FROM securosys_tsb_data.version_info LIMIT 10;
In a marimo or Jupyter notebook:
import dlt data = dlt.pipeline("securosys_tsb_pipeline").dataset() data.version_info.df().head()
See how to explore your data in marimo Notebooks and how to query your data in Python with dataset.
What destinations can I load Securosys TSB data to?
dlt supports loading into any of these destinations — only the destination parameter changes:
| Destination | Example value |
|---|---|
| DuckDB (local, default) | "duckdb" |
| PostgreSQL | "postgres" |
| BigQuery | "bigquery" |
| Snowflake | "snowflake" |
| Redshift | "redshift" |
| Databricks | "databricks" |
| Filesystem (S3, GCS, Azure) | "filesystem" |
Change the destination in dlt.pipeline(destination="snowflake") and add credentials in .dlt/secrets.toml. See the full destinations list.
Troubleshooting
Authentication failures
If you receive 401/403, confirm you are using the deployment's required auth method (CloudHSM: JWT Bearer; on‑premise: API key or mTLS). For mTLS, ensure the client certificate is trusted by the server trust‑store and server.ssl.client-auth is configured appropriately.
Master HSM unavailable
If the master HSM is unreachable some operations will return an HSM error. Example response:
{ "errorCode" : 701 , "reason" : "res.error.in.hsm" , "message" : "HSM error: status: MasterNotReachable; for key ..." }
Configuration and TLS
For on‑premise deployments, configure TLS and mTLS in application-local.yml (server.ssl trust‑store and client‑auth). Use the appropriate Spring profile (local, access-token, standalone) for the desired behavior.
Ensure that the API key is valid to avoid 401 Unauthorized errors. Also, verify endpoint paths and parameters to avoid 404 Not Found errors.
Next steps
Continue your data engineering journey with the other toolkits of the dltHub AI Workbench:
data-exploration— Build custom notebooks, charts, and dashboards for deeper analysis with marimo notebooks.dlthub-runtime— Deploy, schedule, and monitor your pipeline in production.
dlt ai toolkit data-exploration install dlt ai toolkit dlthub-runtime install
Was this page helpful?
Community Hub
Need more dlt context for Securosys TSB?
Request dlt skills, commands, AGENT.md files, and AI-native context.