LocationIQ Python API Docs | dltHub

Build a LocationIQ-to-database pipeline in Python using dlt with AI Workbench support for Claude Code, Cursor, and Codex.

Last updated:

LocationIQ is a location data platform offering geocoding, reverse geocoding, autocomplete, maps, routing, timezone, nearby POI and account APIs. The REST API base URL is https://us1.locationiq.com and All requests require an API access token passed as the key query parameter..

dlt is an open-source Python library that handles authentication, pagination, and schema evolution automatically. dlthub provides AI context files that enable code assistants to generate production-ready pipelines. Install with uv pip install "dlt[workspace]" and start loading LocationIQ data in under 10 minutes.


What data can I load from LocationIQ?

Here are some of the endpoints you can load from LocationIQ:

ResourceEndpointMethodData selectorDescription
search/v1/searchGET(top-level array)Forward geocoding (address → places).
reverse/v1/reverseGET(single object)Reverse geocoding (lat/lon → address).
autocomplete/v1/autocompleteGET(top-level array)Predictive search suggestions.
nearby/v1/nearbyGET(top-level array)Nearby POI search.
timezone/v1/timezoneGET(single object)Timezone lookup for coordinates.
balance/v1/balanceGET(single object)Account balance and rate‑limit info.
directions/v1/directions/{profile}/{coordinates}GETroutesRouting/directions service.
staticmap/v3/staticmapGET(binary image)Returns a static map image.
tiles/v3/{theme}/r/{z}/{x}/{y}.{format}GET(image tiles)Map tile service.

How do I authenticate with the LocationIQ API?

Authentication uses an API access token passed as the query parameter key on all requests.

1. Get your credentials

  1. Sign up or log in at https://locationiq.com/. 2) In the dashboard go to 'API Keys' or 'Access Tokens' and create or copy your access token. 3) Use this token as the key query parameter in API requests.

2. Add them to .dlt/secrets.toml

[sources.location_iq_static_maps_source] api_key = "your_locationiq_api_key_here"

dlt reads this automatically at runtime — never hardcode tokens in your pipeline script. For production environments, see setting up credentials with dlt for environment variable and vault-based options.


How do I set up and run the pipeline?

Set up a virtual environment and install dlt:

uv venv && source .venv/bin/activate uv pip install "dlt[workspace]"

1. Install the dlt AI Workbench:

dlt ai init --agent <your-agent> # <agent>: claude | cursor | codex

This installs project rules, a secrets management skill, appropriate ignore files, and configures the dlt MCP server for your agent. Learn more →

2. Install the rest-api-pipeline toolkit:

dlt ai toolkit rest-api-pipeline install

This loads the skills and context about dlt the agent uses to build the pipeline iteratively, efficiently, and safely. The agent uses MCP tools to inspect credentials — it never needs to read your secrets.toml directly. Learn more →

3. Start LLM-assisted coding:

Use /find-source to load data from the LocationIQ API into DuckDB.

The rest-api-pipeline toolkit takes over from here — it reads relevant API documentation, presents you with options for which endpoints to load, and follows a structured workflow to scaffold, debug, and validate the pipeline step by step.

4. Run the pipeline:

python location_iq_static_maps_pipeline.py

If everything is configured correctly, you'll see output like this:

Pipeline location_iq_static_maps_pipeline load step completed in 0.26 seconds 1 load package(s) were loaded to destination duckdb and into dataset location_iq_static_maps_data The duckdb destination used duckdb:/location_iq_static_maps.duckdb location to store data Load package 1749667187.541553 is LOADED and contains no failed jobs

Inspect your pipeline and data:

dlt pipeline location_iq_static_maps_pipeline show

This opens the Pipeline Dashboard where you can verify pipeline state, load metrics, schema (tables, columns, types), and query the loaded data directly.


Python pipeline example

This example loads search and reverse from the LocationIQ API into DuckDB. It mirrors the endpoint and data selector configuration from the table above:

import dlt from dlt.sources.rest_api import RESTAPIConfig, rest_api_resources @dlt.source def location_iq_static_maps_source(api_key=dlt.secrets.value): config: RESTAPIConfig = { "client": { "base_url": "https://us1.locationiq.com", "auth": { "type": "api_key", "api_key": api_key, }, }, "resources": [ {"name": "search", "endpoint": {"path": "v1/search"}}, {"name": "reverse", "endpoint": {"path": "v1/reverse"}} ], } yield from rest_api_resources(config) def get_data() -> None: pipeline = dlt.pipeline( pipeline_name="location_iq_static_maps_pipeline", destination="duckdb", dataset_name="location_iq_static_maps_data", ) load_info = pipeline.run(location_iq_static_maps_source()) print(load_info)

To add more endpoints, append entries from the resource table to the "resources" list using the same name, path, and data_selector pattern.


How do I query the loaded data?

Once the pipeline runs, dlt creates one table per resource. You can query with Python or SQL.

Python (pandas DataFrame):

import dlt data = dlt.pipeline("location_iq_static_maps_pipeline").dataset() sessions_df = data.search.df() print(sessions_df.head())

SQL (DuckDB example):

SELECT * FROM location_iq_static_maps_data.search LIMIT 10;

In a marimo or Jupyter notebook:

import dlt data = dlt.pipeline("location_iq_static_maps_pipeline").dataset() data.search.df().head()

See how to explore your data in marimo Notebooks and how to query your data in Python with dataset.


What destinations can I load LocationIQ data to?

dlt supports loading into any of these destinations — only the destination parameter changes:

DestinationExample value
DuckDB (local, default)"duckdb"
PostgreSQL"postgres"
BigQuery"bigquery"
Snowflake"snowflake"
Redshift"redshift"
Databricks"databricks"
Filesystem (S3, GCS, Azure)"filesystem"

Change the destination in dlt.pipeline(destination="snowflake") and add credentials in .dlt/secrets.toml. See the full destinations list.


Troubleshooting

Authentication failures

If the API key is missing, invalid or inactive the API returns HTTP 401 with a JSON error {"error":"Invalid key"} or "Key not active - Please write to support". Ensure you pass ?key=YOUR_ACCESS_TOKEN and that the token is active.

Rate limits and throttling

Requests that exceed per‑second, per‑minute or per‑day quotas return HTTP 429 with messages such as "Rate Limited Second", "Rate Limited Minute" or "Rate Limited Day". Back off and retry with exponential delay; you can check limits via the /v1/balance endpoint.

Common error responses

LocationIQ returns JSON bodies like {"error":"Error message"} with status codes: 400 (Invalid Request), 401 (Invalid key), 403 (Service not enabled), 404 (Not found), 429 (Rate Limited), 500 (Internal error).

Ensure that the API key is valid to avoid 401 Unauthorized errors. Also, verify endpoint paths and parameters to avoid 404 Not Found errors.


Next steps

Continue your data engineering journey with the other toolkits of the dltHub AI Workbench:

  • data-exploration — Build custom notebooks, charts, and dashboards for deeper analysis with marimo notebooks.
  • dlthub-runtime — Deploy, schedule, and monitor your pipeline in production.
dlt ai toolkit data-exploration install dlt ai toolkit dlthub-runtime install

Was this page helpful?

Community Hub

Need more dlt context for LocationIQ?

Request dlt skills, commands, AGENT.md files, and AI-native context.