Highlights
- ETL-as-code: Adopted dlt to ensure transparency and stability for high-stakes, business-critical pipelines, allowing Vandebron full control over all aspects of ingestion processes, a necessity that UI-based systems failed to provide.
- Pipeline Performance, Speed, and Simplicity: Pipeline run times reduced from more than an hour to under 20 minutes. Plus, dlt is integral to Vandebron’s strategy to separate compute and storage as it aspires to operate its own green infrastructure.
- Leveraging LLMs: dlt as a Python framework made it easy to use LLMs to support development for both experienced and newcomers to dlt’s codebase
Data Stack
Data sources: Smart Meter Readings (Kafka and others), Salesforce, Business Central (MS SQL Sync), Marketing Cloud, Postgres, REST APIs, Energy Assets, Google Analytics, External European Energy Sources (Wind, Solar)
Destinations: Snowflake, Internal Layer (with Apache Iceberg and DuckLake), Weaviate (for future AI frameworks)
Orchestration: Dagster
Transformation: dbt
Challenge: Securing and Standardizing Data Flow for High-Stakes Energy Decisions
Vandebron, a green energy supplier based in Amsterdam, is more than just a retailer: it is committed to supporting the energy transition in the Netherlands, and strongly encourages the use of green energy in society.
Vandebron depends on high-frequency data, from smart meters, weather models, European data, and more. Achieving its goals depends on sophisticated strategies from its collected data for predicting consumption patterns, anticipating grid congestion, and identifying the ‘cleanest moments of the day’ for energy supply (and demand). All of this predicts grid loads and supports renewable integration, while maintaining a consumer-led business: accounting for customer growth, customer satisfaction and helping Vandebron customers act in more environmentally-friendly ways.
Its ingestion stack, however, was fragmented across Stitch, Airbyte, and custom Python scripts. Pipelines were massive, brittle, and hard to maintain. UI-bound tools couldn’t support Kafka ingestions or custom Vandebron authentication practices for Salesforce.
The decisions to improve the functionality came when Vandebron needed to improve its handling of meter readings, which involved ingesting a huge amount of data via Kafka. The existing ingestion tools failed to meet the challenge:
We tried Airbyte for all this data but the Kafka source was barely maintained. And because of the UI-only implementation of Airbyte, you are bound to what the UI gives you. We also looked at using Stitch, but there you do have that pay-per-go, pay-per-usage model. We didn't like that idea for the amount of data we’d use
- Lucas Fagliano, Senior Data Engineer @ Vandebron
Where the problems became more than just complications led Lucas and his team towards an Infrastructure as Code approach.
The team required a transparent, code-based framework that could handle high-volume, time-sensitive data, and adapt to complex enterprise requirements like custom authentication for sources such as Salesforce.
Solution: dlt as the flexible, transparent, code-based "LEGO Manual"
Vandebron discovered dlt by chance through a viral meme. Once discovered, the realization came quickly that dlt offered both deeper control and overall efficiency, particularly for Python users.
That led to Lucas and the data platform team further exploring dlt for Vandebron’s needs, and it was put to work immediately, being integrated to manage the company's most critical pipelines:
The initial test involved solving the difficult Smart Meter reading pipeline via Kafka. Following the process of learning dlt’s scope through documentation and code review, within a day, Vandebron had a barebones pipelines ready, and in one month, they replaced one of their most complex pipelines, dealing with terabytes of data in a couple of months.
The team views dlt not just as a tool, but as a framework for assembly:
I always try to think that dlt for us is like LEGO. We have the pieces, dlt provides me and us with the manual of how to put those pieces together, and then off we go from there.
- Lucas Fagliano, Senior Data Engineer @ Vandebron
dlt’s core value for Vandebron was providing flexibility and complete transparency over the process, allowing Lucas and his team to maintain data integrity by customizing sources where necessary. This level of control allowed Vandebron to migrate away from proprietary connectors, and started modifying big components like Microsoft’s Business Central for SQL sync, and Salesforce.
Bonus: Leveraging LLMs
Another benefit of adopting a code-centric and Pythonic tool like dlt was how well it plays with Large Language Models (LLMs), boosting development speed.
One of the really good advantages of dlt compared to other tools is that if our team wants an LLM to help out to do something in dlt, the steps that the LLM has to do are pretty simple [as it goes through the OSS libraries]... If you want to ask the same thing through Stitch or for Airbyte, it has to go through the documentation online to figure out the things, and that becomes a problem.
- Lucas Fagliano, Senior Data Engineer @ Vandebron
Results: Efficiency, Clarity, and Operational Transformation
With dlt we could replace something that was extremely daunting by a state of simplicity.
- Lucas Fagliano, Senior Data Engineer @ Vandebron
With Infrastructure as Code thinking and ambition to have real control over the outcomes of data ingestion, part of Vandebron’s existing setup needed to be overhauled. That said, the overhaul was no small task.
A key early result was Vandebron completing a major, complex pipeline migration much faster and with more simplicity than expected.
Once established, dlt provided immediate benefits and measurable improvements across efficiency, maintainability, and cost reduction. In addition, the ease of use of dlt for Python users, plus the strong dlt community and the wider dltHub backing were key tenets for embracing it.
After the first pipeline, the team moved the two remaining complex pipelines into production within a week of dev work, moving from 3.2k lines of code to 472 in one pipeline, and from a complicated GUI $700/m setup to $20/m pipeline that can be maintained as version controlled and configurable.
On top of that, all other production dependent pipelines were moved. And concurrently, the success of the data platform is seeing more teams within the company following the path set forward: a sister team inside Vandebron, focused on ML, is also beginning switching their Airbyte ingestions over to dlt.

The main reasons for why we ended up changing was reliability, we wanted to know what our pipelines were doing, extreme ease of use for Python users... and the open source community.
- Lucas Fagliano, Senior Data Engineer @ Vandebron
Future: Green Infrastructure and Open-Source Contribution
Vandebron’s next step is to make its infrastructure fully green, moving workflows away from AWS and separating compute and storage. Using dlt for reverse ETL, they’re experimenting with Ducklake and Iceberg to move data from Snowflake to internal layers and exploring integrations with Weaviate for AI frameworks.
"dlt is the way we’ll go full circle," says Lucas. "It lets us take data back out of Snowflake and keep everything in our control."
Vandebron is also considering dlt for use in AI frameworks requiring data push to Weaviate, the open-source vector database.
About Vandebron
Vandebron is an Amsterdam-based green energy company that connects Dutch consumers directly with local renewable energy producers — wind farms, solar parks, and bio-energy installations — through an online marketplace model. Rather than producing energy itself, Vandebron gives customers full transparency into where their electricity comes from and who they're supporting. The company has grown to serve over 200,000 households and was acquired by Essent in 2019, while continuing to operate independently. Beyond energy supply, Vandebron has expanded into e-mobility services, smart energy management, and home sustainability solutions, all driven by its mission to reach 100% sustainable energy in the Netherlands as quickly as possible.
