Python Data Loading from google analytics to google cloud storage with dlt
Join our Slack community or book a call with our support engineer Violetta.
This technical documentation provides guidance on using the open-source Python library, dlt, to load data from Google Analytics to Google Cloud Storage. Google Analytics is a powerful tool that collects data from your websites and apps, generating insightful reports for your business. On the other hand, Google Cloud Storage is a versatile filesystem destination on the Google Cloud Platform, ideal for creating datalakes. It supports data upload in formats such as JSONL, Parquet, or CSV. Leveraging dlt can streamline the process of transferring data between these platforms. For more information on Google Analytics, visit https://analytics.google.com.
dlt Key Features
-
Google Analytics: Google Analytics is a
dltverified source that loads data using the Google Analytics API to the destination of your choice. It includes resources for loading basic Analytics info, assembling and presenting report metrics, and compiling and displaying data related to the report's dimensions. -
Governance Support in
dltPipelines:dltpipelines offer robust governance support through features like pipeline metadata utilization, schema enforcement and curation, and schema change alerts. These features contribute to better data management practices, compliance adherence, and overall data governance. More details can be found here. -
Google Storage:
dltsupports Google Storage, Azure Blob Storage, and local file system as destinations. The setup guide for each of these storage options can be found here. -
Setup Guide for Verified Sources:
dltprovides a comprehensive setup guide to help users get started with their data pipeline. The guide includes instructions on how to grab credentials for various bucket types and how to initialize the verified source. More information can be found here. -
Google BigQuery:
dltsupports Google BigQuery as a destination. The setup guide provides step-by-step instructions on how to install the necessary dependencies, create a new Google Cloud project, create a service account, grant BigQuery permissions, and update yourdltcredentials file with your service account info. More details can be found here.
Getting started with your pipeline locally
0. Prerequisites
dlt requires Python 3.8 or higher. Additionally, you need to have the pip package manager installed, and we recommend using a virtual environment to manage your dependencies. You can learn more about preparing your computer for dlt in our installation reference.
1. Install dlt
First you need to install the dlt library with the correct extras for Google Cloud Storage:
pip install "dlt[filesystem]"
The dlt cli has a useful command to get you started with any combination of source and destination. For this example, we want to load data from Google Analytics to Google Cloud Storage. You can run the following commands to create a starting point for loading data from Google Analytics to Google Cloud Storage:
# create a new directory
mkdir google_analytics_pipeline
cd google_analytics_pipeline
# initialize a new pipeline with your source and destination
dlt init google_analytics filesystem
# install the required dependencies
pip install -r requirements.txt
The last command will install the required dependencies for your pipeline. The dependencies are listed in the requirements.txt:
google-analytics-data
google-api-python-client
google-auth-oauthlib
requests_oauthlib
dlt[filesystem]>=0.3.25
You now have the following folder structure in your project:
google_analytics_pipeline/
├── .dlt/