tl;dr: Python + GCP + BigQuery + k8s.
* Develop robust ETL/ELT pipelines to extract, transform, and load data from diverse sources into our data warehouse.
* Enhance and maintain our cloud-based data storage and processing systems for performance, reliability, and cost-efficiency.
* Implement rigorous data quality checks, monitoring, and security measures across all data assets.
* 5+ years of experience in data engineering, with a strong grasp of data warehousing, ETL/ELT principles, and data modeling.
* Experience with BigQuery (, Redshift, Snowflake, etc)
* Experience with infrastructure tools (e.g. Terraform, Kubernetes, Docker) is a plus.
* I will read all resumes! We need more resumes!Job link: https://job-boards.greenhouse.io/epickids/jobs/6669024003
I know the posting mentions 5+ years, but I've been working at scale - built ETL pipelines processing TBs daily and have solid experience with data quality monitoring and cost optimization.
Happy to discuss - email in my profile.