Related skills
sql python gcp dbt airflow๐ Description
- Design, implement, and evolve scalable data infrastructure for cost and carbon.
- Own end-to-end data pipelines for cloud cost, usage, and emissions.
- Partner with data scientists, engineering, finance, and procurement to design data architectures.
- Set standards for data modeling, orchestration, testing, and observability.
- Build analytics-ready datasets for reporting, forecasting, and optimization.
- Ensure data accuracy and timeliness for cost and emissions reporting.
๐ฏ Requirements
- Senior data engineer with end-to-end production data systems experience.
- Degree in CS/engineering or equivalent proven experience.
- Designs scalable data architectures for large data and complexity.
- Leads technical discussions and aligns partners on long-term solutions.
- Proficient in Python, SQL, DBT, and orchestration tooling.
- Experience with Spark, Flink, or Dataflow and cloud platforms (GCP).
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!