Related skills
looker sql python kubernetes dbt📋 Description
- Identify and integrate data from multiple sources to build pipelines.
- Develop and optimize code to run pipelines at low cost.
- Build monitoring procedures to ensure ETL flows and data quality.
- Design data architectures and support cloud data migrations.
- Collaborate with CRM developers, data scientists, analysts, and product owners.
- Advise analytics teams on data practices and governance.
🎯 Requirements
- Experience in data engineering, big data, BI, or data science.
- Spark, Python, or Scala with strong SQL skills.
- Cloud data warehouses and pipelines (GCP preferred; AWS/Azure ok).
- Experience with Google Dataflow, Looker, DBT, EMR, Airflow.
- Programming in JavaScript or Python; DB management and performance tuning.
- Data governance, data quality, and data modeling with GitHub/dbt workflows.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!