Related skills
looker sql python dbt airflow📋 Description
- Identify and integrate data from multiple sources for analytics.
- Build high-quality data pipelines and BI data models.
- Develop and optimize code for cost-efficient pipelines.
- Create monitoring procedures to ensure ETL quality.
- Design processes to correct ETL incidents.
- Collaborate with CRM devs, data scientists, analysts, and product owners.
🎯 Requirements
- Experience in data engineering, big data, BI, or data science.
- Spark, Python, Scala, or similar.
- SQL for large-scale data transformation and analysis.
- Cloud data warehousing and ETL; Dataflow, Looker, DBT, EMR, Airflow.
- Cloud infra experience (GCP preferred; AWS/Azure ok) and DB management.
- Data governance, data quality practices, and visualization tools.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!