Related skills
looker sql python kubernetes dbt📋 Description
- Identify, collect, and integrate data to build high-quality analytics pipelines.
- Develop and optimize code to run pipelines at low cost and easy maintenance.
- Build monitoring procedures and tools to ensure solid ETL and data quality.
- Design processes and tools to correct ETL incidents.
- Consult analytics teams to ensure best practices on data usage.
- Design data architectures and support cloud data migrations.
🎯 Requirements
- Experience in data engineering, big data, BI, or data science.
- Spark, Python, Scala, or similar.
- Excellent SQL skills enabling large-scale data transformation and analysis.
- Cloud data warehousing, pipelines and ETL; Dataflow, Looker, DBT, EMR, Airflow.
- Experience with cloud-based data infrastructures (GCP preferred; AWS or Azure ok).
- Programming in Python or JavaScript.
- Expertise in managing databases, including performance tuning, backup, and recovery.
- Data quality management: profiling, cleansing, and validation.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!