Related skills
looker sql python dbt airflow📋 Description
- Identify, collect, and integrate data from sources to build BI pipelines.
- Develop and optimize code for cost-effective data pipelines.
- Build monitoring to ensure ETL flows and data quality.
- Design processes to correct ETL incidents.
- Collaborate with CRM/data teams to align data with initiatives.
- Design data architectures and assist cloud migrations.
🎯 Requirements
- Experience in data engineering, big data, BI, or data science.
- Spark, Python, Scala, or similar.
- Excellent SQL for large-scale data transforms.
- Cloud data warehousing and pipelines; Dataflow, Looker, DBT, EMR, Airflow.
- Experience with cloud-based infra (GCP preferred; AWS/Azure OK).
- Programming in JavaScript or Python.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!