Related skills
sql python databricks airflow informatica📋 Description
- Data Pipeline Development: design and maintain ETL/ELT pipelines.
- Data Warehousing: manage data warehouse on Google BigQuery.
- Data Processing: use Databricks and Python for data processing.
- DevOps & Automation: CI/CD for data workflows (Dataform, GitHub).
- Collaboration: partner with data scientists and analysts.
- Data Modeling & BI (Optional): build data models and dashboards.
🎯 Requirements
- Bachelor's degree in Computer Science, Engineering, or a related technical field.
- 5+ years of proven experience in a data engineering role.
- Strong proficiency in SQL and extensive experience with Google BigQuery.
- Hands-on experience with Databricks and Dataproc; ETL/ELT tools like Informatica.
- Proficiency in orchestrating data pipelines with Airflow.
- Strong programming skills in Python.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!