Related skills
sql python databricks airflow informatica๐ Description
- Data Pipeline Development: Build scalable ETL/ELT pipelines (Informatica, Airflow, Dataproc).
- Data Warehousing: Manage data warehouse on Google BigQuery for reliable data.
- Data Processing: Use Databricks and Python to transform large data volumes.
- DevOps & Automation: CI/CD pipelines for data workflows (Dataform, GitHub).
- Collaboration: Work with data scientists, analysts and stakeholders.
- Data Modeling & BI (Optional): Create data models and dashboards in Tableau.
๐ฏ Requirements
- Bachelor's degree in CS, Engineering, or related field.
- 5+ years' data engineering experience.
- Strong SQL and Google BigQuery proficiency.
- Databricks and Dataproc experience.
- ETL/ELT tools like Informatica.
- Proficient in Airflow for orchestration.
- Strong Python programming skills.
- DevOps basics; Dataform and GitHub CI/CD.
๐ Benefits
- Inclusive, DEIB culture and belonging.
- Reasonable accommodations available on request.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!