Related skills
sql python databricks airflow informatica๐ Description
- Data Pipeline Development: design and maintain ETL/ELT pipelines.
- Data Warehousing: manage data warehouse on Google BigQuery.
- Data Processing: use Databricks and Python to process data.
- DevOps & Automation: CI/CD for data workflows (Dataform, GitHub).
- Collaboration: work with data scientists, analysts, stakeholders.
- Data Modeling & BI (Tableau) and Anaplan integration (optional).
๐ฏ Requirements
- Bachelor's degree in Computer Science, Engineering, or related field.
- 3+ years of experience in data engineering.
- Strong SQL and Google BigQuery experience.
- Hands-on Databricks and Dataproc experience.
- Experience with ETL/ELT tools like Informatica.
- Proficiency in orchestrating data pipelines with Airflow.
- Strong Python programming skills.
- DevOps principles with Dataform and GitHub for CI/CD.
๐ Benefits
- DEIB commitment and inclusive culture.
- Winning culture with development and rewards.
- Collaborative, cross-functional environment.
- Values-based culture that supports growth and recognition.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!