Related skills
github python databricks airflow informatica๐ Description
- Data Pipeline Development: Build scalable ETL/ELT pipelines with Informatica, Airflow, Dataproc.
- Data Warehousing: Build and manage BigQuery data warehouse; ensure accuracy and accessibility.
- Data Processing: Use Databricks and Python to process large data volumes.
- DevOps & Automation: Implement CI/CD for data workflows using Dataform and GitHub.
- Collaboration: Partner with data scientists, analysts, and stakeholders.
- Data Modeling & BI (Optional): Build data models and dashboards in Tableau.
- EPM Integration (Optional): Integrate data with Anaplan for planning.
๐ฏ Requirements
- Bachelor's degree in Computer Science, Engineering, or related field.
- 3+ years of data engineering experience.
- Strong SQL skills and BigQuery experience.
- Hands-on with Databricks and Dataproc.
- Experience with ETL/ELT tools like Informatica.
- Proficiency with Airflow to orchestrate pipelines.
- Strong Python programming skills.
- DevOps principles and experience with Dataform and GitHub CI/CD.
๐ Benefits
- Competitive compensation and benefits.
- Opportunities for learning and career growth.
- Inclusive culture with DEIB commitment.
- Work with Fortune 50 customers on a leading platform.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!