Added
2 days ago
Type
Full time
Salary
Salary not provided

Related skills

github python databricks airflow informatica

๐Ÿ“‹ Description

  • Data Pipeline Development: Build scalable ETL/ELT pipelines with Informatica, Airflow, Dataproc.
  • Data Warehousing: Build and manage BigQuery data warehouse; ensure accuracy and accessibility.
  • Data Processing: Use Databricks and Python to process large data volumes.
  • DevOps & Automation: Implement CI/CD for data workflows using Dataform and GitHub.
  • Collaboration: Partner with data scientists, analysts, and stakeholders.
  • Data Modeling & BI (Optional): Build data models and dashboards in Tableau.
  • EPM Integration (Optional): Integrate data with Anaplan for planning.

๐ŸŽฏ Requirements

  • Bachelor's degree in Computer Science, Engineering, or related field.
  • 3+ years of data engineering experience.
  • Strong SQL skills and BigQuery experience.
  • Hands-on with Databricks and Dataproc.
  • Experience with ETL/ELT tools like Informatica.
  • Proficiency with Airflow to orchestrate pipelines.
  • Strong Python programming skills.
  • DevOps principles and experience with Dataform and GitHub CI/CD.

๐ŸŽ Benefits

  • Competitive compensation and benefits.
  • Opportunities for learning and career growth.
  • Inclusive culture with DEIB commitment.
  • Work with Fortune 50 customers on a leading platform.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest โ€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs โ†’