Related skills
python databricks airflow informatica google bigquery๐ Description
- Data Pipeline Development: Design, build, and maintain scalable ETL/ELT pipelines (Informatica, Airflow, Dataproc).
- Data Warehousing: Develop and manage data warehouse on Google BigQuery.
- Data Processing: Use Databricks and Python to process large data volumes.
- DevOps & Automation: CI/CD for data workflows using Dataform and GitHub.
- Collaboration: Work with data scientists, analysts, and stakeholders to deliver high-quality data.
- EPM Integration (Optional): Integrate data with Anaplan for connected planning.
๐ฏ Requirements
- Bachelor's degree in CS, Engineering, or related field.
- 5+ years data engineering experience.
- SQL and Google BigQuery proficiency.
- Databricks and Dataproc experience.
- Informatica ETL/ELT tools.
- Python programming; Airflow for orchestration; Dataform & GitHub CI/CD.
๐ Benefits
- DEIB-focused, inclusive culture with opportunities to work with Fortune 50 clients.
- Collaborative, fast-paced environment with growth and development.
- Exposure to AI-infused scenario planning and analytics platform.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!