Related skills
bigquery redshift aws snowflake sql๐ Description
- Designing, building, and optimizing data pipelines and ETL for large data.
- Developing data platforms and warehousing solutions to support analytics.
- Implementing scalable storage for high availability and performance.
- Boosting data processing performance for speed and scalability.
- Collaborating with data scientists and stakeholders to translate needs.
- Ensuring data quality, governance, and compliance.
๐ฏ Requirements
- 6+ years of commercial data engineering experience.
- Experience delivering in Agile environments.
- Hands-on with Databricks, SQL, Python, and Spark.
- Data warehousing tech: Snowflake, Redshift, BigQuery.
- Cloud data solutions: AWS, Azure, GCP; data lake architectures.
- Automation, CI/CD for data pipelines; IaC.
๐ Benefits
- Annual company bonus shared with team.
- Remote work with flexible hours.
- Paid time off: 28 days + holidays.
- Remote working allowance up to โฌ1250 every 2 years.
- Training allowance up to โฌ1000.
- Health care and pension/insurance plans.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!