Added
10 days ago
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
bigquery aws sql python gcpπ Description
- Architect, design, and maintain scalable data pipelines for large datasets.
- Build robust ingestion and transformation workflows (ELT/ETL).
- Implement cloud architecture (AWS/GCP) with security and governance.
- Optimize performance and scale data storage/compute.
- Collaborate with Analytics, Product, and Design teams.
- Own data quality and explore new technologies for reliability.
π― Requirements
- 3+ years of data engineering experience.
- Cloud infrastructure mastery on AWS and/or GCP.
- Advanced Python, SQL, Scala or Java skills.
- Big data & streaming: Spark, Hadoop, Beam; Kafka/Pub/Sub/Kinesis.
- Orchestration: Airflow; transformation: dbt.
- DevOps/CI/CD with Docker, Kubernetes.
- Modern warehousing with Snowflake/BigQuery/Redshift.
- Problem solving: troubleshoot distributed systems.
π Benefits
- Hybrid work: two days remote, three days in-office.
- Comprehensive benefits and competitive compensation.
- Top-tier hardware and async-friendly culture.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!