Related skills
java postgresql python scala airflow๐ Description
- Design and build a large-scale data storage and processing system handling billions of records/day
- Lead the build and design of a scalable new data platform
- Mentor and coach engineers to raise engineering excellence
- Collaborate with stakeholders across engineering, product and business
- Define and monitor SLAs; apply SRE practices for stability
- Build APIs for internal and external data consumers
๐ฏ Requirements
- 8+ years of industry experience
- Strong programming: Python, Scala and/or Java
- Deep experience with data processing frameworks: Spark, Flink, Databricks, Snowflake, BigQuery
- Streaming systems: Kafka, Kinesis
- SQL and RDBMS (PostgreSQL, Citus)
- Orchestration and governance tools: Airflow, Argo, OpenMetadata, GreatExpectations, dbt
๐ Benefits
- Inclusive global culture with 750+ Smartlies across 60 nationalities
- Global impact: contribute to customers' success and growth
- Wellbeing-focused benefits: healthcare, mental health, leave
- Comprehensive rewards: equity, rewards, competitive pay, development
- Flexible hybrid workplace: hybrid model with remote options
- Career growth opportunities and ongoing learning
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!