Related skills
bigquery snowflake python databricks airflowπ Description
- Build and maintain platform capabilities for ingestion, storage, and processing of data at scale.
- Contribute to petabyte-scale data lake modernization and format migrations.
- Develop platform features to support AI/ML workflows on the data lake.
- Partner with engineering teams to integrate with the customer data platform.
- Participate in on-call rotations and define operational standards.
π― Requirements
- 3+ years of software engineering experience building production systems.
- Proficiency in Python, Go, Java, or C#.
- Familiarity with batch/streaming data systems concepts.
- Experience debugging and operating production services using logs and metrics.
- Experience with big data tooling such as Spark, Kafka, Hive, Airflow, or Superset.
π Benefits
- Engineering Career Framework provides growth paths.
- Remote-friendly role with Canada/select locations.
- On-call rotations and operational standards across teams.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!