Related skills
bigquery snowflake databricks airflow kafkaπ Description
- Build and maintain platform capabilities for reliable ingestion, storage, and processing at scale.
- Contribute to petabyte-scale data lake modernization and storage format migrations.
- Develop platform features to support AI/ML workflows on the data lake.
- Partner with engineering teams to integrate with the customer data platform.
- Participate in an on-call rotation and help define operational standards for platform services.
π― Requirements
- 3+ years of software engineering experience building production systems.
- Proficiency in a general-purpose language (Python, Go, Java or C#).
- Familiarity with batch/streaming data systems concepts (scheduling, backfills, idempotency).
- Experience debugging and operating production services using logs/metrics and incident response practices.
- Experience with big data tooling such as Spark/SparkSQL, Kafka, Hive, Airflow, or Superset.
- Experience with Databricks or other big data platforms (Snowflake, Redshift, BigQuery).
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!