Related skills
snowflake python kubernetes airflow spark📋 Description
- Build, maintain, and optimize data infrastructure.
- Evolve AWS-based infrastructure.
- Work with Snowflake, Iceberg, Trino, Athena, and Glue.
- Utilize Airflow, Spark, Kubernetes, ArgoCD and AWS.
- Provide AI tools to ease data access for customers.
- Integrate external tools for anomaly detection or data ingestion.
🎯 Requirements
- 5+ years of experience as a Data Engineer, or Backend Developer.
- Experience with Big Data and cloud-based environments, AWS preferred.
- Experience with Spark and Airflow.
- Experience with Snowflake, Databrick, BigQuery or Iceberg.
- Strong development experience in Python.
- Knowledge of Scala for Spark is a plus.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!