Related skills
java snowflake sql python databricksπ Description
- Develop end-to-end data solutions in production ensuring performance and security.
- Collaborate on data integration across Snowflake/AWS/Azure/GCP with dbt.
- Build scalable data pipelines and robust architectures across data sources.
- Communicate with clients and stakeholders; present findings and recommendations.
- Drive design, documentation, and deployment across the full SDLC.
- Mentor teammates and contribute to data engineering best practices.
π― Requirements
- 4+ years experience as Software Engineer, Data Engineer or Data Analyst
- Proficient in Java, Python and/or Scala
- SQL: write, debug, and optimize queries
- Snowflake, AWS, Azure, Databricks, and GCP
- Data integration tools: Spark, Kafka, Airflow, NiFi, Fivetran, etc
- 4-year Bachelor's degree in Computer Science or related field
π Benefits
- Remote-first global company with flexible work
- Autonomy to deliver results in a casual, exciting environment
- Award-winning workplace recognition
- Opportunity to work with leading cloud data platforms and partners
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!