Added
less than a minute ago
Type
Full time
Salary
Salary not provided

Related skills

java snowflake sql python databricks

πŸ“‹ Description

  • Develop end-to-end data solutions in production ensuring performance and security.
  • Collaborate on data integration across Snowflake/AWS/Azure/GCP with dbt.
  • Build scalable data pipelines and robust architectures across data sources.
  • Communicate with clients and stakeholders; present findings and recommendations.
  • Drive design, documentation, and deployment across the full SDLC.
  • Mentor teammates and contribute to data engineering best practices.

🎯 Requirements

  • 4+ years experience as Software Engineer, Data Engineer or Data Analyst
  • Proficient in Java, Python and/or Scala
  • SQL: write, debug, and optimize queries
  • Snowflake, AWS, Azure, Databricks, and GCP
  • Data integration tools: Spark, Kafka, Airflow, NiFi, Fivetran, etc
  • 4-year Bachelor's degree in Computer Science or related field

🎁 Benefits

  • Remote-first global company with flexible work
  • Autonomy to deliver results in a casual, exciting environment
  • Award-winning workplace recognition
  • Opportunity to work with leading cloud data platforms and partners
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs β†’