Added
4 hours ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

snowflake etl python databricks dbt

📋 Description

  • Design scalable cloud-native data pipelines (batch & streaming) for analytics.
  • Architect enterprise data models across lakes, warehouses, and real-time systems.
  • Define data governance, reliability, performance, and cost optimization standards.
  • Develop Retrieval-Augmented Generation (RAG) systems for AI apps.
  • Integrate LLM APIs into secure, production-ready applications.
  • Collaborate with Product, Engineering, and Ops to translate requirements.

🎯 Requirements

  • Bachelor’s degree in CS or equivalent practical experience.
  • 8+ years of software or data engineering in production environments.
  • Experience integrating LLM APIs into production applications.
  • Hands-on ETL/ELT and streaming tech (Kafka, Spark, Snowflake, DBT).
  • Advanced SQL and storage formats.
  • Proficiency in Python and RESTful API development.

🎁 Benefits

  • Medical, Dental, and Vision insurance
  • 401(k) with company match
  • Flexible Spending Accounts (FSA)
  • Company-paid Life and Disability insurance
  • Flexible PTO and company holidays
  • Paid Parental Leave
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs →