Software Engineer III, Data Pipelines

Added
23 days ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

postgresql python dbt flask data lakehouse

πŸ“‹ Description

  • Design, build, and optimize ETL/ELT pipelines using dbt
  • Develop and maintain scalable data infra across PostgreSQL, Cosmos DB, and Azure services
  • Manage Data Lakehouse architecture, incl. Apache Iceberg
  • Build Python-based APIs and microservices (Flask or similar)
  • Design backend for real-time and batch data access
  • Work with distributed query engines (Trino) to analyze large datasets

🎯 Requirements

  • 5+ years in data engineering or related roles
  • Strong Python skills with production-grade APIs and microservices
  • Deep expertise in SQL and PostgreSQL (schema design, performance)
  • Hands-on experience with dbt for data transformation and pipeline development
  • Experience with large-scale data systems and data lake environments
  • Familiarity with Azure services including Cosmos DB and Trino

🎁 Benefits

  • Comprehensive medical, vision and dental benefits
  • Flexible Spending Accounts (FSA)
  • 401K with matching contributions
  • Generous sick, vacation, and holiday benefits
  • Gym membership contribution
  • Internal referral program
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’