Added
20 hours ago
Type
Full time
Salary
Salary not provided

Related skills

bigquery looker terraform sql python

📋 Description

  • Design, build, and maintain scalable data pipelines (ETL) for reliability
  • Manage data warehouse architecture for performance and cost efficiency
  • Implement data governance, security, and monitoring systems
  • Own data ingestion from multiple sources and integrate into analytics
  • Build data models and semantic layers for self-service analytics
  • Implement automated data quality checks and alerting systems

🎯 Requirements

  • 3+ years in data engineering; expert SQL; BigQuery preferred
  • Proficiency with data pipeline tools: Airflow, Terraform, Fivetran, dbt
  • Strong Python for data pipelines, API integrations, and ETL/ELT
  • Experience with AI-driven development tools to accelerate pipelines and docs
  • Strong data modeling principles and dimensional design
  • Experience with data governance, security, and quality frameworks
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs →