Added
less than a minute ago
Type
Full time
Salary
Salary not provided

Related skills

bigquery looker docker sql python

πŸ“‹ Description

  • Design, build, and maintain scalable batch and real-time ETL/ELT pipelines using GCP tools.
  • Architect data infra for high-volume ingestion and processing.
  • Develop and manage centralized data warehouse in BigQuery.
  • Design data models, schemas, and tables for performance and maintainability.
  • Write clean SQL and Python to transform data into analysis-ready datasets.
  • Build workflows supporting analytics, reporting, and data science.

🎯 Requirements

  • 5+ years in data engineering or data platform development.
  • Degree in Computer Science, Engineering, Mathematics, or related STEM.
  • Strong SQL and Python programming skills.
  • Experience with ETL pipelines, Databricks.
  • Cloud experience (GCP, AWS, or Azure).
  • Hands-on with Airflow and Hadoop.

🎁 Benefits

  • Fully remote with potential transition to hybrid in future.
  • 100% health, dental, and vision premiums for you and dependents.
  • Eligibility starts from day one.
  • Growth and learning resources available.
  • Inclusive, equal-opportunity workplace.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’