Related skills
bigquery looker docker sql pythonπ Description
- Design, build, and maintain scalable batch and real-time ETL/ELT pipelines using GCP tools.
- Architect data infra for high-volume ingestion and processing.
- Develop and manage centralized data warehouse in BigQuery.
- Design data models, schemas, and tables for performance and maintainability.
- Write clean SQL and Python to transform data into analysis-ready datasets.
- Build workflows supporting analytics, reporting, and data science.
π― Requirements
- 5+ years in data engineering or data platform development.
- Degree in Computer Science, Engineering, Mathematics, or related STEM.
- Strong SQL and Python programming skills.
- Experience with ETL pipelines, Databricks.
- Cloud experience (GCP, AWS, or Azure).
- Hands-on with Airflow and Hadoop.
π Benefits
- Fully remote with potential transition to hybrid in future.
- 100% health, dental, and vision premiums for you and dependents.
- Eligibility starts from day one.
- Growth and learning resources available.
- Inclusive, equal-opportunity workplace.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!