Related skills
bigquery sql python dbt airflow๐ Description
- Design, build, and maintain scalable data pipelines for BI.
- Work on GCP BigQuery data warehouse with dbt.
- Develop Python-based services and APIs for data integration.
- Build REST APIs with Flask and document with Swagger.
- Orchestrate data workflows with n8n or Airflow to ensure freshness.
- Follow Git, PRs, CI/CD and best practices.
๐ฏ Requirements
- Proven Data Engineer experience building data pipelines.
- BI tools: MicroStrategy, Power BI, Tableau.
- Strong SQL and data modeling fundamentals.
- Python development experience.
- Build REST APIs with Flask and flask_restful.
- API documentation with Swagger.
- Hands-on with dbt and data lineage.
- Git with PRs, CI/CD; orchestration with n8n/Airflow; Flu ent English.
๐ Benefits
- Variable bonus
- Dynamic international teams
- Access to self-learning courses
- Participation in meetups and conferences
- Flexible office with up to 2 days at home
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!