Related skills
bigquery sql python dbt airflow📋 Description
- Pipeline and Workflow Orchestration: Design and operate workflows with observability.
- Data Modeling and Transformation: Design scalable transformations with dbt, BigQuery, Spark.
- Data Quality and Platform Standards: Define data quality, contracts, lineage standards.
- Handling Real-World Data Complexity: Manage incomplete, evolving data and schema drift.
- Cloud and Infrastructure Integration: Work in cloud-native environments (GCP/AWS) with IaC.
- Platform Improvement and Simplification: Improve observability and simplify architecture.
🎯 Requirements
- 5+ years building and operating production data systems.
- Strong Python and SQL; experience with data pipelines, orchestration, and warehousing.
- Experience with data quality, observability, failure modes, and long-running workflows.
- Strong communication and collaboration across engineering, product, and business teams.
- Worked with Dagster, Airflow, or Prefect.
- Experience with GCP, dbt, BigQuery, or event-driven architectures is a plus.
🎁 Benefits
- Competitive compensation with equity
- Ownership and impact on Avra’s data platform
- Technical environment: real-world data problems
- Lean, high-quality team with experienced engineers
- Flexible culture: remote-first (Brazil) with autonomy
- Flexible time off
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!