Related skills
snowflake sql python dbt airflowπ Description
- Develop, test, and maintain dimensional data models from diverse sources
- Improve data model performance and accessibility
- Build high-quality data products for internal and external use
- Advise product on analytical requirements for new features
- Analyze feature performance with analytical queries and dashboards
- Document data to ensure team-wide consistency
π― Requirements
- Expert knowledge of SQL
- Experience building and testing data modelling pipelines
- Understanding data platform and data warehousing concepts
- Familiarity with dbt (must)
- Interest in improving performance/efficiency of data models
- English communication at C1+ level
- Desire to influence product decisions via data architecture
- Responsible AI tool proficiency
π Benefits
- Bonus: Good Polish language skills
- Drive product decisions and own features
- Familiarity with Fivetran, Airflow, Snowflake, Plotly, Starrocks, Metabase
- ELT pipelines with SQL and Python
- Experience in fast-paced companies
- Experience fine-tuning and controlling AI tools
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!