Related skills
bigquery looker snowflake sql pythonShape the Future of AI & Data with Us
At Datatonic, we are Google Cloud's premier partner in AI, driving transformation for world-class businesses. We push the boundaries of technology with expertise in machine learning, data engineering, and analytics on Google Cloud Platform. By partnering with us, clients future-proof their operations, unlock actionable insights, and stay ahead of the curve in a rapidly evolving world.
Your Mission
Data Engineers at Datatonic work across a wide range of projects, helping clients unlock the full potential of the Modern Data Stack. You’ll bring expertise in our technologies of choice - dbt, Looker, Snowflake, BigQuery, Google Cloud, Sigma, Fivetran, Python, Spark, Pub/Sub - and apply them to solve real client challenges.
In this role, you’ll collaborate closely with a Delivery Manager, support project teams, and make hands-on contributions to the codebase where needed. Our Data Engineers combine strong technical skills with a client-focused mindset, ensuring that data solutions are not only well-engineered but also deliver measurable impact.
What You’ll Do
Foundational Support for Analytics and Data Science Teams: Build the infrastructure that enables analytics and data science teams to deliver innovative, impactful solutions for clients.
Google Cloud Migration and Data Warehouse Solutions: Assist clients in migrating their existing business intelligence and data warehouse solutions to Google Cloud.
Build Scalable Data Pipelines: Develop, and optimize robust data pipelines, making data easily accessible for visualization and machine learning applications.
Design and Build Data Warehouses and Data Marts: Develop and implement new data warehouse and data mart solutions, including:
Transforming, testing, deploying, and documenting data.
Understanding data modeling techniques.
Optimising and storing data for warehouse technologies.
Manage Cloud Infrastructure: Build, maintain, and troubleshoot cloud-based infrastructure to ensure high availability and performance.
Collaboration with Technology Partners: Work closely with technology partners such as Google Cloud, Snowflake, dbt, and Looker, mastering their technologies and building a network with their engineers.
Agile and Dynamic Team Collaboration: Collaborate in an agile and dynamic environment with a team of data engineers, BI analysts, data scientists, and machine learning experts.
Applying Software Engineering Best Practices: Implement software engineering best practices to analytics processes, such as version control, testing, and continuous integration.
What You’ll Bring
Experience: 2+ years in a data-related role (e.g., Data Engineer, Data Analyst, Analytics Engineer).
Technical Expertise: Hands-on experience with Looker, dbt, modern data warehouses like Snowflake or BigQuery, and Kimball data modeling.
Strong Programming Skills: Expertise in Python and/or Java, with proficiency in SQL.
Experience in Data Engineering: 2+ years of experience in developing and building scalable data solutions.
High-Quality Code Standards: Ability to write tested, resilient, and well-documented code.
Cloud Computing Experience: Experience in building and maintaining cloud infrastructure (GCP or AWS is a plus).
Problem-Solving Mindset: Ability to take ownership and support projects from concept to completion.
Project Management: Natural ability to manage multiple initiatives and clients simultaneously.
SQL Proficiency: Skilled in writing analytical SQL, with an understanding of the difference between SQL that works and performant SQL.
Business Translation: Experience in translating business requirements into technical solutions.
Communication Skills: Ability to communicate complex ideas simply to a wide range of audiences.
Cultural Alignment: Complete alignment with our culture of transparency, empathy, accountability, and performance.
Bonus points if you have
dbt Developer certification
Google Cloud Professional Data Engineer certification
Snowflake SnowPro certification
Experience with Scrum methodology
Client-Facing Role: Prior experience in a client-facing or consulting position
API Development Experience: Experience building scalable REST APIs using Python or similar technologies.
What’s in It for You?
We believe in empowering our team to thrive, with benefits including:
20 days of paid vacation per calendar year
Public Holidays for your Province of Residence
5 Wellness days (sickness, personal time, mental health)
5 Lifestyle days (religious events, volunteer day, sick day)
Matching Group Retirement Savings Plan after 3 months
Competitive Group Insurance plan on Day 1 - individual premium paid 100%!
Virtual Medicine and Family Assistance Program - 100% employer-paid!
Home office budget - We are 100% remote!
CAD $70/month for internet/phone expenses
CAD $1,500 every 3 years for tech accessories and office equipment (monitor, keyboard, mouse, desk, etc.) starting on Day 1
Company-supplied MacBook Pro or Air
CAD $400/year for books, relevant app subscriptions or an e-reader.
Opportunities for paid certifications
Opportunities for professional and personal learning through Google and other training programs
Regular company off-sites and meetups
Why Datatonic?
Datatonic is a UK-based company with an Americas division located in Canada. The Canadian team operates remotely, with members distributed across North and South America. This role is open to candidates located anywhere in Canada.
Join us to work alongside AI enthusiasts and data experts who are shaping tomorrow. At Datatonic, innovation isn’t just encouraged - it’s embedded in everything we do. If you’re ready to inspire change and deliver value at the forefront of data and AI, we’d love to hear from you!
Salary: CAD 80,000$ - CAD 110,000$
Are you ready to make an impact?
Apply now and take your career to the next level.
Meet JobCopilot: Your Personal AI Job Hunter
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!