Lead Data Engineer We are:
Wizeline, a global AI-native technology solutions provider, develops cutting-edge, AI-powered digital products and platforms. We partner with clients to leverage data and AI, accelerating market entry and driving business transformation. As a global community of innovators, we foster a culture of growth, collaboration, and impact.
With the right people and the right ideas, there’s no limit to what we can achieve.
Are you a fit?
Sounds awesome, right? Now, let’s make sure you’re a good fit for the role:
We are seeking a highly experienced and technically adept Lead Data Engineer to play a key role in a critical data platform migration initiative. In this hands-on leadership role, you will guide a team of data engineers to migrate our data ecosystem from PySpark, Athena, and BigQuery to a modern stack built on Snowflake, dbt, and Airflow. You will define frameworks, architect scalable solutions, ensure data quality and governance, and mentor the team while shaping the future of our data platform.
📍 Location & Work Model
-
Location: Bogotá or Medellín, Colombia
-
Work model: Hybrid — 3 days onsite / 2 days remote
Key Responsibilities
Migration Strategy & Execution
-
Define and lead the end-to-end migration strategy from BigQuery, Athena, and PySpark to Snowflake.
-
Design, develop, and implement scalable ELT pipelines using Airflow for orchestration and dbt for transformations.
-
Build robust data validation and reconciliation processes to ensure accuracy and integrity throughout the migration.
Architectural Leadership
-
Design and optimize scalable data models and schemas in Snowflake, ensuring performance, cost efficiency, and maintainability.
-
Establish and enforce best practices, coding standards, and CI/CD pipelines for the dbt/Snowflake environment.
-
Partner with data architects, product owners, and stakeholders to translate business needs into technical solutions.
Team Leadership & Mentorship
-
Lead, mentor, and coach a team of data engineers, fostering a culture of technical excellence and continuous learning.
-
Conduct code reviews and guide adoption of new tools and methodologies.
-
Oversee planning, resource allocation, and delivery of data engineering initiatives.
Data Governance & Quality
-
Ensure compliance with data security and privacy regulations.
-
Proactively identify and resolve data quality issues, performance bottlenecks, and reliability challenges.
Stakeholder Communication
-
Clearly communicate plans, progress, and risks to technical and non-technical stakeholders.
-
Collaborate closely with analytics, data science, and operations teams.
Must-have Skills (Remember to include years of experience)
-
Bachelor’s degree in Computer Science, Engineering, or a related quantitative field (Master’s preferred).
-
5+ years of experience in data engineering, with at least 3 years in a lead or leadership role.
-
Proven experience leading data migration projects between cloud data warehouses.
-
Extensive hands-on experience with Snowflake, including performance tuning and cost optimization.
-
Strong mastery of dbt for data modeling, testing, and documentation.
-
Solid experience with Apache Airflow for orchestration and pipeline management.
-
Expert-level SQL skills and query performance tuning.
-
Proficiency in Python for data processing, automation, and scripting.
-
Strong knowledge of data warehousing concepts, dimensional modeling, and ELT/ETL processes.
-
Familiarity with PySpark, Athena, and BigQuery as source systems.
-
Experience with Git/GitHub and CI/CD practices.
-
Excellent problem-solving, analytical, and communication skills.
-
Ability to thrive in a fast-paced, agile environment.
-
English level: Fluent (spoken and written).
Nice-to-have:
-
AI Tooling Proficiency: Leverage one or more AI tools to optimize and augment day-to-day work, including drafting, analysis, research, or process automation. Provide recommendations on effective AI use and identify opportunities to streamline workflows.
-
Experience with data governance frameworks and metadata management.
-
Exposure to cloud cost management and FinOps practices.
-
Background working with cross-functional global teams.
What we offer:
-
A High-Impact Environment
-
Commitment to Professional Development
-
Flexible and Collaborative Culture
-
Global Opportunities
-
Vibrant Community
-
Total Rewards
*Specific benefits are determined by the employment type and location.
Find out more about our culture here.
Meet JobCopilot: Your Personal AI Job Hunter
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!