This job is no longer available

The job listing you are looking has expired.
Please browse our latest remote jobs.

See open jobs →

Data Engineer (Airflow)

Added
3 days ago
Type
Full time
Salary
Not Specified

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead

Related skills

etl sql python airflow data pipelines

We are:Wizeline, a global AI-native technology solutions provider, develops cutting-edge, AI-powered digital products and platforms. We partner with clients to leverage data and AI, accelerating market entry and driving business transformation. As a global community of innovators, we foster a culture of growth, collaboration, and impact.With the right people and the right ideas, there’s no limit to what we can achieve

Are you a fit?Sounds awesome, right? Now, let’s make sure you’re a good fit for the role:

Key Responsibilities

  • Data Migration and Pipeline Development
  • Data Modeling and Transformation
  • Troubleshooting and Optimization
  • Collaboration and Documentation

Must-have Skills:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
  • 3+ years of experience in data engineering, with a focus on building and maintaining scalable data pipelines.
  • Solid experience with data migration projects and working with large datasets.
  • Strong hands-on experience with Snowflake, including data loading, querying, and performance optimization.
  • Proficiency in dbt (data build tool) for data transformation and modeling.
  • Proven experience with Apache Airflow for scheduling and orchestrating data workflows.
  • Expert-level SQL skills, including complex joins, window functions, and performance tuning.
  • Proficiency in Python for data manipulation, scripting, and automation for edge cases
  • Familiarity with PySpark, AWS Athena, and Google BigQuery (source systems).
  • Understanding of data warehousing concepts, dimensional modeling, and ELT principles.
  • Knowledge of building CI/CD pipelines for code deployment
  • Experience with version control systems (e.g., Github).
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work independently and as part of a collaborative team in an agile environment.
  • Must speak and write in English fluently; Effective communicator

Nice-to-have:

  • AI Tooling Proficiency: Leverage one or more AI tools to optimize and augment day-to-day work, including drafting, analysis, research, or process automation. Provide recommendations on effective AI use and identify opportunities to streamline workflows.

What we offer:

  • A High-Impact Environment
  • Commitment to Professional Development
  • Flexible and Collaborative Culture
  • Global Opportunities
  • Vibrant Community
  • Total Rewards

*Specific benefits are determined by the employment type and location.

Find out more about our culture here.

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs →