Added
14 days ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

bigquery snowflake sql python dbt

πŸ“‹ Description

  • Architect high-scale data infrastructure powering Lotlinx's platform.
  • Design, build, and scale robust data pipelines from diverse sources.
  • Own ELT/ETL workflows into data lakes and cloud warehouses.
  • Collaborate with Analytics, Product, and Design to maximize data value.
  • Ensure reliability, security, and async-friendly pipelines.

🎯 Requirements

  • 3+ years of professional data engineering experience.
  • Cloud platforms: AWS and/or GCP.
  • Proficient in Python, Scala, or Java; strong SQL.
  • Experience with Spark, Hadoop, or Beam.
  • Streaming with Kafka, Pub/Sub, or Kinesis.
  • Orchestration with Airflow or Dataflow; dbt usage.
  • Experience with Snowflake, BigQuery, or Redshift.

🎁 Benefits

  • Hybrid work: 2 days remote, 3 days in-office at Winnipeg, Oakville, or Vancouver.
  • Competitive compensation and benefits package.
  • Flexible time off and career development opportunities.
  • Dynamic, team-oriented environment with growth opportunities.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs β†’