Added
2 hours ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

aws snowflake sql python tableau

πŸ“‹ Description

  • Design, build, and own scalable data platforms powering analytics, ML, reporting, and operations.
  • Develop robust ELT workflows, data models, and distributed processing jobs.
  • Use Python, SQL, dbt, Airflow, Snowflake, Tableau, Power BI, AWS data services, and containers/cloud tools.
  • Ensure data quality, integrity, lineage, governance, and reliability across pipelines and analytics apps.
  • Establish SLAs, monitoring, and alerts with observability design and implementation.
  • Identify and resolve performance bottlenecks for scalable, cost-efficient processing.

🎯 Requirements

  • Bachelor's or Master's degree in CS, Engineering, Data Science, or related field.
  • 2+ years of software and data engineering with distributed data processing.
  • 1+ years implementing and maintaining reporting/analytics tools (Tableau, Power BI, DOMO).
  • Experience in Python, SQL, data pipelines, data modeling, and privacy/security best practices.
  • Experience with cloud platforms (AWS, Azure, or GCP) and modern data warehousing (Snowflake, BigQuery, Redshift).
  • Excellent communication and collaboration skills.

🎁 Benefits

  • Competitive health plans
  • Paid time-off and company holidays
  • 401K retirement program with company match
  • Other company sponsored programs
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’