Related skills
aws snowflake sql python tableauπ Description
- Design, build, and own scalable data platforms powering analytics, ML, reporting, and operations.
- Develop robust ELT workflows, data models, and distributed processing jobs.
- Use Python, SQL, dbt, Airflow, Snowflake, Tableau, Power BI, AWS data services, and containers/cloud tools.
- Ensure data quality, integrity, lineage, governance, and reliability across pipelines and analytics apps.
- Establish SLAs, monitoring, and alerts with observability design and implementation.
- Identify and resolve performance bottlenecks for scalable, cost-efficient processing.
π― Requirements
- Bachelor's or Master's degree in CS, Engineering, Data Science, or related field.
- 2+ years of software and data engineering with distributed data processing.
- 1+ years implementing and maintaining reporting/analytics tools (Tableau, Power BI, DOMO).
- Experience in Python, SQL, data pipelines, data modeling, and privacy/security best practices.
- Experience with cloud platforms (AWS, Azure, or GCP) and modern data warehousing (Snowflake, BigQuery, Redshift).
- Excellent communication and collaboration skills.
π Benefits
- Competitive health plans
- Paid time-off and company holidays
- 401K retirement program with company match
- Other company sponsored programs
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!