Related skills
aws snowflake etl sql pythonπ Description
- Design, build, and maintain scalable, cloud-based data pipelines and platforms.
- Develop and optimize batch and event-driven data workflows primarily in AWS.
- Build and manage data warehousing solutions using Snowflake.
- Ensure data quality, reliability, security, and performance across pipelines and platforms.
- Partner with analytics, product, and client stakeholders to translate business needs into data solutions.
- Design and implement data models that support analytics, reporting, and downstream applications.
π― Requirements
- 5+ years of experience as a Data Engineer or in a similar role.
- Strong proficiency in SQL and data modeling concepts.
- Hands-on experience with Snowflake or similar cloud data warehouses.
- Experience building and operating data pipelines in AWS (e.g., S3, Lambda, Glue, Redshift, MSK, etc.).
- Proficiency with Python for data engineering tasks.
- Experience with ETL/ELT frameworks and orchestration tools (e.g., Airflow, dbt).
π Benefits
- Free Health Insurance Option for all (Single, 2-Party, and Family)
- 401k Safe Harbor Plan
- Profit Sharing Program
- Generous PTO - Maternity / Paternity Leave
- Side Hustle Opportunities
- Certification Reimbursement and Bounty Program
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!