Added
less than a minute ago
Location
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
aws snowflake python graphql dbtπ Description
- Design and maintain data pipelines for ingestion and transformation using Python.
- Build robust Snowflake data warehouses for storage and retrieval.
- Model and transform data with DBT in Snowflake.
- Implement ETL workflows using Python, Fivetran, and Amazon DMS.
- Build RESTful or GraphQL APIs for data exchange.
- Ingest streaming data from Kafka for real-time processing.
Qualifications
- Bachelor's or Master's in CS/Engineering or equivalent.
- 3+ years in data engineering or software development.
- Proficient in Python.
- Snowflake, Fivetran, DBT, and Amazon DMS.
- RESTful/GraphQL APIs and Kafka.
- Apache Airflow, Docker, Kubernetes.
π Benefits
- Equal opportunity employer and diverse team.
- Hybrid work option in New York.
- Work with a modern data stack (Snowflake, AWS, Kafka).
- Collaborative, impact-driven environment.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!