Related skills
java snowflake sql python hadoopπ Description
- Pipeline migration: Migrate extraction logic and scheduling to Lakehouse.
- Data transfer: Migrate datasets while preserving data integrity.
- Stakeholder engagement: Liaise with data owners for handoffs.
- Consumption pattern migration: Translate SQL/Spark to Snowflake/Iceberg.
- Usage analysis: Analyze usage to deliver data products.
- Data reconciliation & quality: Validate migrated data is equivalent.
π― Requirements
- Education: Bachelor's or Master's in CS/Math/Engineering.
- Experience: 5+ years coding in a team; SQL troubleshooting.
- Languages: Python or Java.
- SDLC/CI/CD & Kubernetes: Strong SDLC, CI/CD, and Kubernetes.
- Data modeling: Temporal data modeling (SCD Type 2).
- Tech stack: Kafka, ANSI SQL, Apache Spark; JSON/Parquet.
π Benefits
- Hybrid workplace in Dallas.
- Global clients across Americas, APAC, EMEA.
- Equal opportunity employer.
- Visa sponsorship for eligible candidates.
- Job openings alert subscription.
π Visa sponsorship
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!