Related skills
etl python databricks data pipelines performance tuning📋 Description
- Design, build, and maintain scalable ETL pipelines for large-scale data processing.
- Implement data transformations and workflows using PySpark at an intermediate to advanced level.
- Work extensively with Databricks to develop, manage, and optimize data pipelines.
- Optimize pipelines for performance, scalability, and cost efficiency across environments.
- Troubleshoot, debug, and resolve data processing and pipeline issues.
- Collaborate with cross-functional teams to gather requirements and deliver high-quality data solutions.
🎯 Requirements
- 5–7 years of professional experience in data engineering.
- Strong hands-on proficiency with PySpark (intermediate to advanced level).
- Solid experience working with Databricks, including Autoloader, Python-based workflows, and platform best practices.
- Proven experience optimizing data pipelines for performance and cost efficiency.
- Strong understanding of ETL processes and large-scale data transformations.
- Excellent problem-solving skills with the ability to diagnose and resolve complex data issues.
🎁 Benefits
- 100% Remote Work: Enjoy the freedom to work from the location that helps you thrive. All it takes is a laptop and a reliable internet connection.
- Paid Time Off: We value your well-being. Our paid time off policies ensure you have the chance to unwind and recharge when needed.
- Work with Autonomy: Enjoy the freedom to manage your time as long as the work gets done. Focus on results, not the clock.
- Work with Top American Companies: Grow your expertise working on innovative, high-impact projects with Industry-Leading U.S. Companies.
- A Culture That Values You: We prioritize well-being and work-life balance, offering engagement activities and fostering dynamic teams to ensure you thrive both personally and professionally.
- Diverse, Global Network: Connect with over 600 professionals in 25+ countries, expand your network, and collaborate with a multicultural team from Latin America.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!