This job is no longer available

The job listing you are looking has expired.
Please browse our latest remote jobs.

See open jobs →
← Back to all jobs

Senior DataOps Engineer

Added
14 days ago
Type
Full time
Salary
Not Specified

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead
Save job

ABOUT THE TEAM

The Data Engineering team at Mural builds scalable, high-performance systems that transform complex data into actionable insights. We power internal analytics and user-facing features, supporting critical products powered by AI/Machine Learning, such as Customer Insights, Personalization, Reporting APIs, and Audit Logs. Our work integrates diverse data sources, optimizes core data models, and ensures seamless access across the organization. Success is measured not only by system reliability and scalability, but also by the business impact we deliver—enabling users to make informed decisions and driving overall customer success. 

YOUR MISSION

As a Senior Data Engineer, you will be instrumental in shaping the future of our data platform. You will work with a global, remote-first team to build, scale, and optimize our data infrastructure using cutting-edge tools like Databricks, Spark, Terraform, and cloud platforms (AWS/Azure). You will collaborate closely with cross-functional teams to ensure the platform is secure, efficient, and continuously evolving. Your work will empower teams to access the data they need, drive insights, and optimize business outcomes. 

WHAT YOU'LL DO

  •  Build, scale, and optimize the data platform using Databricks, Spark, and various data integration tools to ensure high performance, reliability, and cost efficiency.

  • Design and maintain robust data pipelines with Airflow, Fivetran, DBT, Segment, and automate infrastructure using Terraform to enhance operational readiness, security, and governance.

  • Develop and maintain the infrastructure and automation needed to support MLOps practices, including model deployment pipelines, monitoring frameworks, and retraining automation.

  • Use AI tools as a part of your fundamental workflows

  • Prototype, implement, and maintain team projects and features, serving as a technical expert, mentor, and leader

  • Build flexible and maintainable solutions while being accountable for quality, performance, and reliability

  • Elevate the team’s skills and knowledge by participating in technical designs and talks and reviewing and helping improve your and your colleague’s code

  • Contribute to constantly improving the team’s processes and best practices

WHAT YOU'LL BRING

  • 4+ years of hands-on experience in Data Engineering, with a strong focus on building and optimizing data platforms and cloud platforms.

  • Expertise in designing and building distributed systems and data pipelines, focusing on data integration, transformation, and processing using tools like Astronomer, Fivetran, DBT, and Segment, with a focus on both structured and unstructured data.

  • Proficiency in Python, Scala, SQL, and Spark to develop scalable data pipelines, automate data workflows.

  • Solid understanding of cloud infrastructure (AWS/Azure), data platforms (Databricks, Snowflake, Redshift) with hands-on experience in Terraform to automate cloud resource management (e.g., VPCs, subnets).

  • Passion for performance tuning, cost optimization, and ensuring efficient resource usage across the data platform.

  • An outcome-oriented and highly-experimental interest in AI-driven development practices

  • Experience with data governance, security practices, and compliance (e.g., SOC II) to maintain data integrity and privacy.

  • Experience learning new technologies, platforms, stacks, and coming up to speed quickly on large codebases

  • Emotional intelligence with collaboration and listening skills that encourage innovative solutions and diverse perspectives

  • Experience working in a rapid-growth or startup environment

  • Excellent communication skills and the ability to collaborate with distributed teams across time zones, ensuring alignment and success in a fast-paced environment.

Equal Opportunity 

We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Remote Data Jobs. Just set your preferences and Job Copilot will do the rest—finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs →