This job is no longer available

The job listing you are looking has expired.
Please browse our latest remote jobs.

See open jobs →
← Back to all jobs

SENIOR BIG DATA INFRA ENGINEER

Added
21 days ago
Location
Type
Full time
Salary
Not Specified

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead
Save job

About the Role:

We are seeking a highly skilled and experienced Senior Big Data Infra Engineer to join our dynamic team. The ideal candidate will have a strong background in developing and scaling both stream and batch processing systems, and a solid understanding of public cloud technologies. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.

What you will be doing:

Implementing automation/DevOps best practices for CI/CD, IaC, Containerization, etc to Build a reusable infra structure for stream and batch processing systems at scale.

Create automation, whether that is building DevOps pipelines, scripting or creating Infrastructure as Code in Terraform

Participating in work sessions with clients

Completing technical documentation

Requirements:

Experience in Developing and Scaling data Processing Systems This includes working with technologies like Pub/Sub, Kafka, Kinesis, DataFlow, Flink, Hadoop, Pig, Hive, and Spark.

Expertise in public cloud services, Azure/AWS or GCP.

Experience with Cloud managed services and understanding of cloud-based messaging/stream processing systems are critical.

Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.

Has knowledge in containerization technologies such as Docker and Kubernetes to enhance the scalability and efficiency of applications.

Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals.

Proven experience in engineering stream/batch processing systems at scale.

Strong programming abilities in Java and Python.

Hands-on experience in public cloud platforms. Additional experience with other cloud technologies is advantageous.

Must Have:

Cloud Engineer Certification or other Cloud Professional level certification

4+ years of experience in customer-facing software/technology or consulting

4+ years of experience with “on-premises to cloud” migrations or IT transformations

4+ years of experience building, and operating solutions built on GCP or AWS/Azure

Technical degree: Computer Science, software engineering or related

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Remote DevOps Jobs. Just set your preferences and Job Copilot will do the rest—finding, filtering, and applying while you focus on what matters.

Related DevOps Jobs

See more DevOps jobs →