Added
less than a minute ago
Location
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
dynamodb redshift docker aws postgresqlπ Description
- Identify, design, and implement internal process improvements for scalable infra.
- Develop data pipelines for end-to-end solutions.
- Maintain artifacts such as schemas, data dictionaries, and ETL transforms.
- Integrate data pipelines with AWS to derive insights.
- Collaborate with stakeholders to support data infrastructure needs.
- Coordinate data flows and capabilities; occasional off-hours deployments.
π― Requirements
- Top Secret SCI security clearance.
- Bachelor's degree or equivalent practical experience.
- Expertise in distributed computing frameworks for large-scale data processing.
- Familiarity with AWS or other cloud environments.
- Experience with datastores: PostgreSQL, S3, Redshift, MongoDB, DynamoDB, Redis, SQL.
- Proficient in Python with pandas; PySpark, NiFi, Airflow, or Lambda.
- Working knowledge of Docker, Kubernetes, JMS/SQS, SNS, and Kafka.
- Linux/Unix server environments.
π Benefits
- M9 Benefits: https://m9solutions.com/why-join-m9/#our-benefits
- Inclusive culture and commitment to diversity.
- Work on federal programs with modern technologies.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!