Related skills
java docker aws postgresql python๐ Description
- Deploy and manage security services across the infrastructure.
- Maintain internal crawler apps for crawler-related businesses.
- Perform DevOps tasks involving Kubernetes and AWS.
- Integrate security into the full software development lifecycle.
- Automate processes and implement CI/CD solutions.
- Develop and maintain crawler scripts.
๐ฏ Requirements
- Python for web scraping (requests, BeautifulSoup, Selenium, Scrapy)
- Java + Spring Boot backend development
- SQL syntax (MySQL/PostgreSQL)
- Git, REST APIs, modular design basics
- Strong problem solving and responsibility
- Web crawling/backend projects (GitHub/portfolio)
- Pandas, JSON, and regex proficiency
- Docker and AWS or cloud-native tools
๐ Benefits
- Work-from-home arrangement
- Equal opportunity and diverse workforce
- Competitive salary and benefits
- Career growth and learning opportunities
- Flat organizational structure
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!