Job Location: Chennai
Experience: 2 to 4 Years
Primary Skill: Hadoop
Job Role: Hadoop Developer
Keywords: Spark, Scala, SQL, Elastic Search and AWS
Are you a talented Software Engineer looking for that next step? Do you want to develop software using cutting edge technology in a high-impact environment? Do you want to work alongside a highly talented team of engineers to solve real-world problems? Well then we want you.
What You’ll Be Doing:
- Create complex data processing pipelines as part of a dynamic and fast-paced team
- Hands-on experience with continuous delivery Apache Spark with Scala
- Hands-on experience on AWS including EC2, ELB, S3, CloudFormation, and Lambda.
- Work on Hadoop platform with Amazon EMR, data ingestions, and full-stack development
- Create high-quality software through test driven development and peer
- Contribute to continuous integration and continuous delivery practices
- Work closely with product management and other stakeholders in an agile environment
- Leverage the latest technologies to solve complex problems facing the health care industry
Must Haves:
- Education: Bachelor’s Degree in technical field with equivalent work experience
- 3+ years of relevant professional work experience
- 2+ years of experience coding with one or more JVM-based language (e.g. Java, Scala)
- 1+ years of experience with test-driven development using Junit
- 1+ years of experience using SQL
- Demonstrated experience with agile development for a commercial software product or solution
Even Better:
- Demonstrated experience with big data solutions on the Hadoop platform, specifically Amazon EMR with Spark
- Experience with implementation of continuous integration, specifically with Gradle and Bambo
- Prior experience in the health care domain
- Flexibility to work across a variety of software-related disciplines: solution design, unit testing, refactoring, and build/deployment automation