- Minimum 4 years of work experience in Data warehousing technologies and minimum 2 years of work experience in Big Data or Hadoop platform environment.
- Good experience in Level 2, 3 support for performing hot fixes and dealing with platform issues in production environment.
- Good knowledge in ETL Architecture and Hadoop platform architecture.
- Good Unix skills and good knowledge about Linux networks etc.
- Experience in tool Integration, automation, configuration management in GIT, Jira platforms.
- Should be proficient in writing shell scripts, automating batch and designing scheduler processes.
- Should be able to understand Python/Scala/Java programs and debug the issues.
- Good understanding and knowledge about Hadoop Ecosystem, HDFS and Big Data concepts.
- Good understanding of Software Development Life Cycle (SDLC), agile methodology.
- Excellent oral, written communication and presentation skills, analytical and problem solving skills.
Self-driven, ability to work independently and as part of a team.
Skills Required: Unix, Hadoop, Shell, ETL Architecture
Department: All Departments
Years Of Exp: Above 4 Years
Location : Chennai, Tamil Nadu, India
Open Positions : 10