Job Profile
- Develop and write code in Java or other object orientated languages following team’s coding standards and development process.
- Develop ETL processes to move data from various data sources into Hive, ElasticSearch
- Use SQL tools to visualize the data stored in Hive
- Use Kibana to create dashboards.
- Involve in the integration of smart grid software systems.
- Understand functional specifications and translate them into technical artifacts.
- Provide reliable and quality design to a variety of problems and requirements using sound problem solving techniques.
- You are proud of sharing your knowledge with colleagues.
- Implement and deliver projects with minimal supervision.
- Be a team player and individual contributor as business demands.
- Perform application builds and deployments using CI servers.
Qualification and Experience
- BE / B Tech in any stream, M.Sc. (Computer Science/IT) / M.C.A.
- Minimum 4 – 6 years of experience with software development.
- Solid command of Object Oriented programming.
- Experienced in Hadoop storage technologies such as HDFS and HBase.
- Hands on experience with Hive.
- Familiarity with ETL process and tools
- Solid command of Object Oriented programming.
- Experienced in Java 8.
- Proficient in working with IDEs (Eclipse, IntelliJ IDEA etc)
- Ability to work with Version Control Software – Git preferred
- Expert in writing unit tests (e.g. JUnit)
- Experience with Agile Software Development process and Full product/software development lifecycle
- Familiar working with RDBMS (Oracle preferred) and NoSQL databases.
- Experience in processing XML and JSON messages.
- Proficient with Linux
- Experience in creating technical documentation (high and low level design documents, etc)
- Demonstrates strong debugging, problem solving and investigative skills.
- Ability to collaborate effectively across disciplines, roles and geographical time zones.
- Excellent communication and interpersonal skills.
- Ability to multitask and react to changing priorities.
- Ability to adapt to newer technologies.
Good to Have:
- Experience with ElasticSearch and Kibana
- Knowledge of other big data architectures like Pig, NoSQL databases
- Parallelizing computing tasks in big data architectures
- Experience in scripting languages used in big data (e.g. R, Python, Scala)