Job Location: Chennai
Experience: 2+ Years
Primary Skill: Knowledge for Hadoop
1. Must have minimum 3 years’ of experience working on big data technologies like hadoop, HDFS, real time event streaming, batch event streaming, Kafka etc
2. Must have minimum 3 years’ of hand on experience working on big data platform (such as Cloudera, Hortonworks etc). Please specify.
This role is expected to perform technical design, development activities and implement them, aligned to the business requirements within agreed timelines. Coordinate with other IT teams as necessary for implementation and provide support in investigating & resolving issues.
1. Understand Business requirements of different Big Data Projects
2. Develop and Implement Big Data Pipeline
3. Design, develop and Implement Extract Transform and Load (ETL) flows
4. Coordinate with different IT Team to deliver projects on time with right quality
5. Create, Manage & Support flydubai BigData Warehouse
6. Monitoring Extract Transform and Load(ETL) performance and carry out necessary tuning changes
7. Debugging and resolving production issues reported
B.Tech / BE / MCA or Similar Streams