Job Description
- Design, develop, validate and deploy the Talend ETL processes
- Must have used HADOOP (PIG,HIVE,SQOOP) on MapR Distribution
- Responsible for the documentation of all Extract, Transform and Load (ETL) processes
- Maintain and enhance ETL code, work with the QA and DBA team to fix performance issues
- Collaborate with the Data Warehouse team to design and develop required ETL processes, performance tune ETL programs/scripts.
- Work with business partners to develop business rules and business rule execution
- Perform process improvement and re-engineering with an understanding of technical problems and solutions as they relate to the current and future business environment.
- Design and develop innovative solutions for demanding business situations
- Analyze complex distributed production deployments, and make recommendations to optimize performance
Role: Hadoop Developer
Experience: 2.5 to 4 Years
Location: Chennai
Primary Skill: Hadoop (Spark)