Citi Hiring Hadoop Developer

Java/Spark Developer

  • Primary Location: India,Tamil Nadu,Chennai
  • Education: Bachelor’s Degree
  • Job Function: Technology
  • Schedule: Full-time
  • Shift: Day Job
  • Employee Status: Regular
  • Travel Time: No
  • Job ID: 17079470

Description

Overview:

Enterprise Technology Solutions (ETS) is responsible for developing, managing and maintaining all the applications within the areas of Credit Risk, Market Risk, Operational Risk, Basel, CCAR, Compliance, IT Risk. Risk is a cross product, cross domain and top of the house subject. This gives an opportunity to implement build a big data platform around this huge data set as well as next generation analytics on top of this such as cross domain risk, secondary effects of risky events as well as behavior of portfolio under various scenarios- across all stripes of risk, capital optimization, balancing risk and P/L etc.

ETS are currently 320+ strong development team at Pune and are building a small team in Big Data development and related analytics area. The existing team is currently working in application development using technologies such as JEE, angular JS, Ab-initio, C++, grid computing.

Job Purpose:
The purpose of the job is to design, develop, enhance enterprise applications in Risk Technology area using Big Data technologies using Spark
Key Responsibilities:

  • Interacting with Business analysts to understand the requirements behind BRDs/FRDs/SRs
  • Complete understanding of application code through code compilation, code walkthrough, execution flow, overall design
  • Local compilation, deployment and behavior/Unit testing
  • Identifying the areas where code needs to change for meeting the required functionalities and maintain traceability
  • Participate in design review/ code review meetings- local/global
  • Unit testing, Integration testing, UAT/SIT support
  • Code check in, check out, merge, build management as needed
  • Reporting to the Program manager on project/task progress as needed. Identify risks & issues
  • Participate in all project planning, progress & development meetings with the team & global managers

Qualifications

Knowledge/Experience:
At least 5 to 7 years of Application development experience through full lifecycle. Experience with Red Hat Linux and Bash Shell Scripting. Knowledge in Core Java and OOPs is required. Thorough knowledge and hands on experience in following technologies Hadoop, Map Reduce Framework, Hive, Sqoop, Pig , Hue, Unix, Java, Sqoop, Impala. Cloudera certification (CCDH) is an added advantage. Strong experience in any ETL and BI tools

 

Skills:
  • Conceptual understanding of data structures
  • Passion for technology and self- starter
  • Orientation towards Disciplined development processes Strong Software Development Lifecycle management experience
Qualifications:
  • B. Tech from a top engineering college, University, preferably in computer science. Other preferred branches are EE, ECE. Candidates with passion for coding and systems development from other disciplines also can apply.
  • Work experience in a product company is an added advantage.
Competencies
  • Good coding discipline
  • Team work
  • Ability to mentor junior team members
  • Strong communications skills
  • Ability to adapt complex situations in project and streamline processes across it
  • Ability to work in a Global model, influence stakeholders and increase ownership at a local level

 

Top 100 Hadoop Interview Questions and Answers