IBM-Hiring for Hadoop development

Job Description

About IBM:

IBM India’s solutions and services span all major industries including financial services, healthcare, government, automotive, telecommunications and education, among others. As a trusted partner with wide-ranging service capabilities, IBM helps clients transform and succeed in challenging circumstances. The diversity and breadth of the entire IBM portfolio of research, consulting, solutions, services, systems and software, uniquely distinguishes IBM India from other companies in the industry.

Business Unit Introduction:

IBM Global Business Services (GBS) is a vibrant group of Business, Strategy and Technology professionals, designed specifically to be the source of market-leading Industry consulting, Application and Business process management, supported by the industry’s most sophisticated outcome based delivery model, all designed to become the Digital Reinvention partner for leading clients across the world. IBM GBS provides Value-led and Asset-powered end to end solutions with global footprint in over 170 countries and empowers clients to build upon their tremendous heritage in Application Innovation processes and also to transform them for a Cloud, Cognitive and Social centric world.

Who you are:

  • You will team with some of the best minds in the industry to create innovative high quality solutions focused on clients’ business needs.
  • You will do this using your systems knowledge and expertise to design and model applications, develop application solutions, and integrate them with custom solutions, Commercial-off-the-shelf(COTS) software or packaged applications.

What you’ll do:

  • Pulling data from various database systems and unstructured text from web, social media sites and other domain specific file formats utilizing for example IBM BigInsights tools or other Hadoop offerings.
  • You will participate in full lifecycle implementation of the technical components of solutions, including tool selection, data architecture, database administration, ETL (Extract, transform and load), reporting and integration.

How we’ll help you grow:

  • You’ll have access to all the technical and management training courses you need to become the expert you want to be
  • You’ll learn directly from expert developers in the field; our team leads love to mentor
  • You have the opportunity to work in many different areas to figure out what really excites you.

Required Technical and Professional Expertise

  • At least 4+ years of hands on development experience on Hadoop platform with at least one project implemented using BigInsight tool.
  • Demonstrated work experience with the Map Reduce programming model and HDFS (Hadoop Distributed File Systems)
  • Experience developing software with high level languages such as Java, C, C++ including familiarity with J2EE/ Applet/ Servlet/ JSP/ Java/ JSON/ Python/ Perl/ Shell Script/ REST/AJAX/.
  • Proven foundational knowledge and experience with a range of big data components such as HDFS, HBase, Oozie, Pig, Hive, Avro, Zookeeper, Sqoop and FlumeProfile

Preferred Tech and Prof Experience

  • Hands on experience with Big Data Tools such as BigInsights, Cloudera, Hortonworks, MapR would be an added advantage
  • Familiarity with IBM Big Data architecture and data integration

EO Statement
IBM is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.

Preferred Education: Master’s Degree
Commissionable: No

Top 100 Hadoop Interview Questions