Capgemini Hiring Hadoop Developer

Qualifications Bachelors/Masters

Job Responsibilities

Technology enthusiasts with Object oriented programming skills who can deliver engineering capabilities for the Web Customer channel serving millions of customers in the US


  • Minimum of 5 years’ experience with a degree in Computer Science or Applied Computer Science
  • Technology Enthusiast
  • Good understanding and exposure about Data Structure and Algorithms
  • Strong experience on Bigdata / Hadoop Technlogy stack in Hadoop, Hive, Scala, Spark Streaming, Kafka, MongoDB, HBase, Cassandra, Python
  • Experience in SQL like database is an added advantage.
  • Clear understanding of ETL Methodology. Experience in ETL would be added advantage.
  • UNIX experience including shell scripting. Cluster Management, Kerberos for security
  • Ability to evaluate code for performance, understand key code metrics and take design decisions.
  • Success stories in product engineering, coaching, persuading, and analysing code-not just writing it.
  • Current with modern and developing technologies/frameworks, providing specific examples

Good to have:

  • Proven track record for agile, test-driven development, continuous integration, and automated testing
  • Active contributor to open source including project successes
  • GitHub account with valuable open source contributions

Key Accountabilities

  • Take ownership of applications assigned for development and support activity
  • Ensure all SLA’s are met for assigned tasks.
  • Perform on-call support and pager duties as assigned by Team lead.
  • Strong Bigdata / Hadoop skills including
  • Strong data warehousing and OLTP system knowledge from database/ETL development perspective.
  • Exposure to Agile Scrum Methodology.
  • Ability to collaborate across teams to deliver complex systems and components.
  • Evaluate Application to be crafted/refactored towards continuous delivery.(Toggle management etc )