Capgemini Hiring Hadoop with AWS Developer

Role: Hadoop with AWS Developer

Exp: 4 to 6 Years

Location: Hyderabad

Job Description:

  • Snowflake Datastage AWS Glue and understanding of various source date XML JSON Avro
  • 3 to 4 years of hands on development in Apache Spark Map Reduce HDFS
  • Good understanding of BigData Hadoop technologies Management of Hadoop cluster with all included services
  • Experience working on Amazon EMR managed cluster platform
  • Good knowledge of Big Data querying tools such as Pig Hive and Impala
  • Should be good in developing the Unix Shell PL SQL SCALA framework Expertise in java J2EE and big data technologies like Hadoop Apache Spark and Hive is required
  • Industry experience is preferred
  • Monitoring evaluating performance and advising any necessary infrastructure changes including changing the cloud platform
  • Proficient understanding of distributed computing principles Excellent verbal and written communication skills Experience in onshore offshore delivery model Agile and Devops experience