Verizon Hiring Big Data Developer

Job Location: Chennai

Experience: 3-5 Years

Primary Skill: Hadoop

What you’ll be doing…

  • Deploy hadoop cluster, maintain a hadoop cluster, add and remove nodes using cluster monitoring tools (Cloudera Manager), configure the NameNode high availability and keep track of all the running hadoop jobs.
  • Implement, manage and administer the overall hadoop infrastructure.
  • Take care of the day-to-day running of Hadoop clusters
  • Work closely with database team, network team, BI team and Application teams to make sure that all the big data applications are highly available and performing as expected.
  • Be responsible for capacity planning and estimating the requirements for lowering or increasing the capacity of the hadoop cluster.
  • Be responsible for deciding the size of the hadoop cluster based on the data to be stored in HDFS.
  • Ensure that the hadoop cluster is up and running all the time.
  • Monitoring the cluster connectivity and performance.
  • Manage and review Hadoop log files.
  • Backup and recovery tasks
  • Resource and security management
  • Troubleshoot application errors and ensure that they do not occur again.

What we’re looking for…

You’ll need to have

  • Bachelor’s Degree or four or more years of experience.
  • Four or more years of relevant expereince.
  • Experience in designing, implementing and administering highly available Hadoop clusters secured with Kerberos, preferably using the Cloudera Hadoop distribution.
    In-depth understanding of the Hadoop ecosystem (e.g. HDFS, MapReduce, HBase, Pig, Scoop, Spark, Hive).
  • Willingness to perform operational support (24X7) on-call duties in rotation.