Job Description:
- Knowledge and proven experience with Hadoop Ecosystem components which include HDFS, YARN, Hive, Spark, Hbase, Kafka, Nifi
- Hands on experience implementing and managing Hortonworks HDP and HDF clusters
- Knowledge and proven experience managing Hadoop security (Active Directory, Knox, Kerberos, Ranger)
- Knowledge of Linux administration, Java, Virtual Environments
- Knowledge of scripting languages like Python, Unix shell
- Experience working with automation DevOps tools (e.g. Git, Bitbucket, Jenkins), CI/CD pipelines.
- Experience managing Hadoop deployments in AWS using Cloudbreak
Responsibilities:
- Day to day cluster monitoring and support of HDP and HDF clusters
- Perform cluster maintenance with patching, upgrades, backup/recovery, and automation of routine tasks
- Maintain and evolve Hadoop components configurations based on usage and project requirements
- Implement and support multi-tenancy, high availability
- user provisioning, access management, configure and maintain security policies.
- Troubleshoot and debug any Hadoop ecosystem run time issues
- Support production, data ingestion, streaming applications
- Support and solve issues in day to day development work,
- Capacity planning, Performance tuning on demand
Education and Experience Required:
- Typically a technical Bachelor’s degree or equivalent experience and a minimum of 6 years related experience or a Master’s degree and a minimum of 4 years of experience.