Experience: 2+ Years
Primary Skill: Knowledge for Hadoop
What you’ll be doing…
Technology drives our business and you will be part of the action through operating and administering over fifty applications in the Hadoop cluster. With responsibility for overseeing the platform that houses our largest and most critical Network data sets, you will ensure that the data is reliable and can be used to provide solid analytics and reporting. You will collaborate with stakeholders to efficiently administer and maintain the Hortonworks Hadoop Data Platform, while seeking opportunities to identify new business applications for platform deployment.
- Automating the receipt and loading of the data feeds into the Hadoop Cluster.
- Creating and removing nodes, monitoring and troubleshooting, managing the file system, data capacity and node capacity forecasting and planning.
- Monitoring and supporting Stream technologies to enable real time data loading in Hadoop.
- Supporting Big Data tools.
- Developing strategies to automate deployment and application management in Hadoop.
- Setting-up new monitoring and support automation for various Hadoop jobs and processes.
What we’re looking for…
You work well independently, but also enjoy collaborating with cross functional teams to create solutions that have a big organizational impact. Your attention to detail enables you to quickly diagnose and correct issues. You are a gifted communicator and this allows you to manage stakeholder expectations well. When people run into issues, they look to you to find solutions to make it better.
You’ll need to have:
- Bachelor’s degree or four or more years of work experience.
- Four or more years of relevant work experience.
- Hadoop Big Data Platform Operations and Administration experience.
- Scripting language experience.
- Experience with all phases of the Software Development Lifecycle, including system analysis, design, coding, testing, debugging and documentation.
Even better if you have:
- A degree.
- Database management experience including UNIX and/or Linux.
- Hortonworks Hadoop Data Platform experience.
- Stream processing technology experience including Storm, Kafka, or Spark.
- HDPCA certification or any other Hadoop, Linux Certification.
- Mapreduce, HDFS, Hive, Sqoop, Flume, Pig, Oozie experience.
- Tomcat, Apache, Weblogic, OBIEE software experience.
- Java programming experience.
- PL/SQL programming experience.
- TCP/IP Networking, Web services and HTML experience.
- Ability to self-train through training material available online and within the department.