Atos Hiring SCB Hadoop Engineer

Job Descrpition

  • Support the Hadoop Platform with the tools OOZIE, Apache Ni Fi, Java & Unix is a must knowledge
  • Well-versed knowledge on Network, Storage and API’s creation and debugging
  • Knowledge on supporting the applications on Big Data platform and Debugging knowledge of YARN
  • Write/debug Python code for the designed frameworks
  • Control the Hadoop clusters with proer access controls and apply the security with the tools Ranger, Kerberos
  • Coordinate with product vendors to setup / configure infrastructure, network, nodes
  • Write automation scripts for any alerts and monitoring dashboards from automation perspective with Python
  • Lead the teams on the different platform that runs on 24 x 7
  • Maintaining and trobuleshooting the Environment with Ambari
  • Integrate Solution knowledge between Hadoop and Reporting tools like Tableau, MicroStrategy, SAS, HDF
  • Knowledge on Pure and isilon storage, S3 file systems is added advantage
  • Solutioning different Disaster Recovery approaches for Big Data
  • Tuning and scaling up clusters with several proposals


  • Hadoop Hortonworks admin or support L2/L3 knowledge
  • Hive / Hbase SQL Tuning for map reducer jobs
  • Shell/Python Scripting
  • Certification on Hadoop Administration
  • 6 to 10 years overall experience and 4+ years of Big Data engineering and Production support