Paypal Hiring Big Data Manager

Job Description

Fueled by a fundamental belief that having access to financial services creates opportunity, PayPal (NASDAQ: PYPL) is committed to democratizing financial services and empowering people and businesses to join and thrive in the global economy. Our open digital payments platform gives PayPal’s 237 million active account holders the confidence to connect and transact in new and powerful ways, whether they are online, on a mobile device, in an app, or in person. Through a combination of technological innovation and strategic partnerships, PayPal creates better ways to manage and move money, and offers choice and flexibility when sending payments, paying or getting paid. Available in more than 200 markets around the world, the PayPal platform, including Braintree, Venmo and Xoom enables consumers and merchants to receive money in more than 100 currencies, withdraw funds in 56 currencies and hold balances in their PayPal accounts in 25 currencies.

 

Responsibilities:

• Define the infrastructure vision, strategy, and execution for Hadoop Enterprise Portfolio in PayPal.

• Determine the key areas where Hadoop R&D should be investing in order to maximize future profit potential, meet emerging customer requirements, and maintain our competitive edge.

• Develop technical requirements for new/existing products within the portfolio as well as feature introductions or improvements.

• Assists in the management of the requirements phase of each functional enhancement (including functional/technical specifications, upgrades and deployments).

• Technical duties can include evaluating, prototyping, and recommending emerging Hadoop features.

• Support product management/architectural activities related to PayPal Hadoop product offerings.

• Manage the execution of product roadmap and PayPal’s Enterprise Hadoop Portfolio.

• Build strong relationships with key product group leaders and develop a framework for translating and communicating roadmap and lifecycle requirements for all the relevant parties

• Reach out to key customers and conduct requirements, demo and iteration walkthroughs with end users.

• Support our partner and developer user communities with integrating Hadoop into their systems (ETL/Teradata etc).

• Meet all KPIs and ATBs. Provide world class operational support and up-time. Ensure cluster stability and 24×7 availability.

Basic Qualifications:

  • Results oriented technology leader with significant experience in managing complex operations and systems infrastructure.
  • Expert in Hadoop Operations and cluster deployments at large scale.
  • Managed globally distributed high performance dev/ops teams with a passion and understanding of how to build a DevOps organization.
  • Managed Agile/scrum, dev/ops team methodology and processes.
  • Proficient in managing a department including budget, contracts, and vendor relationships.
  • Technical understanding of Hadoop, the Hadoop ecosystem and emerging Big Data technologies.
  • Capable of producing all types of documents including statement of direction, functional specifications, Power Point presentations, etc.
  • Ability to work on multiple projects at once, set priorities, work independently, problem solve, improvise, and function as part of a team that performs well under pressure.
  • Experience collaborating across virtual cross-functional teams that are not under direct control including product management, vendors and customers.
  • Superior customer and communications skills, both written and verbal. Strong presentation skills.
  • Thrives in fast-paced environment within an established data management company.
  • 5+ years of Hadoop management, technical or engineering experience, plus Enterprise software experience preferred.
  • Hands-on experience with Hadoop and Big Data management technologies (Apache Hadoop, Hortonworks, Cloudera, etc.) would be ideal.