Hadoop Administrator

at Apex Systems Inc

location Cincinnati, Ohio

Updated on Aug 11, 2017

Contract Position
6 month(s)

  • Referral Bonus


  • Signing Bonus



Pay Rate $53.20 per hour

Experience 7-12 Years

Eligibility H-1B visa, Green Card, US citizen


Information Technology Services


Not Specified

Job ID


Working Remotely Allowed



Face to face interview required.

Position Description:
Hadoop Administrator on a team that supports Cloudera based Hadoop cluster

Position/Project Specific Requirements/Technical Responsibilities:
•           The design, care, and feeding of Big Data environments built upon technologies in the Hadoop Ecosystem
•           Day-to-day troubleshooting of problems and performance issues in our clusters
•           Investigate and characterize non-trivial performance issues in various environments
•           Work with Systems and Network Engineers to evaluate new and different types of hardware to improve performance or capacity
•           Work with developers to evaluate their Hadoop use cases, provide feedback and design guidance
•           Responsible for ongoing administration of Hadoop infrastructure.
•           Work with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
•           Performance tuning of Hadoop clusters and Hadoop MapReduce routines
•           Manage and monitor Hadoop connectivity, performance, security, log files
•           HDFS support and maintenance, capacity planning
•           Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.

Required Skills:
•           Experience with Cloudera
•           Experience Hadoop system administration including HDFS, Yarn, MapReduce, Hive, Pig, HBase
•           In depth knowledge of configuration, capacity planning and tuning of services such as MapReduce, Hive, Yarn, HBase
•           In depth knowledge of HDFS file formats such as ORC, Avro, Parquet and others
•           Experience with data ingestion and streaming frameworks such as Sqoop, Flume, Kafka
•           General operational expertise and excellent troubleshooting skills, understanding of system capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
•           Experience with Kerberos (security)

Position/Project Specific Preferences/Desires/Pluses:
•           SQL and HiveQL knowledge
•           Oozie Workflow scheduler system
•           Experience with Python
•           Experience with Apache Spark/Scala

Similar Jobs


Earn money by referring your friends to their next dream job, or find a job for yourself and the Referral Bonus becomes a Signing Bonus. Full time or Contract.

  • Make Money

    We’ll give you between several hundred and several thousand dollars for every successful referral.

  • Help Your

    Job applications with personal recommendations are twice as likely to get hired.

  • 100% Free

    Absolutely zero sign up and subscription fees. We want to pay YOU!

  • Signing Bonuses

    All jobs on iEndorseU come with a Signing Bonus if you find a job for yourself

  • No credit card needed

    We pay you directly through Paypal, so keep your personal information…personal.

  • Sponsor others

    Get others to download our app and we will pay you $100 whenever they next get a job through our platform.

All the ways to make money from us