Back
Full-time
avatar
Ejada
|
Salary: Confidential

Hadoop Big Data Administrator

The Hadoop Big Data Administrator manages large scale of Hadoop environments, in particular Cloudera CDH and CDP, handle builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring.

Responsibilities

  • Works with business and application development teams to provide effective technical support.
  • Assists / Supports technical team members for automation, development, and security.
  • Identifies the best solutions and conduction proofs of concepts leveraging Big Data & Advanced Analytics to meet functional and technical requirements.
  • Interfacing with other groups such as security, network, compliance, storage, etc.
  • Manages and reviews Hadoop log files for audit & retention.
  • Arranging and managing maintenance windows to minimize impact of outages to end users.
  • Moving services (redistribution) from one node to another to facilitate securing the cluster and ensuring high availability.
  • Assists in reviewing and updating all configuration & documentation of Hadoop clusters as part of continuous improvement processes.
  • Evaluate and recommend systems software and hardware for the enterprise system including capacity modeling.
  • Architect our Hadoop infrastructure to meet changing requirements for scaling, reliability, performance, and manageability.

Qualifications

  • Hands-on experience with production deployments of Hadoop applications.
  • Strong understanding of best practices and standards for Hadoop application design and implementation.
  • Configuring, monitoring, and administering large Hadoop clusters.
  • Proven experience with automation,
  • Experience in independently performing root cause analysis and coming up with recommendations.
  • Expertise with one or more of the following Big Data tools like Cloudera CDH and CDP.
  • Configuring, monitoring, and administering large Hadoop clusters.
  • Capacity planning, configuration management, monitoring, debugging, and performance tuning.
  • Should be able to analyze JVM’s and identify the resources limitations of that JVM.
  • Monitor and maintain cluster connectivity and performance
  • Identify faulty nodes and programmatically isolate them to avoid process/job failures.
  • Should have knowledge in Backup and Restore of RDBMS Databases.
  • Well experience using big data tools and techniques of the following technologies:

o    Hive

o    Impala SQL

o    Kudu

o    Spark

o    Spark Streaming

o    Solr


Cluster Security:

• Should have good hands-on knowledge as well as expertise on implementing and troubleshoot on overall big data cluster security.


Additionally:

• Experience on Kafka, HBase (Is plus but not mandatory).

• Solid understanding of automation tools (puppet, chef, ansible) (Is plus but not mandatory).

• Experience with at least one if not most of the following languages; Java, Python, PERL, Ruby, or Bash Shell Scripting. (Is plus but not mandatory)

• Good collaboration & communication skills, the ability to participate in an interdisciplinary team.

• Good written communications and documentation experience.

• Knowledge of best practices related to security, performance, and disaster recovery.

• A can-do approach & self-motivated to learn new components and share knowledge amongst team.


Preferred Qualifications


  • Bachelor or Master’s Degree with an emphasis on Computer Science or Engineering
  • Min. work experience:7 Years
  • Min. 2 years of Experience in Hadoop (Cloudera) Administration
  • Ability to communicate clearly to support operations teams
  • Great experience with Linux

Full-time Position

Riyadh Region - Saudi Arabia / Hybrid

Hybrid

Specialist (2-5 yrs)

English - Arabic

ASAP

Any Time Zone

Share this Job
About the Company
avatar
Ejada
Ejada
55471 Riyadh, 11534, Saudi Arabia
'Asir Region, 55471
Saudi Arabia

Full-time Position

Riyadh Region - Saudi Arabia / Hybrid

Hybrid

Specialist (2-5 yrs)

English - Arabic

ASAP

Any Time Zone

Share this Job