Career Opportunity Send Resume ►
Category: Analyst Job Type: Contract
Job ID Number: 10578 City/State: Richmond    VA
Job Title: Systems Analyst (Big Data, Hadoop) - Long Term
Job Description: Location: Richmond, VA / 3+ years Contract

The Candidate must be a Citizen/Permanent Resident of USA and will have an expert understanding of Linux, Hadoop Ecosystem and associated infrastructure. Knowledge of setting up and configuring Kerberos, Spark, R Studio, Kafka, Flume, Shiny, Ranger, Oozie, NiFI etc. is a must Should have a solid understanding of system's capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks Should be able to deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name-node high availability, schedule and configure it and take backups Solid Understanding on premise and Cloud network architectures

Local candidates strongly preferred.
Only Green Card and US Citizen required, No C2C.

  • Will work fairly independently, and performs complex development and support services with the IT Enterprise infrastructure teams, ensuring operability, capacity and reliability for the Big Data System.

  • Will assist in planning, design, support, implementation and troubleshooting activities.

  • Will work with developers and Architects to support an optimal & reliable Big Data Infrastructure.

  • Will be on call and may need to on evenings/weekends as required.

  • Will be responsible for implementation and ongoing administration of Hadoop infrastructure, align with the Architect to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments Will setup new users in Linux.

  • This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, HBase and Yarn access for the new users, luster maintenance as well as creation and removal of nodes using appropriate administrative tools.

  • Performance tuning of Hadoop clusters and Spark processes, Screen Hadoop cluster for job performances and capacity planning, Monitor Hadoop cluster connectivity and security, set up and monitor users of the system Manage and review Hadoop log files.

  • File system management and monitoring Diligently team with developers to guarantee high data quality and availability and Collaborate with application teams & users to perform Hadoop updates, patches, version upgrades when required Work with Vendor support teams on support tasks and troubleshoot system issues.

For expedited response, please contact Karan at 804-673-5100 x 35
Send Resume ►
Home | Careers | Employers | Resources | About Us | Contact        
Copyright (c) 2003, Leading Edge, Inc. All rights reserved.
Leading Edge logotype is a registered trademark. All other logos are the property of their owners.

Website Design by Koalatech