Header

IT CONSULTING JOBS
This is an attempt to broadcast all the IT jobs we have with our direct clients /vendors and customers. Please check back as this blog gets updated regularly with new job postings. Do respond to chandra.atholi@outlook.com for client submission/ call 936 591 2990 for rapid response.

Friday, October 2, 2015

Job Title: Hadoop Admin    

Location: Reston, VA

Duration: 6+ Months

Emp. Type: W2 / 1099

 

Hadoop Admin Responsibilities: 
Hadoop cluster installations, implementation and ongoing administration of Hadoop infrastructure
Implement strict security controls using Kerberos principals, ACLs, Data encryptions and protect entire Hadoop cluster(s).
Install, configure, upgrade, test, troubleshoot, and administer Hadoop systems
Managing/maintain Hadoop ecosystems (MapReduce, HDFS, YARN, Hive, HBase, Sqoop, Pig etc).
Ensure systems availability, security, integrity, and optimum performance of both Hadoop enterprise solutions and clustered platforms
Design and deploy high-availability, robust, resilient and supportable solutions
Assist developers in determining efficient ways to load new data sources into the big data environment
Evaluate new tools and technology and acquire necessary skills to stay current with fast pace of change in the big data area.
Working with Hortonworks distribution for fixing Hadoop issues

Specialized Knowledge Skill:
Excellent working experience of Hadoop cluster installation, adding/removing nodes, apply patches, and upgrade/integrate Hadoop ecosystems

Excellent working experience of Hadoop architecture, administration, support and capacity planning
Very proficient in HDFS, Map-Reduce, Zookeeper, HBase, Pig, Hive, Python, Ambari and shell scripting
Prior experience of Linux Systems Administration and troubleshooting is strongly preferred
Prior working experience with AWS - any or all of EC2, S3, EBS, ELB, RDS, EMR
Monitor Hadoop cluster job performance, strong analysis and troubleshooting
Expertise with IPsec, VPN, VPC, VPG, Load Balancing, Iperf, MTR, Routing Protocols, SSH, Network Monitoring / Troubleshooting tools
Good knowledge of distributed computing environments
Very strong experience in implementing security concepts / best practices
Very good understanding of ETL principles and how to apply them within Hadoop
Apply advanced troubleshooting techniques to provide unique solutions to our customers' individual needs
Participation in 24X7 on-call support for Big Data Environment
Excellent oral and written communication skills
Strong multi-tasking skills
Write tutorials, how-to, and other technical articles for the customer community
Prior experience on maintaining Databases (Oracle, Netezza, MySQL, etc) are preferred

To Apply, Please click here: APPLY NOW 

No comments:

Post a Comment