Position- Hadoop Admin
Location- San Ramon CA
Duration- 12+month
Experience- 10+ yrs
Job Description:
Minimum of 10 years of experience in administering Linux/UNIX systems, Database, managing storage, and other back-end services in bare metal or virtualized environments
Desired Skills:
3+ years of experience with managing and monitoring large Hadoop clusters
3+ years of experience with writing software/shell scripts using scripting languages including Perl, Python, or Ruby.
3+ years of experience with Hadoop Distributed File System (HDFS)
Hands on experience with Apache Hadoop Ecosystem components such as Pig, Hive, Scoop, Flume, and MapReduce/YARN etc.
Good understanding of the core database concepts
Experience with distributed, scalable databases such as Cassandra, Mongo, HBASE
Hands on experience with one or more ETL/ELT tools.
Established experience with automated, elastic scaling of cloud services, automated deployment and remediation
High level of ownership and accountability
Any Pivotal Hadoop/GreenPlum/GemFire experience is a plus
--
Thanks & Regards,
PRABAL PANDEY
Work: 510-790- 2000 Ext. 1007
| Fax: 510.578.7788
Email: prabal@radiansys.com
39510 Paseo Padre Pkwy # 110
Fremont, CA 94538
You received this message because you are subscribed to the Google Groups "US Jobs: Requirements, Clients and Consultants" group.
To unsubscribe from this group and stop receiving emails from it, send an email to recruiters-r-us+unsubscribe@googlegroups.com.
To post to this group, send email to recruiters-r-us@googlegroups.com.
Visit this group at http://groups.google.com/group/recruiters-r-us.
For more options, visit https://groups.google.com/d/optout.
No comments:
Post a Comment