Mode of Interview: Skype Requirements: Must successfully complete background check and drug screen if selected for hire. Hadoop Design, Implementation and Support · Contract to Perm position We develop cutting-edge software solutions that are helping to revolutionize the Informatics industry. We are seeking technical and business professionals with advanced leadership skills to join our tight-knit team in our headquarters located in Maryland. This is an opportunity to work with fellow best-in-class IT professionals to deploy new Business Solutions utilizing the latest technologies for Big Data solutions, including a wide array of Open Source tools This position requires extensive experience on the Hadoop platform using Sqoop, Pig, Hive, and Flume to design, build and support highly scalable data processing pipelines. Hadoop Administrator Responsibilities: Work with Data Architects to plan and deploy new Hadoop environments and expand existing Hadoop clusters. Design Big Data solutions capable of supporting and processing large sets of structured, semi-structured and structured data Provide Administration, management and support for large scale Big Data platforms on Hadoop eco-system. Provide Hadoop cluster capacity planning, maintenance, performance tuning, and trouble shooting. Install, configure, support and manage Hadoop clusters using Apache & Cloudera (CDH3, CDH4), and Yarn distributions. Install and configure Hadoop Ecosystems like Sqoop, Pig, Hive, Flume, and HBase and Hadoop Daemons on the Hadoop cluster Monitor and follow proper backup & recovery strategies for High Availability Configure various property files like core-site.xml, hdfs-site.xml, mapred-site.xml. Monitor multiple Hadoop cluster environments using Ganglia and Nagios, and monitored workload, job performance and capacity using Cloudera Manger Define and schedule all Hadoop/Hive/Sqoop/HBase jobs Import and export data from web servers into HDFS using various tools Required Skills: Extensive experience in Business Intelligence, data warehousing, analytics, and Big Data Experience with Hardware architectural guidance, planning and estimating cluster capacity, and creating roadmaps for Hadoop cluster deployment Expertise in the design, installation, configuration and administration of Hadoop Ecosystems like Sqoop, Pig, Hive, Flume, and HBase and Hadoop Daemons on the Hadoop cluster Working knowledge of capacity planning, performance tuning and optimizing the Hadoop environment. Experience in HDFS data storage and support for running map-reduce jobs. Experience in commissioning, decommissioning, balancing, and managing Nodes on Hadoop Clusters. Experience with Hadoop cluster capacity planning, maintenance, performance tuning, and trouble shooting. Good understanding of Partitioning concepts and different file formats supported in Hive and Pig. Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa. Hands on experience with data analytics tools such as Splunk, Cognos, Tableau etc. |
No comments:
Post a Comment