JD is below. Basic: • Bachelor's degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education. • At least 4 years of experience in DW/BI of which minimum 2 years in Hadoop Administration in Cloudera and 1 year as Hadoop MapReduce, PIG, HIVE, SQOOP, HBase, SPARK, Scala, Impala. • At least 1 year in monitoring tools like Nagios, Ganglia, Chef, Puppet and others. • Should have experience with Cloudera Manager (OR) Ambari (OR) Pivotal Command Line Centre. • Should be able to work independently • Should have experience in trouble shooting, user support • Must have patching and upgrade experience • Require experience with Capacity Planning & Performance Tuning on Large Scale Data Hadoop Environment. • Should have experience with Hadoop Cluster Security implementation such as Kerberose, Knox & Sentry. • Require experience with Cluster Design, Configuration, Installation, Patching, Upgrading and support on High Availability. • Should have experience in Monitoring Hadoop cluster connectivity and security • Should have experience in HDFS support and maintenance. • Should have extensive knowledge at OS level • Must have performance tuning experince Preferred: • Knowledge of new technology in Hadoop ecosystem components • Strong Analytical skills • Data modelling, design & implementation based on recognized standards. • Software installation and configuration. • Database backup and recovery. • Database connectivity and security. |
No comments:
Post a Comment