Hello,
Greetings form Diverse Lynx, LLC.
We have an urgent need for a Hadoop Developer for one of our clients Denver, CO. Please go through the below requirement if you or your consultants are open for projects and interested in the below requirement ,Please respond back with latest resume along with details ASAP.
Title: Hadoop Developer
Location: Denver, CO
Duration: 12+ Months
Req Details:
Core Responsibilities:
· Develop Big Data solutions on a Hadoop platform, leveraging current ecosystem tools
· Develop solutions for real-time and batch-mode event/log collecting from various data sources
· Analyze massive amounts of data and help drive prototype ideas for new tools and products.
· Developing enterprise-grade integration solutions, leveraging 3rd party and custom integration frameworks
· build and support APIs and services that are exposed to other internal teams
· Actively participate in team Agile planning and sprint execution
Requirements:
· Bachelors or Masters in Computer Science or equivalent
· Proven track record of delivering backend systems that participate in a complex ecosystem.
· 8+ years designing and developing Enterprise-level data, integration, and reporting solutions
· 3+ years' experience developing applications on Hadoop, utilizing Pig, Hive, Sqoop, or Spark- MUST
· Experience with Hadoop 2.0 and Yarn applications
· Proven experience with data modeling, complex data structures, data processing, data quality, and data lifecycle
· Current knowledge of Unix/Linux scripting, as well as solid experience in code optimization and high performance computing.
· Good communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly.
· Great design and problem solving skills, with a strong bias for architecting at scale.
· Good understanding in any: advanced mathematics, statistics, and probability.
· Adaptable, proactive and willing to take ownership.
· Keen attention to detail and high level of commitment.
· Experience in messaging and collection frameworks like Kafka, Flume, or Storm.
· 3+ years of distributed database experience (HBase, Accumulo, Cassandra, or equivalent).
· Knowledge in Big Data related technologies and open source frameworks preferred.
· Experience in software development of large-scale distributed environments
· Experience with integration tools such as Pentaho or Informatica Big Data Edition
· Experience using Enterprise scheduling tools such as UC4, Tidal, or Autosys
Experience: 9- 12 years.
Thanks & Regards
Ankur Gulati
Diverse Lynx | www.diverselynx.com |
300 Alexander Park, Suite 200 Princeton, NJ 08540
P: 732-452-1006 Ext 238 | ankur.gulati@diverselynx.com
You received this message because you are subscribed to the Google Groups "US Jobs: Requirements, Clients and Consultants" group.
To unsubscribe from this group and stop receiving emails from it, send an email to recruiters-r-us+unsubscribe@googlegroups.com.
To post to this group, send email to recruiters-r-us@googlegroups.com.
Visit this group at http://groups.google.com/group/recruiters-r-us.
For more options, visit https://groups.google.com/d/optout.
No comments:
Post a Comment