Title: ETL Hadoop Developer
-- Location: Cleveland, OH
Duration: Long Term
Essential Functions:
- Build distributed systems that scale well
- Build complex algorithms that implement state of the art analytics on Hadoop with MapReduce
- Participate in Research and Development (R&D) activities at the project level
- Cross train team members on areas of technical expertise
- Technical ownership of specific facets of the technology stack
- Keen focus on your own technical skill development
- Own the full life cycle of big feature development from definition, design, implementation and testing
- Advocate for development of best practices in the organization and bring in knowledge of new technologies to the team
- Effective in managing own projects
- Overseeing project and code quality
- Oversight is mostly limited to the scope of own projects, but may include oversight over other software engineers assigned to own project
- Mentor Software Engineers on team
- Influence at the engineering organization
- Regularly contribute to ongoing improvements in engineering process and product development ecosystem
- Foster an environment of continuous learning and improvement
- Contribute to ongoing education initiatives
- Assist with on-boarding of new engineers
Minimum Qualifications:
- Basic knowledge in data structures, algorithms, and complexity analysis
- Basic knowledge in Hadoop and MapReduce
- Basic knowledge in Hadoop ecosystem including Hive, Pig, and HBase
- Basic knowledge in ETL patterns for Hadoop, HBase, Hive, Scoop
Title: BO Admin
Location: Wilmington, DE
Duration: Long Term
Job Description
- 5-6 years of BO Admin experience (OR) 5-7 years of BO Admin + Development experience, where candidate worked on admin work environment of real experience in previous assignments.
- Crystal reports Admin Experience (on-demand and scheduled)
- Strong understanding of BO Architecture and performance tuning
- Strong experience with Installation, configuration of the entire BO suite on AIX and WebSphere environment
- Must be able to effectively scale and size a technical environment and configure BO settings for optimal performance
- Adept at Report Optimization, Tuning, Caching, specifying Query Governing, Thresholds, VLDB properties and Job Monitoring
- Experience upgrading between various versions of BO in PROD, DEV, SIT and UAT
- Develop and maintain Unix/AIX scripts for automation, capacity planning and monitoring
- Troubleshoot user-caused performance issues and give recommendations for more efficient prompt answers, or make report modifications to better guide the user toward efficient runs
- Troubleshooting report SQL, or locating inefficient or incorrect joins that may cause performance issues in Oracle
- Troubleshoot report performance issues and make suggestions for new or tuned aggregations
- Ability to demonstrate excellent problem solving, analytical and interpersonal skills
Title: Cognos Admin
Location: Cleveland, OH
Duration: Long Term
Job Description
- Excellent analytical skills
- Excellent oral and written communication skills including active listening. Asking appropriate questions, clarifying information and writing clear, concise documents.
- Administration experience using Cognos 10.x or higher
- Modeling experience using Cognos Framework Manager
- Design/Building cubes using Cognos Transformer 7 or higher preferably Transformer 8 or higher
- Data warehouse experience
- Ability to troubleshoot issues within Cognos 10.x
- Experience with Cognos 11.x
- 5+ years of relevant experience in business intelligence role, including data warehousing and business intelligence tools, techniques and technology
- 5+ years of experience in diving deep on data analysis or technical issues to come up with effective solutions
- IBM Certified Administrator or IBM Certified Solution Expert preferred
- Experience working with health data and the corresponding privacy and security requirements a plus
- Experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with large-scale, complex datasets
- Proven ability in looking at solutions in unconventional ways and the ability to innovate
- Experience with Cognos Framework Manager developing and maintaining Cubes, Queries, and Reports
- Plan and maintain upgrades, patches, and hot fixes for Cognos
- Experience with the IBM InfoSphere DataStage, and SPSS a plus
- Experience with Cognos customizations
- Experience with DB2 and Vertica a plus
Title: Big Data Security Specialist
Location: Cleveland, OH
Duration: Long Term
Job Description
- 2+ years Experience in implementing security on one of the following infrastructures:
- IBM's Open Platform (IOP)
- Cloudera,
- Horton Data Platform
- MapR
- 2+ years Expertise in Hadoop Management & Security Tools such as Kerberos, Apache Ranger, Apache Knox, Apache Sentry, Apache Atlas
- 2+ years Supporting security technology such as LDAP, SSO etc
- Expertise in use of Ambari for deploying security into Apache Hadoop
- Proficiency in Linux and Python
- 5+ years of IT solution building experience
- 3+ years with the Apache Hadoop ecosystem
Thanks & Regards,
Baidyanath Kamti
Nityo Infotech Corp
Desk: 609-853-0818 X 2106
Email: baidyanath.kamti@nityo.com
LinkedIn: www.linkedin.com/in/baidhy
Gtalk/Yahoo: baidya.recruiter
You received this message because you are subscribed to the Google Groups "US Jobs: Requirements, Clients and Consultants" group.
To unsubscribe from this group and stop receiving emails from it, send an email to recruiters-r-us+unsubscribe@googlegroups.com.
To post to this group, send email to recruiters-r-us@googlegroups.com.
Visit this group at https://groups.google.com/group/recruiters-r-us.
For more options, visit https://groups.google.com/d/optout.
No comments:
Post a Comment