Dear Associate,
Trust this email finds you well.
I am Ashish from IDC Technologies Inc. and currently having many urgent job opportunities for different skill set all at Minnetonka, MN location. All are long term opportunities. So kindly do let me know if you have anyone suitable available with you for any of these opportunities.
Interview is today and only Skype
|
| |||
1 | Solution Architects: 2 roles | |||
We are looking for a dynamic individual to join our team of Solution Architects in Optum Data Management in Minnetonka, MN. The Solution Architect will play a critical role in defining the future state data management capability (Data Fabric) being built on Big Data capabilities like Hadoop & NoSQL databases as well as by leveraging traditional DW technologies where applicable. | ||||
Core responsibilities: | ||||
The candidate filling this role is expected to take the lead in gaining technical understanding of Big Data technologies and in particular gain working technical knowledge of managing massive volumes of data leveraging Hadoop, Spark and ElasticSearch. This role will be working hands on to develop proofs-of-concepts (POC) to identify viable options for implementing a particular data management need. This role will also pass on the learning to peers for knowledge sharing. | ||||
Other Roles: | ||||
· Architect solutions for key business initiatives ensuring alignment with future state analytics architecture vision | ||||
· Work closely with the project teams as outlined in the Agile methodology providing guidance in implementing solutions at various stages of projects | ||||
· Roll up your sleeves and work with project teams to support project objectives through the application of sound architectural principles | ||||
· Develop and validate that the proposed solution architecture supports the stated & implied business requirements of the project | ||||
· Review technical team deliverables for compliance with architecture standards and guidelines | ||||
· Adopt innovative architectural approaches to leverage in-house data integration capabilities consistent with architectural goals of the enterprise | ||||
· Create architectural designs for different stakeholders that provide a conceptual definition of the information processing needs for the delivery project | ||||
Required Qualifications: | ||||
· Minimum 10 years in the IT profession | ||||
· Minimum 3 years of hands on experience with Data Management using Big Data Technologies | ||||
· Minimum 5 years of experience creating DM / DW architectures | ||||
ΓΌ Experience with various data intake, staging and integration methods and tools | ||||
· Experience delivering against long-term, strategic technology roadmaps | ||||
· Success leading multiple projects using the Agile methodology | ||||
· High accountability with a demonstrated ability to deliver | ||||
· Demonstrated ability to communicate clearly and concisely ideas (both verbally and in writing) to executives and technologists | ||||
· Relevant Bachelor's degree or equivalent experience | ||||
Preferred Qualifications: | ||||
· Hands on Experience with Spark & ElasticSearch | ||||
· Experience with lead role in Java development | ||||
· Experience with Meta Data tools and techniques | ||||
· Experience with ELT tools such as Talend | ||||
|
| |||
2 | Technical Architect: 4 | |||
Required Skills: Senior architecture skills - communication, balancing and understanding multiple projects, driving towards and working with the business to outline and stategy / roadmap. Previous database, data virtualization, big data experience preferred. | ||||
Experience that is a MUST: | ||||
· 2+ years Hadoop/ETL Experience (data processing and scripting experience). | ||||
· Expertize in Big data technologies. | ||||
· Strong background in Talend Administration, Capacity Planning in a Big Data environment. | ||||
· Ability to Install and support Talend for Big Data. | ||||
· Deep pocket experience with dealing performance issues in the entire critical path, Performance Management and is able to provide/resolve issues. | ||||
· Strong background in UNIX, SQL. | ||||
· Strong hands on with HIVE, HBASE, PIG, Spark. | ||||
· Very good understanding of Hadoop/MAPR. | ||||
· Willing to be on on-call support rotation (requires working on weekends). | ||||
· Willing to participate/drive war-rooms to address/resolve critical issues as they come. | ||||
Nice to have: | ||||
· SQOOP | ||||
· 2+ years Spark Streaming Technologies - KAFKA | ||||
· Required Qualifications: | ||||
· Minimum of 5 years of IT experience in Architecture and Administration work | ||||
· Minimum 2 years Talend admin experience | ||||
· Strong technical and platform knowledge, including some or all of: Talend, Talend Administration, ETL tool (Datastage/Abinitio/Informatica), Databases (DB2/Oracle/Teradata), UNIX platform (Shell/Perl scripting), Scheduling tools (TWS/Autosys/ControlM). Knowledge BigData platform tools such as Hive, Hbase, Pig. | ||||
· 1 to 2+ years of development experience including full life cycle activities | ||||
· Experience with Agile methodology | ||||
· 5+ years working in LINUX or UNIX platforms | ||||
· Proficiency in MS Office (Word, Excel, Outlook, PowerPoint, Access) | ||||
· Excellent communication, presentation skills | ||||
|
| |||
3 | DBA - Development - Sr IT Architecture Analyst: 2 roles | |||
This position requires the ability to Install and support Talend for Big Data | ||||
Required Qualifications: | ||||
· Undergraduate degree or a minimum of 5 years of IT experience | ||||
5 to 7 years' overall experience | ||||
· Minimum 1 or 2 years Talend admin experience | ||||
· Strong technical and platform knowledge, including some or all of: | ||||
Talend, Talend Administration, ETL tool (Datastage/Abinitio/Informatica), Databases (DB2/Oracle/Teradata), UNIX platform (Shell/Perl scripting), Scheduling tools (TWS/Autosys/ControlM). Knowledge BigData platform tools such as Hive, Hbase, Pig. | ||||
· 5+ years of development experience including full life cycle activities | ||||
· Experience with Agile methodology | ||||
· 5+ years working in LINUX or UNIX platforms | ||||
· Proficiency in MS Office (Word, Excel, Outlook, PowerPoint, Access) | ||||
· Excellent verbal, written communication,presentation skills, on-call support. | ||||
|
| |||
4 | Technical Lead: 1 Role | |||
· Proven MarkLogic developer & solution design experience | ||||
· Extensive database experience | ||||
· A history of delivering production-quality software for enterprise environments | ||||
· Intimate familiarity with XML, XSD and JSON | ||||
· Programming experience with xQuery and JavaScript | ||||
· Experience designing and implementing RESTful APIs | ||||
· Understanding of distributed systems, building web applications, ETL processes and data modeling | ||||
· Good knowledge of MarkLogic configurations and Infrastructure | ||||
· MarkLogic Security, Triples and Semantic experience is a plus | ||||
· Healthcare experience is a plus | ||||
|
| |||
5 | Developer: 7 roles | |||
BDApps Application Developer
As a Big Data Application developer, you will be accountable for building new, data-intensive software products that are focused on bringing the healthcare data together. These products are mostly analytical in nature and we use advanced technologies in big data space to provide a cutting edge solution to solve business opportunities. Core responsibilities of the person in this role will include: • Performing all phases of software engineering including requirements analysis, application design, and code development & testing. • Designing and implementing product features in collaboration with business and IT stakeholders. • Design, develop and test reusable frameworks, libraries and Java components. • Working very closely with Architecture group and driving technical solutions. • Be innovative in solution design and development to meet the needs of the business. • Address the performance issues and design the solutions to meet low latency needs. • Use Big data technologies to enrich and transform data and do data analytics, machine learning. • Lead technology decisions, drive product implementation and delivery. • Support the implementation and drive it to stable state in production. • Provide alternate design solutions along with project estimates. • Reviewing code and providing feedback relative to best practices, improving performance etc. • Troubleshooting production support issues post-deployment and come up with solutions as required. • Demonstrate substantial depth of knowledge and experience in a specific area of Big Data and Java development. • Drive the team and collaborate to meet project timelines.
Requirements: • 8 + years in the software engineering profession. • 5+ Years Hands on expertize with UNIX, JAVA, SQL • 3 or More Years Hands on expertize with Big data technologies
• Hands on expertize with Java Applications (technologies including Angular JS etc...) • Demonstrated expertise in Java programs Frameworks in an Agile/Scrum methodology. • Bachelor's degree or equivalent experience in a related field. • Clear and concise communication, able to articulate clearly.
Preferred skills: • Healthcare experience, Data Science background
| ||||
|
| |||
6 | Big Data Technical Lead: 2 roles | |||
Big Data Application Development Lead As a Big Data Application developer, you will be accountable for building new, data-intensive software products that are focused on bringing the healthcare data together. These products are mostly analytical in nature and we use advanced technologies in big data space to provide a cutting edge solution to solve business opportunities. Core responsibilities of the person in this role will include: • Performing all phases of software engineering including requirements analysis, application design, and code development & testing. • Designing and implementing product features in collaboration with business and IT stakeholders. • Design, develop and test reusable frameworks, libraries and Java components. • Working very closely with Architecture group and driving technical solutions. • Be innovative in solution design and development to meet the needs of the business. • Address the performance issues and design the solutions to meet low latency needs. • Use Big data technologies to enrich and transform data and do data analytics, machine learning. • Lead technology decisions, drive product implementation and delivery. • Support the implementation and drive it to stable state in production. • Provide alternate design solutions along with project estimates. • Reviewing code and providing feedback relative to best practices, improving performance etc. • Troubleshooting production support issues post-deployment and come up with solutions as required. • Demonstrate substantial depth of knowledge and experience in a specific area of Big Data and Java development. • Drive the team and collaborate to meet project timelines.
Requirements: • 8 + years in the software engineering profession. • 5+ Years Hands on expertize with UNIX, JAVA, SQL • 3 or More Years Hands on expertize with Big data technologies
• Hands on expertize with Java Applications (technologies including Angular JS etc...) • Demonstrated expertise in Java programs Frameworks in an Agile/Scrum methodology. • Bachelor's degree or equivalent experience in a related field. • Clear and concise communication, able to articulate clearly.
Preferred skills: • Healthcare experience, Data Science background
Note: For Big Data Application Development Lead position, · Minimum 10+ years' experience · Prior experience with leading team · Handling large initiatives in addition to the skills above | ||||
7 | Technical Lead: 5 roles | |||
Data Lake / Big Data Development Technical Lead As a Big Data Development Technical Lead, you will be accountable for development, testing & implementation of data sources into the Enterprise Data Lake. You will review and vet technical solutions, lead technical discussions, provide guidance to developers and assist in troubleshooting. Your primary function will also include estimation of projects, working with the engagement team, business stakeholders, source system SMEs, solution architects, technical leads and others. Core responsibilities of this role will include: Performing all phases of software engineering including requirements analysis, application design, and code development and testing. Designing and implementing product features in collaboration with business and IT stakeholders. Designing reusable Java components, frameworks and libraries. Working closely with the Architecture group and driving solutions. Implementing the data management framework for the Data Lake. Supporting the implementation and driving to stable state in production. Providing alternative design solutions and project estimates. Reviewing code and providing feedback relative to best practices, performance improvements etc. Troubleshooting production support issues post-deployment and come up with solutions as required. Demonstrating substantial depth of knowledge and experience in a specific area of Big Data and Java development.
Minimum Requirements 10+ consecutive years in the software engineering profession. 5+ years hands-on expertise with UNIX shell scripts and commands 5+ years hands on expertise with SQL writing queries and RDBMS 2+ years hands on expertise with Big data technologies (HBASE, HIVE, SQOOP, PIG,) 2+ Agile/Scrum methodology. 2 + years of experience leading teams and/or managing workloads for team members Demonstrated expertise in Java programming and frameworks Bachelor's degree or equivalent experience in a related field.
Preferred Requirements Splunk, and Talend | ||||
|
| |||
8 | Dev/QA: 1 Role | |||
As a DEV/QA , you will be accountable Development, Testing & Implementation of Source into Enterprise data lake. We are building a brand new, data-intensive software product that is focused on bringing the healthcare data together. The product is expected to save millions of dollars each year for its customers. Core responsibilities of the person in this role will include: | ||||
• Performing all phases of software engineering including requirements analysis, application design, code development and testing. | ||||
• Designing and implement product features in collaboration with business and IT stakeholders. | ||||
• Designing reusable Java components, frameworks and libraries. | ||||
• Working very closely with Architecture group and driving solutions. | ||||
• Design and develop solutions to meet the needs of the Loading Data lake | ||||
• Implement the data management Framework for building Data Lake for Optum. | ||||
• Support the implementation and drive it to stable state in production. | ||||
• Provide alternate design solutions along with project estimates. | ||||
• Reviewing code and providing feedback relative to best practices, improving performance etc. | ||||
• Troubleshooting production support issues post-deployment and come up with solutions as required. | ||||
• Demonstrate substantial depth of knowledge and experience in a specific area of Big Data and Java development. | ||||
Requirements | ||||
• 5-10 years in the software engineering profession. | ||||
• Hands on expertize with UNIX | ||||
• Hands on expertize with SQL | ||||
• Hands on expertize with Big data technologies (HBASE, HIVE, SQOOP, MAPR, PIG, Splunk, and Talend) | ||||
• Demonstrated expertise in Java programs Frameworks in an Agile/Scrum methodology. | ||||
• Bachelor's degree or equivalent experience in a related field. | ||||
|
Thanks & Regards
Ashish Verma
IDC Technologies, Inc
1851 McCarthy Boulevard, Suite 116, Milpitas, CA, USA, 95035.
Mailto :- ashish.verma@idctechnologies.com
Phone No :- 408-459-5637
Hangout: itjobprofessionals@gmail.com
You received this message because you are subscribed to the Google Groups "US Jobs: Requirements, Clients and Consultants" group.
To unsubscribe from this group and stop receiving emails from it, send an email to recruiters-r-us+unsubscribe@googlegroups.com.
To post to this group, send email to recruiters-r-us@googlegroups.com.
Visit this group at https://groups.google.com/group/recruiters-r-us.
For more options, visit https://groups.google.com/d/optout.
No comments:
Post a Comment