| Job Title: Hadoop Developer-Backfill(USC or GC only) Location: Dayton, OH Duration: Approx. 3 months Job Description:- Top Required Skillsets: PIG, HIVE, Scala REQUIREMENT:- - Bachelor's Degree or equivalent or equivalent work experience is required
- Minimum of six (6) years of experience in Informatica or IBM Datastage development is required
- ETL development
- Hadoop and shell scripting experience
- Informatica experience and knowledge
- SSIS and SQL Server knowledge
- Java or Python experience
- Metadata solutions or tool experience
- Communication skills both written and verbal
- Knowledge of managed care preferred
- Knowledge of healthcare data coding such asCPT-4, HCPCS, ICD-9, DRG and Revenue Codes a plus
- Ability to work independently and within a team environment
- Critical listening and thinking skills
- Decision making/problem solving skills
- Strategic management skills
- Training/teaching skills
Leadership experience and skills Education / Experience:- · Bachelor's Degree or equivalent or equivalent work experience is required · Minimum of six (6) years of experience in Informatica or IBM Datastage development is required · Healthcare experience is preferred Data management experience is preferred Required Competencies / Knowledge / Skills: - ETL development
- Hadoop and shell scripting experience
- Informatica experience and knowledge
- SSIS and SQL Server knowledge
- Java or Python experience
- Metadata solutions or tool experience
- Communication skills both written and verbal
- Knowledge of managed care preferred
- Knowledge of healthcare data coding such asCPT-4, HCPCS, ICD-9, DRG and Revenue Codes a plus
- Ability to work independently and within a team environment
- Critical listening and thinking skills
- Decision making/problem solving skills
- Strategic management skills
- Training/teaching skills
- Leadership experience and skills
Role and Responsibility: - Participate in and lead the development and implementation of data migration using best practices throughout the enterprise through the data migration tool
- Work within the data migration strategy and roadmap to increase the usability, completeness, and accuracy of enterprise data throughout the company systems and applications
- Ensure the data migration development adheres to standards and practices for development and the SDLC methodology
- Provide input on the architecture and integration of multiple database and reporting tool products within an Enterprise Data Warehouse
- Ensure HIPPA and regulatory compliance rules are addressed for all data movement
- Utilize ETL standards and practices toward establishing and following a centralized metadata repository
- Work with business areas and IT to ensure integrity and proper integration for all sources of enterprise data
- Coordinate with business areas and IT to support a business rule repository and processes, and support and store in a metadata repository
- Work closely with areas directly connected to the Enterprise Data Warehouse to ensure that reporting, business intelligence, and analytic data needs are met
- Work within an iterative approach methodology
Performs any other job related instructions as requested Working Conditions: General office environment, may be required to sit/stand for long periods of time Licensure / Certification: None |
No comments:
Post a Comment