Job Description
We are seeking a hardworking Data Quality Analyst to join our growing team in our insurance practice at Cognizant! Our strength is built on our ability to work together. Our diverse set of backgrounds offer a wide range of perspectives and new ways of thinking. It encourages lively discussions, inspires thought leadership, and helps us build better solutions for our people and clients. We\'re looking for someone who thrives in this setting and is inspired to craft significant solutions through true partnership.
ÂÂ
Roles/Responsibilities:
- Work closely with DBA/Data management/Testing teams to identify the test data needs and work in setting up the data which will fulfill the release/project needs, right sets of data based on test cycles, iteration needs in the identified development & test environments.
- Coordinate with clients, data users and key stakeholders and develop and achieve various long-term objectives for TDM data architecture.
- Data modelling and DB analysis, data dependency analysis for Data Masking Process
- Create reusable and robust test data to testers and developers.
- Sensitive Data Analysis Using LM security Category sheet.
- Handle structured and unstructured data with various data types (relational databases both distributed (SQL Server, Oracle, DB2, MySQL)
- Gather requirements to define data definitions, transformation logic, and data model logical and physical designs, data flow, and process in informatica PowerCenter.
- ETL design, development and maintenance using Informatica PowerCenter
- Create data mapping and workflow using Informatica PowerCenter to extract, transform and load data into the target reporting environment
- Design, testing, support, and debugging of new and existing ETL and reporting processes in Informatica PowerCenter and IICS
- Provide development support, including walk through approvals, to other ETL application resources to ensure that standards are followed, ss and optimized workflow is implemented.
- Design, develop and maintain Informatica Cloud data integration processes in IICS using the appropriate data load technique
- Creates ETL Mappings, Mapplets, Workflows, Worklets using IICS
- Source to target mapping and creating IICS mapping designs and Pipeline ETL
- Use IICS for data integration in an AWS ecosystem loading and retrieving data from AWS RDS and SQL Server database.
- Build data integration processes by constructing mappings, tasks, task flows, schedules, and parameter files.
- Use IICS Data Integration/Application Integration handling large data volumes.
- Use IICS methodologies for data extraction, transformation and loading data using various transformations like Expression, Router, Filter, Lookup, Update Strategy, Union, and Aggregator.
- Handle Data profiling, masking, segregation, data creation using Delphix Masking Engine
- Create Connector using Jdbc Connection to Create New Connection for source and Target AWS RDS Databases.
- Create Data profiling, masking algorithms, data creation in Delphix Engine
- Create Profiling and masking Job in Delphix Engine.
- Write well versed in developing the SQL queries, unions and multiple tables joins and experience with Views for ETL Testing Script
- Automation, job scheduling, monitoring for ETL workflows Using ESP Job Scheduler.
- Work with CI/CD tools and GitHub code repository and Bamboo deployment.
- Handle Multiple AWS Data Storage (EBS, S3, RDS, Redshift) for ETL Process.
- Handle AWS Glue service for AWS Native ETL Services
ÂÂ
Requirements:
- 7+ years of experience in Data QA experience
- Languages –Core Java, Python
- Big Data -Hadoop, Spark
- Database – SQL Server, Oracle, DB2, AWS RDS
- ETL Technologies – Informatica PowerCenter, IICS, Delphix, AWS Glue
- Cloud – AWS
- Test Data Management (TDM)
Job ID: 117773