1. P. Kumaraswamy Mail:kumar.maruru@gmail.com
Software Test Engineer Mobile: +91-8050826449
Professional Summary:
* Having 6+ years of experience in Software Testing, more precisely in Functional, ETL
Testing, Data Migration Testing and Report Testing.
* Hands on experience on Testing in Financial, Core Banking, Healthcare, Insurance
Bond domains.
* Good at Functional, Sanity, Regression, Adhoc, Compatibility, SIT and UAT Testing.
* Good Experience in Data Migration Testing.
* Good experience in ETL & Report (Informatica & Cognos)Migration Testing.
* Good experience in Cube building and Cubes Report Testing.
* Good experience in ETL tools like Informatica,SSIS.
* Good experience in Reporting tools like Cognos,SCA.
* Good experience in generating cube reports using pivot table.
* Good experience in writing SQL queries and UNIX commands.
* Good exposure on SQL Server Integration Services 2012.
* Good exposure on PL/SQL.
* Good exposure on SQL DBA.
* ISTQB-ISEB Certified Tester Foundation Level.
.
Professional Experience:
* Working as a Senior Consultant in CAPGEMINI., Bangalore from March ’2015 to till
date.
* Worked as a Software Test Engineer in L&T Infotech., Bangalore from June ’2010 to
March ’2015.
Education:
• M.Sc (Electronics) From Sri Venkateswara University, Tirupathi, A.P.
Expertise:
1 Operating System : Windows, UNIX, FileZilla.
1 ETL Tools : Informatica, SSIS, Abinitio,
2. 1 Reporting Tool : Cognos, CPMS, SCA.
1 Tools : ESP Job scheduler, Fitnesse
1 Test Management Tool : QC, HP-ALM
1 Languages : HTML, C, C++ , JSP, Java Script
1 Database : Teradata, Oracle 11g, Netezza, SQL Server 2012
Project #1:
Project Title : Totality (Johnson & Johnson)
Client : Johnson & Johnson
Role : Senior Consultant
Tools : Informatice 9.1, HP-ALM
Environment : Oracle11g, UNIX.
Testing : ETL Testing.
Project Synopsis:
Totality is a healthcare compliance initiative that will help to fulfil mandated reporting
requirements related to transactions involving health care professionals and/or clinical
investigators. Totality is to create an integrated capability across the J&J Pharmaceutical Sector
that allows them to meet their requirements respect to Health Care Compliance. Totality PO/CR
Update is the process identified to update Totality Transaction status when PO/CR Approval or
Update received by web Methods from Ariba. All the PO/CR approval and update published by
Ariba for a Totality Transaction should be retrieved and stored into Totality DB.
Responsibilities:
1. Understanding the supplier dataflow as per the changes in HCC Supplier data.
2. Understanding the Source and Target mappings.
3. Preparing the test plan.
4. Prepared the test scenarios and test cases.
5. Invoke the wf_m_Load_Suplr_Hist by executing the .ksh script.
6. Check the SUPPLR data loaded into SUPPLR_HIST Table.
7. Insert New records and update the records in SUPPLR Table.
8. Invoke the wf_m_Suplr_To_Suplr_Hist_SCD2 by executing the .ksh script.
9. Check all the New Inserts and update records information properly reflecting/loaded in to
‘SUPPLR_HIST’ Table.
10. Checking the current month SUPPLR file has been processed after successful run of SCD2
workflow wf_m_Suplr_To_Suplr_Hist_SCD2.
11. Checking the file count is matching with the ‘SUPPLR_HIST’ Table.
12. Executing the Test cases and raising the defects in HP-ALM
Project #2:
Project Title : Totality-Data Migration
Client : Johnson & Johnson
Role : Senior Consultant
Tools : Informatice 9.1, Cognos 10.2, HP-ALM
3. Environment : Teradata, Oracle11g, UNIX.
Testing : Data Migration Testing (Informatica & Cognos).
Project Synopsis:
Migration is the one time activity here as part migration all the scripts are migrated from
Reference Server to Remediation server.
Responsibilities:
1. Identify the Migrated Workflow and Cognos Reports.
2. Prepared the test cases.
3. Running the Informatica Workflows/Sessions.
4. Capture the run statistics and compare the performance identified in Pre and Post Migration
workflows.
5. Generate the reports by passing same parameters in Pre and Post Migration.
6. Capture the run statistics and compare the report performance in Pre and Post Migration.
7. Compare the Report record count with respect to View table.
8. Validate the Report data with respect to DWH Concur View table.
9. Checking whether the report data displaying as per the filter conditions applied on the View table.
10. Executing the Test cases and raising the defects in HP-ALM.
Project #3:
Project Title : CitiKYC
Client : Citi
Role : Senior Software Test Engineer
To : Abinitio
Environment : Oracle11g, UNIX.
Testing : ETL Testing.
Project Synopsis:
CitiKYC platform will enable a single shared KYC record for the customer’s accounts to
maintain a comprehensive understanding of clients and accounts through the bank across the
lines of business. Global view of the clients through a common KYC record based on standards
set by Client type. Initially data integrity of the feed coming through the batch feed should be
performed at the Source On-boarding System and generate the feed files in Unix landing
directory, once successful run of ETL scripts the data gets Migrated into Staging and Regional
Data Mart tables. Using CityKYC application we can check the records which are stored in
regional Data Mart.
Responsibilities:
1. Understanding the requirement specifications.
2. Understanding the Table relationship from the Data Model document.
3. Understanding the File to Table Mapping from the Mapping documents.
4. Identify the mandatory and optional fields for each input component file from the specification
4. document.
5. Preparing the Test Plan document.
6. Preparing the Test Scenarios and Test cases.
7. Validating the structure of the Source and Target attributes with respect to mapping document.
8. Validating the feed from the file.
9. Query the data from Staging database and validate data with respect to source feed after
successful run of ETL Migration scripts.
10. Checking the source data load in staging and Marts database.
11. Checking the Migrated KYC data exist in the CitiKYC application.
12. Preparing daily Status and attending status call.
13. Run the test cases in QC.
14. Defects were raised and tracked using QC.
Project #4:
Project Title : Data Migration-Travelers Insurance Bond
Client : Travelers
Rol : Senior Software Test Engineer
Tools : Fitnesse,
Environment : Abinitio, SQL Server 2012, UNIX.
Testing : Data Migration Testing.
Project Synopsis:
Data Migration is a onetime activity to migrate data from Source to Target database. All the
data migration specific objects need to be created in the database before migration starts. CLDW
holds transaction data made in CAMS, CAMS, and Express Bond applications.
Responsibilities:
1. Understanding the requirement specifications.
2. Understanding the Mapping documents.
3. Identify the relationship between the tables using the Physical Data Model document.
4. Preparing the Test Plan document.
5. Preparing the Test Scenarios and Test cases.
6. Preparing the test data based on given conditions in the mapping document.
7. Run the Unix migrated scripts using finesse tool.
8. Check the migrated source data into Target database.
9. Check the data conditions as expected in the Target table.
10. Preparing Status tracker and attending daily status call.
11. Run the test cases in QC.
12. Defects were tracked using QC.
Project #5:
Project Title : TAA – IMS Health
Client : IMS Health
Role : Senior Software Test Engineer
5. Tools : SSIS, ESP (Job scheduler).
Environment : SQL Server 2008, Netezza, UNIX.
Testin : DWH Testing.
Project Synopsis:
TAA system will directly connect to CPMS system for accessing the Client Specific
Territory structure data and Cohort data. DDMS – Panelref file will be supplied by PTR system.
TAA will load the Panelref file by reusing the CHAMP packages. Using the CPMS and Panelref
data, Shop-to-region and Cohort-to-Shop mapping will be identified. Using the territory structure
hierarchy and region to shop data, a data file and an XML file will be generated for each Client
Specific territory structure sourced from CPMS. Using the Client - Cohort data and Cohort to
shop data, a data file and an XML file will be generated for each Client sourced from CPMS.
TAA will access these tables in CPMS and will populate the data in TAA tables. The Tactical
Custom Grouping Loader loads territory alignments into TDW.
Responsibilities:
1 Understanding the data system functionality from the requirement specifications.
2 Understanding the interface file mapping to TDW mapping.
3 Preparing Query Log.
4 Preparing the test plan document.
5 Preparing Test Scenarios and Test cases
6 Run the TAA event using ESP Job scheduler
7 Checking the data and xml files were placed in file server in specified GLA path.
8 Checking the count in each generated data file.
9 Testing the XML file validation.
10 Loads the generated files data into TDW by triggering CGL event using ESP job scheduler
11 Checking the log information in log files for every event trigger.
12 Checking the data in load table in TDW.
13 Preparing Status tracker and attending weekly status calls.
14 Run the test cases in QC.
15 Defects were tracked using QC.
Project #6:
Project Title : TDW - Data Projection.
Client : IMS Health
Role : Senior Software Test Engineer
Too : SSIS, ESP (Job scheduler).
Environment : SQL Server 2008, Netezza Database, UNIX .
Testing : DWH Testing.
Project Synopsis:
Data Projection is an interface between DP system and TDW where the data from the DP
system of the acquisition tire will be extracted and loaded in to GLA. The source for the DP is
PTR.DP system is extracted to provide the Data Projections to TDW on periodical basis
6. (Weekly/Monthly).Upon recalculation of data projections in the source system, the change will
be provided to DP system and to TDW. The recalculated projections for the entire period will be
loaded in to the TDW.
Responsibilities:
1 Understanding the requirement specifications.
2 Understanding the STTM mapping.
3 Preparing test plan document.
4 Preparing Query Log
5 Preparing Test Scenarios and Test cases
6 Load and check the weekly and monthly transaction data loaded into TWD
7 Run the DP event using ESP Job scheduler
8 Check the PJF file and xml file generated in the specified path.
9 Run the TDW Loader job and check the projection data loaded into the Projection Load, Stage
and Active table.
10 Check the SIT data against Production data.
11 Preparing Status tracker and attending weekly status calls.
12 Run the test cases in QC.
13 Defects were tracked using QC.
Project #7:
Project Title : APAC Australia Cube
Client : IMS Health
Role : Senior Software Test Engineer.
Tools : ESP (Job scheduler), SCA.
Environment : SQL Server 2008, SSAS, UNIX.
Testin : Cube Report Testing
Project Synopsis:
The Australia Cube is a part and module of the Data warehouse testing. Cube contains a
summary of the principal data contained within OLAP database. Once we can specify the
data source, database, OLAP cube, the analyzer applies those measures to dimension level
within the database. Therefore every possible combination of measure and dimension is
already part of cube for generating the report.
Responsibilities:
1 Understanding the requirement specifications.
2 Preparing test plan document.
3 Preparing Query Log.
4 Preparing Test Scenarios and Test cases.
5 Building and hosting the cubes.
6 Creating cube report using SCA tool.
7. 7 Testing the cube Structure and cube data validation testing.
8 Defects were tracked using MQC.
Project #8:
Project Title : Tactical Data Adaptor-DC&DSD
Client : IMS Health
Rol : Senior Software Test Engineer
Tools : SSIS, ESP(Job scheduler).
Environment : SQL Server 2008, UNIX.
Testing : DWH Testing.
Project Synopsis:
The Tactical Data Adaptor is data collection tool. The purpose of this tool is collecting the market
selections, data customizations from the strategic components like Data Warehouse Market Selection
(DWMS) and Data Customisation (DC) respectively.
Responsibilities:
1 Understanding the requirement specifications.
2 Understanding the component functionality
3 Preparing the query log.
4 Preparing test scenarios and test cases.
5 Create the request and run the DSD market definition in PRS system.
6 Test the market data has been loaded in DSD customization table in TDW after completing the
TDA-DSD event.
7 Test the customization data has been loaded into DC customization table in TDW after
completing the TDA-DC event.
8 Executing the test cases.
9 Defects were tracked using MQC.
Project #9:
Project Title : LBU 593-LRx Belgium
Cli : IMS Health
Role : Senior Software Test Engineer
Tools : ESP(Job scheduler), LRx UK Production tool.
Environment : ASP.NET, SQL Server 2008, UNIX .
Testing : DWH Testing.
Project Synopsis:
CPMS, the Client Parameter Management System, is a window based application that enables to create
customized reports and databases for clients. The market definition for extract process is created within
the CPMS, using this we can specify exactly which data should be included in reports and databases and
the data to structured and displayed. For created market we will define the report with different therapy
levels by assigning the facts, dimensions and the period of data. Once the report is completed this will be
8. run by the LRx Production Tool.
Responsibilities:
1 Understanding the requirement specifications.
2 Preparing test plan document.
3 Preparing test scenarios and test cases.
4 Create the reports based on new Offering RXD2.
5 Create the reports based on new Offering QFF2.
6 Defining the jobs for running the reports using LRx Production Tool.
7 Check the job run details and its log information.
8 Check the generated file in GLA path in UNIX server.
9 Create the dataview report and check the assigned packs are displayed.
10 Preparing Status tracker and attending weekly status calls.
11 Defects were tracked using MQC