Sei sulla pagina 1di 14

Customer Centricity Initiative

Loyalty Program

Master Test Plan


v. 1.01
Contents
Document Purpose............................................................................................ 4

1 Scope........................................................................................................... 4
1.1 Requirements to Be Tested............................................................................................................. 4
1.2 Requirements Not to Be Tested...................................................................................................... 5

2 Testing Levels............................................................................................... 5
2.1 Unit Testing...................................................................................................................................... 5
2.2 System Testing................................................................................................................................. 5
2.3 Acceptance Testing.......................................................................................................................... 5
2.4 Regression Testing........................................................................................................................... 5

3 Approach...................................................................................................... 5
3.1 Guidelines for Iterative Testing........................................................................................................ 6
3.2 Guidelines for Test Factors.............................................................................................................. 6
3.3 Application Interfaces......................................................................................................................... 6

5 Environments and Data................................................................................7


5.1 Inventory of Environment Resources............................................................................................... 9
5.2 Test Data....................................................................................................................................... 12

6 Enterprise Test Tools...................................................................................13

7 Test Deliverables – Project Teams.................................................................13

8 Staffing, Responsibilities, and Training.........................................................14

9 Cross-Business Unit/Third Party Participation by Application.........................14

10 Test Communication Plan........................................................................15


10.1 Meeting Schedules and Purpose................................................................................................... 15
10.2 Metrics and Measurements.......................................................................................................... 15
10.3 Link to Bug Tracker...................................................................................................................... 15

11 Testing Schedules...................................................................................15
11.1 High-Level Testing Milestones.................................................................................................. 15
11.2 Link to Test Scheduler.............................................................................................................. 16

12 Risks and Contingencies.........................................................................16

13 Assumptions, Constraints, and Dependencies..........................................16


13.1 Assumptions............................................................................................................................. 16

Page 2 of 14
13.2 Constraints............................................................................................................................... 16
13.3 Dependencies.............................................................................................................................. 16

14 Document References................................................................................17

15 Glossary of Terms.....................................................................................17

16 Document History.....................................................................................17

17 Reviewers and Approvers..........................................................................18

18 Appendix.................................................................................................. 18

19 Process Asset History................................................................................................. 19

Page 3 of 14
Document Purpose
The purpose of this Master Test Plan is to describe the testing efforts for the Client’s tender neutral
loyalty program also known as the Customer Centricity Initiative (CCI). The areas covered will include
scope, approach, resources, and scheduling
of testing activities across all testing phases.

1 Scope
Client is creating a tender neutral loyalty program, which will allow customers the opportunity to enroll in
the program via in-store, online, and through customer service.
The scope of this Program Level Master Test Plan is to outline testing information at a high-level that
supports the implementation of the tender neutral loyalty program. The implementation of this program
will require both Mainframe and Windows application-testing teams to ensure quality testing is performed
during the system development lifecycle of this project.
Testing Advocates on projects classified as a Work Type 1 (WT1) will include their testing activities in the
Project SDLC Document and will provide a link to the Loyalty Program Master Test Plan. Testing
Advocates on projects classified as a Work Type 2 (WT2) will be required to create an application detail
test plan with reference to the Loyalty Program Master Test Plan.
Testing Advocates should use the document asset templates from the intranet site.

1.1 Requirements to Be Tested


The requirements to be tested will fall into three categories, Business Requirements, System
Requirements, and Technical Requirements.
Business Requirements: The Business Requirements to be tested are those presented in the Client’s
Loyalty Program Overview Document. The link to this document is . Application teams will provide
traceability evidence from their System Requirements to their applicable Business Requirements.
System Requirements: The System Requirements to be tested are those presented in each
application’s WT1 SDLC Document or Requirements Definition Document (RDD). These documents can
be found on the MST Loyalty Team Site by selecting the desired application tab. Testing Advocates from
each application team will perform testing using System Requirements outlined in either the WT1 SDLC
Document or the RDD and any other supported documentation relevant to their testing efforts.
Technical Requirements: The Technical Requirements to be tested are those presented in the Technical
Definition Document (TDD). Testing Advocates should contact their Project Manager or Development
Lead for their project’s location of the TDD. Application teams will provide traceability evidence from
their Technical Requirements to their System Requirements, to their applicable Business Requirements.
Program Change Requests (PCRs): For WT1 projects, Testing Advocates should list all approved
PCRs relating to their projects in the Projects SDLC Document. For WT2 projects, Testing Advocates
should list all approved PCRs relating to their projects in their application detail test plan.
MST will not be testing any requirements relating to the testing activities of third-party vendor Acxiom or
out of scope to the tender neutral loyalty program for Pilot.

2 Testing Levels

Page 4 of 14
All application groups participating in the tender neutral program should incorporate the levels/phases of
testing as needed for the project type.

2.1 Unit Testing


Unit testing is performed during coding and integration testing and is done during detailed design. Since
these are developer activities, they would be included in the unit test plan (Use the Unit Test Workbook
Template on Q4M Streamline (Beta). Testing Advocates should follow-up with the Development Lead for
verification.

2.2 System Testing


System testing can include many types or phases/levels of testing (Smoke Test, Functional/Component
Testing, Regression Testing, Integration/Cross-FOB Testing, Beta Testing, etc.). Often these are
performed concurrently and in various combinations. Testing Advocates should define and describe the
test types that will be performed for their application testing effort.

2.3 Acceptance Testing


Acceptance testing will be performed as End-to-End Test of all functionality across all platforms and
applications. A Program Level Acceptance Test Plan will be created prior to the End-to-End Test. Testing
Advocates can also create a separate Acceptance Test Plan for their projects, if applicable.

2.4 Regression Testing


Regressing testing should be performed by every application Testing Advocate to ensure the quality of
existing code has not been changed as new code is introduced.

3 Approach

The scope of the testing effort for the Loyalty Program will be based on both Waterfall and Iterative
Testing Methodologies. Waterfall Testing is an approach where you complete a number of phases in a
strictly ordered sequence: requirements analysis, design, implementation/integration, and then testing.
Iterative Testing is an approach to testing where an evaluation or assessment of quality is conducted
continuously, on smaller, more manageable units of work.

The Loyalty System testing activity will focus on the FIT Model (Focused Iterative Testing). Focused
Iterative Testing means testing iteratively against a focused area of a software product/system. A focused
area usually refers to a group of specific features in the software product (for example, Offer/Reward).
All other systems can elect to perform Iterative Testing or continue with their current Waterfall method of
testing until End-to-End Testing is ready to be performed. There will be a separate End-to-End
Acceptance Test Plan where all application-test teams will participate together at one time.

3.1 Guidelines for Iterative Testing


 Start with Iteration 1, Build 1.
o There may be several iterations within a Build.
o For iterations performed, the next iteration should build on the previous iteration.
 Record and keep track of the number of Iterations performed.
 Record and keep track of the number of Builds completed.
 For each Build, request Dev. Lead produces Release Notes so that you know what’s in each Build.
This will be beneficial as the product continues to develop and will provide traceability.
 Each Build produced should build on the work of the previous Build.

Page 5 of 14
3.2 Guidelines for Test Factors
There are many applications interfacing for the Loyalty Program. In order for a successful testing effort,
you may use the following test factors where applicable. Test factors enable the test process to be
logically constructed.

 Correctness – Assurance that the data entered, processed, and outputted by the application
system is accurate and complete.
 File Integrity – Assurance that the data entered into the application system will be returned
unaltered.
 Audit Trail – The capability to substantiate the processing that has occurred.
 Continuity of processing – The ability to sustain processing in the event problems occur.
 Reliability – Assurance that the application will perform its intended function with the required
precision over an extended period of time.
 Ease of Use – The extent of effort required learning, operating, preparing input for, and
interpreting output from the system.
 Maintainability – The effort required to locate and fix an error in the application system or missing
requirement or misinterpretation of a requirement.
 Portability – The effort required to transfer data from one application to another.
 Web Service Communication – Assurance that messaging service are communicating between
applications without error.

3.3 Application Interfaces


Integrated Application Context Diagram link

2 Environments and Data


The following environment information is at a high-level and continues to be a work in progress and
subject to change. If changes are made, this document will be updated to reflect those changes. Note:
Figures 1, 2, and Table 1 were retrieved from the Loyalty Program Level Environment Plan.

Figure 1: Environment Readiness Timeline

Page 6 of 14
Figure 2: Test Environment Resources

5.1 Inventory of Environment Resources


This section lists the resources that are used by each system in each environment. The reference
environment names are those used by the new Loyalty system and their mappings to other
nomenclatures are indicated. New resources are emphasized.
Page 7 of 14
Table 1: Environment Resource Inventory

2.1 Test Data


The test data required to support the testing scenarios listed in the Approach section of this document will
be outlined in the individual application’s SDLC or RDD documents. Contact the Quality Control
Organization (QCO) at mailto:MST QCO Support@Clients.com if help is needed to load application data.

3 Enterprise Test Tools


Page 8 of 14
Update the following chart to reflect the testing tools used on a project.

Tool Purpose Link to the Tool


Testing Scheduler (TSC) Testing resource scheduling
Bug Tracker Defect management
QA Director & Changepoint Testing script depository and
execution tool. Defect
management.

7 Test Deliverables – Project Teams


The Testing Advocates are responsible for the following work products.

Deliverable – Work Product Purpose


Test Plan Describes the scope, approach, resources, features to be tested,
testing tasks, and risk contingency planning.
For WT1 projects: Enter the link to Loyalty Master Test in Section
6.1 of the SDLC Document.
For WT2 projects:
Test Cases Specifies test inputs, execution conditions, test outputs, expected
results, actual results, and test pass/fail results.
For WT1 projects: Enter the link to your test cases Section 6.4 of
the SDLC Document.
For WT2 projects:
Test Summary Report Test Summary Report summarizes the application testing results
for this project.
For WT1 projects: Enter the summary information in Section 6.3 of
the SDLC Document.
For WT2 projects:
Test Scheduler (TSC) This is a testing resource scheduling application. Testing
Advocates are responsible for creating their own application TSC
entries for this project

8 Staffing, Responsibilities, and Training

Team Member Role Responsibilities Contact

Page 9 of 14
9 Cross-Business Unit/Third Party Participation by
Application
Business Unit/ Team Name/ Contact Person, Comments
Company Name Applications Phone, E-mail

4 Test Communication Plan

10.1 Meeting Schedules and Purpose


The Loyalty Program Testing Advocates began meeting and will continue to meet on Mondays from 3:15
p.m. to 4:15 p.m. to discuss all related items surrounding the testing efforts for this project. During End-
to-End Testing the application test teams will meet daily.

10.2 Metrics and Measurements


Defect Tracking Metrics: Reports from Bug Tracker will be produced weekly and distributed to each
application’s Project Manager, Development Lead, and Management.
The Testing Advocate will produce and submit a weekly defect summary report from QA Director to the
Program Level Testing Advocate, who will incorporate with the Bug Tracker reports and distribute to all
applicable parties.

10.3 Link to Bug Tracker


All test teams will use Bug Tracker for defect management with the exception of RDS, which is using QA
Director. There is one project level entry in Bug Tracker for the Loyalty Program.

5 Testing Schedules

Page 10 of 14
The testing schedule for the Loyalty Program will be based on the overall project schedule and the
details for the project schedule are still a work in progress. The current project schedule section for the
Loyalty Pilot – Development & System Test Phase currently starts in the project plan on ID 05099 –
60007 (this is subject to change).
Note: Testing Advocates should start testing as soon as Developers have completed Unit Testing and
there are available product areas ready to test.

5.1 High-Level Testing Milestones


Task/Milestone Testing Range Date
Pre-System Testing
System Testing
Integration Testing
Regression Testing
Acceptance Testing (E2E) Development & QA
UAT (Business Partners)

5.2 Link to Test Scheduler


Each Testing Advocate will setup a separate Test Scheduler entry for their application.

6 Risks and Contingencies


Risks and contingencies are being tracked at the project level. If a project risk affects the overall testing,
notify Ginene VanLierop to escalate the risk.

7 Assumptions, Constraints, and Dependencies

7.1 Assumptions
13.1.1 A Testing Advocate, Project Manager, Tech Lead, Development Lead, and Operations Advocate
shall be assigned for each application area.
13.1.2 All projects shall record defects in Bug Tracker, Project Name: Client’s Loyalty Program, Release
Pilot with the exception of RDS, which will be using Changepoint.
13.1.3 Development Leads and Operations Advocates shall contact the QCO with any data or
environment issues.
13.1.4 Testing Advocates shall verify their teams have access to all testing environments. Access shall
be verified prior to the Ready To Test (RTT) checkpoint.
13.1.5 Testing environments shall be ready for each testing phase/level.
13.1.6 All application testing teams shall have their application’s data loaded into the various testing
environments prior to testing.

7.2 Constraints

Page 11 of 14
Resources available to perform testing may impact application team’s ability to produce quality
results.
Un-configured environments may impact application team’s development and testing activity.

13.3 Dependencies
The Loyalty Program application teams will identify testing dependencies.
Examples of testing dependencies are:
1. A requirement for another application team to provide (load or create) test data.
2. A requirement for another application to be functional in the test environment during
testing.
3. A requirement for another team to process or validate output from testing.

14 Document References

Document Path/Location
Q4M Streamline (Beta) Document Assets
for WT1
Q4M Streamline (Beta) Document Assets
for WT2

15 Glossary of Terms
MST Glossary

16 Document History

Version Date Change Author

17 Reviewers and Approvers

Page 12 of 14
®eviewer,
Version Name / BU (A)pprover or Disposition Date
(B)oth

18 Appendix
N/A

Page 13 of 14
8 Process Asset History

Page 14 of 14

Potrebbero piacerti anche