Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Day 1 8:00 - 8:15 8:15 - 8:30 8:30 - 8:45 8:45 - 9:00 9:00 - 9:15 9:15 - 9:30 9:30 - 9:45 9:45 - 10:00 10:00 - 10:15 10:15 - 10:30 10:30 -10:45 10:45 - 11:00 11:00 - 11:15 11:15 - 11:30 11:30 - 11:45 11:45 - 12:00 12:00 - 1:00 1:00 -1:15 1:15 - 1:30 1:30 - 1:45 1:45 - 2:00 2:00 - 2:15 2:15 - 2:30 2:30 - 2:45 2:45 - 3:00 3:00 - 3:15 3:15 - 3:30 3:30 - 3:45 3:45 - 4:00 4:00 - 4:15 4:15 - 4:30 4:30 - 4:45 4:45 - 5:00 Module 1: Introduction 1.1 Course Introduction (0:30) 1.2 Ice Breaker (0:15)
Module 2: Testing in Accenture 2.1 ADM Overview (0:30) ADM Activity (0:15) 2.2 V-Model (1:00)
Module 3: Assembly Test (Continued) Activity Part 1 - Create the Test Plan (1:45)
Break
Break Module 2: Testing in Accenture V-Model Activity (0:45) 2.3 Quality Management (0:30) LUNCH
Module 3: Assembly Test (Continued) 3.3 Prepare the Test (0:30) Activity Part 2 - Create the Test Script (1:00) 3.4 Execute the Test (0:15)
LUNCH Module 4: Product Test 4.1 Overview (0:30) 4.2 Plan the Test (0:15) 4.3 Prepare the Test (0:15) 4.4 Execute the Test (0:15) Break
Module 2: Testing in Accenture (Continued) 2.4 Test Management (1:00) 2.5 ADT Tools (0:15) ADT Tools Activity (0:30)
Break Module 2: Testing in Accenture (Continued) 2.6 Checkpoint (0:30) Module 3: Assembly Test 3.1 Overview (0:30) 3.2 Plan the Test (0:45) Day 1 Summary
Module 4: Product Test Activity Part 3 Execute the Test Script (1:45) 4.5 Checkpoint (0:30)
Presentations
Module 1: Introduction
Module 1: Introduction
Copyright 2007 Accenture All Rights Reserved. Accenture, its logo, and Accenture High Performance Delivered are trademarks of Accenture.
Welcome!
Welcome to the Application Testing School!
Introductions
Module 1: Introduction
Course Positioning
This course is part of the required Application Delivery curriculum for Solutions and within the scope of the Accenture Solutions Delivery Academy
Copyright 2007 Accenture All Rights Reserved.
https://deliveryacademy.accenture.com
Copyright 2007 Accenture All Rights Reserved.
Module 1: Introduction
Icebreaker Activity
Locate the note card at your seat. Each person has a word on their note card that is one part of a matching pair. Find the person that has the note card that is the matching pair to yours. Once youve found your pair, introduce yourselves and find out the following: Name Role Reason for being here An interesting fact about you Introduce your partner to rest of the class.
Module 1: Introduction
Course Goals
What are your goals for this school? What do you hope to learn while you are here?
Course Goals
Module 1: Introduction
School Agenda
Module 1: Introduction Module 2: Testing in Accenture Module 3: Assembly Test Module 4: Product Test Module 5: School Closing
School Expectations
For a successful class, please
Arrive on time Assist your colleagues; show respect to all individuals regardless of their skill and knowledge level Wear business casual attire Do not use class time to surf the net, check e-mail or use instant messaging Turn all cell phones off Adhere to attendance policy as directed by your local training coordinator
10
Module 1: Introduction
Questions/Comments
What questions or comments do you have?
11
Copyright 2007 Accenture All Rights Reserved. Accenture, its logo, and Accenture High Performance Delivered are trademarks of Accenture.
Learning Objectives
Articulate ADMs high-level organization and structure Describe the Testing Framework Summarize how the V-Model is used to effectively test an application Identify and differentiate between the types of testing involved in the development lifecycle and explain their purpose and benefits, including:
Component Test Assembly Test Performance Test Product Test User Acceptance Test Operational Readiness Test
Map the phases of the software development lifecycle to the different test tasks related to those phases Define exit and entry criteria and summarize the importance of stage containment Summarize the importance of quality management in the testing process Describe the testers role in quality management Describe the various Quality Programs in Accenture, including QPI, SEPG, SQA, and CMMI Describe and differentiate between the various roles and responsibilities involved in a typical testing project Identify the critical aspects of managing the test process Describe the ADT Tools and explain their purpose Map the ADT tools to the phases of the V-model
Copyright 2007 Accenture All Rights Reserved.
Module 2: Agenda
ADM Overview The V-Model Quality Management Test Management ADT Tools for Testing Checkpoint
ADM Overview
What is Accenture Delivery Methods (ADM)?
A collection of methods that cover: Management Strategy and planning Development Operations
Activity
Stages Workstreams
Tasks
Copyright 2007 Accenture All Rights Reserved.
Testing Tasks
11
3. Prepare the Test 4. Establish the Test Environment 5. Execute the Test
Execute test Documented actual results and fixed defects Fully executed Test Plan Sign-off Test Closure Memo Sign-off sheet
15
16
Scenario Activity
Using the ADM website: You will work in teams of two With your partner, answer the questions on the worksheet provided Use the ADM website to assist you in finding the answers to the questions At the end of the activity, be prepared to share your answers with the class
https://kx.accenture.com/Offerings/Pages/ADM.aspx
Copyright 2007 Accenture All Rights Reserved.
17
18
Module 2: Agenda
ADM Overview The V-Model Quality Management Test Management ADT Tools for Testing Checkpoint
19
Product Test
Application Integration
Design
Component Test
21
Plan
Analyze
Design
Build
Test
Deploy
Product Test
Application
Integration
Performance Test
Assembly Test
Component Test
Validation
22
Plan
Analyze
Design
Build
Test
Deploy
Product Test
Application Integration
Performance Test
Design
Assembly Test
Component Test
Verification Validation
23
Plan
Analyze
Design
Build
Test
Deploy
Functional
Technical
Product Test
Application
Integration
Performance Test
Design
Assembly Test
24
Repeatability
Present when the same test can be reused both within and across releases.
Benefits Ensures standardized, measurable processes Saves time through test reuse Allows for test automation, reducing overall test execution time and potential for human error
Copyright 2007 Accenture All Rights Reserved.
Approaches Provide detailed, step-bystep instructions and expected results in test scripts Separate test data from test scripts Update the test model to reflect changes made to the application over time
25
10
Product Test
Application Integration
Design
Component Test
28
11
Accenture V-Model
Plan Analyze Design Build Test Deploy
Operational Readiness Test Requirements
Functional Technical
Product Test
Application Integration
Design
Component Test
12
Plan Product Test Plan Performance Test Plan User Acceptance Test Prepare and Execute Product Test Prepare and Execute Performance Test
Stage Containment
With Stage Containment
More errors than defects Cost and effort for fixing problems is minimized
Defect Origin Analyze Design Detailed Design / Build Component Test Assembly Test Prod Test Deploy
Defect Discovered
Analyze
Design
Component Test
Assembly Test
Prod Test
Deploy
32
13
Stage Containment
Without Stage Containment
More defects than errors Fixes become more expensive and difficult
Defect Origin Analyze Design Detailed Design / Build Component Test Assembly Test
Prod Test
Deploy
Defect Discovered
Analyze
Design
Component Test
Assembly Test
Prod Test
Deploy
POTENTIAL CHAOS!
Copyright 2007 Accenture All Rights Reserved.
33
Stage Containment
Without Stage Containment
Detailed Design / Build Component Test Assembly Test
Defect Origin
Analyze
Design
Prod Test
Deploy
Defect Discovered
Analyze
Design
Component Test
Assembly Test
Prod Test
Deploy
WORST CASE!
34
14
35
Development Manager
Test Lead
Tester
Quality Architects
Users
36
15
37
Module 2: Agenda
ADM Overview The V-Model Quality Management Test Management ADT Tools for Testing Checkpoint
38
16
39
Quiz Question # 1
What is Quality Management?
40
17
Quiz Question # 2
What are we doing to mitigate quality issues?
A. Establishing a structured quality process program (QPI) that will directly impact project execution and delivery Benchmark our quality levels against CMMI Level 3 A&B Accept the current losses as normal and hope the problem doesnt get worse
B. C. D.
41
Quiz Question # 3
What will this mean for me?
A. I will receive training, coaching, and support in best practice process areas which will increase my professional development My project could be evaluated on a monthly basis on adherence to the defined best practices and metrics I will be expected to follow Accentures corporate methodology (ADM) from planning through to implementation while tailoring it to the needs of the client and the size of the project All of the above
42
B.
C.
D.
18
Augments Accentures firm-wide methodology Provides monthly reviews against best practices Provides increased visibility into project execution
Copyright 2007 Accenture All Rights Reserved.
44
19
45
46
20
47
CMMI at Accenture
CMMI is used at Accenture in the following ways:
Is used for continuous improvement of our development practices Applies to all systems development and/or maintenance projects Is used with the Accenture Delivery Suite to reinforce systems industrialization and high performance technology
48
21
CMMI Levels
Emphasis on continuous improvement
5 Optimizing
4 Quantitatively Managed
3 Defined
Project-specific processes are based on a tailored version of an organization-wide process set
2 Managed
Processes are planned, documented, performed, monitored, and controlled (at the project level)
1 Initial
49
Benefits of CMMI
Cycle Time
Deliver faster than competitors. Accomplish more with less. Improve effectiveness of multi-site development. Improve accuracy of cost and time estimates. Improve ability to meet delivery commitments consistently and reliably. Reduce number of defects. Improve alignment of IT solutions to the needs of the business. Ensure investments target the right priorities. Optimize use of resources across priorities. Establish environment for continuous improvement through appropriate measurement.
Productivity
Predictability Quality
50
22
51
52
23
53
54
24
55
Provide accurate data on a regular basis to help the Test Lead generate accurate and effective metrics Know the importance for accurate data Provide accurate data to the Test Lead or Manager during various testing activities
Copyright 2007 Accenture All Rights Reserved.
56
25
57
Module 2: Agenda
ADM Overview The V-Model Quality Management Test Management ADT Tools for Testing Checkpoint
58
26
Team members must know howrolesteams work together of each the the and responsibilities and Technical Support Development Test team member to understandLead to expect. interact with each other. what Manager Manager
Tester Application Designers Application Developers Quality Architects Technology Designers Technology Developers
Users
59
Document your results on a flipchart and be prepared to share your thoughts with the class
60
27
63
Testing Techniques
Black Box Testing White Box Testing Positive Testing Negative Testing Static Testing Dynamic Testing
64
28
Input
Executable Program
Output
65
Disadvantages
Only a small number of possible inputs can actually be tested, to test every possible input stream would take nearly forever. Without clear and concise specifications, test cases are hard to design. There may be unnecessary repetition of test inputs if the tester is not informed of test cases the programmer has already tried. May leave many program paths untested.
66
29
67
Disadvantages
A skilled tester is needed to carry out this type of testing, which increases the cost. It is nearly impossible to look into every bit of code to find hidden errors, which may create problems and result in the failure of the application.
68
30
Positive Testing
Aimed at showing that the software works. Testing the system by providing valid data. Also known as "test to pass".
69
Negative Testing
Testing aimed at showing software does not work. Testing the system by providing invalid data. Also known as "test to fail".
70
31
Static Testing
Static Testing (also known as "Dry Run Testing") is a form of testing where the software isn't actually used. Syntax checking and manually reading the code to find errors are methods of static testing. This type of testing is mostly used by the developers. 'Static testing' is generally not taken as detailed testing, but checks mainly for the sanity of the code/algorithm, and to assess if the program is ready for more detailed testing, where methods like code review, inspection and walkthrough are used. Static testing can be done before compilation.
71
Dynamic Testing
Dynamic Testing involves working with the software, giving input values and checking if the output is as expected. Dynamic testing can take place only after compilation. It wont run otherwise.
72
32
Develop Approach
Plan Test
Prepare Test
Execute Test
73
74
33
76
77
34
78
5 4
2 1
Copyright 2007 Accenture All Rights Reserved.
79
35
80
81
36
82
Measure the test process, and thereby facilitate the understanding and control of it Determine:
The earned testing progress/value The amount of testing remaining When testing will be complete
83
37
84
Module 2: Agenda
ADM Overview The V-Model Quality Management Test Management ADT Tools for Testing Checkpoint
85
38
86
Use
Understand requirements and track their changes Model user interactions in order to validate the requirements Define test cases for the requirements
87
39
Use
Update the relevant UML models with changed or new requirements
88
Use
Create or update code, based on the model; the model and code are synchronized to reflect the change
89
40
Use
Create and maintain easy to follow manual tests with support for automated data entry and data verification to reduce human error
90
Use
Create and maintain automated scripts for regression testing of your applications With ScriptAssure to allow for changing user interfaces and script maintenance available in standard development languages
91
41
Use
Create or update code, based on the model The model and code are synchronized to reflect the change
92
Use
Define and group test cases Create the documents or automated scripts for executing the test cases, run test cases written with Rational Functional Tester or Rational Performance Tester Record the test results and store them in the backend database Analyze testing during planning through execution with built-in or custom-created queries or graphs
93
42
Spend 5 minutes discussing the answers to the following questions for your tool:
In which phase would your tool be used? How do you think your tool would be used? What do you think the benefits are for using your tool?
94
Deploy
Operational Readiness Test
Requisite Pro
Software Modeler Requirements Requirements
Functional Functional Technical Technical
Product Test
Application Integration Application Integration
Design Design
95
43
Customizable
96
Developer-strength editing
97
44
Developer-strength editing
98
https://kx.accenture.com/Communities/Pages/ADTC oP.aspx
99
45
100
Module 2: Agenda
ADM Overview The V-Model Quality Management Test Management ADT Tools for Testing Checkpoint
101
46
Checkpoint
Checkpoint Quiz Round 1!
Checkpoint
47
Copyright 2007 Accenture All Rights Reserved. Accenture, its logo, and Accenture High Performance Delivered are trademarks of Accenture.
Develop an understanding and fluency with the Test Plan/Process deliverables, including:
Test Conditions and Expected Results Test Scripts Test Cycle Control Sheets Test Data
Create concise and effective test conditions and expected results for an assembly test that reference all required specifications Create concise, effective and reusable test scripts for an assembly test Articulate the Developers perspective on the testing tasks for an Assembly test
Copyright 2007 Accenture All Rights Reserved.
Module 3: Agenda
Assembly Test Overview
Plan the Test Activity 1 Prepare the Test Activity 2 Execute the Test
Test Stages
1. Component Test 2. Assembly Test 3. Product Test 4. Performance Test 5. User Acceptance Test 6. Operational Readiness Test
Notice that the Test Stages are based upon the V-Model.
Product Test
Application Application Integration
Design
Component Test
The same testing tasks apply to each and every test stage.
until progressive assembly testing verifies the low-level functionality across the system.
10
Module 3: Agenda
Assembly Test Overview Plan the Test
Activity 1 Prepare the Test Activity 2 Execute the Test
11
12
13
Test Scenarios and Test Conditions and Expected Results (TCER) Test Cycle Control Sheet Test Script
Defines when and by whom test cycles are executed. Details the exact steps that a tester must follow to complete testing (i.e., to test all the test scenarios and conditions). Test scripts also include the data that is used for testing.
14
15
Deliverables
1. Test Scenarios (Usually created by Lead) 2. TCER - Test Conditions and Expected Results (This is the one we will focus on creating.)
The tester may or may not use all of the above input documents.
16
17
Use the use case model as an analysis tool to drive out and fully understand the application's functional requirements.
18
19
20
10
21
22
11
23
24
12
25
26
13
27
28
14
Module 3: Agenda
Assembly Test Overview Plan the Test
Activity 1
Prepare the Test Activity 2 Execute the Test
29
Timing
1 hour 45 minutes
30
15
Activity 1 Part 1
Activity Overview Instructions For Activity 1, Part 1, you will be creating the Test Conditions and Expected Results based on the Business Requirements. 1. Review the following inputs:
a) Test Scenarios b) Requirements document
2. Create test conditions and expected results and fill out the TCER template 3. Peer Review
a) Find a partner and review each others TCERs against the given criteria b) Share your feedback with your partner
Activity 1 Part 2
Activity Overview Instructions For Activity 1, Part 2, you will be creating the Test Conditions and Expected Results based on the UI Design. 1. Review the following inputs:
a) Test Scenarios b) UI Design document
2. Create test conditions and expected results and fill out the TCER template 3. Peer Review
a) Find a partner and review each others TCERs against the given criteria b) Share your feedback with your partner
16
Activity 1 Part 3
Activity Overview Instructions For Activity 1, Part 3, you will be creating the Test Conditions and Expected Results based on the Use Case documents. 1. Review the following inputs:
a) Test Scenarios b) End User Use Case c) Launch WMA Use Case
2. Create test conditions and expected results and fill out the TCER template 3. Peer Review
a) Find a partner and review each others TCERs against the given criteria b) Share your feedback with your partner
Module 3: Agenda
Assembly Test Overview Plan the Test Activity 1 Prepare the Test Activity 2 Execute the Test
34
17
35
Develop Approach
Plan Test
Prepare Test
Test Plan Test Approach Test Scenarios Test Conditions and Expected Results Test Cycle Control Sheet Test Script Test Data
Execute Test
36
18
Deliverables
1. TCCS Test Cycle Control Sheet (Usually created by Test Lead) 2. Test script (This is the deliverable we will focus on creating)
37
Test Scenarios and Test Conditions and Expected Results (TCER) Test Cycle Control Sheet Test script
Defines when and by whom test cycles are executed. Details the exact steps that a tester must follow to complete testing (i.e., to test all the test scenarios and conditions). Test scripts also include the data that is used for testing.
38
19
39
Release Identifier - The specific product release number for which this deliverable was produced.
40
20
Product - The name of the product for which this deliverable was produced.
41
Platform - The type of application (operating system and hardware) on which the product was designed to run.
42
21
Configuration - A description of how the product is configured; for example, a specific application setting or a tool that was used to manage a group of application settings.
43
Test Model Status - An indicator of whether or not the deliverable has been approved by a project manager.
44
22
Test Model Version - The version number of the deliverable, indicating how many times it has been revised.
45
Prepared by - - The initials of the person who originally prepared this deliverable, and the date on which the deliverable was prepared.
46
23
Approved by - The initials of the person who approved the deliverable (if applicable), and the date on which it was approved.
47
Document Status - An indicator of whether or not the deliverable has been approved by a project manager.
48
24
49
Cycle Number - The sequence in which the test cycle will be run within the specified test level.
50
25
51
Scheduled Start/End Date - The dates on which the specified test cycle is scheduled to begin and end, respectively.
52
26
Actual Start/End Date - The dates on which the specified test cycle was actually started and completed, respectively.
53
54
27
55
56
28
Example 2
Functional Cycle Grouping Check Out Book Reserve Book Check In Book Cancel Reservation
Get Patron Info Get Book Info Process Loan Process Return Process Reservation Process Cancellation
57
58
29
59
Step Number
1.001
Action / Description
Enter User Name Create User Password
Input Data
First Name: XXX Last Name: XXX Middle Initial: X Password: XXX Retype Password: XXX
Expected Result
Page displays the entered information Password is displayed as a series of asterisk (***)
1.002
The Step Number field identifies each step to be executed. When a defect is identified, the corresponding Step Number should be documented in the SIR.
Copyright 2007 Accenture All Rights Reserved.
60
30
Step Number
1.001
Action / Description
Enter User Name Create User Password
Input Data
First Name: XXX Last Name: XXX Middle Initial: X Password: XXX Retype Password: XXX
Expected Result
Page displays the entered information Password is displayed as a series of asterisk (***)
1.002
The Action/Description field provides detailed directions as to HOW to execute each step, and should be based off of the existing Test Conditions. Each of these steps should begin with an action verb, and should only describe one action at a time.
Copyright 2007 Accenture All Rights Reserved.
61
Step Number
1.001
Action / Description
Enter User Name Create User Password
Input Data
First Name: XXX Last Name: XXX Middle Initial: X Password: XXX Retype Password: XXX
Expected Result
Page displays the entered information Password is displayed as a series of asterisk (***)
1.002
The Input Data field lists each field for which the tester should input data to complete the Action/Description. Note that this field only lists the type of data to be entered (hence the XX format); the data itself is housed in a separate document.
Copyright 2007 Accenture All Rights Reserved.
62
31
Step Number
1.001
Action / Description
Enter User Name Create User Password
Input Data
First Name: XXX Last Name: XXX Middle Initial: X Password: XXX Retype Password: XXX
Expected Result
Page displays the entered information Password is displayed as a series of asterisk (***)
1.002
The Expected Result field should be closely aligned with the existing Test Conditions and Expected Results. It should explicitly describe the result testers should see upon completing the action for that step.
Copyright 2007 Accenture All Rights Reserved.
63
Example 2
Functional Cycle Grouping Check Out Book Reserve Book Check In Book Script 1 Script 2 Script 3
Script 4
Cancel
Reservation
Script 4
Script 5
64
32
Refine
Order
Test Scripts
Copyright 2007 Accenture All Rights Reserved.
65
66
33
67
Given these benefits of Common Test Data, we need to effectively Plan for our Test Data.
68
34
69
70
35
71
Module 3: Agenda
Assembly Test Overview Plan the Test Activity 1 Prepare the Test Activity 2 Execute the Test
72
36
Timing
Module 3: Agenda
Assembly Test Overview Plan the Test Activity 1 Prepare the Test Activity 2 Execute the Test
74
37
75
76
38
77
78
39
79
40
Copyright 2007 Accenture All Rights Reserved. Accenture, its logo, and Accenture High Performance Delivered are trademarks of Accenture.
Module 4: Agenda
Product Test Overview
Plan the Test Prepare the Test Execute the Test Activity 3
Checkpoint
Test Stages
1. Component Test 2. Assembly Test 3. Product Test 4. Performance Test 5. User Acceptance Test 6. Operational Readiness Test
Notice that the Test Stages are based upon the V-Model.
Product Test
First point where the solutions functional and business requirements are tested together. Tests that all functional and business requirements have been met by the system.
Ensures that the interactions between the components function correctly before building Ensures that users already have the overall solution. a working understanding of the system when they design and execute the User Acceptance Test (UAT). Sometimes called system test.
Copyright 2007 Accenture All Rights Reserved.
A test of the business requirements met by each individual application. An end-to-end test of the business requirements across all applications and platforms.
10
11
12
13
The testers role is to execute complete end-to-end business scenarios that map to the defined system requirements of the system.
14
15
16
Module 4: Agenda
Assembly Test Overview Plan the Test
Prepare the Test Execute the Test Activity 3
Checkpoint
17
Tasks include: Define Product Test Approach Define Product Test Conditions and Expected Results Define Product Test Cycles Perform Peer Reviews
Copyright 2007 Accenture All Rights Reserved.
18
19
Deliverables
The tester may or may not use all of the above input documents.
Copyright 2007 Accenture All Rights Reserved.
20
10
21
22
11
23
24
12
25
Module 4: Agenda
Assembly Test Overview Plan the Test Prepare the Test Execute the Test Activity 3 Checkpoint
26
13
27
28
14
Deliverables
29
30
15
31
32
16
33
Module 4: Agenda
Assembly Test Overview Plan the Test Prepare the Test Execute the Test
Activity 3
Checkpoint
34
17
35
The deliverables are documented actual results, fixed defects, and a fully executed Test Plan.
36
18
37
38
19
39
40
20
41
42
21
Provides an estimated priority level Clearly describes the expected outcome Includes complete documentation of the resolution and object impacts (upon resolution) Includes accurate information Updated on a timely basis
43
44
22
45
46
23
Module 4: Agenda
Assembly Test Overview Plan the Test Prepare the Test Execute the Test Activity 3 Checkpoint
47
Inputs
Timing
24
Module 4: Agenda
Assembly Test Overview Plan the Test Prepare the Test Execute the Test
Activity 3
Checkpoint
49
Checkpoint
Checkpoint Quiz Round 2!
Checkpoint
25
Module 1: Introduction
Copyright 2007 Accenture All Rights Reserved. Accenture, its logo, and Accenture High Performance Delivered are trademarks of Accenture.
Looking Back
After completing this course, you should now be able to: Identify and describe the major testing tasks within ADM Describe the role and responsibilities of a tester Articulate how a developer contributes to a successful test execution Understand and apply the V-Model in application testing Read, understand, and correctly interpret functional specifications, requirements, and other test planning documentation, and state the requirements for the creation of each deliverable Generate effective TCERs and Test Scripts using the appropriate functional specifications, requirements, and test planning documentation Successfully execute Assembly and Product tests based on the appropriate test planning documentation Properly report defects found during test execution
Copyright 2007 Accenture All Rights Reserved.
Module 1: Introduction
School Recap
Through presentations, discussions, and activities, this course has covered the following topics:
Accenture Delivery Methods V-Model Quality and Test Management ADT Tools The Assembly Test The Product Test Creating Test Conditions and Expected Results Creating and Executing Tests Scripts
Reflections
What new insights, strategies, and skills have you gained from this school? How do you think you will use these new skills and insights on your current or next project?
Module 1: Introduction
Additional Resources
Seek assistance when you need it:
Use https://experts.accenture.com/ Join communities of practice Use the methodology! Visit https://kx.accenture.com/Offerings/Pages/ADM.aspx
Module 1: Introduction
Activities
Overview
In this activity you will use your knowledge of Accenture Methods to answer scenario-based questions.
Instructions
With your partner, answer the following questions. You may use the ADM website as a resource to assist you in answering your questions. Scenario 1: You have been assigned on a project, and have been asked to prepare a product test for the client application. Even though this is your first time working with a product test, your Team Lead is overloaded with work and doesnt have time to train you. So, you go to ADM to find out the information you need. 1. Where in ADM can you find out about preparing a product test?
2. What are all the things you need to do to completely prepare for a product test?
3. What are the inputs that need to be complete in order for you to begin to prepare for a product test?
4. One part of this task involves creating the test script for a product test. Youve never created a product test script before. What do you do?
Mod2_ADMActivity_ParticipantInstructions.doc
Scenario 2: You just got back from a planned extended vacation. When you return from your vacation, you are assigned to a new project, to fill in on a short-term basis. The resource you are filling in for left unexpectedly due to a serious illness; as this was an unplanned leave, the resource was not able to hand off his/her assignments appropriately. Youve been given a Test Approach and a Requirements document, and have been told that you need to complete the Test Plan for the assembly test for the client application. 1. What do you do first?
2. Where do you go to find out what you need to do to complete the test plan?
3. What other documents do you need to complete in order to complete the test plan?
Mod2_ADMActivity_ParticipantInstructions.doc
Overview
In this activity you will research one of the types of testing and presenting your findings to the class.
Instructions
1. Faculty will break the class up into six different groups 2. Each group will be assigned one of the following tests: a. Component b. Assembly c. Performance d. Production e. User Acceptance f. Operational Readiness 3. For your assigned test, research the following information: a. What does your test do, and what is its purpose? b. Who typically performs your test? c. When does your test typically occur? d. What are the major inputs and deliverables for your test? e. What are some best practices? 4. Document your answers on a flipchart, and prepare to teach the class about your test
Mod2_VModelActivity_ParticipantInstructions.doc
Overview
In this activity you will answer questions about one of the ADT Tools.
Instructions
1. You will be separated into 5 groups 2. Each group will be assigned one of the tools below: a. Rational RequisitePro b. Rational Manual Tester c. Rational Functional Tester d. Rational Performance Tester e. ClearQuest Test Management 3. As a group, spend 5 minutes discussing the answers to the following questions for your tool: a. In which phase would your tool be used? b. How do you think your tool would be used? c. What do you think the benefits are for using your tool? 4. Each group will share their responses with the class
Mod2_ADTToolsActivity_ParticipantInstructions.doc
Activity 1 Part 1
1. Review the Test Scenarios document and the Requirements document. 2. Reflect on each business requirement and consider the following: a. What is the business requirement for and what does it accomplish? b. What would be the criteria used to validate the business requirement? 3. Using the TCER template, create a test condition and an expected result for each business requirement. Fill out each column of the TCER. 4. Trace each Test Condition back to a specific requirement. 5. Break into groups and perform a peer review of your work. a. During your peer review, you should be look for and discuss the following: i. What Functional Requirement is directly related to this Test Condition and Expected Result? ii. Are there other Test Conditions related to this Functional Requirement? If so, how does the Test Scenario differ? iii. Is the level of detail consistent across all Test Conditions and Expected Results? iv. Is a high degree of detail provided, both in the condition and the expected result? v. Does each Test Condition and Expected Result closely match a corresponding Test Scenario and Business Requirement? 6. Prepare to share your thoughts with the class.
Tip: Key criteria when writing TCERs. A high level TCER does not have specific functional or technical specifications (usually derived from a high level business requirement document).
Activity1_ParticipantInstructions.doc
Activity 1 Part 2
1. Review the Test Scenarios document and the UI Design document. 2. Reflect on each screen within the UI Design document consider the following: a. What is the purpose of the screen? What are you supposed to be able to accomplish? b. What would be the criteria used to validate that the screen functions properly? 3. Using the TCER template, create a test condition and an expected result for each business requirement. Fill out each column of the TCER. 4. Trace each Test Condition back to a specific requirement. 5. Break into groups and perform a peer review of your work. a. During your peer review, you should be look for and discuss the following: i. What Functional Requirement is directly related to this Test Condition and Expected Result? ii. Are there other Test Conditions related to this Functional Requirement? If so, how does the Test Scenario differ? iii. Is the level of detail consistent across all Test Conditions and Expected Results? iv. Is a high degree of detail provided, both in the condition and the expected result? v. Does each Test Condition and Expected Result closely match a corresponding Test Scenario and Business Requirement? 6. Prepare to share your thoughts with the class.
Activity1_ParticipantInstructions.doc
Activity 1 Part 3
1. Review the Test Scenarios document and the Use Case documents. 2. Reflect on each use case and consider the following: a. What is the use case for and what does it accomplish? b. What would be the criteria used to validate the use case? 3. Using the TCER template, create a test condition and an expected result for each business requirement. Fill out each column of the TCER. 4. Trace each Test Condition back to a specific requirement. 5. Break into groups and perform a peer review of your work. a. During your peer review, you should be look for and discuss the following: i. What Functional Requirement is directly related to this Test Condition and Expected Result? ii. Are there other Test Conditions related to this Functional Requirement? If so, how does the Test Scenario differ? iii. Is the level of detail consistent across all Test Conditions and Expected Results? iv. Is a high degree of detail provided, both in the condition and the expected result? v. Does each Test Condition and Expected Result closely match a corresponding Test Scenario and Business Requirement? 6. Prepare to share your thoughts with the class.
Activity1_ParticipantInstructions.doc
Test Condition
User succesfully logs in as a user where user's welcome screen is correctly displayed
R0
Activity1_TCERTemplate.xls
Scenario ID Sub-Scenario ID
Description Test login to WMA to launch the Welcome page Test online help Test Create a work item from a template feature
Expected Result
Bus. Requirement ID
Pass 1 Status
Pass 2 Status
Pass 3 Status
1 2
1 2
R0
R1
3.2 From End User Welcome page, open Outstanding / Assigned work queue items Complete a work item with Freeform comments Change Status of Work Item Navigate away from WMA whilst working on awork item Test work items for Team Lead_Manager
R2
3 3 3 3
R4
4 4 4 5 5
4 4.1 4.2 5 5.1 Test Create a work item from a template feature Test Assign Work item Test Maintain Schedule Test Team Calendar Page under Maintain Schedule
Activity1_Input_Scenarios.xls
Application Testing
1 Business Requirements
1.1 Features and Functionalities approved for development
WMA Release 2 of the Workflow Management Application Project will include the analysis, design, development, testing and deployment of the following new features/functionalities: R0. Ability to log into the WMA application welcome screen R1. Ability for end user to create new work items using templates R2. Ability for users to conduct a keyword search to find specific work items R3. Ability for the end users to pull work out of a work queue based on skill set R4. Ability for user to open Outstanding / Assigned work queue items on the welcome screen R5. Ability to indicate to an end user that a work item in a queue is being worked by someone else R6. Ability to search pending work and open in view only mode R7. Ability to report statistics for Key Operating Indicators R8. Ability for administrator/manager to setup security levels R9. Ability to allow administrator/manager to update multiple work items concurrently R10. Ability for administrator/manager to view work queue statistics in real time R11. Ability for manager to sort work in queue to show work item prioritization R12. Ability for administrator/manager to add/update user skill sets R13. Ability for administrator/manager to add/maintain users and user profile information
Activity1.1_Input_Requirements.doc
Logo
Welcome
Get Context Sensitive Help
Log off
Welcome to Workflow Management Application, John, Doe! Your last visit to WMA was 15-Aug-2004 8:05 am
August 20, 2004 @2004 Accenture Business Services for Utilities. All Rights Reserved. Confidential and Proprietary Information.
8/9/2007 4:45:49 PM
Logo
Welcome > Work Items List All Status
Get Context Sensitive Help
Log off
August 20, 2004 @2004 Accenture Business Services for Utilities. All Rights Reserved. Confidential and Proprietary Information.
8/9/2007 4:45:49 PM
Module 3: Assembly Test Activity 1.3 End User Use Case Input
1. Name
End User Experience
2. Description
The steps an end user experiences from logging into WMA until completing a work item in WMA.
3. Related Requirements
The following is a list of the requirement items and their IDs (from the Requirements document) that this use case is associated with:
Requirement ID R3.1 R3.6 R5.3 R5.4 R5.5 R5.15 R6.9 R10.2 R10.4 R10.5 R10.8 R10.20 R12.1 R12.2
Requirement Description Ability to display levels of urgency for trouble areas of work. Ability to record an audit trail. Ability to timestamp/userstamp a work item when it is first addressed by a user. Ability to timestamp/userstamp a work item each time the status of the work item is updated by the assigned user. Ability to timestamp/userstamp all work items at completion. Ability to track users time in day adequately even when not actively working in Workflow tool. Ability to track non-business function time in tool. Ability for user to add user comments on records, if applicable. Ability for user to view assigned work queue on login. Ability for user to change status of work item. Ability for status of work item to change automatically when the work item is first opened by an end user (Assigned -> Being Worked). Ability to copy & paste data between tool and legacy applications. Ability to require certain fields in the user interface. Ability to automatically validate population of required fields.
4. Actors
The following actors interact with this use case: Actor End User WMA Tool Legacy System Actor Description End User refers to the Clerks and CSRs who perform backoffice (nonphone) work. WMA is the Workflow Management Application being built to distribute, assign, track, and report backoffice work. Legacy system refers to the systems that produce backoffice exceptions and/or are used to complete backoffice work. Examples of legacy systems include CIS, ICSS, Energy, OWMS, and CDF.
Activity1.3_Input_EndUserUseCase.doc
Module 3: Assembly Test Activity 1.3 End User Use Case Input
5. Base Flow
The following base/main flow represents the use cases default or normal behavior when actors interact with this use case:
Step 1 2
Name WMA Tool Opens Select Assigned Work Queue Display Assigned Work Queue
Description WMA tool opens to Welcome Page where end user has several options to select/click. End user clicks hyperlink to view his/her assigned work queue.
Outputs WMA Tool opened Assigned Work Queue selected Assigned Work queue is displayed and automatically ordered
WMA Tool
End User
4 5
Assigned work queue is displayed and automatically ordered Assigned Work queue is displayed Work item selected
WMA displays work items in assigned work queue ordered based on color (yellow, red, then green), due date (the ones due the soonest on top) and due time.. (See Urgency Assessment topic in ABS_WMA Online Process Helpv7.0SL_NC-M.doc doc for details on when work items turn yellow) End user views level of urgency by color, work item ID (randomly generated #), business function, status, due date and due time. Assigned Work queue is displayed Work item selected. End user clicks/selects Work Item ID hyperlink of 1st work item in assigned work queue. WMA records entry in audit trail with time and user information. (See Items to Add to TrainingNov22.xls spreadsheet, rows #22 and #26 and the WMA_pagespecification document for details on audit trail. Also see Status Sequencing by Role.vsd for details on status changes for every possible scenario) WMA changes the status of the work item to Being Worked (means end user is currently working on work item in workflow). WMA records entry in audit trail with time of status change and user information. WMA displays details of work item in Work Item Details page such as account #, customer
Work Item selected Record in audit trail with user name, date/time, business function, etc.
WMA tool
Work item selected for first time by end user. Status change Work item Selected
Status changed to Being Worked Record inserted into audit trail. Details of Work item displayed
7 8
Activity1.3_Input_EndUserUseCase.doc
Module 3: Assembly Test Activity 1.3 End User Use Case Input
name, error reason, etc. (different info for each work item...see Work Item Details wireframes and page specifications). The title of the Work Item Details page is Work Item Detail (Work Item ID#) Status. If the status is Pending Investigation, Pending Investigation is in the title and also defaults as the status in the Status dropdown box. If the status is Assigned, the work item status changes to Being Worked when the end user opens the work item, so Being Worked is in the title, but the Status dropdown menu defaults to blank in this case. End user reviews detailed information in work item. Highlight specific details of work item and copy Selected work item in WMA End user clicks on Legacy System icon Click on field in main retrieval window of Legacy system Enter or paste info into legacy system to retrieve relevant data Work performed in legacy system Comments field is optional in this scenario. End user highlights the account # or customer name or address within the work item and selects copy. End user selects legacy application icon on workstation to open the application. Legacy system opens to main retrieval window End user clicks on field within retrieval window of the legacy application and pastes the account # or customer name or address that was copied from WMA. Legacy system retrieves relevant data based on retrieval info that was either pasted or keyed in by the end user. End user completes work for the work item in the legacy application. End user toggles from the legacy application back to WMA. If needed, end user adds comments to the work item in WMA. (See the WMA_pagespecifications document for details on freeform comment field specs) After the work item is completed in the legacy system the end user toggles back to WMA and Info pasted (left navigation menu disabled...use r must Save and Close work item to exit page)
9 10
Info copied
11 12
13
14
Legacy System
Information retrieved
15
16
End User
End User
17
End User
Status Change
Activity1.3_Input_EndUserUseCase.doc
Module 3: Assembly Test Activity 1.3 End User Use Case Input
changes the status of the work item to Complete (means no further action is required-work item is completed). -See Status Dropdown Revised.xls for list of statuses available for end users in dropdown menu. -See Online Help Field-Level Content Nov12.xls row #93 for status definitions) WMA records entry in audit trail with time of status change and user information. End user clicks Save and Close button to close the work item once they have changed the status to Complete. Workflow validates that all required fields are populated. If a required field is missing data, text appears advising user to populate required fields. WMA reverts back to assigned work queue once the end user closes the work item they had opened. Completed work item disappears from assigned work queue.
18 19
Record inserted into audit trail. Work item closed in WMA tool
20
WMA tool
6. Alternative Flows
The following flows represent optional or exceptional behaviors of the use case:
Alternative Not first time work item opened by end user Work item cant be completed immediately
Alternative Description The end user views the assigned work queue and decides which work item to open. They may select a work item that they have never opened before, which is the scenario described above WMA automatically changes the status of the work item from assigned to being worked. However, the end user may select a work item that has been opened before (in this case the work item is in Pending Investigation status). WMA will not perform an automatic status change if the work item has been opened before. The scenario above describes a situation where the end user is able to complete the work item in the legacy system. The end user may look at the info in the legacy system and determine more work is needed at a later time before the work item can be closed. Instead of changing the status of the work item to Completed the end user would select Pending Investigation from the Status dropdown menu on the Work Item Details page. The Standard Comments field is enabled and required. Select the desired standard comment from the dropdown menu (See Online Help Field-Level Content Nov12.xls row #96 for list of standard comments in dropdown menu). After selecting a Standard Comment, a text box appears instructing the end user to enter specific information into the Freeform Comments field. The Freeform Comment field is required. Enter details about why the work item is still pending rather than being completed. (See Online Help Field-Level Content Nov12.xls row #98 for list of what to enter in Freeform comments for each standard comment). Click Save and Close button. An audit trail is recorded to reflect the date/time and user information with the status
Activity1.3_Input_EndUserUseCase.doc
Module 3: Assembly Test Activity 1.3 End User Use Case Input
Work Item requires completion by another team that does not have access to Workflow
change. After clicking the Save and Close button, Workflow reverts back to the assigned work queue. The work item remains in the end users assigned work queue in Pending Investigation status. The scenario above describes a situation where the end user is able to complete the work item in the legacy system. The end user may look at the info in the legacy system and determine the work item needs to be reviewed by a team lead (it needs to be Escalated). Instead of changing the status of the work item to Completed the end user would change the status of the work item to Escalated. (See Online Help Field-Level Content Nov12.xls row #93 for status definitions) The Freeform Comments field is required so the end user can explain why they believe the work requires escalation. (See Online Help Field-Level Content Nov12.xls row #98 for list of what to enter in Freeform comments for each standard comment). Click Save and Close button. An audit trail is recorded to reflect the date/time and user information with the status change. After clicking the Save and Close button, Workflow changes the work item status to Escalated for the team lead/team manager to address and Workflow reverts back to the assigned work queue. NOTE: please do not describe work items as being re-routed to the team lead/team manager queue, as they can see all work items in all statuses. The work item status is just changed, which indicates to the team lead/team manager that his/her attention is required. The scenario above describes a situation where the end user views the details of the work item and then goes to the legacy application to investigate and complete the work item. When the end user views the work item in WMA they may determine the work item was incorrectly routed. In other words, the end user does not believe he/she has the appropriate skill-set to complete the work item. The end user changes the status of the work item to Mis-Routed. The Freeform comments field is required so the end user can explain why he/she believes he/she doesnt have the appropriate skill-set for the work item. (See Online Help Field-Level Content Nov12.xls row #98 for list of what to enter in Freeform comments for each standard comment). Click Save and Close button. An audit trail is recorded to reflect the date/time and user information with the status change. After clicking the Save and Close button, Workflow changes the work item status to Mis-Routed for the team lead/team manager to address and Workflow reverts back to the assigned work queue. The Escalated work item disappears from the assigned work queue.. NOTE: please do not describe work items as being re-routed to the team lead/team manager queue, as they can see all work items in all statuses. The work item status is just changed, which indicates to the team lead/team manager that his/her attention is required. The scenario above describes a situation where the end user views the details of the work item and then goes to the legacy application to investigate and complete the work item. When the end user views the work item in WMA they may determine the work item must be completed by another team. The end user selects Completion-Other Team from the Status dropdown menu. The Team field becomes enabled for the end users to select which team will complete the work item. (See the WMA_pagespecifications document for a list of teams in dropdown menu) The Freeform Comments field is required so the end user can explain the work done, work remaining and reason for sending work to another team. After clicking the Save and Close button, an audit trail is recorded to reflect the date/time and user information with the status change. The end user must manually give the work to the appropriate team outside of workflow to ensure the work gets completed. After clicking the Save and Close button, the work item disappears from the assigned work queue. The end user views his/her assigned work queue. There may be times when the end user has to do other work (exception work) such as meetings, training, special projects, inbound phone calls, etc. From the assigned work queue the end user will click on the desired Other Work hyperlink (see Other Work Exceptions_Maintain Schedule EventsNov16.xls spreadsheet for list of Other Work types). WMA then displays the Other Work page. The end user clicks Start button. At this time WMA records an audit trail with time and user information. When the end user is done with the other work the 5 Activity1.3_Input_EndUserUseCase.doc
Module 3: Assembly Test Activity 1.3 End User Use Case Input
Illegally exiting from Worfklow 2 People accessing the same work item
end user enters required comments (see Online Help Field-Level Content Nov12.xls spreadsheet (row #104: Comments box) for specific comments required for each type of Other Work). The end user clicks Stop button in WMA. WMA again records an audit trail with time and user information. WMA then automatically reverts to the assigned work queue. If the user clicks Stop button before entering comments, a validation message will appear, advising the user to enter comments (see WMA_pagespecifications for details on validation message). NOTE: If end user illegally exits from the Other Work screen, audit trail records time of exit, but no comments. This indicates to Team Lead/Mgr that page was illegally exited by the end user. Power Outage: User must re-log into Windows in order to re-access Workflow. See WMA Wireframes Iteration 1 document Warnings tab for details on effects of illegally exiting Workflow pages. See Item #37 in the Items to Add to Training document.
NOTES: See Online Help Field-Level Content Nov12.xls spreadsheet and page specifications for specific field definitions See Business FunctionsNov23.xls spreadsheet for list of business functions for Release 1. See ABS_WMA Online Process Helpv7.0SL_NC-M.doc document for specific detailed steps. See WMA wireframes Iteration 1 - Work Item Details - 19-Nov-2004 Update.vsd and WMA_pagespecification- Work Item Detailsupdatednov19.doc for details on system-generated data displayed at the top of each Work Item Details page (it is different for each business function).
7. Additional Information
The following section outlines additional information relevant to the use case.
If the workstation was to freeze at any point while in workflow and the end user is forced to reboot, the end user would log back into workflow and go back to where the end user had left off
Activity1.3_Input_EndUserUseCase.doc
Module 3: Assembly Test Activity 1.3 Launch WMA Use Case Input
1. Name
Launch of WMA
2. Description
The steps taken when opening the WMA tool.
3. Related Requirements
The following is a list of the requirement items and their IDs (from the Requirements document) that this use case is associated with: Requirement ID R4.1 R0 R8 Requirement Description Ability to setup security levels. Ability to login to the application with a user id. Ability for user to view assigned work queue on login.
4. Actors
The following actors interact with this use case: Actor End User Team Lead/Mgr WMA Tool Actor Description End User refers to the Clerks and CSRs who perform backoffice (nonphone) work. Team Lead/Mgr refers to the Team Leads and Team Managers who are responsible for the workload and end users that perform backoffice (non-phone) work. WMA is the Workflow Management Application being built to distribute, assign, track, and report backoffice work.
5. Base Flow
The following base/main flow represents the use cases default or normal behavior when actors interact with this use case:
Step 1
Triggers
Inputs
Description End User doubleclicks WMA icon on desktop. Can also: -right click on WMA icon on desktop and click Open.
Outputs
WMA Tool
Activity1.3_Input_LaunchWMAUseCase.doc
Module 3: Assembly Test Activity 1.3 Launch WMA Use Case Input
to recognize/ authenticate the end user. WMA tool opens. User id/security level WMA tool opens to the Welcome Page, displaying a personalized welcome statement with the end users name and the date/time he/she last visited Workflow.
3 4
6. Alternative Flows
The following flows represent optional or exceptional behaviors of the use case:
Alternative Team Lead/Mgr opens WMA WMA tool accesses Workstation login but doesnt recognize user
Accessing WMA with multiple users End User Relaunch of WMA after previous session was exited illegally
Alternative Description When the Team Lead/Mgr opens WMA (using 1 of the methods indicated in step 1 above), WMA accesses workstation login information to recognize/authenticate Team Lead/Mgr. After recognized/authenticated, WMA tool opens to the Welcome Page displaying a personalized welcome statement with the Team Leads/Mgrs name and the date/time he/she last visited Workflow. A. When the user (could be end user/team lead/mgr) opens WMA, WMA accesses workstation login information to recognize/authenticate the user. If the user isnt recognized/authenticated, a login window appears prompting the user to enter his/her user id and password. The user enters his/her user id and password. When WMA recognizes/authenticates the entered user id and password, WMA is opened and the personalized Welcome pages are displayed with the appropriate left navigation menu based on security level. B. When the user (could be end user/team lead/mgr) opens WMA, WMA accesses workstation login information to recognize/authenticate the user. If the user isnt recognized/authenticated, a login window appears prompting the user to enter his/her user id and password. The user enters his/her user id and password. If WMA doesnt recognize/authenticate the entered user id and password, WMA displays a Login Failed message. The user has the option of re-entering the user id and password or communicating the issue to WMA Support team. If user A (could be end user/team lead/mgr) is logged into WMA system on the workstation and another user B (could be end user/team lead/mgr) attempts to log into WMA on the same workstation, a window will be displayed asking if user B would like to navigate away from the page to re-login into the WMA system as the new user. If an End User illegally exited (closed browser) a work item with Being Worked status to end their last WMA session, the next time the End User opens the WMA it will launch the Welcome Page. If an End User illegally exited (closed browser) a work item with a status other than Being Worked to end their last WMA session, the next time the End User opens the WMA it will launch the Welcome Page.
Activity1.3_Input_LaunchWMAUseCase.doc
Instructions
NOTE: You will need to use the case application to complete this activity. 1. Refine the test conditions and expected results to make sure they provide a high level of detail. NOTE: There are many different ways to logically order scripts; for example, they might be sequenced to represent one set path of functionality through a system. Refining and ordering TCERs can be an iterative process. 2. Order the test condition and expected results into a logical sequence such that a test pass can be executed in a natural manner. This order is typically documented in the Test Cycles. 3. Populate the test script template with the test scripts. 4. Review the created test script and make adjustments as necessary. 5. Peer Review your work following the instructions below.
Peer Review
1. You will be broken up into small groups. Trade your test scripts with someone in your group. 2. Look for the following: Were the test scripts written at the right level of detail? Did the test scripts test each Test Scenario? Could you execute another participants script? How accurately do you think you can execute the other participants test script? Would you have to vary from the written instructions at all? Is the Requirements Traceability Matrix updated correctly detailing each test condition that is being tested in this test script? Does the test script have any vague or insufficient details to make it unusable? Does each run consider the positioning of where you are at in the application so it flows correctly into subsequent runs/cycles? Are data requirements complete and correct? Does the test script cover all entry and exit points as per the UI design document(s) relating to the purpose of the test script?
1 Activity2_ParticipantInstructions.doc
Are minimum and maximum characters tested in the test script for data entry fields? Are all error conditions tested in the test script, including verification of the error message format and error indicator(s) presence with location of the indicators? Does the test script contain any redundant unnecessary testing?
3. Discuss your findings individually with your group. 4. Be prepared to participant in a class debrief discussion.
The Step Number field identifies each step to be executed. When a defect is identified, the corresponding Step Number should be documented in the SIR. The Action/Description field provides detailed directions as to HOW to execute each step, and should be based off of the existing Test Conditions. Each of these steps should begin with an action verb, and should only describe one action at a time. The Input Data field lists each field for which the tester should input data to complete the Action/Description. Note that this field only lists the type of data to be entered (hence the XX format); the data itself is housed in a separate document. The Expected Result field should be closely aligned with the existing Test Conditions and Expected Results. It should explicitly describe the result testers should see upon completing the action for that step.
Keep in mind what good test scripts look like will help you when you set out to create them. Provide explicit, step-by-step instructions Are highly detailed Are reusable Cover all Test Conditions and Expected Results Reference necessary input data Follow a standard format
Activity2_ParticipantInstructions.doc
Test Script ID Requirement ID Condition ID Purpose Input Expected Output Environmental Need Dependency Procedure Test Script Execution
tsTEO0-1 R0 0-1 Verify that the user can successfully log into the WMA application system with the user type of End User N/A User Name/Password (Simulated with End User 01 Icon on desktop) End User 01 welcome screen is correctly displayed
Name: Steps
Purpose: Result
Pass1 Pass2
1. Ensure Virtual Image is correctly loaded and windows server desktop is correctly displayed. 2. Ensure that the desktop displays at least the following WMA user icons: End User 01, End User 02, End User 03, End User 04, Team Manager 01 3. On the desktop, double click on the End User 01 Icon to log into the WMA system as End User 01. 4. Ensure that Web browser opens up and redirects the user to the End User 01 Welcome screen. PASS 1 Passed/Failed PASS 2 Passed/Failed If passed, P; if failed, F and the bug ID from defect management tool for identification. If passed, P; if failed, F and the bug ID from defect management tool for identification.
Test Script ID Requirement ID Condition ID Purpose Input Expected Output Environmental Need Dependency Procedure Test Script Execution
tsTEO0-2
N/A
Name: Steps
Purpose: Result
Pass1 Pass2
If passed, P; if failed, F and the bug ID from defect management tool for identification. If passed, P; if failed, F and the bug ID from defect management tool for identification.
Test Script ID Requirement ID Condition ID Purpose Input Expected Output Environmental Need Dependency Procedure Test Script Execution
tsTEO0-3
Name: Steps
Purpose: Result
Pass1 Pass2
If passed, P; if failed, F and the bug ID from defect management tool for identification. If passed, P; if failed, F and the bug ID from defect management tool for identification.
Test Script ID Requirement ID Condition ID Purpose Input Expected Output Dependency Procedure Test Script Execution
tsTEO0-4
Name: Steps
Purpose: Result
Pass1 Pass2
If passed, P; if failed, F and the bug ID from defect management tool for identification. If passed, P; if failed, F and the bug ID from defect management tool for identification.
User Login
0-1 0-2 Login with End User Login with Team Manager User succesfully logs in as End User 1 where End User welcome screen is correctly displayed User succesfully logs in as team manager where team manager welcome screen is correctly displayed 2 2 R0 R0
Activity2_Input_TCER.xls
Overview
A test script details the exact steps that a tester must follow to complete testing (i.e., to test all the conditions), and each usually describes a test cycle. Test scripts also include the data that is used for testing. Each step of a test script has an associated test condition that can be traced back to a business requirement. In this activity, you will execute existing test scripts. You will find a variety of real-life situations in executing these scripts and have an opportunity to discuss and debrief the experience.
Instructions
Step One: Execute the Test Script 1. Execute the first test script tsTEO01TC01, by following the script in the application. 2. As you execute the test script, identify potential defects. 3. Compare the actual result to the expected result. If there is a discrepancy: Ensure you have executed the step correctly. Ensure you have entered the correct data. Ensure the data is correct, if possible. Review relevant business rules (as appropriate). 4. Document confirmed discrepancies as SIRs. Include all relevant test data and business requirements information. Ideally, SIRs should include clear and detailed description of the defect, priority level (e.g., High, Medium and Low), and the exact steps to replicate the issue.
5. Ensure each defect is logged and tracked individually. Documentation of defects during testing provides a place to identify downstream impacts of the defect on other teams.
6. Make the description of the issues/SIRs specific enough so that the fixing team does not need real-time interaction with the testing team to understand the content and the context of the issues/SIRs. 7. Repeat steps 1-6 for the next test script: tsTEO01TC02. 8. Stop after executing the second test script and wait for the class debrief.
Activity3_ ParticipantInstructions.doc
Step Two: Class Debrief 1. Be prepared to participate in a class discussion about the first two test scripts in this activity. Step Three: Execute the Remaining Test Scripts 1. Go back up to Step One: Execute the Test Script and repeat steps 1-6 for the remaining test scripts, executing one script at a time. Execute the remaining scripts in the following order: a) Execute Test tsTEO01TC03 b) Execute Test tsTEO01TC04 c) Execute Test tsTEO01TC05 d) Execute Test tsTEO01TC06 e) Execute Test tsTEO01TC07 Step Four: Peer Review Break into teams of two. Work individually to review your partners logged bugs against the review criteria and discuss your findings with your partner. Review Criteria for Activity 1. Were the scripts executed in the correct sequence? 2. Did the scripts run successfully? 3. Compare actual results to expected results. 4. Analyze the test results with exit criteria to determine if you are ready to close the testing stage. Review Criteria for SIR Logs 1. Identify and document the defects identified during the test execution. 2. Ideally SIRs should include clear and detailed description of the defect, priority level (e.g., High, Medium and Low), exact steps to replicate the issue. 3. Ensure each defect is logged and tracked individually. 4. Documentation of defects during testing provides a place to identify downstream impacts of the defect on other teams. 5. Ensure that there is a project standard on how to document issues and defects clearly. Make the description of the issues/SIRs specific enough so that the fixing team does not need real-time interaction with the testing team to understand the content and the context of the issues/SIRs
Z16327 2007 Accenture. All Rights Reserved. 2 Activity3_ ParticipantInstructions.doc
Step Five: Class Debrief 1. Be prepared to participate in a class discussion about this activity.
3. Document confirmed discrepancies as SIRs. Include all relevant test data and business requirements information. How to create a SIR Provides a clear, detailed description of the identified defect, including: A description of when the problem occurred A description of the actions the tester took to encounter the defect Any additional information necessary to pinpoint the problem
Provides an estimated priority level Clearly describes the expected outcome Includes complete documentation of the resolution and object impacts (upon resolution) Includes accurate information
Activity3_ ParticipantInstructions.doc
SIR Levels Critical: Defect results in Fail-Stop Must be fixed before next Pass Must be fixed before entering the next Stage Must be fixed before entering user acceptance Test Stage Must be fixed before go live or approved for deferral High: Defect results in Fail-Stop or Fail-Continue Medium: Defect results in Fail-Continue Low: Defect results in Fail-Continue
Activity3_ ParticipantInstructions.doc
Activity3_Input_TestScripts.doc
Script # 1
BF Code: TEO01
Test Script ID Requirement ID Purpose Input Expected Output Environmental Need Dependency Procedure Test Script Execution Name: Steps 1. Execute the procedure Login by launching End User 01 Icon on the desktop 2. Create work item. 3. Fill in the mandatory fields. 4. Save and Create PASS 1 Passed/Failed PASS 2 Passed/Failed If passed, P; if failed, F and the bug ID from defect management tool for identification. If passed, P; if failed, F and the bug ID from defect management tool for identification. Purpose: Result
Pass1 Pass2
tsTEO01TC01 Verify that End User is able to create a work item (Telephone Complaint) Input: Customer Number = 8877554422 Work Item successfully created by End User N/A
Activity3_Input_TestScripts.doc
Script # 2
BF Code: TEO02
Test Script ID Requirement ID Purpose Input Expected Output Environmental Need Dependency Procedure Test Script Execution Name: Steps 1. Execute the procedure Login by launching End User 04 Icon on the desktop 2. Access Scheduled Work Queue 3. Locate work item and complete the task PASS 1 Passed/Failed PASS 2 Passed/Failed If passed, P; if failed, F and the bug ID from defect management tool for identification. If passed, P; if failed, F and the bug ID from defect management tool for identification. Purpose: Result
Pass1 Pass2
tsTEO02TC02 Verify that End User 04 is able to successfully complete a work item Input: Customer Number = 1122334455 Work Item successfully completed by End User 04 N/A
Activity3_Input_TestScripts.doc
Script # 3
BF Code: TEO03
Test Script ID Requirement ID Purpose Input Expected Output Environmental Need Dependency Procedure Test Script Execution Name: Steps Purpose: Result
Pass1 Pass2
tsTEO02TC03 Verify that End User 02 is able to login successfully and view the correct welcome page greeting message Input: End User 02 login welcome page greeting message = Welcome to the Work Flow Application System, _EU02, UU_EU02 N/A
1. Execute the procedure Login by launching End User 02 Icon on the desktop 2. Expected Result: End User is able to login successfully and view the welcome page greeting message Welcome to the Work Flow Application System, _EU02, UU_EU02 PASS 1 Passed/Failed PASS 2 Passed/Failed If passed, P; if failed, F and the bug ID from defect management tool for identification. If passed, P; if failed, F and the bug ID from defect management tool for identification.
Activity3_Input_TestScripts.doc
Script # 4
BF Code: TEO04
Test Script ID Requirement ID Purpose Input Expected Output Environmental Need Dependency Procedure Test Script Execution Name: Steps Purpose: Result
Pass1 Pass2
tsTEO02TC04 Verify that End User 01 is able to Complete the work item. Input: Assigned to = UU_EU01, UU_EU01 Work Item successfully Completed N/A
1. Execute the procedure Login by launching End User 01 Icon on the desktop 2. Access Scheduled Work Queue from left navigation menu. 3. Select work item 1213 from Queue and change the status to Completed 3. Select Save and Close 4. Expected Result: User should have successfully saved the work item without any errors PASS 1 Passed/Failed PASS 2 Passed/Failed If passed, P; if failed, F and the bug ID from defect management tool for identification. If passed, P; if failed, F and the bug ID from defect management tool for identification.
Activity3_Input_TestScripts.doc
Script # 5
BF Code: TEO05
Test Script ID Requirement ID Purpose Input Expected Output Environmental Need Dependency Procedure Test Script Execution Name: Steps 1. Execute the procedure Login by launching End User 01 Icon on the desktop 2. Access Scheduled Work Queue from left navigation menu. 3. Select work item from Queue and change the status to Pending Investigation 4. Close the browser prior to saving the page, the system prompts a message Are you sure you want to navigate away from this page? Click OK 5. Re-launch End User 01 login from the desktop 6. Expected Result: End User should view the welcome page. PASS 1 Passed/Failed PASS 2 Passed/Failed If passed, P; if failed, F and the bug ID from defect management tool for identification. If passed, P; if failed, F and the bug ID from defect management tool for identification. Purpose: Result
Pass1 Pass2
tsTEO05TC02 Verify that End User 01 is able to change status of the work item to Pending Investigation Input: Assigned to = UU_EU01, UU_EU01 Work Item successfully assigned to End User N/A
Activity3_Input_TestScripts.doc
Script # 6
BF Code: TEO06
Test Script ID Requirement ID Purpose Input Verify that the widget is correctly created in the database Input: Expected Output Environmental Need Dependency Procedure Test Script Execution Name: Steps Purpose: Result
Pass1 Pass2
tsTEO06TC01
Customer Account No: = 9876543211 Property Type = Measured Contact Name = John Newton Customer FullName = John Newton Customer Address L1 = 55 Bently Drive Post Code = OX2 5TR Telephone Number = 18654445555 (numerics only) Select Work from the dropdown list Payment Amount = 44.99 Date Payment made = Select 20-Jun-2007 from the calender function Payment Method = Credit or Debit Card Where Payment made = Bank Is this part of the bulk payment = No Action Required = Enter the following comments Client enquiring about payment Widget successfully created
N/A
1. Execute the procedure Login by launching End User 01 Icon on the desktop 2. Click Create Work Item expandable list in left navigation menu. 3. Click Customer Driven Payment Enquiry (under Create Work Item) in left navigation menu. 4. Fill in the Customer Account No. field in form using input data 5. Select Commercial from the dropdown list under Customer Type
Activity3_Input_TestScripts.doc
6. Select Measured from the dropdown list under Property Type 7. Fill in the following fields using input data: Customer FullName Contact Name Customer Address L1 Post Code Payment Amount 8. Select Date Payment made as 25-Jun-2007 from the calender function 9. 10. 11. 12. Select Credit or Debit card from the dropdown list under Payment method Select BANK from the dropdown list under Where Payment made Select No for question Is this part of a bulk payment? Enter free format text Client enquiring about payment under Action required
13. Fill in the Telephone Number field in form using input data and select Work from the dropdown list 14. Select Save and Create to complete the process PASS 1 Passed/Failed PASS 2 Passed/Failed If passed, P; if failed, F and the bug ID from defect management tool for identification. If passed, P; if failed, F and the bug ID from defect management tool for identification.
Activity3_Input_TestScripts.doc
Script # 7
BF Code: TEO07
Test Script ID Requirement ID Purpose Input Expected Output Environmental Need Dependency Procedure Test Script Execution Name: Steps 7. Execute the procedure Login by launching Team Manager 01 Icon on the desktop 8. Click Scheduled Work Queue expandable list in left navigation menu. 9. Find a work item for the list which shows status as Unassigned and select it by double clicking the corresponding ID # 10. Scroll down to the bottom of the page 11. Select Assigned from the dropdown list under Status 12. Select UU_EU01, UU_EU01 from the dropdown list under Assigned to 13. Enter free text under Comments Re-assigned to EU01 as it required urgent action 14. Click Save and Close PASS 1 Passed/Failed PASS 2 Passed/Failed If passed, P; if failed, F and the bug ID from defect management tool for identification. If passed, P; if failed, F and the bug ID from defect management tool for identification. Purpose: Result
Pass1 Pass2
tsTEO07TC02 Verify that Team Manager is able to successfully assign a work item to End User 01 Input: Assigned to = UU_EU01, UU_EU01 Work Item successfully assigned to End User N/A
Activity3_Input_TestScripts.doc
SIR Report
Issue ID Status Priority Severity Test script number Pages / Components Summary Description Build number Date resolved Resolution details
TE003-01
Open
Low
Low
Welcome message verbal is not correctly displayed when user logs into the WMA application
After user has successfully logged into the WMA system, the welcome R2 greeting message is displayed incorrectly. Defect Recreation Steps: 1. Execute the procedure Login by launching End User 02 Icon on the desktop Expected Result: End User is able to login successfully and view the welcome page greeting message Welcome to the Work Flow Application System, UU_EU02, UU_EU02 Actual Result: End User is able to login successfully and view the welcome page greeting message: "Welcome to Workflow Management Application, UU_EU02, UU_EU02"
Activity3_SIR Report_Template.xls
Issue ID
Status
Priority
Severity
Pages / Components
Summary
Description
Build number
Date resolved
Resolution details
Activity3_SIR Report_Template.xls