Sei sulla pagina 1di 1

The purpose of test:

Provide information to assure the quality of


- the product (finding faults) Mistake Process
/ Error
- decisions (quantifying risks)
Software Testing at a Glance – or two improvement

- process (finding root causes) Defect / Fault Fault correction


Test is comparing what is to what should be
Failure Incident report Deployment
Start Business goals WWW.DELTA.DK Maintenance testing Disposal
Business Tool support for monitoring Continuous analysis of usage Hyperlinks
here requirements
Vision
Vision // ideas
ideas // needs
needs // expectations
expectations
Organizational management
Test policy
- definition of test Process Improvement
- test process to use
- test evaluation
- quality level Project plan Be prepared - Project management
- approach to
improvement R Customer
Test management satisfaction
Phase specific Overall Test process for all test phases Base From test basis to test cases
test strategy test plan document
R R Configuration management Test
Overall planning derive create using
test strategy
Requirements tower test case design techniques
R Environment requirements Test plan Test phases Quality assurance
User Test Organize into
Platform Hardware Standards 1. Introduction Acceptance testing
requirements specification
- user Report

De
2. Test item(s)
- operational (OAT) User Test conditions / Testgruppe
Product quality requirements 3. Features to be tested

ve
Acceptance testing T T Test requirements TestTestgruppe
procedure /
Test strategy 4. Features not to be - contractual (FAT) requirements Test Test script
R 1

lop
1. Entry & exit criteria 5. Approach - alpha & beta Test plan Test specification correction execution 2
Test cases 3
2. Test approach 6. Item pass/fail criteria System integration Product R R of the 4

me
Functional requirements
Monitoring and control

3. Techniques to use 7. Suspension criteria design Report test object

nt
testing Test
4. Completion criteria 8. Test deliverables Product ÷
System testing design
System integration testing T recording
5. Independence 9. Testing tasks Test coverage
- functional R Test plan Test logs
6. Standards to use 10. Environmental needs Test specification
- non-functional System R R % of the coverage items touched by a test
7. Environment 11. Responsibilities N Checking for
Non-functional requirements Tracing
8. Test automation Component requirements Report Req. 146 Coverage items:

Integration
Performance requirements 12. Staffing and training test completion
9. Degree of reuse 13. Schedule integration testing SW - statements, decisions, conditions
Usability requirements System testing T Req. 147
10. Retest and Availability requirements 14. Risks Component testing requirements - interfaces
R Test plan Test case
regression testing Maintainability requirements Test specification Phase - user requirements, use cases
A plan must be The price of a fault correction Architectural R R Req. 148 27
11. Test process Phase
test - business processes, business goals
12. Measures Specific depends on when the fault is design Report
Project requirements Tester skills Phase
test
report Test case design techniques
13. Incident Measurable found Architectural Test case
Cost Resources Time Agreed Test Component integration testing T Hw Phase
test
report Specification based
management Test 1000 design - prevents omissions 56 Phase
test
Tool support Test Specification
Relevant skills R Tp Ts report - equivalence class partitioning for test
Tool support for skills - prevents gold-plating test
report 1. Test Design
Time specific Soft Test ITTest IT R R - boundary value analysis
Soft Detailed - shows the rationale for test cases report specification 1.1 Features to be tested
Requirements management skills skills skills Report - decision tables
100 design

Integration
skillsskills skills
Domain Test design 1.2 Approach refinement
Soft Soft
knowledge
IT DomainIT
€ Detailed
- state transitions
Test input data 1.3 Associated test case
Development skills skills skills skills
knowledge 10 Component testing T D - use cases
design Incident preparation 1.4 Feature pass/fail criteria
Domain Domain 1 R T pR TsR Experience based
models Requirement Design Tool support for dynamic analysis report Test oracles
knowledgeknowledge Test Production - error guessing
specification Source: Grove Consultants, UK Recognition 2. Test Cases
Sequential Component
Component
Component supporting data - check lists A fool with a tool 2.1 Test Case 1
Coding Component S
C D I T Test Roles The best (test)team is Hw classification - exploratory is still a fool 2.1.1 Test items (rationale)
Test leader (manager) based on knowledge of Tool support for static testing impact Structure based 2.1.2 Prerequisites
Test analyst / designer Static analysis, incl. requirements testing - statements 2.1.3 Inputs
Waterfall and Belbin types: Investigation
Test executer - decisions
The white-box insect
V-models Quality assurance for all work products supporting data branch (in) 2.1.4 Expected outcomes
C Reviewer / inspector Cerebral Definition of quality criteria Validation Verification Reporting classification - conditions 2.1.5 Inter case
T Reporting
Test environment responsible Plant Definition of quality criteria Validation Verification - models basic block ------- statement dependencies
D T Good Good Known impact
(Test)tool responsible Monitor/Evaluator Action -------
I code test quality R 2.N Test Case n
Specialist Review types: informal review, walkthrough, technical review, management review, inspection, audit supporting data decision
S classification condition ------- branch point Trace Test Procedure
Iterative
Responsibility matrix Action oriented Static analysis by tools D Dynamic analysis by tools T Dynamic test manually or by tools
C T UN- Tasks Shaper impact tables 1. Purpose
D
Poor Poor known R Disposition 2. Prerequisites
T Implementer branch branch
I
code test quality Configuration management supporting data Overall 3. Procedure
C P P Completer/Finisher Reporting Recognition Investigation Action (out) (out)
of selected work products , Reporting
Reporting
C T R R R C R R Storage Reporting
Reporting classific. test report steps
People oriented Tool support product components , and products Reporting Disposition Tool support decision outcome
D T Identification Change management impact
I Communication is this is what I Coordinator

People
P
C T think is meant Team-worker
this is what R P Plan at project start d
D T I mean Ressource investigator e Test environment
this is what l
this is what I hear C C R P i Hardware, software, network, other systems, tools, data,
I I say v Test report
Responsible, Performing, Consulted e
Test targets room, peripherals, stationary, chairs, ‘phones, food, drink, candy
RAD, RUP or Agile Triangle of test quality r 1. Summary
development test y Tool support for New product Change related
Conceive change one aspect Tool support for test execution and logging 2. Variances
Test could be ruled by product risks Plan during development d Test planning and monitoring - functional - regression testing
Design -> change at least two e 3. Comprehensiveness
background background + willingness to run risks l
Test running Test script generation
Implement i Management (of everything) - non-functional - retesting 4. Summary of results
v Test harness and drivers Coverage measurement
Test + Correction e - structure/architecture 5. Evaluation
r Monitoring and Comparators Simulator Security testing
development test y 6. Summary of
Product Risk analysis Estimation Scheduling control activities
Plan at test start d
Maintenance Tool support for non-functional test Performance, Load, Stress
Risk exposure = probability x effect Time e
Risk management Resources l
i Communication
Quality v
e
Risks hit in many places r
Project development test y

Process Higher productivity Level Org.al Innovation Causal Analysis Level 5: Optimization, Defect Prevention and Quality Control Scale p g
5 and Deployment and Resolution PA 5.2 N N N N N N Test strategy A B C D
Cooperation matrix
and quality Process improvement • Test process optimisation
5 Organizational Quantitative • Quality control Life-cycle model A B Learning
Business 4 Process Performance Project Management N N N N N N • Application of process data for defect prevention Moment of involvement A B C D
Lower Optimising PA 5.1 Tools
Requirements Estimating and planning A B 55 procedures
risk Integrated Project Development Level 4: Management and Measurement
N N N N N N Acting Test specification techniques A B
Organizational Management for IPPD Technical PA 4.2 methods
4 Learning • Software quality evaluation Static test techniques A B
Process Focus Solution Organizational Environ-
Risk Management • Establish a test measurement program Metrics A B C D
Controlled Organizational ment for Integration PA 4.1 N N N N N N • Establish an organisation-wide review program techniques
Product Integration
3 Process Definition Decision Analysis
(implementing) Test tools A B C
44
Integrated Teaming Test environment A B C
Organizational and Resolution
Verification L L P F L P Level 3: Integration
3 Training Integrated
PA 3.2 Office environment A
Described Supplier Management
• Control and monitor the test process Commitment and motivation A B C 33
Why do we work? Validation L L P F L P Initiating • Integrate testing into the software lifecycle
PA 3.1 Test functions and training A B C
• Establish a technical training program
Key areas

Configuration • Establish a software test organisation Scope of methodology A B C


Project Planning
2 Management Communication A B C
PA 2.2 L L P F L L 22 Actions
motives Repeatable Project Monitoring Requirements Process and Product Establishing Level 2: Phase Definition Reporting A B C D
incentives 2
and Control Management Quality Assurance
F F L F F P • Institutionalise basic testing techniques and methods Defect management A B C ☺
Supplier Agreement Measurement
PA 2.1 Diagnosing • Initiate a test planning process Testware management A B C D Adjustment
Management and Analysis (plan and write)
1 • Develop testing and debugging goals Test process management A B C 1
Initial PA 1.1 L F P F L P Evaluating A B
Process Management Project Management Engineering Support
(assessment) Level 1: Initial Low-level testing A B C
Test process
improvement
models

Maturity levels 1 2 3 4 5

General process
improvement
models
CMM - levels CMMI –process areas ISO 15504 – result profile TMM - levels TPI®– matrix Deployment

Input Exit Degree of independence (producer vs. tester) Industry specific Testing standards Other related standards The golden rule of testing:
Process Procedure: Test approaches Standards If you don’t measure you’re left with
criteria 1. Producer tests own product (you shall test this much) (you shall test like this) – IEEE 1044
- Activity 1 Analytical (risk) Model-based (statistical) Structured Always test so that whenever
template - Activity 2 2. Tests are designed by other team members Quality assurance – aviation – BS-7925 – ISO 9126
only one reason to believe you’re in
Entry - Consultative (domain experts) Methodical (check-lists) Heuristic (exploratory) 3. Tests are designed by independent testers (you shall test) – medical devices – IEEE 829 control – hysterical optimism. you have to stop you have
Output – IEEE/ISO 12207
criteria - Activity n Standard compliant Regression-averse (reuse and automation) 4. Tests are designed by external testers – ISO 9000:2001 – military equipment – IEEE 1028 Tom De Marco done the best possible test.
– IEEE/ISO 15288

© DELTA, WWW.DELTA.DK - Anne Mette Jonassen Hass, AMJ@DELTA.DK – based on ISEB Software Testing Practitioner Syllabus, BS-7925-1:1998; IEEE Std. 610-1990; IEEE Std. 829-1998; ISTQB testing terms 2005 and presentations at EuroSTAR conferences by Lee Copeland, Steward Reid, Martin Pool, Paul Gerrard, Grove Consultants and many others Graphic advise: EBB-CONSULT, Denmark Screen Beans® Clip Art, USA V. T-UK-o-0603

Potrebbero piacerti anche