Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Experience-Based
Experience
Based and
Exploratory
p
y Testing
g
Ohjelmistojen testaus
15.11.2010
Juha Itkonen
S b
SoberIT
Agenda
Intelligent Manual Testing
E
Experience
i
b
base ttesting
ti
Exploratory testing
Ways of Exploring
Session Based Test Management
Touring
g testing
g
Manual Testing
1. Individual differences in
testing are high
2. Test case design
techniques alone do not
explain
l i th
the results
lt
T
Testing
ti that
th t is
i performed
f
d by
b
human testers
Stereotype of manual testing
Executing detailed pre-designed test
cases
Mechanically following the step-bystep instructions
Treated as work that anybody can do
Domain experience
Testing experience
Knowledge
g of testing
g methods and techniques
Testing skills grown in practice
Software testing
is creative
ti and exploratory
l
t
work
ork
requires skills and knowledge
application domain
users processes and objectives
so
some
e level
e e o
of technical
ec ca de
details
a sa
and
d history
soyo
of the
e app
application
ca o
under test
Testers
Tester
s Attitude
Testers
Tester
s goal is to break
break the software
software
Reveal all relevant defects
Find out any problems real users would
experience in practice
Testers
Tester
s Goal
4. Parallel learning of the system under test, test design, and test execution
5. Experience and skills of an individual tester strongly affect
effectiveness and results
Pure scripted
( t
(automated)
t d) testing
t ti
Freestyle
y exploratory
p
y
bug hunting
High level
test cases
Chartered
exploratory testing
Manual scripts
In scripted
p
testing,
g, tests are first
designed and recorded. Then they
may be executed at some later time
y a different tester.
or by
Tests
Tests
P d t
Product
James Bach, Rapid Software Testing, 2002
Lateral thinking
Allowed to be distracted
Find side paths
p
and explore
p
interesting
g areas
Periodically check your status against your mission
Juha Itkonen, 2009 SoberIT
12
fixes
bugs
Customer-owned
C
Comprehensive
h
i
Repeatable
Automatic
Timely
Public
Exploratory Testing
Utilizes professional testers skills
and experience
Optimized to find bugs
Minimizing time spent on
documentation
Continually adjusting plans, refocusing on the most promising risk
areas
Following hunches
Freedom,, flexibilityy and fun for
testers
Focus on manual validation
making testing activities
agile
Agenda
Intelligent Manual Testing
E
Experience
i
b
base ttesting
ti
Exploratory testing
Ways of Exploring
Session Based Test Management
Touring
g testing
g
Freestyle exploratory
testing
i
Unmanaged ET
Functional
F
ti
l testing
t ti off
individual features
Exploring high level test
cases
Exploratory regression
testing
by verifying fixes or
changes
Session-based
exploratory
l
testing
i
Exploring like a tourist
Outsourced exploratory
t ti
testing
Charter
Time Box
Reviewable Result
D b i fi
Debriefing
Business district
Guidebook tour
Money tour
Landmark tour
Intellectual tour
FedEx tour
After-hours tour
Garbage collectors tour
Collectors tour
Lonely businessman tour
Supermodel tour
TOGOF tour
Scottish pub tour
Hotel district
Rained-out tour
Coach potato tour
Historical district
Bad-Neighborhood tour
Museum tour
Prior version tour
Tourist district
Seedy district
Saboteur tour
Antisocial tour
Obsessive-compulsive tour
Entertainment district
Supporting actor tour
Back alley tour
All-nighter tour
Juha Itkonen - SoberIT
2010
20
Choosing
gag
goal and then visiting
g
each item by shortest path
Screen-by-screen, dialog-bydialog, feature-by-feature,
feature by feature,
Test every corner of the software,
but not very deeply in the details
The All-Nighter
All Nighter Tour
Agenda
Intelligent Manual Testing
E
Experience
i
b
base ttesting
ti
Exploratory testing
Ways of Exploring
Session Based Test Management
Touring
g testing
g
Overall strategies
Structuring testing work
Guiding a tester through features
Detailed techniques
Low level test design
Defect hypotheses
Checking
g the test results
Overall strategies
Exploring weak
areas
Aspect oriented
testing
User interface
exploring
Exploratory
Top-down
functional
exploring
Simulating
ga
real usage
scenario
Smoke testing
b iintuition
by
i i and
d
experience
Data as test
cases
Documentation
based
Detailed techniques
Testing input
alternatives
Testing
boundaries and
restrictions
Testing
g alternative
ways
Input
Covering input
combinations
Exploring against
old functionality
Simulating
abnormal and
extreme situations
Exploratory
Persistence
testing
Comparing with
another application
or version
Comparing within
the software
Comparison
Feature
interaction testing
Defect based
exploring
End-to-end data
check
<exploratory strategy>
complicated
coded in a hurry
lots of changes
coders' opinion
testers' opinion
based on who implemented
a hunch...
<exploratory strategy>
Goal: To get first high level understanding of the function and then
deeper confidence on its quality set-by-step.
set by step.
Description: Pre
Pre-defined
defined test data set includes all relevant cases and
combinations of different data and situations. Covering all cases in a predefined test data set provides the required coverage.
Testing is exploratory, but the pre-defined data set is used to achieve systematic
coverage.
coverage
Suitable for situations where data is complex, but operations simple. Or when
creating the data requires much effort.
<comparison technique>
<input technique>
Training Testers
T
Testing
ti practices
ti
are good,
d experience
i
b
based,
d
knowledge for intelligent testers
Named and documented
Give common terminology and names that can be used to
discuss how the testing should be done
Test patterns
T
Testing
ti practices
ti
could
ld b
be ffurther
th d
developed
l
d
Testing pattern will provide set of good testing practices
For a certain testing problem and motivation
With a certain testing goal
Describing
esc b g a
also
so the
e app
applicability
cab y (co
(context)
e )o
of the
e pa
pattern
e
Agenda
Intelligent Manual Testing
E
Experience
i
b
base ttesting
ti
Exploratory testing
Ways of Exploring
Session Based Test Management
Touring
g testing
g
Testers know how the software is used and for what purpose
Testers know what functionality
y and features are critical
Testers know what problems are relevant
Testers know how the software was built
Risks,
Risks tacit knowledge
Effectiveness
Reveals large number of relevant defects
Efficiency
Low documentation overhead
Fast feedback
Quality of testing
How to assure the quality of testers
tester s work
Detailed test cases can be reviewed, at least
Research problem
Do testers
performing manual functional testing
with pre-designed test cases
find more or different defects
compared
p
to testers working
g
without pre-designed test cases?
Research questions
H
How
d
does using
i pre-designed
d i
d ttestt cases affect
ff t
1 the number of detected defects?
1.
2. the
e type
ype o
of de
detected
ec ed de
defects?
ec s
3. the number of false defect reports?
False reports are:
incomprehensible
duplicate
reporting a non-existent defect
No difference in the
total number of detected defects
Testing
approach
ET
TCT
Feature
Set
Number
of defects
Found defects
per subject
FS A
44
6 28
6.28
2 172
2.172
FS B
41
7.82
2.522
Total
85
7.04
2.462
FS A
43
5.36
2.288
FS B
39
7.35
2.225
Total
82
6.37
x = mean and = standard deviation
2.456
TCT p
produces more false defect reports
p
Testing
Approach
ET
TCT
Feature
Set
False defects
subject
FS A
1.00
1.396
FS B
1 05
1.05
1 191
1.191
Total
1.03
1.291
FS A
1.64
1.564
FS B
2.50
1.867
Total
2.08
1.767
per
Conclusions
1 Th
1.
The data
d t showed
h
d no benefits
b
fit ffrom using
i pre-designed
d i
d
test cases
in comparison to freestyle exploratory testing approach
Defect type distributions indicate certain defect types might be
better detected by ET
Contact information
Juha Itkonen
juha.itkonen@tkk.fi
+358 50 577 1688
htt //
http://www.soberit.hut.fi/jitkonen
b it h t fi/jitk
References (primary)
Bach, J., 2000. Session-Based Test Management. Software Testing and Quality Engineering, 2(6). Available at:
http://www.satisfice.com/articles/sbtm.pdf.
Bach, J., 2004. Exploratory Testing. In E. van Veenendaal, ed. The Testing Practitioner. Den Bosch: UTN Publishers,
pp 253
pp.
253-265.
265 http://www.satisfice.com/articles/et-article.pdf.
http://www satisfice com/articles/et article pdf
Itkonen, J. & Rautiainen, K., 2005. Exploratory testing: a multiple case study. In Proceedings of International
Symposium on Empirical Software Engineering. International Symposium on Empirical Software Engineering. pp. 8493.
Itkonen JJ., Mntyl,
Itkonen,
Mntyl M.V.
M V & Lassenius,
Lassenius C.,
C 2007
2007. Defect Detection Efficiency: Test Case Based vs.
vs Exploratory
Testing. In Proceedings of International Symposium on Empirical Software Engineering and Measurement.
International Symposium on Empirical Software Engineering and Measurement. pp. 61-70.
Itkonen, J., Mantyla, M. & Lassenius, C., 2009. How do testers do it? An exploratory study on manual testing practices.
In Empirical Software Engineering and Measurement, 2009. ESEM 2009. 3rd International Symposium on. Empirical
Software Engineering and Measurement, 2009. ESEM 2009. 3rd International Symposium on. pp. 494-497.
Lyndsay, J. & Eeden, N.V., 2003. Adventures in Session-Based Testing. http://www.workroomproductions.com/papers/AiSBTv1.2.pdf. Available at: http://www.workroom-productions.com/papers/AiSBTv1.2.pdf .
Martin, D. et al., 2007. 'Good' Organisational Reasons for 'Bad' Software Testing: An Ethnographic Study of Testing in
a Small Software Company
Company. In Proceedings of International Conference on Software Engineering
Engineering. International
Conference on Software Engineering. pp. 602-611.
Whittaker, J.A., 2009. Exploratory Software Testing: Tips, Tricks, Tours, and Techniques to Guide Test Design,
Addison-Wesley Professional.
References (secondary)
Agruss, C. & Johnson, B., 2005. Ad Hoc Software Testing.
Ammad Naseer & Marium Zulfiqar, 2010. Investigating Exploratory Testing in
Industrial Practice. Master's Thesis. Rnneby, Sweden: Blekinge Institute of Technology. Available at:
http://www bth se/fou/cuppsats nsf/all/8147b5e26911adb2c125778f003d6320/$file/MSE 2010 15 pdf
http://www.bth.se/fou/cuppsats.nsf/all/8147b5e26911adb2c125778f003d6320/$file/MSE-2010-15.pdf.
Armour, P.G., 2005. The unconscious art of software testing. Communications of the ACM, 48(1), 15-18.
Beer, A. & Ramler, R., 2008. The Role of Experience in Software Testing Practice. In Proceedings of Euromicro
Conference on Software Engineering and Advanced Applications. Euromicro Conference on Software Engineering and
Advanced Applications
Applications. pp
pp. 258-265
258 265.
Houdek, F., Schwinn, T. & Ernst, D., 2002a. Defect Detection for Executable Specifications An Experiment.
International Journal of Software Engineering & Knowledge Engineering, 12(6), 637.
Kaner, C., Bach, J. & Pettichord, B., 2002. Lessons Learned in Software Testing, New York: John Wiley & Sons, Inc.
Martin, D. et al., 2007. 'Good'
Good Organisational Reasons for 'Bad'
Bad Software Testing: An Ethnographic Study of Testing in
a Small Software Company. In Proceedings of International Conference on Software Engineering. International
Conference on Software Engineering. pp. 602-611.
Tinkham, A. & Kaner, C., 2003a. Learning Styles and Exploratory Testing. In Pacific Northwest Software Quality
Conference (PNSQC). Pacific Northwest Software Quality Conference (PNSQC).
Wood, B. & James, D., 2003. Applying Session-Based Testing to Medical Software. Medical Device & Diagnostic
Industry, 90.
Vga, J. & Amland, S., 2002. Managing High-Speed Web Testing. In D. Meyerhoff et al., eds. Software Quality and
Software Testing in Internet Times. Berlin: Springer-Verlag, pp. 23-30.