Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Evaluation Criteria
Once the test engineer has narrowed the search for a particular type of test tool to
two or three lead candidates, the Evaluation Scorecard depicted in Table 1 can be
used to determine which of the tools best fits the particular requirements.
Tool Customization
Can the tool be customized (can fields in tool be added 7 4 28
or deleted)?
Does the tool support the required test procedure 8 4 32
naming convention?
Platform Support
Can it be moved and run on several platforms at once, 8 4 32
across a network (that is, cross-Windows support, Win95,
and WinNT)?
continued
1
2 Example of Tool Evaluation Criteria
Tool Functionality
Test scripting language—does the tool use a flexible, yet 9 5 45
robust scripting language? What is the complexity of the
scripting language: Is it 4 GL? Does it allow for modular
script development?
Complexity of scripting language 9 5 45
Scripting language allows for variable declaration and use; 9 5 45
allows passing of parameters between functions
Does the tool use a test script compiler or an interpreter? 9 5 45
Interactive test debugging—does the scripting language allow 8 4 32
the user to view variable values, step through the code,
integrate test procedures, or jump to other external
procedures?
Does the tool allow recording at the widget level (object 10 5 50
recognition level)?
Does the tool allow for interfacing with external .dll 9 5 45
and .exe files?
Published APIs—language interface capabilities 10 4 40
ODBC support—does the tool support any 10 4 40
ODBC-compliant database?
Is the tool intrusive (that is, does source code need to be 9 4 36
expanded by inserting additional statements)?
Communication protocols—can the tool be adapted to various 9 3 27
communication protocols (such as TCP/IP, IPX)?
Custom control support—does the tool allow you to map to 10 3 30
additional custom controls, so the tool is still compatible
and usable?
Ability to kick off scripts at a specified time; scripts can 9 5 45
run unattended
Allows for adding timers 10 5 50
Allows for adding comments during recording 7 5 35
continued
Example of Tool Evaluation Criteria 3
Reporting Capability
Ability to provide graphical results (charts and graphs) 8 5 40
Ability to provide reports 8 5 40
What report writer does the tool use? 8 5 40
Can predefined reports be modified and/or can new 8 5 40
reports be created?
Version Control
Does the tool come with integrated version control capability? 10 4 40
Can the tool be integrated with other version control tools 8 3 24
Pricing
Is the price within the estimated price range? 10 4 40
What type of licensing is being used (floating, fixed)? 7 3 21
Is the price competitive? 9 4 36
Vendor Qualifications
Maturity of product 8 4 32
Market share of product 8 4 32
Vendor qualifications, such as financial stability and length 8 4 32
of existence. What is the vendor’s track record?
Are software patches provided, if deemed necessary? 8 4 32
Are upgrades provided on a regular basis? 8 5 40
Customer support 10 3 30
Training is available 9 4 36
Is a tool Help feature available? Is the tool well documented? 9 5 45
Availability and access to tool user groups 8 4 32
Total Value 2,638
Example of Tool Evaluation Criteria 5
As the weighted values for the test tool characteristics will vary with each type of test
tool, the test team may wish to develop an evaluation scorecard form for each type
of test tool required. In Table 1, an automated GUI test tool (capture/playback)
candidate is evaluated against the desired test tool characteristics. The total value of
2,638 for this candidate must then be compared with the total values derived for the
other two candidates. As noted in the sample scorecard summary below, Candidate
3 achieved a rating of 75.3% in being able to provide coverage for all the desired test
tool characteristics:
A Preferred Scorecard Summary is provided in Table 2. Note that using this dif-
ferent model for scoring, test tool Candidate 2 achieves a higher rating than Candi-
date 3, which had posted the highest rating using the Evaluation Scorecard method.
Candidate 2 achieved a rating of 90.0% for being able to provide coverage for the
highest priority test tool characteristics.
Candidate 1 97 74.6%
Candidate 2 117 90.0%
Candidate 3 103 79.2%
Remember that the evaluation for each kind of test tool being considered for an
organization or project is different. Each particular type of test tool has its own par-
ticular desired characteristics and a different weight scheme for the tool’s desired
characteristics. The guidelines of what to look for and weigh when evaluating a
GUI test tool will be different from guidelines for how to evaluate a network mon-
itoring tool.