Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
3 3
NON-IT? IT
Is there any difference when you work for
same assignment ???
4 4
Software/Application??
A self contained program that performs welldefined set of tasks under user control.
5 5
Marketing
Survey is done by marketing people for various product and benchmark the products. Create MRS( Marketing Requirement survey)
Requirement analysis
Feasibility study (social, economical etc) investigate the need for possible software automation in the given system. Domain expert Create URS ( User Requirement specification)
Design
software's overall structure is defined. Software architecture, interdependence of modules, interfaces, database etc is defined. System analyst Create SRS( High level design, Low level design etc)
6 6
Code Generation
design must be translated into a machine-readable form taking input as SRS. Done by Team of developers. Reviews after every 500 lines of code
Code Inspection Code Walkthrough
Testing
New/patched build is tested by Test engineers for stability of the application.
Maintainence
Software is maintained due to changes ( unexpected values into the system)
7 7
Requirement Analysis
Review/Test
Operational Testing
Integration Testing
Detailed Specifications
Unit Testing
Coding
8 8
System An inter-related set of components, with identifiable boundaries, working together for some purpose.
Input
Process
Output
9 9
Analysis
10 10
Design
11 11
Phases or stages of a project from inception through completion and delivery of the final product and maintenance too!
Software Testing Concepts
12 12
3. Maintenance
13 13
Definition Phase
Focuses on WHAT
What information to be processed? What functions and performances are desired?
14 14
Development Phase
Focuses on
How the database should be designed ? How the software architecture to be designed ?
15 15
Maintenance Phase Maintainability is defined as the ease with which software can be understood, corrected, adapted and enhanced Maintenance phase focuses on CHANGE that is
associated with
16 16
SDLC Phases
Identify Problems/Objectives
S D L C
17 17
SDLC Phases : Requirement Identification & Analysis Phase Request for Proposal Proposal Negotiation Contract User Requirement Specification Software Requirement Specification
18 18
Software Requirement Specifications IEEE 830 : Software Requirement Specification is a means of translating the ideas in the minds of the clients(the inputs) into a set of formal document (the output) of the requirement phase
The Role
Bridge the communication gap between the client the user and the developer
19 19
20 20
Database tables, with all elements, including their type and size
All interface details All dependency issues Error MSG listing Complete input and output format of a module
21 21
SDLC Phases
Code Generation
design must be translated into a machine-readable form taking input as SRS. Done by Team of developers. Reviews after every 500 lines of code
Code Inspection Code Walkthrough
Testing
New/patched build is tested by Test engineers for stability of the application.
Maintainence
Software is maintained due to changes ( unexpected values into the system)
22 22
Testing Defined
Is Product Successful
Test factors
23 23
What is Testing?
process used to help identify the correctness, completeness
and quality of developed computer software.
24 24
25 25
To catch bugs/defect/errors. To check program against specifications Cost of debugging is higher after release
Verifying Documentation.
26 26
To get adequate trust and confidence on the product. To meet organizational goals
Like meeting requirements, satisfied customers, improved market share, Zero Defects etc
27 27
Testing defined !!
Def-1 Process of establishing confidence that a program or system does what it is supposed to.
Def-2 Process of exercising or evaluating a system or system component by manual or automated means to verify that it satisfies specified requirement (IEEE 83a)
Def-3
Testing is a process of executing a program with the intent of finding errors (Myers)
Def-4 Testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results.
28 28
The business perceives that the system satisfactorily addresses the true business goals.
End user feels that look, feel, and navigation are easy. Team is prepared to support and maintain the delivered product.
29 29
Functionality Usability
Likeability
Configurability
Maintainability
Interoperability
30 30
Testability
Operability Controllability Observability
Understandability
Suitability Stability Accessibility Navigability Editorial Continuity Scalability
Context Sensitivity
Structural Continuity
31 31
Test Factors
Functionality (exterior quality) Correctness Engineering (interior Adaptability (future quality) quality) Efficiency Flexibility
Reliability
Usability Integrity
Testability
Documentation Structure
Reusability
Maintainability
33 33
Spec
Design
Build
* Here testing was happening only towards the end of the life cycle
Software Testing Concepts
34 34
27% 27%
56% 56%
7%
7%
35 35
Requirement Analysis
Review/Test
Operational Testing
Integration Testing
Detailed Specifications
Unit Testing
Coding
36 36
STLC-V Model
Requirement Req. Review
37 37
Approach
Test Design
Acceptance criteria
Schedule Prioritization Test references Sign off req
Risk Plan
Project Overflow Quality Objectives Configuration
Plan
38 38
Implement Stubs
39 39
40 40
Test approach will be based on the objectives set for testing Test approach will detail the way the testing to be carried out Types of testing to be done viz Unit, Integration and system
testing
the general resources required The method of testing viz Blackbox, White-box etc., Details of any automated testing to be done
41 41
Software Testing Life Cycle- Phases 1. Requirement Analysis 2. Prepare Test Plan 3. Test Case Designing 4. Test Case Execution 5. Bug Reporting, Analysis and Regression testing 6. Inspection and release 7. Client acceptance and support during acceptance 8. Test Summary analysis
42 42
Requirement Analysis
Objective The objective of Requirement Analysis is to ensure software quality by eradicating errors as earlier as possible in the developement process, as the errors noticed at the end of the software life cycle are more costly compared to that of early ones, and there by validating each of the Outputs. The objective can be acheived by three basic issues:
Correctness Completeness Consistency
43 43
Type of Requirement
Functional Data Look and Feel Usability Performance Operational Maintainability Security Scalability Etc.
44 44
Evaluating Requirements
What Constitutes a good Requirement?
Consistent
requirements that are similar are stated in similar terms. Requirements do not conflict with each other.
Complete
all functionality needed to satisfy the goals of the system is specified to a level of detail sufficient for design to take place.
45 45
Requirement Analysis
Analyst not prepared Customer has no time/interest Incorrect customer personnel involved Insufficient time allotted in project
schedule
46 46
Scope Analysis of project Document product purpose/definition Prepare product requirement document Develop risk assessment criteria Identify acceptance criteria Document Testing Strategies. Define problem - reporting procedures Prepare Master Test Plan
47 47
Design-Activities
Setup test environment Design Test Cases: Requirements-based and Codebased Test Cases
48 48
Execution- Activities
Initial Testing, Detect and log Bugs Retesting after bug fixes
Final Testing
Implementation
49 49
Bug Reporting
Analyze the Error/Defect/Bug Debugging the system Regression testing
50 50
51 51
Client Acceptance
52 52
Pre-Acceptance Test Support Installing the software on the clients environment Providing training for using the software or
maintaining the software
53 53
54 54
Tools expertise
Database expertise Domain/Technology expertise
55 55
56 56
The roles chart should contain both onsite and off-shore team members.
57 57
Test Manager
Single point contact between Wipro onsite and offshore team
58 58
59 59
Test Lead
Resolves technical issues for the product group Provides direction to the team members Performs activities for the respective product group Review and Approve of Test Plan / Test cases Review Test Script / Code Approve completion of Integration testing Conduct System / Regression tests Ensure tests are conducted as per plan Reports status to the Offshore Test Manager
60 60
Test Engineer
Development of Test cases and Scripts
Test Execution
Result capturing and analysing Defect Reporting and Status reporting
62 62
Integration Testing
System Testing Acceptance Testing
Interface Testing
Regression Testing Special Testing
63 63
Unit Testing
Unit Testing is a
verification effort on the smallest unit of the software design the software component or module.
64 64
65 65
Approach
Uses the component-level design description as a guide. errors within the boundary of the module.
Important control paths are tested to uncover Unit testing is white-box oriented, and this
can be conducted in parallel for multiple components.
66 66
Unit Testing
Interfaces (input/output)
Test Cases
Software Testing Concepts
67 67
68 68
Misunderstood or incorrect arithmetic precedence Mixed mode operations Incorrect initialization Precision inaccuracy
69 69
Error description is unintelligible Error notes does not correspond to error encountered Error condition causes system intervention prior to error handling Exception- condition processing is incorrect Error description does not provide enough information to assist in the location of the cause of error.
70 70
71 71
The Unit test criteria, the Unit test plan, and the test case specifications are defined. A code walkthrough for all new or changed programs or modules is conducted. Unit Test data is created, program or module testing is performed, and a Unit test report is written. Sign-offs to integration testing must be obtained, Sign-off can be provided by the lead programmer, project coordinator, or project administrator.
72 72
Functional Testing
73 73
Functional Testing
Should
know expected results test both valid and invalid input
Unit test cases can be reused New end user oriented test cases have
to be developed as well.
74 74
User Interface
This stage will also include Validation Testing which is intensive testing of the new Front end fields and screens. Windows GUI Standards; valid, invalid and limit data input; screen & field look and appearance, and overall consistency
75 75
Horizontal
76 76
Integration testing
77 77
Data can be lost across an interface. One module can have an inadvertent, adverse
effect on another.
78 78
A 1 2
Top-Down is an incremental approach to testing of the program structure. Modules are integrated by moving downward through the control hierarchy, beginning with the main control module, this could be done as depthfirst or breadth-first manner.
Software Testing Concepts
79 79
as the name implies, begins construction and testing with atomic modules i.e., from the components at the lowest levels in the program structure
80 80
Login class calls the ConnectionPool object , Integration testing identifies errors not observed while code Debugging or Reviews.
81 81
System Testing
Purpose
Test the entire system as a whole
Assumptions
Completed
Unit Testing Functional Testing Integration Testing
82 82
Expectations
Verification of the system Software Requirements Business Workflow perspective Final verification of requirements and design External Interfaces Performance tests Affected documentation Non-testable requirements
83 83
Interface Testing
Purpose
Interfaces with the system
Assumptions
Unit, functional and integration testing All Critical Errors
Expectations
Interfaces with External Systems Planning and Co-ordination meetings with the external organizations in preparation for testing. Who will be the primary contacts? When is testing scheduled? If there is no test environment available testing may have to occur on weekends or during non-production hours.
84 84
Expectations (Contd.)
What types of test cases will be run, how many and what are they testing?
Provide copies of test cases and procedures to the participants
If the external organization has specific cases they would like to test, have them provide copies
Who will supply the data and what will it contain? What format will it be in (paper, electronic, just notes for someone else to construct the data, etc.)? Who is responsible for reviewing the results and verifying they are as expected? How often will the group meet to discuss problems and testing status?
85 85
Expectations
(Contd.)
Both normal cases and exceptions should be tested on both sides of the interface (if both sides exchange data). The interface should be tested for handling the normal amount and flow of data as well as peak processing volumes and traffic. If appropriate, the batch processing or file transmission window should be tested to ensure that both systems complete their processing within the allocated amount of time. If fixes or changes need to be made to either side of the interface, the decisions, deadlines and re-test procedures should be documented and distributed to all the appropriate organizations.
86 86
Performance Testing
Purpose
The purpose is to verify the system meets the performance requirements.
Assumptions/Pre-Conditions
System testing successful. Ensure no unexpected performance. Prior to Acceptance Testing. Tests should use business cases, including normal, error and unlikely cases.
87 87
88 88
Regression Testing
89 89
Definition and Purpose Types of regression testing Regression test problems Regression testing tools
90 90
Regression Testing
Definition
Selective retesting to detect faults introduced during modification of a system or system component, to verify that modifications have not caused unintended adverse effects, or to verify that a modified system or system component still meets its specified requirements. ...a testing process which is applied after a program is modified.
91 91
Regression Testing
92 92
Corrective Maintenance
Adaptive Maintenance
Perfective Maintenance
Preventive Maintenance
Corrective - fixing bugs, Design Errors, coding errors Adaptive - no change to functionality, but now works under
new conditions, i.e., modifications in the environment.
93 93
Regression Testing
Example 1 - Y2K
During Y2K code changing, regression testing was the essence of the transition phase. What was typically done, was that code was changed at multiple places (it did not turn the original logic upside down, but made subtle changes). Now Regression testing was very important for the fact that even one small piece of code lying untested could lead to huge ramifications in the large amounts of data that is typically handled by these mainframe computers / programs.
94 94
Regression Testing
Example 2 General
Regression testing might even be required when one of the business associates changes his systems (might be new hardware). Since our system is hooked on to this transition system, our test engineers are also required to do regression testing on our system which has NOT been changed. This example brings to light another fact with Regression testing, i.e., sometimes, even an unchanged system needs to be tested!
95 95
96 96
Acceptance Testing
Purpose
The purpose of acceptance testing is to verify system from user perspective
Assumptions/Pre-Conditions
Completed system and regression testing Configuration Manager Test data Final versions of all documents ready Overview to the testing procedures Exit decision Specific procedures Acceptance Criteria MUST be documented
Acceptance Testing Project stakeholders
97 97
Acceptance Testing
Expectations
Verification from the users perspective Performance testing should be conducted again
Extra time
User manual to the testers Non-testable requirements Review with the Sponsor and User Plans for the Implementation
98 98
Field Testing
Purpose
The purpose of field testing is to verify that the systems work in actual user environment.
Assumptions/Pre-Conditions
System and/or acceptance testing successful.
Expectations
Verification of the system works in the actual user environment.
10 100 0
Evaluation
Errors
Testing
Debug
Reliability Model
Test Configuration
Corrections
Expected Results
Predicted Reliability
NOTES Software Configuration includes a Software Requirements Specification, a Design Specification and source code. A test configuration includes a Test Plan and Procedures, test cases and testing tools. It is difficult to predict the time to debug the code, hence it is difficult to schedule.
10 101 1
10 102 2
10 103 3
No. & Levels of Resources Rounds of Testing Exit Criteria Test Suspension Criteria Resumption Criteria
Types of Applications
Test Strategy
10 104 4
Test Environments
Approach to Testing External Interfaces Approach to Testing COTS products Scope of Acceptance Testing
Test Issues
10 105 5
Test Staffing
Testing of COTS
External Interfaces
Acceptance
10 106 6
Executed at least once? Requirements been tested or verified? Test Documentation Documents updated and submitted
Configuration Manager
Test Incidents
10 108 8
10 109 9
Test Plan
Objective
11 110 0
Importance
Test planning process is a critical step in the testing process. Without a documented test plan, the test itself cannot be verified, coverage cannot be analyzed and the test is not repeatable
11 111 1
11 112 2
Test Plan
11 113 3
Continue..
11 114 4
Structure
9. Test deliverables (PPlan)
10. Environmental needs (H/w & S/w) 11. Responsibilities (PPlan) 12. Staffing and Training needs (PPlan) 13. Schedule (PPlan) 14. Risks and Contingencies (PPlan) 15. Approvals Ref : Test Plan Template
11 115 5
Testing Process
Software Development / Implementation Process Requirements Definition Specification Definition Code Creation Testing
Release Date
Testing Techniques
Code Based Test Case Design Requirements Based Test Case Design
11 117 7
Testing Techniques
Fault Based
Error Guessing Mutation Fault Seeding
Software Testing Concepts
11 118 8
Usage Based
Statistical testing (Musas)SRET
Specific Technique
Object Oriented Testing Component Based Testing
11 119 9
Have I
Tested Exercised Forced
Have I Thought Applied all inputs Completely explored Run all the Scenarios
Found
12 120 0
Code Based Test Case Design Requirements Based Test Case Design
12 122 2
Requirements of Test Approach Identify the features to be tested Arrive at High Level
12 123 3
Pass/Fail Criteria
12 124 4
Approach
12 125 5
Test Cases
12 126 6
12 127 7
Identify all potential Test Cases needed to fully test the business and technical requirements Document Test Procedures Document Test Data requirements Prioritize test cases Identify Test Automation Candidates Automate designated test cases
Software Testing Concepts
12 128 8
Type
1.Requirement Based 2.Design based 3.Code based 4.Extracted 5.Extreme
Software Testing Concepts
Source
Specifications Logical system Code Existing files or test cases Limits and boundary conditions
12 129
13 130 0
13 131 1
Extreme cases
13 132 2
Extracted cases involved extracting samples of real data for the testing process. Randomized cases involved using tools to generate potential data for the testing process.
13 133 3
Characteristics of good test case Specific Non-redundant Reasonable probability of catching an error Medium complexity Repeatable Always list expected results
13 134 4
Test case guidelines Developed to verify that specific requirements or design are satisfied Each component must be tested with at least two test cases: Positive and Negative Real data should be used to reality test the modules after successful test data is used
13 135 5
Test Cases
Test Data
Test Results
Test Reports
Compare results
Statement Coverage Edge Coverage Condition Coverage Path Coverage Cyclomatic Complexity
13 137 7
Purpose
Understand the Objective Effective conversion of specifications Checking Programming Style with coding standards Check Logic Errors Incorrect Assumptions Typographical Errors
13 138 8
Structural Testing
Ensure Reduced Rework Quicker Stability
Smooth Acceptance
Structure of the Software itself Valuable Source Selecting test cases
13 139 9
All loops are executed at their boundaries and within operational bounds.
All internal data structures are exercised to ensure validity.
Contd..2
14 140 0
14 141 1
14 142 2
Types of Code Based Testing & Adequacy Criteria Involve Control Flow Testing
Statement Coverage
Is every statement executed at least once?
Edge Coverage
Is every edge in the control flow graph executed?
Condition Coverage
Is edge + every Boolean (sub) expression in the control flow graph executed?
Path Coverage
Is every path in the control flow graph executed?
Cyclomatic Complexity
Is the logical structure of the program appropriate?
14 143 3
Test Cases
Independent Path
Logical Decisions
Boundaries
Data Structures
14 144 4
Number of Executed Statements (P) Statement coverage (C) = Total Number of Statements (T)
14 145 5
Number of Executed Branches (P) Edge Covers (C) = Total Number of Branches (T)
14 146 6
14 147 7
14 148 8
14 149 9
15 150 0
15 151 1
Example 1: C1 = B1 & B2 where B1, B2 are boolean conditions. Condition constraint of form (D1,D2) where D1 and D2 can be true (t) or false(f). The branch and relational operator test requires the constraint set {(t,t),(f,t),(t,f)} to be covered by the execution of C1.
15 152 2
Selects test paths according to the location of definitions and use of variables.
Loop Testing. Loops fundamental to many algorithms. Can define loops as simple, concatenated, nested and unstructured.
15 153 3
Simple
Nested
Concatenated
Unstructured
15 154 4
Simple Loops of
size n:
Skip loop entirely
15 155 5
Types of Code Based Testing(4) - Path Coverage: Nested Testing Nested Loops
Start with inner loop. Set all other loops to minimum values.
Nested
15 156 6
Concatenated
15 157 7
Unstructured loops
Don't test - redesign.
15 158 8
15 159 9
Cyclomatic Complexity
An independent path is any path through a program that introduces at least one new set of processing statements or a new condition (i.e., a new edge).
16 160 0
<5
structure of the module is simple and logic is easy to test.
16 161 1
Sequence
If
While
Until
Case
16 162 2
On a flow graph:
Arrows called edges represent flow of control.
Contd..2
16 163 3
Any procedural design can be translated into a flow graph. Note that compound Boolean expressions at tests generate at
least two predicate node and additional arcs.
Contd..2
16 164 4
16 165 5
16 166 6
1 2 4 6
7a 7b
3 5
Endif
7b.Enddo 8. End Software Testing Concepts
Contd..2
16 167 7
The McCabe Cyclomatic complexity V(G) of a control flow graph measures the maximum number of linearly independent paths through it. The complexity typically increases because of branch points.
16 168 8
To compute the Cyclomatic complexity: V(G) where v refers to the Cyclomatic number in graph theory and G indicates that the complexity is a function of the graph.
If e is the number of arcs,
n is the number of nodes and p is the number of connected components or predicates or modules, then Linearly independent paths,
V(G) = e - n + 2 * p
16 169 9
17 170 0
17 171 1
Note: some paths may only be able to be executed as part of another test.
17 172 2
Graph Matrices
Contd..2
17 173 3
Graph Matrices
Can associate a number with each edge entry. Use a value of 1 to calculate the Cyclomatic complexity
For each row, sum column values and subtract 1.
Sum these totals and add 1.
Contd..2
17 174 4
Probability that a link (edge) will be executed. Processing time for traversal of a link. Memory required during traversal of a link.
Contd..2
17 175 5
Graph Matrices 1
1 2 3 4 5 6 7a 7b 8 1 1 1 2 1 1 3 1 4 1 1 5 1 6 1 7a 1 7b 1 8
2
4 6
7a 7b 8
3 5
17 177 7
Static Testing
17 178 8
Reviews
17 179 9
Benefits of Reviews
Identification of the anomalies at the earlier stage of the life cycle Identifying needed improvements Certifying correctness
Encouraging uniformity
Enforcing subjective rules
18 180 0
Types of Reviews
18 181 1
Software design description Source Code Software test documentation Software user documentation System Build Release Notes Let us discuss Inspections, Walkthroughs and Technical Reviews
with respect to Code.
18 182 2
Code Inspection
Code inspection is a visual examination of a software product to detect and identify software anomalies including errors and deviations from standards and specifications.
Determination of remedial or investigative action for an anomaly is mandatory element of software inspection
Attempt to discover the solution for the fault is not part of the inspection meeting.
18 183 3
18 184 4
Author Reader
Moderator
Inspector Recorder
18 185 5
18 186 6
Inspection Process
Overview Preparation
Inspection
Rework Follow up
18 187 7
Classification of anomaly
Missing Superfluous (additional) Ambiguous Inconsistent Improvement desirable Non-conformance to standards Risk-prone (safer alternative methods are available) Factually incorrect Non-implementable (due to system or time constraints)
18 188 8
Severity of anomaly
Major Minor
18 189 9
Work product is detached from the individual. Identification of the anomalies at the earlier
stage of the life cycle.
Uniformity is maintained.
19 190 0
Adequate preparation time must be provided to participants. The inspection time must be limited to 2-hours sessions, with a
maximum of 2 sessions a day.
19 191 1
Inspection team members Software program examined Code inspection objectives and whether they were met. Recommendations regarding each anomaly. List of actions, due dates and responsible people.
19 192 2
Code Walkthrough
19 193 3
Find anomalies
Improve software program Consider alternative implementation if required (not done in inspections)
19 194 4
Inspection process includes Overview, preparation, inspection, rework and follow up.
Walkthrough process includes Overview, little or no preparation, examination (actual walkthrough meeting), rework and follow up.
19 195 5
Inspection takes longer time as the list of items in the checklist is tracked to completion.
Shorter time is spent on walkthroughs as there is not formal checklist used to evaluate the program.
19 196 6
Recorder
Team member
19 197 7
19 198 8
19 199 9
A technical review is a formal team evaluation of a product. It identifies any discrepancies from specifications and standards
or provides recommendations after the examination of alternatives or both.
20 200 0
20 201 1
Same as Inspections.
20 203 3
Purpose
Is to find
Functional validity of the system
Sensitivity
Tolerance Operability
Interface errors
Errors in database structures Performance errors Initialization and termination errors
Software Testing Concepts
20 204 4
Approach
20 205 5
Categories of Requirements
Functional
Absolutely necessary for functioning of system
Non-functional
Restriction or constraints on system services
20 206 6
20 207 7
20 208 8
21 210 0
Techniques
Flowchart
Use Cases
21 211 1
State Machine
State Diagram
Description
State based business logic Covering all paths generate test cases Diagram may be complicated For every event generate test cases using BVA, EP
21 212 2
Decision Table
Value 1 Login Password Successful Login Unsuccessful Login Warning Message X X (W) Value 2 X NA Value 3 X X X (W) ACTION
CONDITION
Explores combinations of input conditions Consists of 2 parts: Condition section and Action section
Condition Section - Lists conditions and their combinations Action Section - Lists responses to be produced
Exposes errors in specification Columns in decision table are converted to test cases Similar to Condition Coverage used in White Box Testing
21 213 3
Flowchart
Description
Flow based business logic Generate test cases covering all paths Simple to use For every condition generate test cases using BVA, EP
21 214 4
Use Cases :
21 216 6
Techniques
Equivalence partitioning Boundary value analysis Input domain & Output domain Special Value Error based
Cause-effect Graph
Comparison Testing
21 217 7
Invalid Inputs
Valid Inputs
SYSTEM`
21 218 8
Invalid
Valid Range
Invalid
17
Less than 6
Between 6 and 15
More than 15
Useful in reducing the number of Test Cases required. It is very useful when the input/output domain is amenable to
partitioning.
21 219 9
Here test cases are written to uncover classes of errors for every
input condition.
Equivalence classes are:Range Upper bound + 1 Lower bound 1 Within bound Value Maximum length + 1 Minimum length 1 Valid value and Valid length Invalid value Set In-set Out-of-set Boolean True False
22 220 0
Equivalence Partitioning partitions the data to partition of a set. Partition refers to collection of mutually disjoint subsets whose union
is the entire set.
Choose one data element from each partitioned set. The KEY is the choice of equivalence relation! EC based testing allows
To have a sense of complete testing. Helps avoid redundancy.
22 221 1
5 6 7
16 15
Less than 6
Between 6 and 15
More than 15
Complements to Equivalence partition BVA leads to a selection of test cases that exercise bounding values Design test cases test
Min values of an input Max values of an input Just above and below input range
22 222 2
Helps to write test cases that exercise bounding values. Complements Equivalence Partitioning. Guidelines are similar to Equivalence Partitioning. Two types of BVA:
Range
Above and below Range
Value
Above and below min and max number
22 223 3
22 224 4
Examples:
For a range of values bounded by a and b, test (a-1), a, (a+1), (b-1), b, (b+1).
If input conditions specify a number of values n, test with (n-1), n and (n+1) input values.
Apply 1 and 2 to output conditions (e.g., generate table of minimum and maximum size). If internal program data structures have boundaries (e.g., buffer size, table limits), use input data to exercise structures on boundaries.
22 225 5
Minimum +1
Nominal/mid Maximum -1 Maximum Maximum +1
22 226 6
Description
Inputs Outputs
From input side generate inputs to map to outputs. Ensure that you have generated all possible inputs by looking from the output side.
22 227 7
Ad-hoc / seat-of-pants / skirt testing. No guidelines, use best engineering judgment. Special test cases / Error guessing. Is useful dont discount effectiveness!
22 228 8
22 229 9
Contd..2
23 230 0
Steps:
A cause-effect graph developed. Graph converted to a decision table. Decision table rules are converted to test cases.
Contd..2
23 231 1
23 232 2
Comparison Testing
In some applications, reliability is critical. Redundant hardware and software may be used.
GUI Testing
23 234 4
Should move the focus (cursor) from left to right and top to bottom in the window.
Using SHIFT+TAB Should move the focus (cursor) from right to left and bottom to top.
Text
Should be left-justified.
23 235 5
Edit Box U should be able to enter data. Try to overflow the text, text should be stopped after the specified length of characters. Try entering invalid characters - should not allow.
Radio Buttons Left and right arrows should move ON selection. So should UP and DOWN.
Check Boxes Clicking with the mouse on the box or on the text should SET/UNSET the box.
23 236 6
Command Buttons Should have shortcut keys (except OK and Cancel buttons). Click each button with the mouse - should activate. TAB to each button & press Space/Enter - should activate.
Drop Down List Pressing the arrow should give list of options. Pressing a letter should bring you to the first item in the list with that start letter. Pressing Ctrl+F4 should open/drop down the list box.
23 237 7
Combo Boxes Should allow text to be entered. Clicking the arrow should allow user to choose from the list
List Boxes Should allow a single selection to be chosen by clicking with the mouse or using the Up and Down arrows. Pressing a letter should bring you to the first item in the list with that start letter.
23 238 8
Aesthetic Conditions
The general screen background should be of correct colour (company standards,.). The field prompts and backgrounds should be of correct colour. The text in all the fields should be of the same font. All the field prompts, group boxes and edit boxes should be aligned perfectly.
23 239 9
24 240 0
Navigation Conditions
The screen should be accessible correctly from the menu and toolbar. All screens accessible through buttons on this screen should be accessed correctly. The user should not be prevented from accessing other functions when this screen is active. Should not allow to open number of instances of the same screen at the same time.
24 241 1
All read-only and disabled fields should be avoided in the TAB sequence.
Should not allow to edit microhelp text. The cursor should be positioned in the first input field or control when opened. When an error message occurs, the focus should return to the field in error after cancelling it. Alt+Tab should not have any impact on the screen upon return.
24 242 2
24 243 3
24 244 4
General Conditions
Help menu should exist. All buttons on all tool bars should have corresponding key commands. Abbreviations should not be used in drop down lists. Duplicate hot keys/shortcut keys should not exist. Escape key and cancel button should cancel (close) the application. OK and Cancel buttons should be grouped separately. Command button names should not be abbreviations.
24 245 5
Red colour should not be used to highlight active objects (many individuals are red-green colour blind).
Screen/Window should not have cluttered appearance. Alt+F4 should close the window/application.
24 247 7
What is a Bug?
Bug
A fault in a program which causes the program to perform in an unintended or unanticipated manner or deviation from the requirement specification or the design specification is referred as a bug.
24 248 8
Submitted
No
24 249 9
Classification of Bugs
Two attributes are used whenever a Bug/Defect is detected
Serious
Minor
25 250 0
Reporting/Logging a Bug/Defect
Summary Description
How to reproduce
Version Module Phase Browser Environment Modified Date
25 251 1
Priority
Testers name Status Database
Type of defect
Reproducible Attachments