Sei sulla pagina 1di 18

Test Plan Document

Reusability Evaluation Model for OO Software


Components

Project Code:
REMFOOSC-2013

Internal Advisor:
Dr. Muhammad Ilyas

External Advisor:
Syeda Iffat Zahra

Project Manager:
Dr. Muhammad Ilyas

Project Team:
Shahnawaz
Shayan Shaukat
Rubab Javaid
Hira Noor

(BSSE-F10-M-44)
(BSSE-F10-M-25)
(BSSE-F10-M-62)
(BSSE-F10-M-59)

Submission Date:
April 29, 2014

_____________________
Project Managers Signature

Document Information
Category

Information

Customer

Department Of CS & IT
<Reusability Evaluation Model For OO Software
Components>
Test Plan
1.0
ReMOOSC
Draft
<Shahnawaz, Shayan Shaukat, Hira Noor, Rubab Javaid>
PM
Feb. 25, 2014
1. Advisor
2. PM
3. Project Office

Project
Document
Document Version
Identifier
Status
Author(s)
Approver(s)
Issue Date
Distribution

Definition of Terms, Acronyms and Abbreviations

Term
PHP
TP
ML
OO

Description
Hypertext Preprocessor
Test Plan
Multi linear
Object Oriented

Table of Contents
1. Introduction...........................................................................................................................6
1.1 Purpose of Document....................................................................................................6
1.2 Project Overview............................................................................................................6
2. Scope of Testing.....................................................................................................................7
3. Test Plan Strategy.................................................................................................................7
4. Test Environment..................................................................................................................9
5. Schedule...............................................................................................................................10
6. Control Activities................................................................................................................10
7. Functions to be Tested........................................................................................................10
8. Functions not to be Tested..................................................................................................10
9. Test Case Design and Description......................................................................................11
10. Traceability Matrix.............................................................................................................15
11. Major Deliverables..............................................................................................................15
12. Risks and Assumptions.......................................................................................................15
13. Exit Criteria........................................................................................................................15
14. References............................................................................................................................16

1. Introduction
Software reusability is a prominent approach used to help in ensuring how software could be
reusable or how to find out those artifacts from currently available software components to build
a new system. The goal of reusability evaluation model for object oriented software component is
to find the quality of software in terms of reusability. Our objective is to develop the reusability
evaluation model for the object oriented components that will use CKs metrics suit as an input.
These inputs will be processed by the regression algorithms. Our output which will be the
reusability of object oriented component in percentile (%).

1.1.

Purpose of Document

Everything is developed for the ease of human beings. Every industry builds products or projects
for their intended audience or to assist its employee. We cant present a product for real time
usage until we test that product. Testing actually justify either this product is meeting customer
functional and non functional requirements like quality, robustness, performance, reusability, user
friendly etc. Testing verify the all parameters which must be fulfill before its real time use.
Testing verify the performance level of an application or a model.

In short, a product is

incomplete without testing process. Softwares crisis gave a new thinking to industry. Why we are
not reusing our x-work? At the beginning, these jobs are very tough, but object oriented approach
gave a light for betterment. IT professionals are the key assets of our industry. We tried to lessen
their workload by developing this application. The intended audience of our system is software
industry. By calculating reusability of x-work, he/she can confidently say that I have done x% of
this product already. It will assist a project manager in project time estimation and will also boost
the morale of working team. This model will help a lot to bring down softwares crisis.
1.2.

Project Overview

Software reusability is a prominent approach used to help in ensuring how software could be
reusable or how to find out those artifacts from currently available software components to build
a new system. There are number of attributes, which find out the quality of the software like
understandability, maintainability, reusability, fault proneness etc. The requirement today is to
find how different metrics can collectively determine the reusability of the software component.

The aim of reusability evaluation of object oriented based software components is to predict the
quality of these components in terms of reusability. So far different reusability evaluation models
have been proposed by number of researchers. Comparative analysis of certain different machine
learning regression techniques and finding the best machine learning algorithm for reusability
evaluation model is still unexplored.
In this research based project we are going to develop an application for reusability calculation of
OO based software components. Our project has three phases. In 1 st phase, formulation of given
four techniques and developing an application which is capable of calculating reusability for OO
components. 2nd phase is based on comparison of these four regression techniques. In 3rd phase,
we have to develop a generic algorithm which will contain the most dominant features of four
regression techniques. We have developed our application which is very simple and easy to use.
To evaluate any OO components you need to put CKs metric suit value. Final output of the
system is displayed in percentile (%) in tabular format against the given input. Output is also
shown in pie graph format. This application is also help in datas generation. Datas generation
ultimately enhance the accuracy of results. This application will help out the industry to tackle
the softwares crisis. Deadline is key factor in the success of any project, this application will also
help to meet deadline by reducing effort, time and cost of project. In short, it will target software
pyramid that contains Effort, Time and Cost. These three parameters definitely affect the quality
of a product.
2. Scope of Testing
Research oriented project are mostly analysis based. Our project is statistics based. In scope we
describe what is in and what is out in testing of this application. On following levels we require
testing.
1. Checking data type of inputs
2. Data Scaling
3. Which parameter of CKs metrics affects overall reusability result dominantly and in which
sense?
4. Matching of pie chart result with results in tabular format
5. Verification of output results

6. Testing the training results of ML algorithm.


Following things are out of our scope
1. Either provided inputs are correct or not.
2. Just telling how much that component is reusable in percentile (%). Not telling how to reuse
it.
3. Test Plan Strategy
Planning leads us to right way and also warns about incoming risks. Outline of work save time,
cost and effort. Strategy is basically informing us to define a clear cut path to accomplish the
testing. Resources, procedures, testing types and environment are defined during planning for
testing. We will do following tests according to our system perspective.
1. Unit Testing
2. Integration Testing
3. System Testing
1.

Unit Testing

Definition: Unit testing is the most basic type of testing. Unit testing involve variables,
functions, classes, objects and complete package, module or component. All other tests are based
upon unit testing. Unit tests give you the confidence to refactor your code to make it cleaner and
more efficient.
Participants: Shahnawaz, Shayan Shaukat, Rubab Javaid, and Hira Noor.

Methodology:
i.

Set up all conditions for testing.

ii.

Call the method (or Trigger) being tested.

iii.

Verify that the results are correct.

Clean up modified records.

iv.

2.

Integration Testing

Definition:. Integration testing finds the bugs that occur when two or more models integrated.
Main purpose of Integration testing is to identifying the functional, requirement and performance
level bugs. When modules not integrated, they perform as per requirement but when they
integrated, functional requirement and performance related issues will occurs due to the
integration. There are three different types of integration testing approach in software testing.
i.

Big Bang

ii.

Top down

iii.

Bottom up

Participants: Shahnawaz, Shayan Shaukat, Rubab Javaid, and Hira Noor.


Methodology:

3.

i.

Prepare

ii.

Review

iii.

Rework

iv.

Baseline

System Testing:

Definition: System testing of software or hardware is testing conducted on a complete,


integrated system to evaluate the system's compliance with its specified requirements. System
testing has three further sub testing.

i.

Functional Testing

ii.

Performance Testing

iii.

Acceptance Testing

Participants: Shahnawaz, Shayan Shaukat, Rubab Javaid and Hira Noor.


Methodology:
i.

Execute the test condition.

ii.

Check the output against the expected results.

iii.

Evaluate and document any unexpected results.

iv.

Make sure that any required corrections are migrated and re-tested.

v.

Make sure that final testing components (conditions, input, and expected results) are
accurate, complete and documented in such a way to make them repeatable and
reusable.

vi.

Review and obtain Acknowledgment of System Test results where appropriate (i.e.
new functionality).

vii.

Check system is following performance level required of given application

viii.

Review system by end user to complete acceptance testing.

4. Test Environment
Test environment describes on what certain conditions testing has performed. It describes
hardware and software specification and situation on which testing has performed. Since our
system is not sensitive so no specific environmental conditions like temperature, humidity etc are
required. Our application is platform independent. Minimum Hardware and software
specifications are listed below.
a. Hardware Specifications:
i.

512 MB RAM.

ii.

2.00 GHz Processor

iii.

HD 20GB

b. Software Requirements
i.

All Browsers

ii.

All text editors capable of handing files with .js, .php and .html extensions without
letter casing sensitivity.

iii.

Junit Tester

c. Infrastructure
i.

An internet connection and browser oriented infrastructure.

No logistics support is required for testing from department and teachers. However result
validation is evaluated by honourable Dr. Tehseen Zia, Syeda Iffat Zahra with the assistance
of honourable Dr. Muhammad Ilyas.
5. Schedule
__________________________________________________________________________________________

Testing Activities

Begin

End

Designing Test Cases

03-11-2014

10-12-2013

Executing Test Cases

12-02-2014

20-04-2014

Unit Testing

12-02-2014

31-04-2014

Integration Testing

01-05-2014

15-05-2014

System Testing

15-05-2014

22-05-2014

6. Control Activities
Our whole project team has participated in review meetings conducted with supervisor and the
PM (project manager).These meetings have been conducted weekly and monthly.
7. Functions to be Tested
R1: Checking data type of inputs
R2: Data Scaling
R3: Which parameter of CKs metrics affects overall reusability result dominantly and in which
sense?

R4: Matching of pie chart result with results in tabular format


R3: Verification of output results
R4: Testing the training results of ML algo.

USE Cases:
UC-1: Check Data Types
UC-2: Scaling of Data
UC-3: Effect of CKs metrics
UC-4: Comparing of Result
UC-5: Verification of Result
UC-6: Testing of Result
8. Functions not to be Tested
-

Either provided inputs are correct or not.

Just telling how much that component is reusable in percentile (%). Not telling how to
reuse it.

9. Test Case Design and Description

< Check

Test Case ID:

Data Type >

T1

Application Development Shahnawaz,

Shayan

Shaukat,

Test Case Version:


Test Date:

T1
21-03-2014

Team
Reviewed By
Use Case Reference(s)

Revision History
Objective
Product/Ver./Module
Environment:
Assumptions:
Pre-Requisite:

NO
Type of entered CKs matrices is Real Number.
Refer to next level for data scaling
Necessary and desired properties of the test environment. (hardware, software)
Provided inputs are correct
Internet connection is required

Rubab Javaid, and Hira Noor


Team Lead
UC_1

Step

Execution description

Procedure result

No.
1.

Fill all the input fields

Values are entered.

2.

Press the submit button

Form validation has started

3.

No error message found because all values Form submission has done successfully

were real Numbers


Comments:
No error was recorded in module. All team members are satisfied with the results
Passed

Failed

Not Executed

< Scaling of Data >


Test Case ID:

T2

Application Development Shahnawaz,

Shayan

Shaukat,

Test Case Version:


Test Date:

T2
27-03-2014

Team
Reviewed By
Use Case Reference(s)

Revision History
Objective
Product/Ver./Module
Environment:
Assumptions:
Pre-Requisite:

NO
All entered values are properly scaled or not.
Refer to next level for mathematical evaluation
Necessary and desired properties of the test environment. (hardware, software)
All values of type Real Numbers.
Internet connection is required

Rubab Javaid, and Hira Noor


Team Lead
UC_2

Step

Execution description

Procedure result

No.
1.

Press the submit button

Values are forwarded to scaling function.

2.

Scaling function is executed

Scaled values are displayed in tabular form

Comments:
No conflict was found during scaling. All team members are satisfied with the results

Passed

Failed

Not Executed

< Effects of CKs metrics >


Test Case ID:

T3

Application Development Shahnawaz,

Shayan

Shaukat,

Test Case Version:


Test Date:

T3
28-03-2014

Team
Reviewed By
Use Case Reference(s)

Revision History
Objective
Product/Ver./Module
Environment:
Assumptions:
Pre-Requisite:

NO
Which parameter of Cks matrices affect final results
Refer to final result of reusability evaluation model
Necessary and desired properties of the test environment. (hardware, software)
Form is submitted.
Internet connection is required

Rubab Javaid, and Hira Noor


Team Lead
UC_3

Step

Execution description

Procedure result

No.
1.

Form has submitted

Values are forwarded to KNN algorithm.

2.

Results displayed after processing

Final result is displayed against provide


Cks matrices suit.

Comments:
All team members are satisfied with the results because results are according to our research work.
Passed

Failed

Not Executed

< Comparing of Result >


Test Case ID:

T4

Application Development Shahnawaz,

Test Case Version:


Test Date:

T4
01-04-2014

Team
Reviewed By
Use Case Reference(s)

Revision History

NO

Shayan

Shaukat,

Rubab Javaid, and Hira Noor


Team Lead
UC_4

Objective
Product/Ver./Module
Environment:
Assumptions:
Pre-Requisite:

Graphical and tabular results are same.


Refer to final result of reusability evaluation model
Necessary and desired properties of the test environment. (hardware, software)
Form is submitted.
Internet connection is required

Step

Execution description

Procedure result

No.
1.

Form has submitted

Values are forwarded to KNN algorithm.

2.

Results displayed after processing

Results in tabular and graphical model are


displayed. Both results are same.

Comments:
No conflict was found during matching of graphical and tabular results. All team members are satisfied
with the results
Passed

Failed

Not Executed

< Verification of Result >


Test Case ID:

T5

Application Development Shahnawaz,

Shayan

Shaukat,

Test Case Version:


Test Date:

T5
02-04-2014

Team
Reviewed By
Use Case Reference(s)

Revision History
Objective
Product/Ver./Module
Environment:
Assumptions:
Pre-Requisite:

NO
Accuracy of final result.
Refer to final result of reusability evaluation model.
Necessary and desired properties of the test environment. (hardware, software)
Form is submitted.
Internet connection is required

Rubab Javaid, and Hira Noor


Team Lead
UC_5

Step

Execution description

Procedure result

No.
1.

Form has submitted

Values are forwarded to KNN algorithm.

2.

Results displayed after processing

Results in tabular and graphical model are


displayed.

Comments:
Final result is nearly to our approximation. All team members are satisfied with the results
Passed

Failed

Not Executed

< Testing of Result >


Test Case ID:

T6

Application Development Shahnawaz,

Shayan

Shaukat,

Test Case Version:


Test Date:

T6
04-04-2014

Team
Reviewed By
Use Case Reference(s)

Revision History
Objective
Product/Ver./Module
Environment:
Assumptions:
Pre-Requisite:

NO
Training of Multilinear Algorithm.
Refer to data generation for this application.
Necessary and desired properties of the test environment. (hardware, software)
Training on different number of iterations.
Internet connection is required

Rubab Javaid, and Hira Noor


Team Lead
UC_6

Step

Execution description

Procedure result

No.
1.

Enter the values for training of ML algorithm

Values are forwarded to ML algorithm.

2.

Results displayed after processing

All

3.

Test data generation process

displayed.
Results are displayed

4.

Check the difference between provided set of Result is displayed against each set of

data and actually generated data.


Comments:

results

against

each

iteration

is

inputs.

Finally ML is trained at 10,000 iterations at which accuracy level is 96%. All team members are
satisfied with the results
Passed

Failed

Not Executed

10. Traceability Matrix


Sr.

Requirement ID

Use Case ID

no.
1

R1

GUI ID

Test case ID

Status

Date

UC_1

T1

Complet

Completed
21-03-2014

R2

UC_2

T2

e
Complet

27-03-2014

R3

UC_3

T3

e
Complet

28-03-2014

R4

UC_4

T4

e
Complet

01-04-2014

R5

UC_5

T5

e
Complet

02-04-2014

R6

UC_6

T6

e
Complet

04-04-2014

11. Major Deliverables


At the end of testing phase, we deliver:
i.

Test plan document

ii.

Test cases

12. Risks and Assumptions


i.

We have implemented first 2 algorithms of our project and we have done unit testing of
both but our 3rd algorithm is in developing state .We have integrated first 2 but not the 3 rd
one

ii.

thats why may be a chance of risk for us at the time of integration and system testing.

Our all group members have creep knowledge about project and they all know about
critical details of each aspect of our project. We can easily present our project to market,
to end-user, to project manager. We are confident for our final year project so there is
very low chance of risk

iii.

During working of application, we assumed that all provided inputs are accurate.

iv.

Internet is available at the time of working.

13.

Exit Criteria

To exit the unit Testing phase 100% success rate must be achieved. Things that must be done on
exit from the Integration Test stage:
i.

The project coding is complete.

ii.

All priority bugs have been fixed and closed

iii. Unit test has been passed.


The integration Test Exit Criteria will rely on all the modules to be operational. To exit the
Integration Testing phase 100% success rate must be achieved. Things that must be done on exit
from the Integration Test stage:
i.

All code bugs that are exposed are corrected.

ii.

The entire module after integration performs functionality according to the System
Specification Design.

iii. All Modules are ready for System Testing


iv. Black Box Testing is completed.
The system testing Exit Criteria must satisfy all the criteria listed below. This verifies that all
elements of the project integrated properly. This is to make sure that all the system functions and
performs according to the System Specification Document.
i.

All Function Validation Testing is 100 percent successful.

ii.

The Graphical User Interface performs to System Specification Requirements.

iii. All input fields on the Graphical User Interface are working correctly.
iv. Quality Assurance Plan approved.
v.

User satisfaction is approved.

vi. Application is approved by end-user.

14.

References

Date

Ref. No.

Document Title

REEFO

Reusability Evaluation Model

For

OO

Components
Software

Web Based Application

http://softwaretestingfundamentals.co

20-04-2014

m/integration-testing/
http://readwrite.com/2008/08/13/12_u

Fundamentals

nit_testing_tips_for_software_engine

Software Engineers

ers#awesm=~oC8WCLGsabxtM8
20-04-2014

testing

or

25-04-2014

and automation

6
7

SRS
Design Document

http://searchsoftwarequality.techtarget
.com/definition/integration-testing

integration and testing (I&T)

Software testing tutorials

https://developer.salesforce.com/page/
How_to_Write_Good_Unit_Tests

Tests

Integration

Document Source

Testing 18-04-2014

How to Write Good Unit


3

Publication
10-04-2014

Release/

Software

12 Unit Testing Tips for


2

of

25-04-2014

http://software-testing-tutorialsautomation.blogspot.com/2011/06/wh
at-is-integrated-testing-insoftware.html

25-04-2014
25-04-2014

Potrebbero piacerti anche