Sei sulla pagina 1di 43

Asia Pacific University of Technology and Innovation

Software Quality Engineering


CT051-3.5-3
Title: Reliable Web Attendance (R.W.A)
Intake: UC3F1603SE

Student ID: TP028489


Student Name: Ho Li Yang
Lecture Name: Mr. Kesava Pillai A/L Rajadorai @Rajoo

Table of Contents
Abstract................................................................................................................................3
Chapter 1: Introduction........................................................................................................4
BACKGROUND OF THE PROJECT....................................................................................................4
PURPOSE........................................................................................................................................4
AIMS OF THE SYSTEM...................................................................................................................5
OBJECTIVES OF SYSTEM...............................................................................................................5
SCOPE OF THE PROJECT.................................................................................................................6

Chapter 2: Methodology......................................................................................................6
METHODOLOGY SELECTION..........................................................................................................6
METHODOLOGY EVALUATION.......................................................................................................7
Requirements Planning..........................................................................................................8
User Design...........................................................................................................................8
Construction..........................................................................................................................8
Implementation......................................................................................................................9
BENEFITS OF RAD........................................................................................................................9
JUSTIFICATION OF CHOICE..........................................................................................................10

Chapter 3: Quality Assurance Process...............................................................................11


QUALITY ASSURANCE.................................................................................................................11
MCCALLS QUALITY MODEL......................................................................................................12
PRODUCT REVISION FACTOR.......................................................................................................13
PRODUCT TRANSITION FACTOR...................................................................................................13
PRODUCT OPERATION FACTOR.....................................................................................................13
VALIDATION&VERIFICATION......................................................................................................14
DEFECT TRACKING......................................................................................................................14
CONCLUSION...............................................................................................................................15

Chapter 4: Estimation........................................................................................................16
SCHEDULE ESTIMATION......................................................................................................16
SIZE ESTIMATION...................................................................................................................17
EFFORT ESTIMATION...................................................................................................................18
CONCLUSION...............................................................................................................................21

Chapter 5: Quality Assurance Plan....................................................................................22


SQAP PURPOSE..........................................................................................................................22
Scope of Software quality assurance plan............................................................................22
Referenced Documents.........................................................................................................22
MANAGEMENT............................................................................................................................23
Project Organization............................................................................................................23
Roles and Responsibilities...................................................................................................24
DOCUMENTATION........................................................................................................................25
Software Requirement Specification (SRS)...........................................................................25
SRS purpose.........................................................................................................................26
SRS content..........................................................................................................................26
Software Design Description...............................................................................................27

Architecture Design Documents...........................................................................................28


ADD purpose.......................................................................................................................28
ADD content........................................................................................................................28
Software Verification and Validation Plan...........................................................................29
Software Verification and Validation Report........................................................................30
Configuration Management Plan.........................................................................................30
CMP Purpose......................................................................................................................30
CMP Content.......................................................................................................................31

Configuration Identification........................................................................................31

Roles and Responsibilities...........................................................................................31


STANDARDS, PRACTICES, CONVENTIONS, AND METRICS............................................................32
Documentation Standards....................................................................................................32
Coding and Naming Standards............................................................................................32
User Interface Standards.....................................................................................................33
Software Metrics..................................................................................................................34
REVIEW AND AUDIT....................................................................................................................34
Purpose................................................................................................................................34
Deliverables Reviews...........................................................................................................34
Management Review............................................................................................................36
Product Reviews..................................................................................................................36
TESTING......................................................................................................................................36
Unit Testing:........................................................................................................................36
Integration Testing:..............................................................................................................37
RISK MANAGEMENT...................................................................................................................37
Risk Identification................................................................................................................38
Risk Quantification..............................................................................................................38
Risk Response Plan..............................................................................................................39
Risk Mitigation Plan............................................................................................................39
Risk Configuration and Control:.........................................................................................39
Tools and Practices..............................................................................................................39
Glossary...............................................................................................................................39

Chapter 6: Metrics.............................................................................................................41
PRODUCT METRICS......................................................................................................................41
PROCESS METRICS.......................................................................................................................42
SIZE-ORIENTED METRICS............................................................................................................42
LOC METRICS............................................................................................................................42
COMPLEXITY METRICS...............................................................................................................43
MEASUREMENT...........................................................................................................................43
MEASURES OF SOFTWARE QUALITY...........................................................................................44

Conclusion and Recommendation.....................................................................................45

Abstract
First of all, I would like to thank to my lecturer of this module Kesava Pillai A/L
Rajadorai @Rajoo for the advices and supports. He has encouraged and motivated me to
do well in our project. He also emphasized that this project was very important for my
understanding on the subject and in the future. I would like to thank him for showing
some good examples that is related to our assignment. I would like to thank the authority
of The Asia Pacific University (APU) for providing me with the good environment and
facilities to finish the project.

Chapter 1: Introduction
Background of the project
Nowadays software application has gradually been used in almost every organization to
organize important information. The significant benefit of software application is to
replace the traditional paper work of managing information. Attendance system is very
important for every education organization unit as it is used to manage the students
attendance records. Attendance system has to be stable, reliable, and must simplified the
task of attendance marking as possible as it can. Because every lecturer is compulsory to
mark attendance during every lecture session, and it should not take too long time as
every minutes, every seconds is precious during the lecture session. Many of the
attendance system, such as APUs web attendance, is quite tedious because every time the
lecturers mark attendance they have to select the detail such as intake, module, start time,
class type, duration etc. from the long drop-down list, which is confusing and takes long
time. Sometimes the lecturer marked the wrong attendance for students because theyve
made a mistake of selecting the detail from the drop-down list. In addition, when the final

examination is near, students have to collect the examination dockets in administration


office. But collecting examination dockets is tedious because the staffs have to issue
dockets one by one for the large number of students. Sometimes it takes long time for
students to get their examination dockets when the queue is long and the staffs working
efficiency is too slow. This creates workload to the staff and also makes the students
unhappy.

Purpose
According to the problems stated above, the Reliable Web Attendance will be a great
solution for these problems, as it simplified the attendance marking process, the lecturer
can mark the student attendance is just few steps. Also, the proposed system would really
help when issuing docket for students by allowing them to get docket online if the student
has enough total of class attended. Its a web application which allows users to access it
anywhere through internet.

Aims of the System

The aim of R.W.A is to provide a simple, efficient and useful attendance web application.
It improves the way and reduce the time taken in marking attendance. It also eliminates
the manual task such as issuing docket. Another aim is to try to reduce the absenteeism in
APU.

Objectives of System

Objectives are more specific statements about what R.W.A will be able to do after
completion of the system.
The first objective is to make things done in just few steps. Through the new system, the
lecturer can mark attendance by just logging in and choose module, they do not have to
specify other detail such as class code, date time, class type etc.
Secondly, R.W.A is meant to reduce workloads of administrator as it allows students to
get their docket number online, so the students do not need to pay a visit to the
administration office. Another objective is that it highlights the name of the low
attendance student and also warn the student if had too many classes absented.

Scope of the project

The scope of a system defines the boundaries of the system to be built. In other
words, Reliable Web Attendances various features must be defined clearly to assure the
quality of the new system, it is assumed that R.W.A is being built for a university.
The different features of a Reliable Web Attendance to be included will be:

1) Marking attendance record


2) Viewing attendance record
3) Getting docket number
4) Low attendance warning& highlight

Chapter 2: Methodology
Methodology Selection
XP is a set of practices that conform to the values and principles of Agile. XP is a discrete
method, whereas Agile is a classification. There are many Agile methods, XP is just one
of them.
Having said that, none of the other Agile methods are as well defined, or as broad in
scope as XP. Scrum, for example, is roughly equivalent to XPs Planning game practice,
with elements of Whole Team. While there are differences in the details, it is fair to say
that Scrum is a subset of XP. Indeed, many Scrum teams augment their process by adding
in many of the XP practices such as Acceptance Testing, Pair Programming, Continuous
Integration, and especially Test Driven Development.
Of all the Agile methods, XP is the only method that provides deep and profound
disciplines for the way developers do their daily work. Of those disciplines, Test Driven
Development is the most revolutionary and impactful. (Storani Maurizio, 2008)

Methodology Evaluation
Robustness
Extreme Programming leverages the power of simplicity. The design resembles a jigsaw
puzzle with developers working on many small pieces or iterations. The combination of
such iterations at the end gives the end product. This approach creates working software
faster with very few defects. Regular testing at the development stage ensures detection
of all bugs, and the use of customer approved validation tests to determine the successful
completion of a coding block ensures implementation of only what the customer wants
and nothing more.
One advantage of this approach is allowing for cost estimates-based software features
instead of developer activity. This allows customers to make intelligent decisions on what
needs inclusion and what needs exclusion depending on the budget. By selecting the
important requirement first, the customer obtains maximum value with the least amount
spent, and this can effect a trade-off on the marginal increase in product utility with the
cost to incorporate additional features. This approach also allows both the user and the
customer to "pull the plug" on development at almost any time and still have highly
valuable functional code, even if incomplete.
Resilience
The traditional approach of programming works best when requirements remain static. In
actual life, requirements keep changing either because of emergence of new business
opportunities or simply because the initial requirement-gathering phase was incomplete.
Extreme Programming in-builds accommodation of such changed requirements through
getting user stories at the start of iterations, and through feedback during the course of
iterations.
Employee Satisfaction
Extreme Programming, while reducing the importance of individuals in the development
process, also helps increase employee satisfaction and retention. Extreme Programming is
a value-driven approach that sets fixed work time, with little scope for overtime. The

breakdown of project scope into subcomponents and the constant customer feedback
prevents accumulation of much work to be completed before a tight deadline.
Lesser Risks
One of the major advantages of Extreme Programming is that it reduces the risks related
to programming. Conventional programming depends a lot on individual superstars or
critical members in the team. Extreme Programming, by breaking the tasks into modules,
spreads the risk and reduces the dependence on any one architect, project manager, or
individual coder.

Justification of Choice
Extreme Programming (XP) was created in response to problem domains whose
requirements change. I assumed that the customers, such as lecturer and student, may not
have a firm idea of what the system should do. I may have a system whose functionality
is expected to change every few months. In many software environments dynamically
changing requirements is the only constant. This is when XP will succeed my R.W.A
project while other methodologies do not. XP was also set up to address the problems of
project risk. If your customers need a new system by a specific date the risk is high. If
that system is a new challenge for your software group the risk is even greater. If that
system is a new challenge to the entire software industry the risk is greater even still. The
XP practices are set up to mitigate the risk and increase the likelihood of success. XP is
set up for small groups of programmers. Between 2 and 12, though larger projects of 30
have reported success. Our programmers can be ordinary, we don't need programmers
with a Ph.D. to use XP. But we cannot use XP on a project with a huge staff. We should
note that on projects with dynamic requirements or high risk you may find that a small
team of XP programmers will be more effective than a large team anyway. XP requires an
extended development team. The XP team includes not only the developers, but the
managers and customers as well, all working together elbow to elbow. Asking questions,
negotiating scope and schedules, and creating functional tests require more than just the
developers be involved in producing the software. The last thing on the list is

productivity. XP projects unanimously report greater programmer productivity when


compared to other projects within the same corporate environment. But this was never a
goal of the XP methodology. The real goal has always been to deliver the software that is
needed when it is needed.

Chapter 3: Quality Assurance Process


Quality Assurance has its roots in assuring the quality of a manufactured physical
product; this is achieved by inspecting the product and evaluating its quality near its
completion or at various stages of production. Software however is not as tangible as
products that are more physical. Typically, a software product is its functionality and not
its use. There is no physical software product to evaluate; there is code and not always
accompanying documentation. This invisible nature of software adds to the
complications of assessing its quality. Industrial products are visible, software products
are invisible. Most of the defects in an industrial product can be detected during the
manufacturing process, however defects in software products are invisible, as in the fact
that parts of a software package may be absent from the beginning (Daniel Galin, 2004)

Software quality Assurance

A planned and systematic pattern of all actions necessary to provide adequate


confidence that a software work product conforms to established technical
requirements.

A set of activities designed to evaluate the process by which software work


products are developed and/or maintained.
(IEEE quoted in Daniel Galin, 2004) also SEI Carnegie Mello University Glossary of
terms for CMM Key practices.
For the purpose of this Software quality assurance (SQA) is considered a process for the
measurement of deliverables and activities during each stage of the development
lifecycle. The objective of SQA is to quantify the quality of the products and the activities
giving rise to them and also to guide a quality improvement effort. It is advantageous to
integrate it into the software development process. SQA should also take into
consideration the maintenance of a product, the technical solution, product budget and
scope. Quality assurance differs from quality control in that quality control is a set of
activities designed to evaluate the quality of a developed or manufactured product. The
evaluation is conducted during or after the production of the product. Quality assurance

however reduces the cost of guaranteeing quality by a variety of activities performed


throughout the development and manufacturing process.
For the purpose of this assignment I will focus on the following aspects to SQA. Each
SQA activity that I discuss is modular; the SQA activities take place at each
developmental stage of the development lifecycle. The stages are categorized into areas
for requirements capture, system design and coding and testing and finally release.
1. Verification The process of evaluating a system or component to determine
whether the products of a given development phase satisfy the conditions imposed at
the start of that phase.
2. Validation The process of evaluating a system or component during or at the end
of the development process to determine whether it satisfies specific requirements
3. Qualification The process used to determine whether a system or component is
suitable for operational use.
During the analysis, design and coding stages of product development the outputs of each
stage need to be measured, monitored and managed so that each output can be verified
against its predefined exit criteria. When the final product has completed the coding and
integration stages it must be validated against the original user requirements and signed
off by senior team members as passed validation testing. At each stage of this product
development the efforts during the development must be improved upon where possible
in order to cut costs and remain competitive. This is not an easy task when what is being
produced is a program, which in itself is intangible. This is where the complications of
software quality assurance lie.
Verification originated in the aerospace industry during the design of systems. There are
two criteria:
1. The software must perform all intended functions
2. The software must not perform any function itself or in combination with other
functions that can degrade the performance of the system.

An effective verification effort must show that all requirements have been carried out
correctly, this is done by testing the requirements against the product during delivery.
These tests can be re-executed to achieve the same results should the system be changed
at a later date.
Verification is showing that a product meets its specified requirements at predefined
milestones during the development life-cycle. Validation checks that the system meets the
customers requirements at the completion of the development life cycle. An example
system of verification versus validation is depicted below:

Figure: V-model of verification versus validation

Chapter 4: Estimation
I assumed that there are two situations in which I have to estimate the project. First
situation is that my boss come to me and gives me a feature list or requirements list. After
that Im asked to give the total time required to implement all of these features. I estimate
the project and give my estimates to my boss. Boss take these estimate and put into a
larger sub-total for a large project. In this situation when Im estimating I do not have any
time limit. In this estimate Im not kept under-pressure for deadline. This type situation
happens very rarely. In second situation I have a deadline and a list of features to
implement. My task is to develop an estimate for all the features that I can give before the
deadline. We face most of the time this second situation. There may be times when with
all resources available one cannot deliver the output on or before the deadline. In this
situation one has to negotiate either the deadline or the number of features for
implementation. Estimation alone cannot guarantee me the project completion at the
committed date. One need project control and good project management skills to
complete the project according to the estimate. Therefore, in software project
management estimation is just one part and just help out in planning.

SCHEDULE ESTIMATION
When does the project start? Does it start when it gets formal budget approval? Does it
start when initial discussions about the project begin? Does it start when it is fully
staffed? Caper Jones reports that less than 1% of projects have a clear defined starting
point. (Jones, 1997)
When does the project end? Does it end when the software is released to the customer?
When the final release candidate is delivered to testing? What if most of programmers
have rolled off the project a month before the official release? Jones reports that 15% of
projects have ambiguous end times. (Jones, 1997)

There are many tools on the market that help develop Gantt and PERT charts to schedule
and track projects. These programs are most effective when breaking the project down

into a Work Breakdown Structure (WBS), and assign estimates of effort and staff to each
task in the WBS.

SIZE ESTIMATION

Software size is the key input to most software cost estimating methodologies.
It can be measured by counting number of Lines of Code (LOC). For this project S.M.S,
the project manager is going to adopt Function points to estimate the project size, as it
doesnt rely upon a propriety estimation algorithm for the size
estimation. There are several approaches to estimate size of S.M.S:

Through expert consensus (Wideband-Delphi)

From historical population data (Fuzzy logic)

From standard components (Component estimating)

From a model of function (Function points)

The estimated size of the project is shown below:

Function
Inputs
Outputs
quiries
Interfaces

FP conversion
4
5
4
7

Qty
5
2
5
4

Total
20
10
20
28

Logical interface table

10

30
108

Popular method based on a weighted count of common functions of


software. The basic functions are:
Inputs: Sets of data supplied by users or other programs
Outputs: Sets of data produced for users or other programs
Queries: Means for users to interrogate the system
Interfaces: Files/databases shared with other systems

Effort Estimation
Effort is defined as an expenditure of physical or mental effort that will
be needed from the part of the team member and effort is mostly
being measured in terms of staff hours (Duncan, 2011).
The main goal here is to define data being collected well enough so that we can know
what we are estimating. If the data from previous projects includes a high percentage of
unpaid overtime and we use the historical data to estimate a future project, then we have
just calibrated a high percentage of overtime into the future project. (McConnell, 2006)

The each type of programming language


Programming language
C++

LOC per function point


53

Cobol

107

Delphi 5

18

HTML 4

14

Python

20

ASP.net

30

SQL

15

Java

46

Programming language: Java


LOC = (Average lines of codes) * number of function points
LOC

= 46 x function point

LOC = 46 x 108
LOC = 4968
Convert LOC to KLOC =4.968

Project Type

Linear Productivity Factor

COCOMO II

3.13

Embedded Development

3.60

E-commerce Development
Web Development

3.08
2.51

Military Development

3.97

Common Values for the Linear Productivity Factor

Linear productivity factor (Mobile App) = 3.60


Effort = productivity x (KLOC ^ penalty)
= 3.60 x (4.968 ^ 1.030)
= 19

Duration= 2.5*E0.38
= 2.5 * (19) ^0.38
= 7 months
Estimating people = E/Duration
= 19/7 ~ 3 people

Coding cost
8 hrs | day * 5 days
=40 hrs in 1week
=40 hrs* 4 weeks
=160 hrs/month
1 hr = 15 RM
1 month cost = RM 15 * 160 hrs

= 2400 RM
Coding cost = 160 * 15 * 7
= 16,800 RM
Coding Cost= RM 16,800 * (3people) = 50,400 RM

Conclusion

Estimating software is simply a better tool for businesses. It just makes sense to use
something that helps to address the challenges that the business faces in its estimating and
produce the best estimates that possibly can be done.

Chapter 5: Quality Assurance Plan

According to IEEE, Quality assurance is a planned and systematic pattern of all actions
necessary to provide adequate confidence that an item or product conforms to established
technical requirements.
Assuring that there are no defects on the software and if any exist fixing them is the
meaning of software quality assurance. Project managers and their teams must assure that
the product is defect free before launching it in the market.

SQAP Purpose
The software quality assurance plan (SQAP) defines the activities performed to provide
assurance that the software-related items delivered to the customer conforms to the
established and contracted technical requirements. The SQAP also describes how the
project will be audited to ensure that the policies, standards, practices, procedures, and
processes applicable to the project are followed.

Scope of Software quality assurance plan

The

scope

of

this

SQAP

includes

activities,

assessments

and

measurements of new system life cycle processes from requirement


gathering to testing and maintenance to provide quality control and set
up the required standards. (Tian, 2009)

Referenced Documents

IEEE 730, Standard for Software Quality Assurance Plans

IEEE 828, Standard for Software Configuration Management Plans

IEEE 829, Standard for Software Test Documentation

IEEE 830, Recommended Practice for Software Requirement Specification

IEEE 1008, Standard for Software Unit Testing

IEEE 1012, Standard for Software Verification and Validation Plans

IEEE 1016, Guide to Software Design Description

IEEE 1028, Standard for Software Review and Audit

IEEE 1042, Guide to Software Configuration Management Plans

IEEE 1044, Standard Classification for Software Anomalies

IEEE 1045, Standard for Software Productivity Metrics

IEEE 1058.1, Standard for Software Project Management Plans

IEEE 1059, Guide for Software Verification and Validation Plans

IEEE 1061, Standard for a Software Quality Metrics Methodology

IEEE 1063, Standard for Software User Documentation

IEEE 1074, Standard for Developing SDLC Processes

IEEE 1219, Standard for Software Maintenance

IEEE 1233, Guide to Developing System Requirement Specifications

Management
Project Organization

Smart
Medical
System

S.M.S
Project
manager

S.M.S
quality
assurance
team

Testers

Developers

Roles and Responsibilities

Roles

Responsibility

Project Manager

Developing the project plan


Managing the project schedule
Managing the project budget
Managing the project stakeholders
Managing the project team
Managing the project risk

Quality assurance team

Implementing the software quality plan.


Check project quality and software
quality assurance.
Report quality assurance problems to
development team.
Working with PM and QM on the

Developers

development process
Managing and coordination of human
resources

Testers
Testing end product of the proposed
project

Documentation
Documenting projects is crucial as it shows what is done in the project and for the
purpose of future use. This part contains identification of document for development
purpose and for verification and validation as software maintenance.

Software Requirement Specification (SRS)


Software requirements specification is a document including the full list of description
purpose and environment for software under development. The Software requirement
specification describes what the software will do and how it will be expected to perform.
(Tian, J,2009)

SRS purpose
Software requirement specification is used to know all the requirements that are required
for the S.M.S software development and thus that will further help in designing the
software and thus finally to make the software. Beside that it refers to nonfunctional
system requirements.

SRS content
Functional Requirement
Smart Medical System
Name:

Alert on Location

Description:

Call the emergency line

Actor(s):

Users, operators

Assumption:

The call will be redirected to appropriate

Precondition(s):

personnel
Tap the emergency button

Post condition(s):

Call the emergency line

Primary (Happy) Path(s):

NFC

Alternate Pathway(s):

None

Exception Pathway(s):

None

Nonfunctional requirement

The location of the patient should be encrypted.

Software Quality Attributes

Software Design Description


Software Design Description (SDD) is an IEEE standard that specifies an organizational
structure for a software design description (SDD). An SDD is a document used to
specify system architecture and application design in a software related project.

Architecture Design Documents


ADD purpose

The Architectural Design Document (ADD) describes the basic system design for the
software to be made during the S.M.S project. This is a decomposition of the software
into components. Each component is described in terms of its external interfaces and the
dependencies on other components, this in order to allow the programmers in the next
phase of the project to work in parallel

ADD content

Use Case for Smart Medical System S.M.S

Software Verification and Validation Plan


The authentication and validation testing is a development of inspection if the software
has come across with its future purpose. On the other hands verification remains to check
if the development of a specified project progress stage satisfies the planned situations.
The review method will be selected used for the verification authentication test of Smart
Medical System (SMS) application system. Moreover, the review of validation and
verification plan for software which it is acceptable and complete records. However, this
documentation doesn't be existent currently through the team but then again it would like
to provide the phases for validation and verification of the generated work production.

Phases of Inspection for Smart Medical System

1. Planning
2. Overview Meeting
3. Preparation
4. Inspection Meeting
5. Rework
6. Follow-up
Software Verification and Validation Report
This report will decrease the results that have been founded from the importance like
software validation and verification plan. Fully the actions from the Software Verification
and Validation Plan (SVVP) will be at that moment documented consuming the exact
standards.

Configuration Management Plan


CMP Purpose
The purpose of the Configuration Management Plan is to describe how configuration
management (CM) will be taking place during the project lifecycle. This includes
documenting how CM is managed, roles and responsibilities, how configuration item
(CI) changes are made, and communicating all aspects of CM to project stakeholders.
Without a documented configuration management plan it is likely that CIs may be
wasted, unfinished, or redundant work is done because of a lack or version and document
control. (Tian, J,2009)

CMP Content

Configuration Identification
Configuration identification (CI) involves identifying the components of the smart
medical system (SMS), uniquely identifying the individual components, and
making them accessible in some form. A proper configuration identification
schema identifies each component of the network and provides traceability
between the component and its configuration status information

Roles and Responsibilities


Configuration Control Team:
This team wills analysis the document of configuration, rejects,
admits and also observes the situation, is the core responsibility
of this specific group.
Configuration Management Team:
This team will be mainly responsible for maintenance the
configuration management participants as well as keep the
configuration management group update.

Configuration Control
Includes the evaluation of all change requests and change proposals, and their
subsequent approval or disapproval. It is the process of controlling modifications
to the system's design, hardware, firmware, software, and documentation.

Standards, Practices, conventions, and metrics


Documentation Standards
The SQA always require a standard with complete well-explained documentation. Once
the entire documents have been arranged, need to approve by the project manager and the
SQA team. Absolutely the complete and standard document will deliver a better
presentation and through support the developer to maintain the progress easily. The
standard for SMS documentation are using Times New Roman font for clarity and line
spacing should not be 1.5 for SMS documentation.
Coding and Naming Standards
The Software Quality Assurance team is responsible to handle these standards. Whenever
the development team has accepted these standards, the coding will be going on

development department. Moreover, coding and its standard can give a better view to
programmer to recover modify and better readability. (Tian, J,2009)

User Interface Standards


The (SMS) development team has to ensure that the application layout must follow a
positive structure. With no doubt, the user interface must be user friendly in order to be
present a positive point of stability with clear design of interfaces.

Software Metrics
Software metrics are used to measure software. Their first use is to help us plan and
predict software development. It helps to control software quality and work progress
more efficiently. For the Smart Medical System, the source line of code
(SLOC) is used to identify and measure the size of the projects.
Additionally, it is used to measure the complexity of the new system
like the number of faults.

Review and audit


Purpose
It is used to assist project manager to assure that the project is meeting user requirements
as it is improve weaknesses happened while the system is being developed. According to
IEEE standards for software review and audits, the following types of software reviews
will be conducted.
Deliverables Reviews
Software Requirement Review
Software requirement reviews is done by a group of people who read and analyze the
requirements, look for problem, meet and discuss the problems and agree on action to
address these problems. The purpose of this review is to certify that the requirements
document is an acceptable description of the system to be implemented (Zelichover,
2009)
Review checklist will be as following:
1. Completeness
Ask questions like are all requirements written at consistent and
appropriate level of details and does the SRS include all of the known
customers or system needs? Are all the tasks the user want being
specified?

2. Consistency
This review will ensure to have internal consistency between the
software requirements and check is it SRS compatible with the
operational environment of the hardware and software. (NAIK, 2008)
3. Compatibility
The compatibility review will ensure that the interface requirements
enable compatibility of external interfaces. (NAIK, 2008)
4. Correctness
Is each requirement written in a clear, unambiguous language?

Is each requirement in scope of the project?

Can all of the requirements be implemented within known constraints?

System Architecture Review


This review will be conducted in SMS to ensure that the architecture of a system is
documented and the chosen technology and design is likely to achieve the projects goals
and objectives.

As Outcome of an Architectural Review will be a detailed Architecture Review


Document: This contains very detailed information on the points observed from the
system, whats the recommendations and mitigation. Generally the teams who are going
to look at doing those changes consume this document.
Software Verification and Validation and configuration Plan Review
The development team must ensure before delivering the system that
entirely the tests, reviews, approvals and acceptance has been done
and properly documented. Without verification, system delivery
remains incomplete.
Management Review
Management Review Meetings are designed to ensure that all quality
related functions are reviewed at the highest possible level and so that
all levels of management affecting quality are made aware of changes,
updates, revisions, verification activities and policies

In SMS system case it is the supervisor responsibility to conduct


regular Management Review Meetings and ensure that this procedure
is carried out.

Product Reviews
The comments about this product to make sure that the product are
developed according to the standards met. Also the User review will
be to assess the product's smart medical system. By its end-users to
test and evaluate the customer, the customer may once use without
any problems, the product will then officially released.

Testing
Completely the key feature that must be declared remains the true
thing on behalf of the system to be successful. On the other hand, to
maximize the usability besides efficacy of the system, numerous tests
must carry out in progress the last deliverable part of the system.
Unit Testing:
The Unit testing includes attractive lesser portions of the testable
software of the application and isolates it from the remaining part of
the code, and determines whether it has the expected behavior. Each
unit separately before integrates them into modules to test the
interfaces can be tested. Unit testing has shown its efficiency since
larger percentage of defects can be identified during its use.

Integration Testing:
Integration involves the logical addition of unit testing. However, the
two units that have previously been tested in the Unit testing are joint
into a particular component as well as the interface amongst them is

being tested. Also the various components are combined together to


make sure that they function properly as per the requirements.

Risk Management

Project risk management is the art and science of identifying,


analyzing, and responding to risk throughout the life of a project and in
the best interests of meeting project objectives.
Risk management is often overlooked in projects, but it can help
improve project success by helping select good projects, determining
project scope, and developing realistic estimates. (Pressman, 2010)
The following are risk management plan:
Risk Identification
Risk identification is the process of understanding what potential
events might hurt or enhance a particular project.
Risk identification tools and techniques include:

Brainstorming

The Delphi Technique

Interviewing

SWOT analysis

The imaginable risks of the smart medical system exist as follows:

Risk Quantification
Once the risks have been recognized, they must be arranged according
to their status. The risk quantification remains important from the time
when the team were not be able to take arrangements aimed at wholly
the risks. Also the risks could be classified according to their influence
on the project plus its possibility to happen inside the project. Thus, the
impact matrix is used to arrange the risks.

Probability/Impact Matrix

Risk Response Plan


The risk response design remains the direct action plan that wants to
be taken by the group if even the risks reports up through the project
lifetime.
Risk Mitigation Plan
The risk justification plans are alarmed typically in taking the required
actions to cut the possibility of risks amount.

Risk Configuration and Control:


To define e risk configuration also controls plan can say it is concerned
with the monitoring of risks and also different risk identification during
the complete life of the project.
For the Smart Medical System, various risks will be analyzed. However,
the Management will be informed if some modifications happen to the
risk status.

Tools and Practices


For SMS, the project manager decided to use interview as a tool to
identify the risks. Interviewing is a fact-finding technique for
collecting information in face-to-face, phone, e-mail, or instantmessaging discussions.
Interviewing people with similar project experience is an important tool
for identifying potential risks (Pressman, 2010)

Glossary

SMS- Smart Medical System


SQAP- Software Quality Assurance Plan
CCB- Change Control Board
CI Configuration Items
CM- Configuration Management
SAD -Software Architecture Design
SRS Software Requirement Specification

SQA- Software Quality Assurance


VVP- Validation and Verification Plan
NFC- Near Filed Communication
ADD- Architecture Documents Design

Chapter 6: Metrics

Software metrics are used to measure software. Their first use is to help us plan and
predict software development. It helps to control software quality and work progress
more efficiently. In general, software metrics can be classified into two categories:
software product metrics and software process metrics. Software product metrics are
measures of software products such as source code and design documents. Software
process metrics are measures of software development process. For example, the size of
the software is a measure of the software product itself, thus a product metrics. The effort
required to design a software system may be influenced by how the software is designed,
thus a process metrics. In this article, only software product metrics will be addressed.
(Li, 2002)

Software metrics can be classified into three categories: product metrics, process metrics,
and project metrics:

Product metrics
It focuses on the deliverables quality. It combined across several projects to produce
process metric. Beside that, it measures of the analysis model and complexity of the
design model.

Intrinsic product quality is usually measured by the number of bugs (functional defects)
in the software or by how long the software can run before encountering a crash. In
operational definitions, the two metrics are defect density (rate) and mean time to failure
(MTTF).

Process metrics
These metrics are used to help in decision-making and they intent to provide indicators
that led to long-term software process. The only way to know how/where to improve any
process is to
1. Measure specific attributes of the process.
2. Develop a set of meaningful metrics based on these attributes.
3. Use the metrics to provide indicators that will lead to a strategy for improvement.
(Unknown, 2010)

Size-oriented Metrics

Size of the software produced

LOC - Lines Of Code

KLOC - 1000 Lines Of Code

SLOC Statement Lines of Code (ignore whitespace)

Typical Measures:

Errors/KLOC, Defects/KLOC, Cost/LOC, Documentation Pages/KLOC

LOC Metrics

Easy to use
Easy to compute
Language & programmer dependent
Count only executable lines.
Count executable lines plus data definitions.
Count executable lines, data definitions, and comments.

Count executable lines, data definitions, comments, and job


control language.
Count lines as physical lines on an input screen
Count lines as terminated by logical delimiters.

Complexity Metrics
LOC - a function of complexity
Language and programmer dependent
Halsteads Software Science (entropy measures)
n - number of distinct operators
1
n - number of distinct operands
2
N - total number of operators
1
N - total number of operands
2

Measurement
It is defined as quantitative indication of extent, amount, dimension,
capacity, or size of some attributes of a product or process. (IEEE, 2004)
The goal of software measurement is to build and validate hypotheses and increase the
body of knowledge about software engineering. This body of knowledge can be used to
understand, monitor, control, and improve software processes and products. Therefore,
building measures is a necessary part of measurement, but not its final goal. (Morasca,
2011)

Software measurement got two types:


Direct measurement
Indirect measurement

Direct measurement relates an attribute to a number or symbol


without reference to no other object or attribute (e.g., height).
Indirect metrics usually are used when an attribute must be measured
by combining several of its aspects (e.g., density). Beside that it
requires a model of how measures are related to each other.

Measures of Software Quality

Correctness: degree to which a program operates according to the


specification
Defects/KLOC
A defect is a verified lack of conformance to requirements
Failures/hours of operation (H.Kan, S 2006)
Maintainability: degree to which a program is open to change
Mean time to change
Change request for new version (Analyze, design)
The cost to correct (H.Kan, S 2006)
Integrity: degree to which a program is resistant to outside attack
Fault tolerance, security & threats (H.Kan, S 2006)
Usability: easiness to use
Training time, the skill level necessary to use, Increase in
productivity, subjective questionnaire or controlled experiment
(H.Kan, S 2006)

Conclusion and Recommendation


In a conclusion, this document is proposed for the new system in order to help the
developers to improve the quality of the system. The researcher proposed McCall model
as a solution as it attempts to bridge the gap between users and developers by focusing on
a number of software quality factor that reflect both the users views and the developers
priorities. The researcher also mentioned a great way to increase the quality of the system
that is SQA plan; a set of activities designed to evaluate the process by which the
products are developed or manufactured. The importance of SQAP that it describes the
review and audit activities, monitors the process and detects inadequacies and settles
disputes and manages risks. Beside that this document handled system estimation, as it is
very important for a project to assure that all resources are being used

efficiently and wisely and there is no lack of resources at any point


throughout the project.

Software quality doesn not come for free. It has to be actively pursued.
The use of well-defined model of the software development process
and good analysis, design and implementation techniques are a first
prerequisite for the proposed system (S.M.S). However, quality msut
also be controlled and managed. To be able to do so, it has to be
defined rigorously. There exists numerous taxonomies of quality
attributes. For each a precise definition have been given, together with
a metric that can be used to state quality goals, and to check that the
quality goals are indeed being satisfied.

As a recommendation, focusing on software quality planing as V&V


focuses on the quality of products and QA focuses on the quality of
processes. Most of the time, Process and Product problems go
unnoticed so using quality assesment such as CMMI/ISO 9000
Assessments would really help jumping over this problem. Empowering
and

embracing

QA

activities

and

learning

to

effectively

use

walkthroughs, inspections, audits and reviews.

Tailored product and process measures should be used as which gets


measured, gets managed:
Process # of reviews, audits, inspections

Product internal, external, quality in use


Project earned value