Sei sulla pagina 1di 16

Definitions

Measure - quantitative indication of extent,


amount, dimension, capacity, or size of some
attribute of a product or process.
Software Metrics Number of errors

Metric - quantitative measure of degree to which


Software Engineering a system, component or process possesses a
given attribute. A handle or guess about a
given attribute.
Number of errors found per person hours expended

Why Measure Software? Example Metrics


Determine quality of the current product or Defects rates
process Errors rates
Measured by:
Predict qualities of a product/process individual
module
Improve quality of a product/process during development
Errors should be categorized by origin,
type, cost

1
Metric Classification Product vs. Process
Products Process Metrics-
Insights of process paradigm, software engineering
Explicit results of software development tasks, work product, or milestones.
activities. Lead to long term process improvement.
Deliverables, documentation, by products
Processes Product Metrics-
Assesses the state of the project
Activities related to production of software
Track potential risks
Resources Uncover problem areas
Inputs into the software development activities Adjust workflow or tasks
Evaluate teams ability to control quality
hardware, knowledge, people

Types of Measures Size Oriented Metrics


Direct Measures (internal attributes) Size of the software produced
Cost, effort, LOC, speed, memory Lines Of Code (LOC)
1000 Lines Of Code KLOC
Effort measured in person months
Indirect Measures (external attributes)
Errors/KLOC
Functionality, quality, complexity, efficiency,
reliability, maintainability Defects/KLOC
Cost/LOC
Documentation Pages/KLOC
LOC is programmer & language dependent

2
LOC Metrics Function Oriented Metrics
Easy to use Function Point Analysis [Albrecht 79, 83]
Easy to compute International Function Point Users Group
Can compute LOC of existing systems but (IFPUG)
cost and requirements traceability may be Indirect measure
lost Derived using empirical relationships
Language & programmer dependent based on countable (direct) measures of
the software system (domain and
requirements)

Computing Functions Points Compute Function Points


Number of user inputs FP = Total Count * [0.65 + .01*Sum(Fi)]
Distinct input from user
Number of user outputs
Reports, screens, error messages, etc Total count is all the counts times a
Number of user inquiries weighting factor that is determined for
On line input that generates some result each organization via empirical data
Number of files
Logical file (database)
Number of external interfaces Fi (i=1 to 14) are complexity adjustment
Data files/connections as interface to other systems values

3
Complexity Adjustment Complexity Adjustment (cont)
Does the system require reliable backup and Are the master files updated on line?
recovery? Are the inputs, outputs, files, or inquiries
Are data communications required? complex?
Are there distributed processing functions? Is the internal processing complex?
Is performance critical? Is the code designed to be reusable?
Will the system run in an existing heavily utilized Are conversions and installations included in the
operational environment? design?
Does the system require on-line data entry? Is the system designed for multiple installations
Does the online data entry require the input in different organizations?
transaction to be built over multiple screens or Is the application designed to facilitate change
operations? and ease of use by the user?

Using FP FP and Languages


Errors per FP Language LOC/FP
Assembly 320
Defects per FP
C 128
Cost per FP COBOL 106
Pages of documentation per FP FORTRAN 106
Pascal 90
FP per person month
C++ 64
Ada 53
VB 32
SQL 12

4
Using FP Complexity Metrics
FP and LOC based metrics have been found to LOC - a function of complexity
be relatively accurate predictors of effort and
cost Language and programmer dependent
Need a baseline of historical information to use Halsteads Software Science (entropy
them properly measures)
Language dependent n1 - number of distinct operators
Productivity factors: People, problem, process,
product, and resources n2 - number of distinct operands
FP can not be reverse engineered from existing N1 - total number of operators
systems easily N2 - total number of operands

Example Halsteads Metrics


if (k < 2) Amenable to experimental verification [1970s]
{
if (k > 3) Length: N = N1 + N2
x = x*k;
Vocabulary: n = n1 + n2
}

Distinct operators: if ( ) { } > < = * ; Estimated length: N = n1 log2 n1 + n2 log2 n2


Close estimate of length for well structured programs
Distinct operands: k 2 3 x
n1 = 10
n2 = 4 Purity ratio: PR = N/N
N1 = 13
N2 = 7

5
Program Complexity McCabes Complexity Measures
McCabes metrics are based on a control
Volume: V = N log2 n
Number of bits to provide a unique designator for each of the n
flow representation of the program.
items in the program vocabulary.
A program graph is used to depict control
Program effort: E=V/L flow.
L = V*/V Nodes represent processing tasks (one or
V* is the volume of most compact design implementation
This is a good measure of program understandability more code statements)
Edges represent control flow between
nodes

Flow Graph Notation Cyclomatic Complexity


While
Sequence
Set of independent paths through the
graph (basis set)

V(G) = E N + 2
If-then-else Until E is the number of flow graph edges
N is the number of nodes

V(G) = P + 1
P is the number of predicate nodes

6
Example Flow Graph
1
i = 0;
while (i<n-1) do 2
j = i + 1;
while (j<n) do 3
if A[i]<A[j] then
swap(A[i], A[j]); 7 4 5
end do;
i=i+1; 6
end do;

Computing V(G) Another Example


1
V(G) = 9 7 + 2 = 4
9
V(G) = 3 + 1 = 4 2
Basis Set 4
3
1, 7 5 6
1, 2, 6, 1, 7
7
1, 2, 3, 4, 5, 2, 6, 1, 7
1, 2, 3, 5, 2, 6, 1, 7 8
What is V(G)?

7
Meaning McClures Complexity Metric
V(G) is the number of (enclosed) Complexity = C + V
regions/areas of the planar graph C is the number of comparisons in a module
Number of regions increases with the V is the number of control variables
number of decision paths and loops. referenced in the module
A quantitative measure of testing difficulty
and an indication of ultimate reliability
Similar to McCabes but with regard to
Experimental data shows value of V(G)
control variables.
should be no more then 10. Testing is
very difficulty above this value.

Metrics and Software Quality Measures of Software Quality


Correctness
FURPS Defects/KLOC
Defect is a verified lack of conformance to requirements
Failures/hours of operation
Functionality - features of system Maintainability
Mean time to change
Usability aesthesis, documentation Change request to new version (Analyze, design etc)
Cost to correct
Reliability frequency of failure, security Integrity
Fault tolerance, security & threats
Performance speed, throughput Usability
Supportability maintainability Training time, skill level necessary to use, Increase in
productivity, subjective questionnaire or controlled experiment

8
Quality Model High level Design Metrics
Structural Complexity
product
Data Complexity
System Complexity
operation revision transition
Card & Glass 80
reliability efficiency usability maintainability testability portability reusability

Structural Complexity S(i) of a module i.


S(i) = fout2(i)
Metrics Fan out is the number of modules immediately
subordinate (directly invoked).

Design Metrics System Complexity Metric


Data Complexity D(i) Another metric:
D(i) = v(i)/[fout(i)+1] length(i) * [fin(i) + fout(i)]2
v(i) is the number of inputs and outputs Length is LOC
passed to and from i. Fan in is the number of modules that invoke i.
System Complexity C(i) Graph based:
C(i) = S(i) + D(i) Nodes + edges
As each increases the overall complexity of Modules + lines of control
the architecture increases. Depth of tree, arc to node ratio

9
Coupling Metrics for Coupling
Data and control flow
di input data parameters
ci input control parameters Mc = k/m, k=1
do output data parameters
co output control parameters
Global m = di + aci + do + bco + gd + cgc + w + r
gd global variables for data a, b, c, k can be adjusted based on actual
gc global variables for control data
Environmental
w fan in number of modules called
r fan out number modules that call module

Component Level Metrics Using Metrics


Cohesion (internal interaction) The Process
Select appropriate metrics for problem
Coupling (external interaction) Utilized metrics on problem
Complexity of program flow Assessment and feedback

Formulate
Cohesion difficult to measure Collect
Bieman 94, TSE 20(8) Analysis
Data slice from a program slice Interpretation
Feedback

10
Metrics for the Object Oriented Weighted Methods per Class
n

Chidamber & Kemerer 94 TSE 20(6) WMC = c


i =1
i

Metrics specifically designed to address


object oriented software ci is the complexity (e.g., volume,
Class oriented metrics cyclomatic complexity, etc.) of each
Direct measures method
Must normalize
What about inherited methods?
Be consistent

Depth of Inheritance Tree Number of Children


DIT is the maximum length from a node to NOC is the number of subclasses
the root (base class) immediately subordinate to a class
Lower level subclasses inherit a number of As NOC grows, reuse increases
methods making behavior harder to But the abstraction may be diluted
predict
However, more methods are reused in
higher DIT trees.

11
Coupling between Classes Response for a Class
CBO is the number of collaborations RFC is the number of methods that could
between two classes be called in response to a message to a
As collaboration increases reuse class
decreases Testing effort increases as RFC increases
CRC lists the number of collaborations
Classes, Responsibilities, and Collaborations

Lack of Cohesion in Methods LCOM


LCOM poorly described in Pressman There are n such sets I1 ,, In
P = {(Ii, Ij) | (Ii Ij ) = }
Class Ck with n methods M1,Mn Q = {(Ii, Ij) | (Ii Ij ) }
If all n sets Ii are then P =
Ij is the set of instance variables used by
Mj LCOM = |P| - |Q|, if |P| > |Q|
LCOM = 0 otherwise

12
Example LCOM Explanation
Take class C with M1, M2, M3 LCOM is the number of empty
I1 = {a, b, c, d, e} intersections minus the number of non-
I2 = {a, b, e} empty intersections
I3 = {x, y, z} This is a notion of degree of similarity of
methods.
P = {(I1, I3), (I2, I3)}
If two methods use common instance
Q = {(I1, I2)} variables then they are similar
LCOM of zero is not maximally cohesive
Thus LCOM = 1 |P| = |Q| or |P| < |Q|

Class Size Number of Operations Overridden


CS NOO
Total number of operations (inherited, private,
public)
A large number for NOO indicates
Number of attributes (inherited, private,
possible problems with the design
public)
Poor abstraction in inheritance hierarchy
May be an indication of too much
responsibility for a class

13
Number of Operations Added Specialization Index
NOA SI = [NOO * L] / Mtotal

The number of operations added by a L is the level in class hierarchy


subclass Mtotal is the total number of methods
As operations are added it is farther away
from super class Higher values indicate class in hierarchy
As depth increases NOA should decrease that does not conform to the abstraction

Method Inheritance Factor MIF


n

M (C ) Ma(Ci) = Md(Ci) + Mi(Ci)


i i
MIF = i =1
n . All that can be invoked = new or
M
i =1
a (Ci )
overloaded + things inherited
Mi(Ci) is the number of methods inherited
and not overridden in Ci
MIF is [0,1]
Ma(Ci) is the number of methods that can
be invoked with Ci MIF near 1 means little specialization
Md(Ci) is the number of methods declared MIF near 0 means large change
in Ci

14
Coupling Factor Polymorphism Factor

is _ client (C , C ) PF =
M i o (C i )
.
CF= i j i j
. [M ]
(TC TC ) ( C i DC (Ci )
)
2
i n

is_client(x,y) = 1 iff a relationship exists between Mn() is the number of new methods
the client class and the server class. 0 Mo() is the number of overriding methods
otherwise. DC() number of descendent classes of a base
class
(TC2-TC) is the total number of relationships
possible (Total Classes2 diagonal) The number of methods that redefines inherited
methods, divided by maximum number of
CF is [0,1] with 1 meaning high coupling possible distinct polymorphic situations

Operational Oriented Metrics Encapsulation


Average operation size (LOC, volume) Lack of cohesion

Number of messages sent by an operator


Percent public and protected

Operation complexity cyclomatic


Public access to data members
Average number of parameters/operation
Larger the number the more complex the
collaboration

15
Inheritance
Number of root classes

Fan in multiple inheritance

NOC, DIT, etc.

16

Potrebbero piacerti anche