Sei sulla pagina 1di 7

University of Toronto Department of Computer Science University of Toronto Department of Computer Science

Lecture 8: Specification and Validation Software Requirements Specification


! How do we communicate the Requirements to others?
Last
Last Week:
Week:
Modeling
! It is common practice to capture them in an SRS
Modelingand
andAnalysis
Analysis(III)
(III)
Non-functional " But an SRS doesn’t need to be a single paper document...
Non-functionalRequirements
Requirements
Measuring
MeasuringSoftware
SoftwareQuality
Quality ! Purpose ! Audience
! Communicates an understanding of ! Users, Purchasers
the requirements "Most interested in system requirements
This
This Week:
Week: "explains both the application domain "Not generally interested in detailed
Communicating
Communicating Requirements
Requirements
and the system to be developed software requirements
! Contractual ! Systems Analysts, Requirements
the
theSoftware
SoftwareRequirements
RequirementsSpecification
Specification(SRS)
(SRS) "May be legally binding! Analysts
Validation:
Validation: Reviews, Inspections,etc
Reviews, Inspections, etc "Expresses an agreement and a
"Write various specifications that inter-
commitment
Requirements
RequirementsPrioritization
Prioritization relate
! Baseline for evaluating subsequent ! Developers, Programmers
products "Have to implement the requirements
"supports system testing, verification
! Testers
Next
Next Week:
Week: and validation activities
"Determine that the requirements have
Evolving "should contain enough information to
EvolvingRequirements
Requirements verify whether the delivered system
been met
Change
Changemanagement
management meets requirements ! Project Managers
Inconsistency
Inconsistencymanagement
management ! Baseline for change control
"Measure and control the analysis and
Feature development processes
FeatureInteraction
Interaction "requirements change, software evolves
Product
ProductFamilies
Families
© 2000-2003, Steve Easterbrook 1 © 2000-2003, Steve Easterbrook 2

University of Toronto Department of Computer Science University of Toronto Department of Computer Science

SRS Contents Appropriate Specification


Source: Adapted from IEEE-STD-830 Source: Adapted from Blum 1992, p154-5

! Software Requirements Specification should address: ! Consider two different projects:


! Functionality. What is the software supposed to do? A) Small project, 1 programmer, 6 months work
! External interfaces. How does the software interact with people, the programmer talks to customer, then writes up a 5-page memo
system's hardware, other hardware, and other software? B) Large project, 50 programmers, 2 years work
team of analysts model the requirements, then document them in a 500-page SRS
! Performance. What is the speed, availability, response time, recovery time
of various software functions, and so on?
! Attributes. What are the portability, correctness, maintainability, security, Project A Project B
and other considerations? Crystalizes programmer’s Build-to document; must
! Design constraints imposed on an implementation. Are there any required Purpose of spec? understanding; feedback contain enough detail for
standards in effect, implementation language, policies for database to customer all the programmers
integrity, resource limits, operating environment(s) and so on?
Spec is irrelevant; have Will use the spec to
Management
! Some other topics should be excluded: view?
already allocated estimate resource needs
! … should avoid placing either design or project requirements in the SRS resources and plan the development
! … should not describe any design or implementation details. These should be Primary: Spec author; Primary: all programmers
described in the design stage of the project. Readers? Secondary: Customer + V&V team, managers;
! … should address the software product, not the process of producing the
Secondary: customers
software product.
© 2000-2003, Steve Easterbrook 3 © 2000-2003, Steve Easterbrook 4
University of Toronto Department of Computer Science University of Toronto Department of Computer Science

A complication: Procurement Desiderata for Specifications


Source: Adapted from IEEE-STD-830-1998

! An ‘SRS’ may be written by… ! Valid (or “correct”) ! Consistent


! Expresses only the real needs of the ! Doesn’t contradict itself
! …the procurer: " I.e. is satisfiable
stakeholders (customers, users,…)
" so the SRS is really a call for proposals ! Uses all terms consistently
" Must be general enough to yield a good selection of bids… ! Doesn’t contain anything that isn’t
" …and specific enough to exclude unreasonable bids “required” ! Ranked
! …the bidders: ! Must indicate the importance and/or
! Unambiguous stability of each requirement
" Represents a proposal to implement a system to meet the CfP
! Every statement can be read in
" must be specific enough to demonstrate feasibility and technical competence
exactly one way ! Verifiable
" …and general enough to avoid over-commitment
! A process exists to test satisfaction
! …the selected developer: ! Complete of each requirement
" reflects the developer’s understanding of the customers needs ! “every requirement is specified
! Specifies all the things the system
" forms the basis for evaluation of contractual performance behaviorally”
must do
! …or by an independent RE contractor! ! ...and all the things it must not do! ! Modifiable
! Conceptual Completeness
! Choice over what point to compete the contract " E.g. responses to all classes of input
! Can be changed without difficulty
" Good structure and cross-referencing
! Early (conceptual stage) ! Structural Completeness
" can only evaluate bids on apparent competence & ability " E.g. no TBDs!!! ! Traceable
! Origin of each requirement must be
! Late (detailed specification stage) ! Understandable (Clear) clear
" more work for procurer; appropriate RE expertise may not be available in-house ! Facilitates referencing of
! E.g. by non-computer specialists
! IEEE Standard recommends SRS jointly developed by procurer & developer requirements in future documentation
© 2000-2003, Steve Easterbrook 5 © 2000-2003, Steve Easterbrook 6

University of Toronto Department of Computer Science University of Toronto Department of Computer Science

Typical mistakes Ambiguity Test


! Noise ! Jigsaw puzzles
" the presence of text that carries no " e.g. distributing requirements across a ! Natural Language?
relevant information to any feature of the document and then cross-referencing ! “The system shall report to the operator all faults that originate in critical
problem. ! Duckspeak requirements
! Silence " Requirements that are only there to
functions or that occur during execution of a critical sequence and for
" a feature that is not covered by any text. conform to standards which there is no fault recovery response.”
! Over-specification ! Unnecessary invention of terminology (adapted from the specifications for the international space station)
" text that describes a feature of the " E.g., ‘the user input presentation function’,
solution, rather than the problem.
! Contradiction
‘airplane reservation data validation
function’ ! Or a decision table?
" text that defines a single feature in a ! Inconsistent terminology
number of incompatible ways. " Inventing and then changing terminology
Originate in critical functions F T F T F T F T
! Ambiguity ! Putting the onus on the development
" text that can be interpreted in at least two staff
different ways. " i.e. making the reader work hard to
Occur during critical seqeunce F F T T F F T T
! Forward reference decipher the intent
" text that refers to a feature yet to be ! Writing for the hostile reader No fault recovery response F F F F T T T T
defined. " There are fewer of these than friendly
! Wishful thinking readers Report to operator?
" text that defines a feature that cannot
possibly be validated.

© 2000-2003, Steve Easterbrook 7 © 2000-2003, Steve Easterbrook 8


Source: Adapted from Kovitz, 1999 Source: Adapted from Easterbrook & Callahan, 1997.
University of Toronto Department of Computer Science University of Toronto Department of Computer Science

Avoiding ambiguity Organizing the Requirements


! Review natural language specs for ambiguity ! Need a logical organization for the document
! use people with different backgrounds ! IEEE standard offers different templates
! include software people, domain specialists and user communities
! Must be an independent review (I.e. not by the authors!) ! Example Structures - organize by…
! …External stimulus or external situation
! Use a specification language " e.g., for an aircraft landing system, each different type of landing situation:
wind gusts, no fuel, short runway, etc
! E.g. a restricted subset or stylized English
! …System feature
! E.g. a semi-formal notation (graphical, tabular, etc)
" e.g., for a telephone system: call forwarding, call blocking, conference call, etc
! E.g. a formal specification language (e.g. Z, VDM, SCR, …) ! …System response
" e.g., for a payroll system: generate pay-cheques, report costs, print tax info;
! Exploit redundancy ! …External object
! Restate a requirement to help the reader confirm her understanding " e.g. for a library information system, organize by book type
! ...but clearly indicate the redundancy ! …User type
! May want to use a more formal notation for the re-statement " e.g. for a project support system: manager, technical staff, administrator, etc.
! …Mode
" e.g. for word processor: page layout mode, outline mode, text editing mode, etc
! …Subsystem
" e.g. for spacecraft: command&control, data handling, comms, instruments, etc.

© 2000-2003, Steve Easterbrook 9 © 2000-2003, Steve Easterbrook 10

University of Toronto Department of Computer Science University of Toronto Department of Computer Science

IEEE Standard for SRS


Source: Adapted from IEEE-STD-830-1993 See also, Blum 1992, p160
IEEE STD Section 3 (example)
Source: Adapted from IEEE-STD-830-1993. See also, Blum 1992, p160

Identifies the product, &


11 Introduction
Introduction application domain
3.1 External Interface 3.3 Performance Requirements
Purpose
Purpose Requirements Remember to state this in measurable
Scope terms!
Scope Describes contents and structure 3.1.1 User Interfaces
Definitions,
Definitions,acronyms,
acronyms,abbreviations
abbreviations of the remainder of the SRS 3.1.2 Hardware Interfaces 3.4 Design Constraints
Reference
Reference documents
documents 3.1.3 Software Interfaces 3.4.1 Standards compliance
Describes all external interfaces: 3.1.4 Communication Interfaces
Overview
Overview 3.4.2 Hardware limitations
system, user, hardware, software;
etc.
22 Overall
Overall Description
Description also operations and site adaptation, 3.2 Functional Requirements
Product
Product perspective
perspective
and hardware constraints this section organized by mode, user 3.5 Software System
class, feature, etc. For example:
Product
Product functions
functions Summary of major functions
3.2.1 Mode 1 Attributes
User
User characteristics
characteristics 3.5.1 Reliability
3.2.1.1 Functional Requirement 1.1
Constraints Anything that will limit the … 3.5.2 Availability
Constraints
Assumptions developer’s options (e.g. regulations, 3.2.2 Mode 2
Assumptions and
and Dependencies
Dependencies 3.5.3 Security
reliability, criticality, hardware 3.2.1.1 Functional Requirement 1.1
3.5.4 Maintainability
33 Specific
Specific Requirements
Requirements limitations, parallelism, etc) …
3.5.5 Portability
...
Appendices
Appendices All the requirements go in here (i.e. 3.2.2 Mode n
3.6 Other Requirements
this is the body of the document). ...
Index
Index IEEE STD provides 8 different
templates for this section
© 2000-2003, Steve Easterbrook 11 © 2000-2003, Steve Easterbrook 12
University of Toronto Department of Computer Science University of Toronto Department of Computer Science

Agreeing on a specification Inquiry Cycle


Note similarity with
! Two key problems for getting agreement: Prior Knowledge process of scientific
1) the problem of validation (e.g. customer feedback) investigation:
Like validating scientific theories Requirements models are
If we build to this spec, will the customer’s expectations be met? Initial hypotheses theories about the world;
2) the problem of negotiation Designs are tests of those
How do you reconcile conflicting goals in a complex socio-cognitive setting? theories
Observe
(what is wrong with
! Validating Requirements the current system?)
! Inspections and Reviews Look for anomalies - what can’t
! Prototyping the current theory explain?
Model
! Negotiating Requirements Intervene
(describe/explain the
! Requirements Prioritization (replace the old system)
observed problems)
! Conflict and Conflict Resolution Carry out the Design experiments to Create/refine
! Requirements Negotiation Techniques experiments test the new theory a better theory
(manipulate
the variables) Design
(invent a better system)

© 2000-2003, Steve Easterbrook 13 © 2000-2003, Steve Easterbrook 14

University of Toronto Department of Computer Science University of Toronto Department of Computer Science

The problem of validation Prototyping


! logical positivist view:
" “there is an objective world that can be modeled by building a consistent body of ! Definitions
knowledge grounded in empirical observation”
! “A software prototype is a partial implementation constructed primarily to
! In RE, assumes there is an objective problem that exists in the world
" Build a consistent model; make sufficient empirical observations to check validity enable customers, users, or developers to learn more about a problem or its
" Use tools that test consistency and completeness of the model solution.” [Davis 1990]
" Use reviews, prototyping, etc to demonstrate the model is “valid” ! “Prototyping is the process of building a working model of the system”
[Agresti 1986]
! Popper’s modification to logical positivism:
" “theories can’t be proven correct, they can only be refuted by finding exceptions” ! Approaches to prototyping
! In RE, design your requirements models to be refutable ! Presentation Prototypes
" Look for evidence that the model is wrong " explain, demonstrate and inform – then throw away
" E.g. collect scenarios and check the model supports them " e.g. used for proof of concept; explaining design features; etc.
! Exploratory Prototypes
! post-modernist view: " used to determine problems, elicit needs, clarify goals, compare design options
" “there is no privileged viewpoint; all observation is value-laden; scientific " informal, unstructured and thrown away.
investigation is culturally embedded” ! Breadboards or Experimental Prototypes
" E.g. Kuhn: science moves through paradigms " explore technical feasibility; test suitability of a technology
" E.g. Toulmin: scientific theories are judged with respect to a weltanschauung " Typically no user/customer involvement
! In RE, validation is always subjective and contextualised ! Evolutionary (e.g. “operational prototypes”, “pilot systems”):
" Use stakeholder involvement so that they ‘own’ the requirements models
" development seen as continuous process of adapting the system
" Use ethnographic techniques to understand the weltanschauungen
" “prototype” is an early deliverable, to be continually improved.
© 2000-2003, Steve Easterbrook 15 © 2000-2003, Steve Easterbrook 16
University of Toronto Department of Computer Science University of Toronto Department of Computer Science

Throwaway or Evolve? Reviews, Inspections, Walkthroughs…


Source: Adapted from Blum, 1992, pp369-373

! Throwaway Prototyping ! Evolutionary Prototyping ! Note: these terms are not widely agreed
!Purpose: !Purpose
" to learn more about the problem or its " to learn more about the problem or its
! formality
solution… solution… " informal: from meetings over coffee, to team get-togethers
" hence discard after the desired knowledge " …and to reduce risk by building parts of " formal: scheduled meetings, prepared participants, defined agenda, specific
is gained. the system early format, documented output
!Use: !Use: ! “Management reviews”
" early or late " incremental; evolutionary " E.g. preliminary design review (PDR), critical design review (CDR), …
!Approach: !Approach: " Used to provide confidence that the design is sound
" horizontal - build only one layer (e.g. UI) " vertical - partial implementation of all " Attended by management and sponsors (customers)
" “quick and dirty” layers;
" Usually a “dog-and-pony show”
!Advantages: " designed to be extended/adapted
" Learning medium for better convergence !Advantages: ! “Walkthroughs”
" Early delivery ! early testing ! less cost " Requirements not frozen " developer technique (usually informal)
" Successful even if it fails! " Return to last increment if error is found " used by development teams to improve quality of product
!Disadvantages: " Flexible(?) " focus is on finding defects
" Wasted effort if requirements change !Disadvantages: ! “(Fagan) Inspections”
rapidly " Can end up with complex, unstructured
" a process management tool (always formal)
" Often replaces proper documentation of system which is hard to maintain
the requirements " early architectural choice may be poor " used to improve quality of the development process
" May set customers’ expectations too high " Optimal solutions not guaranteed " collect defect data to analyze the quality of the process
" Can get developed into final product " Lacks control and direction " written output is important
" major role in training junior staff and transferring expertise
Brooks: “Plan to throw one away - you will anyway!”
© 2000-2003, Steve Easterbrook 17 © 2000-2003, Steve Easterbrook 18

University of Toronto Department of Computer Science University of Toronto Department of Computer Science

Benefits of formal inspection Inspection Constraints


Source: Adapted from Blum, 1992, pp369-373 & Freedman and Weinberg, 1990. Source: Adapted from Blum, 1992, pp369-373 & Freedman and Weinberg, 1990.

! Formal inspection works well for programming: ! Size ! Scope


! “enough people so that all the ! focus on small part of a design, not
! For applications programming:
relevant expertise is available” the whole thing
" more effective than testing
" most reviewed programs run correctly first time ! min: 3 (4 if author is present)
" compare: 10-50 attempts for test/debug approach ! max: 7 (less if leader is ! Timing
inexperienced) ! Examines a product once its author
! Data from large projects
has finished it
error reduction by a factor of 5; (10 in some reported cases)
"
" improvement in productivity: 14% to 25%
! Duration ! not too soon
! never more than 2 hours "product not ready - find problems the
" percentage of errors found by inspection: 58% to 82% author is already aware of
"concentration will flag if longer
" cost reduction of 50%-80% for V&V (even including cost of inspection)
! not too late
! Effects on staff competence: ! Outputs "product in use - errors are now very
" increased morale, reduced turnover costly to fix
! all reviewers must agree on the
" better estimation and scheduling (more knowledge about defect profiles)
" better management recognition of staff ability
result ! Purpose
"accept; re-work; re-inspect;
! Remember the biggest gains come
! all findings should be documented
! These benefits also apply to requirements inspections "summary report (for management) from fixing the process
"detailed list of issues "collect data to help you not to make
! E.g. See studies by Porter et. al.; Regnell et. al.;… the same errors next time

© 2000-2003, Steve Easterbrook 19 © 2000-2003, Steve Easterbrook 20


University of Toronto Department of Computer Science University of Toronto Department of Computer Science

Inspection Guidelines Choosing Reviewers


Source: Adapted from Freedman and Weinberg, 1990. Source: Adapted from Freedman and Weinberg, 1990.

! Prior to the review ! Possibilities


! schedule Formal Reviews into the project planning ! specialists in reviewing (e.g. QA people)
! train all reviewers ! people from the same team as the author
! ensure all attendees prepare in advance ! people invited for specialist expertise
! people with an interest in the product
! During the review ! visitors who have something to contribute
! review the product, not its author ! people from other parts of the organization
" keep comments constructive, professional and task-focussed
! stick to the agenda ! Exclude
" leader must prevent drift
! anyone responsible for reviewing the author
! limit debate and rebuttal " i.e. line manager, appraiser, etc.
" record issues for later discussion/resolution
! anyone with known personality clashes with other reviewers
! identify problems but don’t try to solve them
! anyone who is not qualified to contribute
! take written notes
! all management
! After the review ! anyone whose presence creates a conflict of interest

! review the review process

© 2000-2003, Steve Easterbrook 21 © 2000-2003, Steve Easterbrook 22

University of Toronto Department of Computer Science University of Toronto Department of Computer Science

Structuring the inspection Requirements Prioritization


Source: Adapted from Porter, Votta and Basili, 1995 Source: Adapted from Karlsson & Ryan 1997

! Can structure the review in different ways ! Usually there are too many requirements
! Ad-hoc ! Decide which to include in the first release
" Rely on expertise of the reviewers " Balancing quality, cost and time-to-market
! Checklist ! Assess each requirement’s importance to the project as a whole
" uses a checklist of questions/issues ! Assess the relative cost of each requirement
" checklists tailored to the kind of document (Porter et. al. have examples)
! Compute the cost-value trade-off:
! active reviews (perspective based reading)
" each reviewer reads for a specific purpose, using specialized questionnaires
" effectively different reviewers take different perspectives
30 High
The differences may matter
Value (percent)
! 25 priority
! E.g. Porter et. al. study indicates that: 20 Medium
" active reviews find more faults than ad hoc or checklist methods
priority
" no effective different between ad hoc and checklist methods 15
" the inspection meeting might be superfluous!
10

5 Low priority

5 10 15 20 25 30
Cost (percent)
© 2000-2003, Steve Easterbrook 23 © 2000-2003, Steve Easterbrook 24
University of Toronto Department of Computer Science University of Toronto Department of Computer Science

Analytic Hierarchy Process (AHP) AHP example


Source: Adapted from Karlsson & Ryan 1997 Source: Adapted from Karlsson & Ryan 1997

! Create n x n matrix (for n requirements) Req1 Req2 Req3 Req4 …Also: should compute
Req1 1 1/3 2 4 the consistency index
! Compare each pair of requirements (because the pairwise
! For element (x,y) in the matrix enter: Req2 3 1 5 3 comparisons may not be
" 1 - if x and y are of equal value Normalise consistent)
" 3 - if x is slightly more preferred than y Req3 1/2 1/5 1 1/3
columns
" 5 - if x is strongly more preferred than y
" 7 - if x is very strongly more preferred than y Req4 1/4 1/3 3 1
" 9 - if x is extremely more preferred than y
! …and for (y,x) enter the reciprocal.
Req1 Req2 Req3 Req4 sum sum/4
! Estimate the eigenvalues: Sum
! E.g. “averaging over normalized columns” Req1 0.21 0.18 0.18 0.48 the 1.05 0.26
" Calculate the sum of each column rows
" Divide each element in the matrix by the sum of it’s column Req2 0.63 0.54 0.45 0.36 1.98 0.50
" Calculate the sum of each row
" Divide each row sum by the number of rows Req3 0.11 0.11 0.09 0.04 0.34 0.09

! This gives a value for each reqt: Req4 0.05 0.18 0.27 0.12 0.62 0.16
! …based on estimated percentage of total value of the project
© 2000-2003, Steve Easterbrook 25 © 2000-2003, Steve Easterbrook 26

Potrebbero piacerti anche