Sei sulla pagina 1di 87

WCS_AP v.

03 Test report

...
ACRuDA Project
DG VII RTD Programme

RA-96-SC.231

Deliverable D3:
The proposed assessment and certification
methodology for digital architectures

Authors: All ACRuDA Partners

Document.ID: WP2/D3/V2

Date: June 12, 1998

Type of document: Deliverable

Status: Proposed

Confidentiality: Public

Work Package Allocation: WP2

Distribution: All

Document Abstract:
The deliverable D3 presents the result of the tasks achieved in the ACRuDA Work Package 2 and Work
Package. This work consists mainly in the description of an assessment and certification schema, of assessment
and certification procedures and of assessment criteria.

List of Contributors (Alphabetical Order):

A. AMENDOLA (ANSALDO S.F. Italy) P. BENOIT (MATRA T.I.. France)

J.L. DUFOUR (MATRA T.I. France) Ph. GABRIEL (MATRA T.I.. France)

Ph. (SNCF-France) Ph. KAPCRZAK (RATP-France)


GRANDCLAUDON

M. El KOURSI (Inrets-France) H. KREBS (TUV-Rheinland -


Germany)

3.11.2010 12:59
WCS_AP v.03 Test report

G. LOISEAU (MATRA T.I. France) J.F. LINDEBERG (SINTEF, Norway)

Ph. MEGANCK (Inrets-France) S. MITRA (Lloyds Register-UK)

O. NORDLAND (SINTEF Norway) P. OZELLO (SNCF-France)

F. POLI (ANSALDO S.F. -Italy) G. SONNECK (SEIBERSDORF-


Austria)

R. TOOZE (Lloyds Register-UK) S. VALENCIA (RATP-France)

Project Sponsor
This report reflects work which is partially funded by the Commission of the European Communities (CEC)
under the Framework IV in the area of Specific RTD programme project: ACRuDA» Assessment and
Certification Rules for Digital Architectures".

CONTENT
Foreword for the HTML version of this document
1. INTRODUCTION
1.1. Objectives
1.2. Structure of the document
2. BACKGROUND
2.1. Introduction
2.2. The certification procedure
2.2.1. Licensing
2.2.2. Certification
2.2.3. The certificate
2.2.4. The certification body
2.2.5. Conclusion
3. MAIN CONCEPTS
3.1. Product, Safety Requirements Specification.
3.2. Assessment process
3.3. Certification process
4. PRODUCT DEVELOPMENT PROCESS
4.1. Introduction
4.2. Life cycle of a product
4.2.1. Presentation
4.2.2. Paper study
4.2.3. Model
4.2.4. Prototype
4.2.5. Pre Production
4.2.6. Production
4.3. Development Life Cycle and documentation
4.3.1. Development Life Cycle
4.4. Safety life cycle and documentation
4.4.1. Safety life cycle
4.4.2. Safety Plan
4.4.3. Safety case
4.4.3.1. Introduction
4.4.3.2. General
4.4.3.3. Purpose of the safety case
4.4.3.4. Content of the product safety case
4.5. Quality Assurance provisions
4.5.1. Relationship of Quality Assurance to other Plans
5. ASSESSMENT AND CERTIFICATION PROCESS
5.1. Introduction

3.11.2010 12:59
WCS_AP v.03 Test report

5.2. Description of the assessment and certification process


5.3. Role of the different bodies
5.3.1. Role of the European Union (EU)
5.3.2. Role of the authority of a EU member state
5.3.3. Role of the accreditation body of a state member
5.3.4. Role of the notified body
5.3.5. Role of the sponsor
5.3.6. Role of the assessors
5.3.7. Role of the supplier
5.4. Phases of the assessment and certification process
5.4.1. Phase I: preparation of the assessment
5.4.2. Phase II: assessment
5.4.3. Phase III: certification
5.5. Re-use of assessment results and product composed of assessed/certified components
5.6. Capitalisation of assessment work
5.7. Certification report and Certificate
5.8. Assessment inputs
5.9. Essential Quality Requirements for the assessment activities
5.9.1. The normative context for assessment requirements
5.9.2 Quality System of the Assessor
5.9.3. Quality Handbook
5.9.3.1. Background referential of the assessment
5.9.3.2. The quality requirements on methods for assessment
5.9.3.3. The quality requirements on tools for assessment
5.9.3.4. The safety audit
5.9.3.5. The Configuration management
5.9.3.6. The assessment reports and anomalies
5.9.4. Human issues
5.9.4.1. Competence and knowledge of the assessor
5.9.4.2. Organisation of the assessment team
5.9.4.3. Independence of judgement
5.9.4.4. Confidentiality of the developer's innovations
5.9.4.5. Publication of the results of the assessment
5.9.4.6. Subcontractors of the notified body
5.9.4.7. Environment organisation
5.10. General concepts for assessment and certification of software.
5.10.1. Software design
5.10.2. Algorithms and formal methods
5.10.3. Verification and validation
5.10.4. Interfaces
6. HIGH LEVEL ASSESSMENT CRITERIA
6.1. Introduction
6.2. Assessment activities
6.2.1. Referential Examination
6.2.2. Safety Management Assessment
6.2.3. Quality Management Assessment
6.2.4. Organisation Assessment
6.2.5. Development Phase Assessment
6.2.6. Safety Plan Assessment
6.2.7. Safety Case Assessment
6.3. Structure of the criteria
6.4. Process/Project criteria
6.4.1. Contents
6.4.2. Basic Criteria
6.5. Requirements
6.5.1. Contents
6.5.2. Basic criteria
6.6. Design

3.11.2010 12:59
WCS_AP v.03 Test report

6.6.1. Contents
6.6.2. Basic criteria
6.7. Validation and off line testing
6.7.1. Contents
6.7.2. Basic criteria
6.8. Fault and failure analyses
6.8.1. Contents
6.8.2. Basic criteria
6.9. Operation, Maintenance and Support
6.9.1. Contents
6.9.2. Basic criteria
6.10. Software Assessment Criteria
6.10.1. Software integrity level
6.10.2. Life cycle issues and documentation
6.11. Hardware Assessment Criteria
6.11.1. Life cycle issues and documentation
7. TERMINOLOGY
7.1. Introduction
7.2. Terminology, Definitions and Abbreviations
8. REFERENCES
8.1. European Council Directives
8.2. European Technical Specifications
8.3. Standards
8.4. ACRuDA Project
8.5. CASCADE Project
8.6. ERTMS project
8.7. Information Technology domain
8.8. Others
9. ANNEX I: STRUCTURE OF A SAFETY PLAN
10. ANNEX II: STRUCTURE OF A PRODUCT SAFETY CASE
11. ANNEX III: STRUCTURE OF AN ASSESSMENT PLAN
12. ANNEX IV: STRUCTURE OF A TECHNICAL ASSESSMENT REPORT
13. ANNEX V: STRUCTURE OF THE CERTIFICATION REPORT
14. ANNEX VI: STRUCTURE OF A CERTIFICATE

List of FIGURES
Figure 1: Product and Safety requirements specification
Figure 2: Assessment process
Figure 3: Certification process
Figure 4: Development life cycle
Figure 5: Relationships between categories and aspects
Figure 6: Equipment/Generic Product life cycle
Figure 7: Safety life cycle
Figure 8: Structure of Plans
Figure 9: Assessment and certification schema
Figure 10: assessment inputs

Foreword for the HTML version of this document


The text presented here is the HTML version of the original document that was submitted to and accepted by the European Commission.
The text has been reformatted to make it better suited to reading with a web browser, and this foreword has been inserted. All entries in
the table of contents and the list of figures are links to the corresponding chapters respectively figures, and all chapter titles and figure
captions are links back to the corresponding entry in the table of contents respectively list of figures. The final "To top of text" link at the
very end of the document has been added.

Apart from that, only four typing errors in the original text have been corrected:

3.11.2010 12:59
WCS_AP v.03 Test report

1. Chapter 4.5 contained a sub-chapter that was erroneously numbered 4.5.1.3 and not mentioned in the table of contents. In this text,
the number has been corrected to 4.5.1 and the corresponding reference added to the table of contents.
2. The identification of the document referenced in [ACR02] in chapter 8.4 was erroneously typed as "24 February 97. Reference:
ACRuDA/INRETS/MK-PM/WP1/D1/97.13/V2" (identical to the identification of [ACR01]). It has been corrected to "29
September 97. Reference: ACRuDA/INRETS/PM-MK/WP1/D2/97.39/V3".
3. Annex II, Chapter 3, section "Safety Plan", erroneously referred to chapter 3.4.2 of this document. The reference has been
corrected to 4.4.2.
4. Annex III, Chapter 5, erroneously referred to chapter 3.4.3 of this document. The reference has been corrected to 4.4.3.

No other corrections have been made. Any other differences between the wording of the original document and this text are unintentional
and unnoticed! Please inform me if you discover any.

Note: If you print this text, some of the images may not appear completely on the printed page. To get a complete print-out of such an
image, right-click it and store it on your hard disk. (This works with both Netscape and Explorer). Then print the image separately (e.g.
by viewing the file with your browser and then printing it!). You may have to adjust the margin settings in your page setup.

1. Introduction
The ACRuDA project aim to develop a methodology for safety assessment of safety critical digital architectures.
This methodology has to comply with the different requirements of the end-users and the suppliers. In priority
order:

it has to be the minimum activity of the assessor to gain confidence on the safety of the architecture,
it has to use the best practices on assessment so that the end-users maintain or gain confidence on the
automatism supported be safety critical digital architecture,
it has to be non ambiguous so that it cannot be misinterpreted by the different assessors. This contributes
to the harmonisation of the European market,
it has to be cost effective so that the effort are well proportioned between the different activities of
assessment and does not create lacks or deviation.

The minimum activity of the assessor depends on the complexity of the architecture and on the development
process. In all cases, the activity consists in more than just a conformity checking. The assessor has to perform
safety studies and expertise to complete assessment with an effectiveness evaluation of the protections ,
principles, specific mechanism developed for the architecture and gain confidence on the safety under the
proposed conditions of use.

The best practices in assessment in the European countries have been studied in [ACR02]. ACRuDA project has
used results from CASCADE project, European Standard ([CEN01], [CEN02] and [CEN03]) and Directives
([DIN01], [DIN02] and [DIN03]) and the experiences of the different ACRuDA partners. This has been
formalised through high level assessment criteria. The set of criteria obtained hereafter is a basic set. These set
of criteria can not be applied under this form and the assessors should add work to refine this basic set of criteria
in a set of detailed criteria that could be applied to assess a digital architecture. This chapter has been updated
once, after the ACRuDA case studies results and it must be updated regularly as the practice evolves in Europe.

The principal aim of this document is to define the framework for the assessment method. The objective is to
ensure that the safety digital architecture meets its safety requirements according to relevant standards, and best
practice as well as any specific safety requirements contained in contractual or technical specifications for
the equipment. The process of certification of safety critical products or systems can involve many different
bodies, for example, the sponsor, the supplier, the assessors and the notified body. It is, therefore, essential that
a harmonised, mutually agreed approach to the assessment of the products should be established which takes
into account the needs of each partner.

1.1. Objectives
This document provides information on the way a process or product is to be assessed by a third party. The
foundation of the assessors work is this standard which are « codes of practice ». The philosophy of assessment

3.11.2010 12:59
WCS_AP v.03 Test report

of safety digital architectures in the railway sector, is based on the product and the process. This document is
principally aimed at the assessors who need to perform an assessment of a safety critical digital architecture
based on the high level criteria. Each criterion defines a high level requirement that the item under assessment
must fulfil. This is the top level document to be used during an assessment and from this the assessors will
develop the detailed criteria necessary to assess the specific architecture.

1.2. Structure of the document


This document is the key deliverable of the ACRuDA project and provides:

the definition of an assessment and certification process,


the definition of a set of assessment and certification procedures,
the definition of assessment criteria.

Chapter 1: this chapter.


Chapter 2: This chapter provides a brief description of the background to ACRuDA in terms of the certification
and licensing requirements emerging from EU directives.
Chapter 3: In this chapter, the main concepts involved in the assessment and certification process are defined
and some basic assessment procedures are given.
Chapter 4: This chapter presents the development process of a product.
Chapter 5: This chapter presents the assessment and certification processes that underpin in the ACRuDA
project.
Chapter 6: In this chapter, the assessment criteria are defined and described.
Chapter 7: This chapter gives the definition of the terms used in this document and in the general certification
language.
Chapter 8: This chapter gives the references of the documents used for ACRuDA project.
Annexes: annexes I and II are related to some document to be produced during the development process for the
supplier. ANNEX I describes the structure of a safety plan, ANNEX II, the structure of a safety case. Annexes
III to VI are related to the documentation to be produced during an assessment. ANNEX III describes the
structure of an assessment plan, ANNEX IV, the structure of a technical assessment report, ANNEX V, the
structure of a certification, ANNEX VI, the structure of a certificate.

2. Background
2.1. Introduction
Articles 129b to 129d of the amended EC Treaty established the intention of introducing trans-European
networks in the areas of transport, telecommunication and energy infrastructures. The need for inter-operability
and technical standardisation is stated.

The European Commission has expressed the urgent need for an effective, integrated transport system providing
a high degree of inter-operability between rail, air, road and water transport systems. This is referred to as
cross-modal transport. To achieve this, there must be efficient cross-border and cross-modal operations between
all transport systems. The European Commission has recognised the need for an effective railway as part of this
European transport system.

Furthermore, the European Commission is now, on behalf of the European Council, developing the necessary
legislation, mainly in the form of Council Directives. This legislation will become part of Member State
legislation.

2.2. The certification procedure


Council directive [DIN03], Article 1, contains definitions of:

Technical specifications, Standards, European standards, Common technical specifications,

3.11.2010 12:59
WCS_AP v.03 Test report

European technical approval.

For the railway, the following directives, standards, and technical specification must be considered:

the three standards [CEN01], [CEN02] and [CEN03],


the Technical Specification for Interoperability [STI01],
the directives [DIN01] and [DIN02].

[DIN03] refers to the concept of an approval body which certifies that the product or system satisfies the
essential requirements making it fit for use. Safety is the first of the essential requirements that Council
described in Annex III of [DIN01].

2.2.1. Licensing

[DIN01] and [DIN03] are examples of international legislation which when that member states are required to
implement into their own national legislation. This will results in all member states having similar laws defining
responsibilities and authorities with respect to various kinds of public transport.

In turn, regulations will issued to implement such laws and these will define who is authorised to grant a license
(licensing authority) to operators of transport systems or part thereof. This licensing authority may be a
department within the government, or an external organisation.

The licensing authority will set up the rules to be applied in order for granting a license. The key requirements is
that it must be demonstrate that a given system or product is safe to be used in its intended application.

As many countries have already privatised transportation, the owner of a public transportation system may not
necessarily be the operator of that system. For the purposes of licensing, this reflects mainly an issue of liability
and does not affect the actual licensing procedure.

The owner/operator will order anything from individual constituents all the way up to a complete transportation
system from one or more suppliers, who themselves will subcontract out to sub-suppliers etc.

In order to get permission to use the system he has ordered, the owner/operator must convince the licensing
authority that it fulfils the requirements that the licensing authority has defined. This evidence must be provided
by an unbiased, independent body: the assessor. The assessor must be accepted by the licensing authority.

It is important here to remember that the owner/operator does not define the requirements: that is done by the
licensing authority. The owner/operator will, however, identify which requirements he wants an assessment for,
in dependence on the product or system being assessed and the intended application.

2.2.2. Certification

In order to avoid to repeat the assessment process each time an existing product or system is deployed, it is
desirable to perform some kind of generic assessment and to document the results in a form that is acceptable to
all licensing authorities. This is the fundamental concept of certification!

Thus certification requires an unbiased, qualified assessment of the generic properties of a system or product.
The term generic is significant: the actual properties of a given object are dependent of the way it is deployed.
Therefore, the assessment can only evaluate those properties that are common to all reasonably expectable
environments and deployments. However, for complex systems, embedded components can certainly be
certified for the context of the system that they are embedded in.

This means that individual products can be certified for use in specific assemblies, which can be certified for use
in specific subsystems that are certified for use in specific systems. For software, this is equivalent to certifying
specific modules for use in specific programmes in specific programme systems.

[DIN01] defines in Article 2 the notified bodies as "the bodies which are responsible for assessing the ...
suitability for use ... or for appraising ... verification ..." and in annex VI refers to the "certificate from the
notified body ...". In other words, assessment and certification are to be performed by the notified bodies.

3.11.2010 12:59
WCS_AP v.03 Test report

The liability of the notified body is not clearly defined in the directive but it is said that: the notified body must
subscribe a civil liability insurance, excepted if this responsibility is covered by the State on the basis of national
law or if the controls are directly achieved by the State member.

2.2.3. The certificate

From the above it is clear that the validity of a certification is heavily dependent on the context of the object
being certified. It is therefore of paramount importance that the certificate clearly indicates the limits and
conditions of validity.

The details of the assessment performed is to be contained in a certification report. This report must be explicitly
identified on the certificate as an integral part. It must clearly state the conditions and limitations of the
assessment and in particular identify the requirements against which the assessment has been performed.

Annex VI of the above mentioned Council directive [DIN01] defines the "Contents of the EC declaration" as
being:

the Directive references


the name and address of the supplier ...
description of ... constituent (make, type, etc.)
description of the procedure followed ...
all the relevant descriptions ... and in particular its conditions of use
name and address of notified body (bodies) ... together, where appropriate, with the duration and
conditions of validity of the certificate
... reference to the European specification
identification of signatory ...

2.2.4. The certification body

As in the case of assessments performed for the purposes of obtaining a license from the licensing authority,
certification must be performed by a person or body that is recognised and accepted by the licensing authority.
In other words, the certification body must be certified!

It was pointed out earlier that assessment and certification are to be performed by the "notified body". Article
20 of Council directive [DIN01] states:

"2. Member States shall apply the criteria provided for in Annex VII for the assessment of the bodies to be
notified. Bodies meeting the assessment criteria provided for in the relevant European standards shall be
deemed to meet the said criteria."

where Annex VII identifies the minimum criteria which must be taken into account by the member states when
notifying bodies.

These minimum criteria are very generic and refer to the independence and impartiality of the notified body's
staff (must not be involved in the design, development, manufacture, construction, marketing, maintenance or
operation of the product and system they assessed). The technical qualification of the assessors must be
"adequate". Thus, the licensing authority must define the detailed criteria that a certification body shall fulfil,
just as it defines the criteria for certification. By harmonising the criteria for certification body and certification
across boarders we will achieve a situation where certification body and certificates throughout Europe will be
recognised by all licensing authorities, as indicated in Article 20 (5) of Council directive [DIN01].

2.2.5. Conclusion
The three directives [DIN01], [DIN02] and [DIN03] are the base for the definition of a new European model for
assessment and certification in the railway fields but the directives do not give the detailed procedures for
assessment/certification of product or system and the detailed criteria for assessment of product or system.
Safety digital architectures can be parts of sub-systems. One objective of ACRuDA is the definition of
assessment/certification procedure and assessment criteria for safety digital architectures.

3.11.2010 12:59
WCS_AP v.03 Test report

3. MAIN CONCEPTS
3.1. Product, Safety Requirements Specification.
[CEN03] (chapter 6.5.1, page 45) standard considers three different categories of programmable electronics
systems:

generic product: (independent of application) A generic product can be re-used for different independent
applications,
generic application: (for a class of application) A generic application can be re-used for different
class/type applications with common function,
specific application: (for a specific application) A specific application is used for only one particular
installation.

The deliverable [ACR01] of the ACRuDA project gives a definition for safety digital architecture, and describes
the differences between basic architecture and application architecture. ACRuDA deals with the basic
architecture. The basic architecture includes hardware and software, is railway generic and can be used in
different railway application.

The definition of ACRuDA’s basic architecture is similar to the definition of generic product of [CEN03]. In
this document , the use of the term « Product » is equivalent to « Generic product »

[CEN03] also gives a definition of Product : « a collection of elements, interconnected to form a system,
sub-system, or item of equipment, in a manner which meets the specified requirements ».

A product will be included in different system(s)/sub-system(s)/equipment(s). The supplier of the product can
only have general and theoretical hypothesis on the operational environment. It is necessary for end users and
suppliers of system(s)/sub-system(s)/equipment(s) to verify that the hypothesis on the environment of the used
products are consistent with the real environment.

A product can be composed of several different components. From the safety point of view, some of the
component don’t influence the safety, some other will contribute to the safety. These components are called
safety critical components.

Before the beginning of the assessment of a product, it is necessary to give two precise descriptions:

a description of the product in its totality,


a description of the safety requirements of the product. This description is called the safety requirements
specification.

The basic definition of safety requirement specification is taken from [CEN03] (See Annex A, sub-chapter
A.2, page 52). This definition has been refined by addition of new items in the definition.

For the purpose of assessing a product, the safety requirements specification should contain:

the safety functional requirements (the safety functions that the product is required to carry out) (comes
from[CEN03]),
the safety requirements: safety integrity level (SIL) and/or a numerical safety target that has been derived
from a higher level. A digital architecture must be considered within the context of the overall railway
system. This safety requirement comes from a allocation of safety given at a higher level (in general
system level) and apportioned to the different sub-levels (sub-system and equipment level and finally the
architecture level which is a sub-level of the equipment level). (comes from [CEN03] but modified for
ACRuDA project)
the applicable standards and rules (new item, not defined in [CEN03]),
the type of application considered (Interlocking, ATP, ATC, etc.) (new item, not defined in [CEN03]),
a description of the environment of the product (new item, not defined in [CEN03]).

3.11.2010 12:59
WCS_AP v.03 Test report

Figure 1 shows the process to obtain the definition of the product and the safety requirement specification:

Figure 1: Product and Safety requirements specification

[CEN01], [CEN02] and [CEN03] defines the safety integrity and the safety integrity levels. The safety integrity
is the probability of a safety critical system satisfactorily performing the required safety functions under all the
stated conditions within a stated period of time. The Safety Integrity Level (SIL) of a safety requirements
specification is one of four possible discrete levels for specifying the safety integrity requirements of the safety
functions to be allocated to the safety related products/systems. Safety integrity level 4 has the highest level of
safety integrity and safety integrity level 0, the lowest. The ACRuDA project only considered SIL4 products.
In this document, process, procedure, criteria are built for SIL4 level. For SIL level less than 4, the requirements
will be lower, but the basic procedures will still be applicable.

3.2. Assessment process


The main objective of the assessment process is to prepare an impartial report giving enough information on the
safety of the product to demonstrate that the product meets the safety requirements specification and to support
the certification of the product.

The assessment process is described in Figure 2. The confidence in the safety, is obtained by examination of the
product, by examination of all its representations and by the understanding of its development process. An
assessment is composed of preliminary analysis of the product, observations, theoretical studies, and
experimentation. The assessment of the product is based on the ACRuDA safety case (see also chapter 4.4.3).

The main organisations involved in the assessment process are: the sponsor, the supplier, the notified body and
the assessor(s). The supplier builds and sells the product and is also responsible for the establishing the safety
case of the product. The sponsor asks and finances the assessment of the product. In most cases, the end user of
the product or the supplier of the product can be the sponsor. The notified body assesses, with internal and/or
external assessors, the product on the base on the safety case. In the end of the assessment, the notified body
certifies a product or not on the base of the assessment results. In this document the term « Assessors » will
cover the Internal and External Assessors.

3.11.2010 12:59
WCS_AP v.03 Test report

Figure 2: Assessment process

There are two main concepts for the assessment: the assessment of conformity and the assessment of
effectiveness.

The assessment of conformity consists in the assessment of the implementation of a product. It assesses the
degree to which a given real product correspond to its description.

The assessment of effectiveness consists in the assessment of the safety functions, mechanisms and measures
chosen to satisfy the safety requirements specification of a product. The assessment is focused on the pertinence
of the functions, mechanisms and measures, the cohesion (if all the functions, mechanisms and measures operate
together in a good way), the consequences of risks of hazardous failure (from the construction and operational
points of view), the ease of use (installation, adaptation to real environment, maintenance...), the safety plan
(tools, methods, organisation of the supplier), and the research of remaining scenarii.

The apportionment of work between conformity and effectiveness assessment of product is dependant upon the
definition and description of the safety requirements specification and the product. If the safety requirements
specification and the product are well described, defined and detailed then, the main part of the assessment of
the product will involve the conformity and a few part of the assessment will involve the effectiveness.

The assessors need well defined criteria and procedures to assess a product. The safety of the products is
achieved, for the main part, by procedural measures, like organisation controls, staff controls, staff training, and
so on. But it is necessary to make also technical controls. The assessment criteria must be procedure and
product based.

These criteria will cover the aspects of effectiveness and conformity. The assessment criteria must list the
necessary assessment inputs to achieve the assessment. The supplier is responsible for the delivery of the
assessment inputs and must verify that all the assessment inputs, given to the assessors, satisfy the requirement
on the content and the structure and that all the assessment inputs give the proof or help to establish the proof of
safety.

The ideal situation is to begin the assessment at the same time as the development of the product. It is necessary
for the assessors to have a good understanding of the product but the assessors must stay independent and must
not influence the development of the product (sources [ITS01], [ITS02]).

It is the responsibility of the supplier to prove the safety of its product with all proofs of safety contained in the
safety case. The role of the assessors is to assess the methods used by the manufacturer for testing and safety
analyses and the result of the tests and the safety analyses. When necessary the assessors can make further
independent tests and safety analysis to verify the supplier’s results, to complete some proof element,
demonstration or safety analysis. It is recommended that tests and safety analyses are performed using methods

3.11.2010 12:59
WCS_AP v.03 Test report

and tools, different from these used by the supplier.

3.3 Certification process


Figure 3 shows the ideal set of processes for development, assessment and certification. Each circle is a process
and each arrow represents data flow between two processes. This is an ideal scheme but in most cases the
processes overlap. The assessment process should be in parallel with the development process.

Besides the supplier and assessor, there are four more bodies involved in the certification process: the European
Union (EU), the authority of a state member of EU, the accreditation body. The European Union is the body
that accepts or refuses the notified bodies appointed by the authority of a state member. The European Union
is also in charge of defining a common European policy for assessment and certification. The accreditation
body is responsible for issuing accreditation certificates ([EN01] to [EN07] standards) to the assessors and to
the notified bodies. This accreditation it is not explicitly required in the EU directives [DIN01], [DIN02] and
[DIN03]. If a body becomes notified body, this could be regarded as an accreditation in its own right.

3.11.2010 12:59
WCS_AP v.03 Test report

Figure 3: Certification process

A product is developed during a development process. A product is assessed against well defined criteria during
an assessment process. A certification presume the validity of the assessment and confirmed that an assessment
was performed. The next stage is the integration of the product in a system or the installation of the product in
its real environment. This is the approval stage. The approval process is the means to confirm that the use of a
product in a particular environment for a particular goal is acceptable. In a safe operational exploitation, a
product is used according to specified procedures. In this case, the changes made to the environment of the
product can involve modification of the product which can influence the development process (sources [ITS01],
[ITS02]). A new assessment will be necessary if changes are made to the product (it can be only a simplified «
delta assessment »). The definition of approval procedure and operational exploitation procedures is
outside of the field of ACRuDA project.

Certification is based on the results of the assessment. It is a formal declaration that confirms the results of the
assessment and the fact that the assessment criteria were correctly applied and satisfied. All the bodies involved
in the certification and assessment process must be recognised and competent for their role in the certification
and assessment process. The certificate is issued by the notified body (sources [ITS01], [ITS02]).

The certification process requires that the impartiality and independence of the assessment. The liability of the
different bodies involved in the assessment and certification process is not clearly defined but will depend of
national rules and laws.

The Assessment criteria will be most effective if all the countries have a common and harmonised certification
process and apply the same set of criteria. The notified body, in each country, is responsible for the application
of the criteria. It should be noted that these criteria are not only defined for assessors and notified bodies. They
are also useful for the suppliers and it would be very inefficient for a supplier to develop and build a product
without considering the assessment criteria (it can lead to an unsuccessful assessment).

Summary of the main concepts developed in the chapter and which form the basis of the document:

digital architectures are considered as generic products,


the boundaries, the environment, the conditions of use and the safety requirement specification
of digital architectures shall be clearly and precisely defined,
these generic products can be assessed and certified but only SIL4 level will be considered in this
document,
the assessment and certification process shall be based on the new European directives, that
define the requirements for notified bodies,
the assessment and certification process shall be based on harmonised criteria and a well defined
assessment methodology
the definition of approval procedures is outside the scope of this document.

4. PRODUCT DEVELOPMENT PROCESS


4.1. Introduction
Figure 2 shows the assessment process. The first step is to define the product and safety requirements
specification. Afterwards, the supplier develops the product and supplies enough information to the assessors.
To assess a product, it is necessary to know how this product is specified, designed, build and tested. To achieve
this, the supplier must develop according to a life cycle with well defined phases and clearly linked inputs and
outputs for each phases. Some assessment criteria address this development life cycle. The assessment of
conformity deals indeed with the process development and the environment of development.

Some development life cycle are proposed in the European railway standards [CEN01], [CEN02], [CEN03] and
in the [IEC01] standard.

[CEN01] and [CEN03] defines a system life cycle in terms of:

3.11.2010 12:59
WCS_AP v.03 Test report

concept
system definition and application condition
risk analysis
system requirements
apportionment of system requirements
design and implementation
manufacture
installation
system validation (including safety acceptance and commissioning)
system acceptance
operation and maintenance
decommissioning and disposal

and two other phases:

performance monitoring
modification and retrofit

[CEN02] (page 53) defines a life cycle for the development of software and a little bit of system life cycle:

software requirement specification


software architecture and design
software module design
software code
software module testing
software integration
software validation

and other phases

software maintenance
software planning

[IEC01] defines:

safety requirements specification (safety functions requirements specification + safety integrity


requirements specification)
validation planning
design and development
integration
operation and maintenance procedures
safety validation

The life cycles proposed in [CEN01] and [CEN03] are essentially for the development of system and do not
give details on the life cycle of electronic equipment. The life cycle proposed in [CEN02] is essentially for the
development of software but with few aspects of the development of systems (hardware/software integration).
The life cycle proposed in [IEC01] is specifically focused on safety and does not give details on the different
phases of the development life cycle. The different proposed life cycle (chapters 4.3.1 and 4.4.1) are synthesis
of life cycle described in all the above mention standards with some additional contributions given by the
ACRuDA project members.

4.2. Life cycle of a product


The following description is derived from [AQC01].

4.2.1. Presentation

A digital architecture can be considered as a product with a life cycle composed of several stage:

3.11.2010 12:59
WCS_AP v.03 Test report

paper study
model
prototype
pre-production
production

The definition of the product is refined at each necessary stage. The life cycle for the product should be defined
as clearly as possible.

Each stage (paper study, model, prototype, pre-production, production, etc.) should be clearly defined. These
stages may vary in duration and generally overlap in practice.

A new architecture may be an upgrade of an existing architecture. Consequently, the life cycle must give
consideration to existing designs.

4.2.2. Paper study


At a paper study stage, the feasibility and viability of the concepts considered for the product are viable. The
essential output of this stage is documentation.

4.2.3. Model

The aim of this stage is to define a model. To achieve an accurate model it is important to perform a users
requirements analysis. A preliminary specification and a high level design must be done to determine what
functionality must be realised by the model, to demonstrate the feasibility of the product. It is essential to
produce a documentation (but not necessary complete) and all changes are formally controlled as there are a
great number of iterations during the phases of design, realisation and test, to refine the model.

4.2.4. Prototype
In this stage, the specification and design depend not only on the users requirements analysis but also on the
results of the analyses of the model. The prototype takes into account (unlike the model) all the functionality of
the definitive product but in a provisional form. Testing of the prototype in real or simulated environments is a
very important stage in development of the product. All procedures and results must be formally documented.

4.2.5. Pre Production


This stage begins with a rigorous users requirements analysis and development of a specification. The operation
(real or simulated) of the prototype is likely to have modified the initial requirements. The phases of design
includes the industrialisation with documents on the definition, manufacture and control of the product in
production. The tests of the pre-production product are necessary to verify the product before the start of
production delivery.

4.2.6. Production

In this stage, the specification and design activities are essentially completes with changes only to correct errors,
or if end user’s demands lead to modifications or evolution of the product. Testing is less rigorous than for the
pre-production product and utilises sampling, depending on the quantity of the product and the quality assurance
level requirements.

4.3. Development Life Cycle and documentation


4.3.1. Development Life Cycle

Chapter 4.2 defines the stages of the life of a product. In each of these stages, the supplier should follow a
defined development life cycle.

3.11.2010 12:59
WCS_AP v.03 Test report

The development life cycle is normally considered as a « V ». In the descendant branch of the V, there are a
succession of analyses and study activities. In the ascendant branch, the activities are all activities of testing .

Figure 4 shows a development life cycle:

3.11.2010 12:59
WCS_AP v.03 Test report

Figure 4: Development life cycle

General abbreviation:

3.11.2010 12:59
WCS_AP v.03 Test report

- Req. Spec. : Requirement Specification

- Arch : Architecture

Abbreviation:

- SDP : System Development Plan

- SQAP : System Quality Assurance Plan

- SCMP : System Configuration Management Plan

- SVP : System Validation Plan

- SveP : System Verification Plan

- SQP : System Qualification Plan

- SMP : System Maintenance Plan

- SUM : System User Manual

- SQR : System Qualification Record

- SMR : System Maintenance Record

- SRS : System Requirement Specification

- SRTS : System Requirement Test Specification

- SVR : System Validation Report

- SAD : System Architecture Description

- SDTS : System Design Test Specification

- SITR : System Integration Test Report

- SSRS : Sub System Requirement Specification

- SSRTS : Sub System Requirement Test Specification

- SSVR : Sub System Validation Report

3.11.2010 12:59
WCS_AP v.03 Test report

- SSAD : Sub System Architecture Description

- SSDTS : Sub System Design Test Specification

- SSITR : Sub System Integration Test Report

[CEN03] (chapter 6.5.1, page 45) standard considers three different categories of programmable electronics
systems:

generic product: (independent of application) A generic product can be re-used for different independent
applications,
generic application: (for a class of application) A generic application can be re-used for different
class/type applications with common function,
specific application: (for a specific application) A specific application is used for only one particular
installation.

Figure 5 shows the different categories and the different aspects defined in [CEN03]:

Figure 5: Relationships between categories and aspects

The scope of ACRuDA project is to consider the generic product. It is necessary to define a precise
development life cycle for the product. A product generally comprises hardware and software. There are two
life cycles: one for the software and one for the hardware with interactions between the two cycles (for
example: software may be modified because of problems with the design the hardware).

A equipment life cycle is shown in the Figure 6. The software life cycle is taken from [CEN02].

3.11.2010 12:59
WCS_AP v.03 Test report

3.11.2010 12:59
WCS_AP v.03 Test report

Figure 6: Equipment/Generic Product life cycle

Abbreviation:

- Hw : Hardware

- Sw : Software

- SSAD : Sub System Architecture Description

- ERS : Equipment Requirement Specification

- ERVR : Equipment Requirement Verification Report

- ERTS : Equipment Requirement Test Specification

- EVR : Equipment Validation Report

- EAD : Equipment Architecture Description

- EDTS : Equipment Design Test Specification

- EITR : Equipment Integration Test Report

- Sw. Req. Spec. : Software Requirement Specification

- SwRS : Software Requirement Specification

- SwRVR : Software Requirement Verification Report

- Sw. Arch. & Design : Software Architecture & Design

- SwAS : Software Architecture Specification

- SwDS : Software Design Specification

- SwADVR : Software Architect. & Design Verification Report

- SwDTS : Software Design Test Specification

- Sw. Mod. Design : Software Module Design

- SwMDS : Software Detailed Design Specification

- SwMVR : Software Module Verification Report

3.11.2010 12:59
WCS_AP v.03 Test report

- SwMTS : Software Module Test Specification

- Sw. Code. : Software source Code

- SwSC&D : Software Source Code and Documentation

- SwSCVR : Software Source Code Verification Report

- Sw. Mod. Testing : Software Module Testing

- SwMTR : Software Module Test Report

- Sw. Integ. : Software Integration

- SwITR : Software Integration Test Report

- Sw. Valid. : Software Validation

- SwVR : Software Validation Report

- Hw. Req. Spec. : Hardware Requirement Specification

- HwRS : Hardware Requirement Specification

- HwRVR : Hardware Requirement Verification Report

- HwRTS : Hardware Requirement Test Specification

- Hw. Design : Hardware Design

- HwDS : Hardware Design Specification

- HwDVR : Hardware Design Verification Report

- HwMTS : Hardware Module Test Specification

- Hw. Manuf : Hardware Manufacture

- HwM : Hardware Module

- HwMVR : Hardware Module Verification Report

3.11.2010 12:59
WCS_AP v.03 Test report

- Hw. Test : Hardware Test

- HwTR : Hardware Test Report

- Hw. Valid. : Hardware Validation

- HwVR : Hardware Validation Report

Others phases:

- Sw. Planning : Software planning

- SwDP : Software Development Plan

- SwQAP : Software Quality Assurance Plan

- SwCMP : Software Configuration Management Plan

- SwVP : Software Validation Plan

- SwITP : Software Integration Test Plan

- SwVeP : Software Verification Plan

- SwMP : Software Maintenance Plan

- DPP : Data Preparation Plan

- DTP : Data Test Plan

- Sw. Maintenance : Software Maintenance

- SwMR : Software Maintenance Record

- SwCR : Software Change Record

- Hw. Planning : Hardware planning

- HwDP : Hardware Development Plan

- HwQAP : Hardware Quality Assurance Plan

- HwCMP : Hardware Configuration Management Plan

- HwVP : Hardware Validation Plan

- HwITP : Hardware Integration Test Plan

- HwVeP : Hardware Verification Plan

- HwMP : Hardware Maintenance Plan

3.11.2010 12:59
WCS_AP v.03 Test report

- Hw. Maintenance : Hardware Maintenance

- HwMR : Hardware Maintenance Record

- HwCR : Hardware Change Record

- EUM : Equipment Users Manuals

4.4. Safety life cycle and documentation


4.4.1. Safety life cycle
Figure 7 shows the safety life cycle and the associated documentation. This life cycle covers system aspects, the
sub-system aspects and the equipment aspects. As for the development life cycle, ACRuDA project deals only
with the equipment aspects (generic product categories). All this phases involved a team independent from the
development team.

3.11.2010 12:59
WCS_AP v.03 Test report

3.11.2010 12:59
WCS_AP v.03 Test report

Figure 7: Safety life cycle


Abbreviation:

- Hw : Hardware

- Sw : Software

- FPS : Functional Performance Specification

- PRAS : Preliminary Risk Analysis Specification

- SsaP : System Safety Plan

- SsaRS : System Safety Requirement Specification

- SsaR : System Safety Report

- SSSaP : Sub System Safety Plan

- SSSaRS : Sub System Safety Requirement Specification

- SSSaR : Sub System Safety Report

- EsaP : Equipment Safety Plan

- EsaRS : Equipment Safety Requirement Specification

- EsaR : Equipment Safety Report

- HwSaP : Hardware Safety Plan

- SwSaP : Software Safety Plan

- HwSaRS : Hardware Safety Requirement Specification

- SwSaRS : Software Safety Requirement Specification

- HwSaAR : Hardware Safety Analysis Report

- SwSaAR : Software Safety Analysis Report

4.4.2. Safety Plan

The safety plan defines the safety requirements for each phase of the system life cycle and cover the whole life
cycle. The safety plan is produce by the supplier

3.11.2010 12:59
WCS_AP v.03 Test report

The Safety Plan is defined as follow in [CEN01]: « a documented set of time scheduled activities, resources and
events serving to implement the organisational structure, responsibilities, procedures, activities, capabilities and
resources that together ensure that an item will satisfy given safety requirements relevant to a given contract or
project ».

In addition, [CEN03] specifies the development of the plan thus:" A Safety Plan shall be drawn up at the start of
the life cycle. This plan shall identify the safety management structure, safety-related activities and approval
mile-stones throughout the life-cycle and shall include the requirements for review of the Safety Plan at
appropriate intervals. The Safety Plan shall be updated and reviewed if subsequent alterations or additions are
made to the original system/sub-system/equipment. If any such change is made, the effect on safety shall be
assessed, starting at the appropriate point in the life-cycle. ". The Safety Plan is a part of the requirements for
the demonstration of the evidence of the safety management.

The Safety Plan identifies the safety management structure, safety-related activities and approval milestones
throughout the life-cycle. It also includes the requirements for review of the Safety Plan at appropriate intervals.
The Safety Plan shall define all management and technical activities during the whole safety life-cycle which are
necessary to ensure that the safety-related products and external risk reduction facilities achieve and maintain
the required functional safety. The Safety Plan for a product, such as Digital Architecture, is not mentioned
explicitly in the relevant standards but it is possible to derive a Product Safety Plan from the System Safety Plan
provide be the standard.

A more precise description of the Safety Plan structure is presented in ANNEX I

The Safety Plan also outlines the methods and techniques to be used to develop, validate and verify the safety
digital architecture against the safety requirements. ANNEX VIII, summarises the techniques and tools
prescribed by relevant standards and proposed in various current practices.

The Safety Plan shall be implemented and functional safety internal audits initiated as required. All those
involved in implementing the Safety Plan shall be informed of responsibilities assigned to them under the plan.

4.4.3. Safety case

4.4.3.1. Introduction

This chapter describes the issues to be considered in developing the Safety Case for a product, and addresses the
basic structure for the Safety Case.

Safety is defined as freedom from unacceptable risk of harm, where risk is defined as the probable rate of
occurrence of a hazard causing harm times the degree of severity of the harm. In general, the aim of a Safety
Case is to provide the evidence to demonstrate that in all aspects of specified operation, the risk of harm is
reduced to the lowest practicable level. This is achieved by demonstrating that:

all the safety requirements have been identified,


the safety requirements are achieved,
the remaining risk of harm is acceptable or tolerable.

The supplier is ideally placed to develop the Safety Case and in practice it is generally his responsibility.

4.4.3.2. General

There are fundamental safety requirements which apply to all safety critical products. These are described
below.

Any credible fault within any part of a safety critical product can be a potential source of a hazard. In
developing a product, the bounds of its operation must be defined in terms of the application(s) in which
supplier envisages the product will be utilised. While the supplier may not know the exact nature of the hazards
relating to any particular application, he should use his expertise of existing applications and his railway
experience to identify a set of hazards common to the applications envisaged. The supplier must eliminate or
mitigate these wherever possible. The completeness of any such analyses undertaken by the supplier must be

3.11.2010 12:59
WCS_AP v.03 Test report

demonstrated in the safety case.

All credible failures in a product must be assumed to be hazardous and, consequently, every effort must be
made to eliminate or mitigate the hazard and potential consequences to be within acceptable and practicable
limits or evidence must be provided that the failures are not hazardous.

A safety critical product must perform vital operations including fault detection in a reliable and timely manner.

The safety case of a product should provide evidence that it meets the above requirements. It should be
demonstrated that the SIL of the product is commensurate with that of the applications for which it has been
developed. Alternatively, this demonstration may be performed for specific functions rather than the product as
a whole.

The Safety Case must demonstrate to the satisfaction of the notified body, the operator, the user of the product
and the suppliers themselves, that these requirements have been satisfied.

It may be preferable to develop and issue the safety case in stages. These stages may be linked to life cycle
phases, delivery milestones or design reviews. The advantage of this approach is that it gives visibility of the
development of the safety case to the stakeholders in the product, and thus provides opportunity for early
comment.

The proposed contents of the safety case at each stage, and an outline safety case describing the proposed
format and contents should be issued at a very early stage in the project.

4.4.3.3. Purpose of the safety case

In general, a Safety Case must provide a clear comprehensive, convincing, and defensible argument, supported
by calculation, procedure and management, that a product will, inherently, provide a framework within which a
design may be realised and implemented and be acceptably safe throughout its life.

The safety case for a product will assist in the safe implementation of an application, using the product, and will
provide a major contribution to the application safety case.

In turn, the safety case developed for the application will be built into the safety cases of higher level systems,
and finally, into the overall railway system Safety Case. This will contain, or reference, Hazard/Error Logs,
design decisions, a history of development and use, and concluding safety arguments for all the components of
the system, including the safety critical products. In total this will provide the safety argument for the railway.

As well as aspects of product integration in the application, maintenance must be described in detail, defining
what maintenance is required by whom, where it will be performed, training and spares required, and any
maintenance aids needed, to ensure the level of safety of the product is upheld.

A requirement of [CEN03] is that operational safety of a railway system must be monitored to ensure that the
safety features of the design remain valid during use. This should include the monitoring of safety-related
performance and comparison of this to the performance predicted in design analyses, assessment of failures and
accidents to establish actual or potential failure trends, and identifying from these, changes required to improve
safety performance. This is clearly the responsibility of the end user. However, it is this requirement which
makes it essential that the owner or duty-holder, responsible for the operation of the railway, has access to
sources of any information needed to enable the assessment of safety performance and to propose and
implement any changes needed. Clearly, therefore, support may be required from any of the suppliers and it is
essential that the appropriate commercial agreements are in place to enable the necessary modifications to be
made and any revisions to the safety arguments developed.

For details of the requirements for the Safety Case for software programmed into, or used directly for
developing, safety critical elements of the architecture, refer to the CASCADE Generalised Assessment Method
[GAM01]. [GAM01] should also be used to assess any software development tools or utilities which contribute
directly to the integrity level of any aspect of the architecture.

4.4.3.4. Content of the product safety case

3.11.2010 12:59
WCS_AP v.03 Test report

The Safety Case shall include sections on the following:

Contents
High-level documentation
Safety Management Documentation
Safety Objective
Description of the Architecture
Functional Elements
Safety Studies
Ownership & Responsibilities- Operation, Evolution, Modification
User Support
Conclusion - Safety Argument Summary

A more detailed description of the safety case is given in ANNEX II.

4.5. Quality Assurance provisions


The Quality actions that need to be carried out to ultimately obtain certification of a product. These shall be
described, by the supplier, in a Quality Assurance Plan. The Quality Assurance Plan describes practices,
means and the sequence of activities related to quality. The plan shall apply to all activities and to the whole
product life cycle The approach for the Quality Assurance Plan is specified in EN 29001.

The manufacturer shall be fully compliant with the requirement of EN 29001

4.5.1. Relationship of Quality Assurance to other Plans

The Quality Assurance Plan is applicable to all the supplier activities. These are described in the management
plan that has been prepared to carry these activities. The measures defined by the Quality Assurance plan are,
consequently, complementary to the management plan and can be detailed in all other plans derived from them.

The Quality Assurance program is written based on the approach defined in the management plans. These plans
are based on a development process described in the chapter 4.3 of this document.

EN 29001 requires that the supplier sets up an organisation capable of assuring the design and production
quality of the product.

To achieve this, the supplier will ensure that his plan are consistent with the safety and reliability requirements
of the product and that these plans are correctly implemented.

Figure 8 shows the relation between the various plans:

3.11.2010 12:59
WCS_AP v.03 Test report

Figure 8: Structure of Plans

The figure makes a distinction between:

1. The organisation documentation: this should describe the organisational structure under which the product
has been developed, and defines the roles, responsibilities and reporting structure of personnel involved in
management, development, safety, maintainability reliability and user support. The named organisation
chart showing persons working on the project shall be kept available.
2. The development plan: this plan defines the development of the product in terms of development stages
and establishes the criteria for demonstration and acceptance that each stage has been completed. This is
a "living" document which must reflect not only the original plan, but also the actual life cycle of the
development that took place.
3. The Quality plan: this plan is the basic guideline of the quality plan. The Quality Plan defines the quality
requirements that will be applied to all aspects of the work in developing the product. This will include the
Quality Management System (QMS) used on the project together with a traceable path to enable
demonstration that the QMS is in accordance with EN 29001 and related standards. Two Software and
Hardware Quality plans could be prepared in addition to the Quality Plan. The producers (internal entities
or subcontractors) use these plans to prepare their own plans and include specific provisions into their
activities.
4. The Safety plan: the Safety Plan defines the way in which the safety of the product is to be assured.
Details of techniques and processes to be used, at what stage they are to be used and how the findings of
each analysis is to be addressed as part of the development process shall be described.
5. Configuration management plans: this document describes the principles and processes by which the
product under consideration has been controlled throughout its life cycle from conception through
detailed specification, design, build, validation. The Configuration Management Plan should detail the
timing of design reviews, configuration baselines, status reporting mechanisms and procedures for
deviation from prescribed processes. This document is vital since traceability is a central requirement of a
Safety Case and rigorous traceability is only truly achievable when all evidence is from configured
sources.
6. Verification & Validation (V &V) Plan: this document defines the objective and approach to be adopted
in demonstrating that the requirements described in the Requirement specification documentation and
safety criteria drawn from the various safety analyses have been met. Procedures for, and evidence of,
traceability of specific requirements to particular test elements of V &V activities shall be briefly
described and appropriate, detailed documentation should be referenced.

3.11.2010 12:59
WCS_AP v.03 Test report

5. ASSESSMENT AND CERTIFICATION


PROCESS
5.1. Introduction
This chapter contains a proposed assessment and certification process which is derived from the European
directives [DIN01], [DIN02] and [DIN03].

The main objective of an assessment is to gain confidence in the fact that the product meets its safety
requirements specification. The final objective is to obtain the certification of the product. The certification is
based on the result of the assessment.

The main concepts for an assessment are repeatability, reproducibility, impartiality (sources [ITS01], [ITS02].)
An assessment is repeatable if the repetition of the assessment of the same product, with the same safety
requirements specification evaluated by the same assessor gives the same overall verdict as the first assessment
(sources [ITS01], [ITS02]). An assessment is reproducible if the repetition of the assessment of the same
product, with the same safety requirements specification evaluated by another assessor gives the same overall
verdict as the first assessment (sources [ITS01], [ITS02]). An assessment is impartial if it is free from bias
towards achieving any particular result (sources [ITS01], [ITS02]).

An assessment can be concurrent or consecutive. If the assessment is done after the development of the
product, the assessment is consecutive. If the assessment is done in parallel with the development of the
product, the assessment is concurrent. For a consecutive assessment, the totality of the assessment inputs
(documentation, hardware, software, etc.), are available at the beginning of the assessment. For a concurrent
assessment, the assessment inputs are available as the development progresses (sources [ITS01], [ITS02]). The
recommended assessment is the concurrent assessment. This allows problems to be resolved at an early stage.
Where existing products are utilised, consecutive assessments is the only solution. For new designs all
assessment should be concurrent (sources [ITS01], [ITS02]).

The assessment criteria describe the elements of proof necessary for the assessment. The information on the
product must be as clear and complete as possible and he assessors should have a good understanding of the
product particularly the safety requirements specification. An assessment is based on preliminary analyse,
observation, theory, and experimentation.

The preliminary analyse of the product is a very important. The assessment requires inputs from the supplier.
These inputs should include a description of the product a set of requirement specifications which provide
sufficient level of detail to undertake the assessment.

A product is composed of components and each of these may be composed of lower level components. It is
essential that the requirements at product level are broken down to all components and that each low level
requirement is traceable to the top level requirement. Safety requirement should be clearly identified separately
from other requirements.

An assessment is successful if all the assessment criteria are satisfied . For each criterion assessed, there are
three possible outcomes:

success: evidence was presented that satisfied the criterion,


fail: evidence was presented that should think criterion has not been verified
to be confirmed: There was insufficient evidence, time and resources to state whether or not a criterion
passed or failed.

By the end of the assessment, all criterion classed as " to be confirmed " verdict must become " success " or "
fail " verdict.

5.2. Description of the assessment and certification process

3.11.2010 12:59
WCS_AP v.03 Test report

Figure 9 shows the overall assessment and certification schema with the bodies involved, the roles of the bodies
and the data exchanged between the bodies.

3.11.2010 12:59
WCS_AP v.03 Test report

3.11.2010 12:59
WCS_AP v.03 Test report

Figure 9: Assessment and certification schema

5.3. Role of the different bodies


5.3.1. Role of the European Union (EU)

The EU gives an identification number to each notified body and publishes all the information on the notified
body in the Official Journal of the European Communities. The EU keeps the official list of notified bodies
(sources [DIN01], [DIN02]).
The EU is in charge of defining the directives (for example: Interoperability Directive for the European Railway
High Speed System), the standards (for example: [CEN01], [CEN02], [CEN03], [EN01] to [EN07]).

The European Union can be helped by a committee made up of representative peoples of each country and
chaired by the European Union. If it exists, this committee is in charge to define the European policy for
assessment and certification in Europe. It defines the procedures, methods, rules and criteria for assessment and
certification for all the countries (sources [DIN01], [DIN02]).

5.3.2. Role of the authority of a EU member state

The authority appoints notified bodies. In order, to maintain the notification, the authority must regularly
monitor the competence and the independence of the notified bodies (sources [DIN01], [DIN02]).

Details of the notified bodies are published in the Official Journal of the European Communities (sources
[DIN01], [DIN02]).

The authority establishes the national accreditation body (the accreditation body must function in conformance
with the EN 45ACC and EN 45ASS project standards).

5.3.3. Role of the accreditation body of a state member


The accreditation body gives an accreditation to [EN01] to the notified body and the assessors. The
accreditation body regularly monitor that the notified body complies with [EN01] standard.

5.3.4. Role of the notified body


The notified body provide a third party assessment. It is competent to fulfil the tasks related to the assessment of
conformity planned in the European directives (sources [DIN01], [DIN02]).

The notified body is governed by the laws of the state member who notifies it (source [DIN01]).

The notified body can be accredited ([EN01] standard). In other ways, the state member must justify to the
European Union the competencies (for these standards) of the notified body (sources [DIN02]).

The employees of the notified body must be independent (remuneration non proportional to the number of
achieved assessment) and are bound by professional secrecy (industrial property) (source [DIN01]).

The notified body must subscribe a civil liability insurance (source [DIN01]).

The notified body leads the assessment and defines the procedures and the means to fulfil the assessment of
the product. The notified body may use external assessors to perform some parts of the assessment work.

Upon satisfactory assessment results, the notified body will issue a certification report and a certificate for the
product.

The notified body is responsible for the assessment technical report (source [DIN01]).

The notified body (sources [DIN01], [DIN02]) maintains, and publishes the list of:

3.11.2010 12:59
WCS_AP v.03 Test report

assessment requests (past and in progress),


certificates refused,
certificates delivered.

5.3.5. Role of the sponsor

The sponsor is the person or body who requests the assessment to show that the product meets the safety
requirements specification. The sponsor orders the assessment (he asks and finances the assessment). The
sponsor is responsible for the appropriate utilisation of the certification report and the certificate (sources
[ITS03], [ITS04]).

The sponsor may choose a notified body from any European country (sources [DIN01]).

5.3.6. Role of the assessors

The assessors are bodies of proven integrity, independence (notably financial), technical competence. They
must be independent of design, manufacture, marketing, maintenance, and operation of the product (sources
[DIN01], [DIN02]).

The employees of the assessors must be independent (i.e.. remuneration must not be based on the number of
certificate issued) and are subjected to requirements confidentiality (sources [DIN01], [DIN02]).

The assessors must be accredited to [EN01].

5.3.7. Role of the supplier


The supplier designs, develops and validates the product according to current European Standards (quality,
safety, development, organisation, documentation: [CEN01], [CEN02], [CEN03], etc.), and directives.

The supplier demonstrates the safety of the product according to the defined level (in the case of ACRuDA, the
level is SIL4). The evidence and argument that the product is safe is contained in the safety case.

5.4. Phases of the assessment and certification process


The assessment and certification process is divided into three main phases: preparation, assessment, and
certification.

5.4.1. Phase I: preparation of the assessment

The sponsor must give a precise description of the product and defines the safety requirements specification and
the boundaries of the assessment. When the sponsor is not the supplier, the participation of the supplier is
strongly recommended.

The sponsor enters in contact with a notified body and asks an assessment for the product (sources [ITS03],
[ITS04]).

The notified body consults the assessors (internal and/or external assessors). A preliminary analyse of the
product description and the Safety requirement specification must be done by the notified body and the
assessors, to control the completeness and the coherence of the two description. On the base of the product
description and safety requirements specification, the notified body and the assessors make an assessment
feasibility study. The results of these study are:

a) The result of the feasibility study is positive: the notified body and the assessors define an assessment plan.
The structure of the assessment plan is given in ANNEX III. This plan contains a detailed assessment work plan
with a detailed assessment inputs delivery plan. The assessment plan is submitted for approval to the sponsor
and the supplier. The sponsor and the supplier (if the sponsor is not the supplier) prepare an assessment case
which content:

3.11.2010 12:59
WCS_AP v.03 Test report

the description and boundaries of the product,


the safety requirements specification,
the assessment work plan,
the assessment inputs delivery plan,
the confidentiality clauses (delivery of assessment inputs, etc.),
the identification of the notified body and the identification of the assessors

The assessment case is transmitted to the notified body. The notify body can make remarks and comments on
the assessment case (in particular, it can focus on some points which could be problems for the delivery of the
certificate).

The notify body draws up a contract for the assessment (based on the assessment case). The contract is signed
by the notified body and the sponsor. The assessment is registered by the notified body in its list of assessments
in progress.

The assessment can divided into one or more work packages. An assessor can be in charge of all the work
packages. If external assessors are required subcontractors), contracts are signed between the notified body and
the subcontractors before the beginning of the assessment. These contracts must define the assessment work, the
assessment inputs delivery plan, and the financial forms.

The assessment plan can be annexed to the contract. It can have a draft status. This plan can be modified during
the assessment (because of changes of documentation, tools, increase of information, etc.).

b) The result of the feasibility study is negative, the notified body asks the sponsor and the supplier to make the
necessary changes.

5.4.2. Phase II: assessment


The notified body and the assessors proceeds the assessment. The supplier produces the assessment inputs
according to the assessment plan. The sponsor is responsible for the delivery of the assessment inputs. The
assessment inputs are delivered according to a delivery protocol and the confidentiality clauses, established
during the preparation phase of the assessment. Confidentiality clauses are applied. If the supplier wants to
protect its industrial knowledge, the assessment inputs go directly from the supplier to the notified body but the
sponsor is informed of the delivery of the assessment inputs.

The notified body (with internal and/or external assessors) execute the assessment tasks defined in the
assessment plan. For each task, reports are regularly produced to control the progress of the assessment tasks.
The assessment inputs are analysed in conformance with the criteria. The assessors must verify the product
according to the criteria. During the assessment, the notified body and the assessors must seek to understand the
product which they investigate. In particular, they must seek to understand whether the product can behave in
any way contrary to the safety requirements specification. In other words, they seek to discover potential risks
of hazardous failure. It is recommended that the assessors build models, realise experiments and observations on
the product. All the assessment criteria must be verified.

During the assessment, some problems can be detected: non delivery of an assessment input, refusal to correct a
design, etc. These problems are called anomalies and they must be submitted to a particular treatment. They
must be analysed to determine their consequences on the assessment. It is necessary to take in account these
anomalies the earlier as possible in the assessment process. A special procedure to treat the anomalies must be
defined. All the bodies involved in the assessment must be informed. In general, there are two categories of
anomalies: minor or major. The minor anomalies can be easily corrected and are registered in the assessment
report. The major anomalies are recorded in anomaly reports. An anomaly report can contain:

the activities during which the anomaly was detected,


the description of the anomaly,

Each anomaly report is examined and validated by the notified body and the assessors. These reports are sent to
the sponsor and to the supplier. All the anomalies must be treated. The supplier can dispute an anomaly but he
must have good arguments to convince the notified body and the assessors. The sponsor can also dispute an
anomaly if he judges that the treatment of the anomaly can have important consequences on the assessment. In

3.11.2010 12:59
WCS_AP v.03 Test report

all cases, all the anomaly must be solved at the end of the assessment and the decision of closing an anomaly
must be taken with the agreement of all the partners involved in the assessment.

At the end of each assessment task, an assessment report, which contains the results of the assessment work, is
written. Each assessor writes an assessment report for the notified body. Each assessment report is examined
and internally approved by the notified body. The assessment reports can contain confidential information. Their
diffusion must be controlled These assessment reports are sent for approval to the sponsor and supplier.
Confidentiality clauses are applied. If the supplier wants to protect its industrial knowledge, the assessment
reports are only sent to the supplier but the sponsor is informed of the delivery of the reports. The assessment
reports contain:

the objectives of the task,


the assessment inputs
the criteria applied,
the description of the work achieved by the assessors,
the techniques, methods and tools used for the assessment work,
the results of the assessment,
a proposal for the verdict of the assessment,
a description of the anomalies detected,
the time and resources used for the task,

In case of the work packages are apportioned between several assessors, it is necessary to make a synthesis of
all the assessment reports in a final technical assessment report. The notified body is responsible for the
constitution of the technical assessment report. The technical assessment report contains a description of all the
work achieved by the assessors, all the results of the assessment, and the conclusion of the assessment.
Sometimes, restrictions for the use of the product can be mentioned in the report. All the references of the
technical assessment report must be available.

When the technical assessment report is internally approved by the notified body, it is sent, for approval, to the
supplier. The technical assessment report can contain confidential information. Its diffusion must be controlled.
Confidentiality clauses are applied. If the supplier wants to protect its industrial knowledge, the technical
assessment report is only sent to the supplier but the sponsor is informed of the delivery of the report. The
diffusion of the technical assessment report to entities, not involved in the assessment, is submitted to the
approval of the sponsor and the supplier.

An evaluation rating can be regarded as the assignment of a pass/fail verdict. A pass verdict is assigned if all
criteria are satisfied and, in particular, no risk of hazardous failure have been found. A fail verdict is assigned if
any error is found and is not corrected, or if a risk of hazardous failure is found.

5.4.3. Phase III: certification

This is the final step. The technical assessment report, containing the results of the assessment is approved by
the notified body and the supplier. Confidentiality clauses will be applied. On the base of the technical
assessment report, the notified body summarises the conclusion in the certification report. The certification
report is a public document. When a end user uses the product, it can only have access to this document. In
consequence, the certification report must contain all the observations, measures and recommendation
necessary to have a safe use of the product.

When the certification report is approved by the notified body the sponsor, the notified body delivers a
certificate. The certificate is signed by the sponsor and by the notified body. The product is added to the list of
certified products. The certification report and the certificate are published in official national and European
documents.

The structure of the technical assessment report is given in ANNEX IV, the structure of the certification report
in ANNEX V and the structure of the certificate in ANNEX VI.

5.5. Re-use of assessment results and product composed of


assessed/certified components.

3.11.2010 12:59
WCS_AP v.03 Test report

An assessment is a complex process, which demand lot of time, important resources and money, depending
upon the complexity of the product and the integrity level. The certification report and the certificate for a
product are valid only for the assessed version and configurations of the product. To limit the quantity of work
to achieve for an assessment and when it is possible, it can be interesting to re-use assessment results from a
previous product assessment. There are two cases where the re-use of assessment results can be applied (sources
[ITS02], [ITS04]):

a new version of a product


a new product which used assessed/certified components, tools, methods, principles, techniques,
theoretical studies, etc. (all these things will be called assessed/certified components in this chapter).

The way of doing the assessment is different in the two cases.

In the first case, it is a new version of the product. The supplier must identify, by a clear and precise analysis,
and must describe in a report, the modifications and the consequences of these modifications on the safety of
the product. The supplier or the sponsor must submit this report to the notified body. The notified body analyses
the report and decides if it is necessary to re-assessed the product or not. A re-assessment is identical to the
assessment described in the chapters 5.1 to 5.8 except that some results of the previous assessment of the
product can be re-used. If the re-assessment is successful, the certification report is written and the certificate is
delivered by the notified body. If no re-assessment is needed, the notified body extends the certificate to the
new version of the product (sources [ITS02], [ITS04]).

In the second case, the situation is different because the product is new but it uses some assessed/certified
components. The sponsor asks the assessment of this new product. This assessment is considered as a totally
new assessment as described in the chapters 5.1 to 5.8. As it is said before, some part of the product has been
assessed/certified in a previous assessment. It is possible to re-use some results of these previous assessment
during the assessment of this new product. The assessors must carefully verified if the assessed/certified
components are correctly used in the composed product (in particular: verification of the interfaces and
verification that the use of the assessed certified components can not degrade the safety of the composed
product). If the assessment is successful, the certification report is written and the certificate is delivered by the
notified body (sources [ITS02], [ITS04]).

In all cases, all the partners involved in the assessment and certification process, must be careful in the re-use of
previous assessment results or with product composed of assessed/certified components.

5.6. Capitalisation of assessment work


During an assessment, the assessors can meet difficulties in the application of methods, techniques and tools. It
is important to write these difficulties and their solutions in a document. The objective is to improve and
facilitate the future assessments. The capitalisation is focused on two subjects: the assessment methods and the
development methods. The progress in the assessment and development methods are written in a capitalisation
report. This report is the property of the notified body.

This report must talk about all the methods, techniques , tools used during the assessment and about the lessons
and benefit found by the assessors.

This report must also talk about the opinion, the judgement of the assessors on the methods, tools and
techniques used by the supplier for the development of the product.

This report can contain some confidential information but some of the results (evolution of the criteria, new
methods, etc.) can be published to the overall community of the assessors. The objective is to improve the global
quality of assessment in the European Community.

5.7. Certification report and Certificate


The certificate attests that the assessment was achieved correctly, with impartiality, competencies, in
accordance with the criteria, procedures and schema.

3.11.2010 12:59
WCS_AP v.03 Test report

The certificate is valid for the assessed version and configuration of the product. The safety of the product may
reasonably assumed for the correct use of the product in accordance with the recommendation of use contained
in the certification report.

The certification report and the certificate are the properties of the notified body. The reproduction and
publication of the two documents are authorised only if there are reproduced in their whole.

The notified body can withdraw the certificate (for example if it is discovered that the data supplied during the
assessment were not exact).

A certification report structure is proposed in ANNEX V and the certificate structure is proposed in ANNEX VI
of this document.

5.8. Assessment inputs


The assessment inputs are all the data necessary to achieve the assessment (hardware, software, documentation
tools, standards, etc.). Figure 10 shows the needed assessment inputs to achieve an assessment: harmonised
criteria (the base of the assessment), regulation, rules, laws and standards, the product and the safety
requirements specification, and a set of tools and method.

Figure 10: Assessment inputs

The assessors are not concerned with the relationship between sponsor and supplier. It is recommended to
define, before the beginning of the assessment a complete list of the assessment inputs with the date of delivery.
The following points should be defined:

the medium and the format of the assessment inputs (computer medium, tape, paper, etc.),
the program for the delivery of the assessment inputs,
the number of each assessment input to deliver,
the policy for provisional assessment inputs,
the development environment,
the access of the development site.

During the assessment, the assessors will have access to confidential information (industrial protection). All the
body involved in the assessment process must have the assurance that all the information will stay confidential.
This will influence a lot of aspects of the assessment process (reception, management, stocking, and restitution
of the assessment inputs).

3.11.2010 12:59
WCS_AP v.03 Test report

5.9. Essential Quality Requirements for the assessment activities


5.9.1. The normative context for assessment requirements
The normative context for the assessment activities on vital architectures are few:

- EN 29001 series
- [EN01] to [EN07] standards
- International standards
- National standards
- Standards on safety analyses methods

5.9.2. Quality System of the Assessor

The assessment phase is an essential phase in the life cycle of the product (see [CEN01] life cycle) . The aim of
the assessment is to have the final users confident about the safe use of the product. In this context, a quality
system, in the assessment activity, is highly recommended. This quality system has to be described in a Quality
Handbook.

Here after are the ACRuDA recommendations for the content of the Quality Handbook of the assessor.

5.9.3. Quality Handbook


The quality handbook can be written following the EN 29001 series. But some specific issues for the assessment
activities are presented here.

5.9.3.1. Background referential of the assessment

The referential used by the assessor includes the best practices and the applicable norms, standards, and
regulations. The domain of the railways safety architectures is submitted to numerous regulations and norms.
The assessor should then clearly specify, in the quality handbook, the norms that will be checked in his
assessment.

The quality handbook should explain how the assessor makes sure that he has always the updated referential.

The risk of having a bad referential is to have a certification which will not be recognised by the others
European partners.

5.9.3.2. The quality requirements on methods for assessment

The quality handbook should explain which dispositions are taken to identify and qualify the validity domain of
the methods used for assessment.

These methods should be in coherence with the referential of the assessor. The assessor should make sure that
these methods haven been previously tested in safety applications. The assessor should use methods that have
been defined by national or international standard.

The "training" chapter of the handbook should explain how the assessor is able to use the methods.

It is recommend to use methods which lead to objective results as far as possible. This is the best assurance for
objective and reproducible results. It is recommended to have a wide panel of methods available for the
assessment so that the verification is strengthen by a diversification of the studies and points of views taken by
the assessor.

The risk of misusing the methods is to get a non efficient evaluation.

5.9.3.3. The quality requirements on tools for assessment

3.11.2010 12:59
WCS_AP v.03 Test report

For the tools used for measurement and test, it is recommended to apply the requirements of [EN01] on
equipment.

Qualification procedures should be described to qualify the tools of the assessor.

The automation of the tests and analyses is recommended as far as possible, to have a guarantee of reproduction
of the evaluation and to limit the human error factor.

The risk of not applying the quality requirements on tools are to have bad measures.

5.9.3.4. The safety audit

A procedure about the safety audit activity should be referenced in the quality handbook. Some quality
requirements for audit traceability must be ensured: an audit plan should be written, a list of documents
reviewed should be produced.

5.9.3.5 The Configuration management

The Quality handbook should explain the configuration management applied to:

- internal documentation: quality procedures, referential, tools


- assessment reports, anomalies
- documentation of the developer

The general policy for the configuration management of the product under assessment (versions of the software,
versions of the components of the architectures) should be presented too.
This is a very important requirement for the assessor because the risks are major:

to assess a version of the product which is not the version under operation.
to have lacks in the evaluation of the development process
to loose the traceability of the assessment

Further more, in railways the life cycle duration of the products can be 30 years. This needs to develop a
configuration system to keep the safety documentation (including the assessment reports) valid during this
duration. The quality handbook should state whether the assessor provides a service like recording the different
assessment reports made on a product.

5.9.3.6. The assessment reports and anomalies

The assessment activity will produce assessment reports and anomalies. We suggest to refer to [EN01]
requirements on reports. The closure of the safety anomalies should be submitted to strong conditions. The
anomalies that remain open should be presented in the final report and should then imply restrictions in the
certification or use of the architecture.

The risk of a bad traceability of the reports and anomalies is to forget some important points and restrictions of
the certification.

5.9.4. Human issues

5.9.4.1. Competence and knowledge of the assessor

The legitimacy of the assessor is mainly funded on this characteristic. This issue should then be specially
detailed in the quality Handbook . The competence and the knowledge of the assessors should be closely related
to the technology, process, and methods used by the developer in railways. The " training" chapter should
explain how the competencies and knowledge are kept up to date.

The use of experts in the assessment team should be organised too. Their activity should be submitted to an
assessment project review with the rest of the team, that their independence be demonstrated towards the

3.11.2010 12:59
WCS_AP v.03 Test report

methods and product evaluated. Further more it is good to keep some expert out of the assessment process as a
potential resource in case of conflict.

The risk of a lack of competencies is to be unable to perform the investigations correctly or to judge the
technical criteria. The risk of a lack of knowledge in the specialised domains can be to overpass a problem, to
refuse a knew technology even if it could give some improvement in the safety process.

5.9.4.2. Organisation of the assessment team

The responsibility of the assessors in their results is important and it is recommended to have an organisation
that deals with this specificity:

double control teams


regular internal reviews
nominative organisation
description of the signature process and responsibilities of each member
a project oriented team

The risk of a bad organisation is to loose time, money and to give unnecessary stress to the members of the
assessment.

5.9.4.3. Independence of judgement

The legitimacy of the assessor is mainly funded on this characteristic. It is recommended to satisfy the
requirements of impartiality of [EN01] and Interoperability directive.
The risk of a lack of independence of judgement are to: hide some anomalies, reject the product of a concurrent,
accept light justification of the process of the developer.

5.9.4.4. Confidentiality of the developer's innovations

A minimum set of procedures in the quality system should explain the protection to reduce the vulnerability of
the information given by the sponsor and produced by the assessor. The level of security reached should be
defined.

5.9.4.5. Publication of the results of the assessment

The assessor should define a procedure to make sure that its report contain is not altered or changed in the
needed information for the notified body and the final user.

5.9.4.6. Subcontractors of the notified body

The subcontractor should comply to the same quality requirements than the notified body himself. Further more,
a reception procedure should be defined to accept the work of the subcontractor.

5.9.4.7. Environment organisation

The relationship between the assessor and other partners should be presented in the Quality Handbook.

5.10. General concepts for assessment and certification of software.


Software is always embedded in a more complex environment and the interaction between the software and its
environment determine the quality of the software product. Software is not subject to wear and tear, so it will
not deteriorate in the course of time. Therefore, the specification of the software and the processes leading up to
the generation of code will be considered in great detail, whereas the actual performance of the code will play a
less important role.

3.11.2010 12:59
WCS_AP v.03 Test report

5.10.1. Software design

Software assessment and certification is basically assessment and certification of design. Therefore, the methods
and tools that are used in the design process must be assessed in order to determine if they lead to the desired
quality. Here, quality encompasses error avoidance, error correction and error tolerance.

Error avoidance is clearly an activity that must be performed during the design process. It can be facilitated by
the use of recognised design and control principles. Error correction is used here to mean an activity performed
by the software (at run-time) to correct recognisable errors before they have any effect. Error tolerance is the
ability of the software to function correctly even if certain boundary conditions are not fulfilled.

5.10.2. Algorithms and formal methods


Assessment and certification of software is thus also assessment and certification of algorithms. Not only the
algorithms for performing the specified functions, but also the error correction algorithms must be assessed. This
allows for the use of very formal proofs, provided a formal description of the algorithm is possible. Since
software is not subject to wear, such proofs can be exceptionally generic.

5.10.3. Verification and validation

Verification is the process of demonstrating that the software truly fulfils the specified requirements, validation
is the process of demonstrating that the requirements were correct. For hierarchically structured software,
validation of requirements at a lower level consists of demonstrating that they correspond to at least part of a
requirement at the next higher level. Then, only the top level requirements must be validated against the safety
and reliability requirements of the encompassing system.

If all requirements at all lower levels can be validated against requirements at a higher level, then verifying
requirements at the bottom will very often also be a verification of the higher levels too.

This must of course be confirmed, and such confirmation is of course part of the assessment and certification
process. But when that can be done, the subsequent validation of the top level requirements becomes the
remainder of the assessment and certification process. And that is where the encompassing hardware and its
operational context must be considered.

5.10.4. Interfaces

At the beginning of this section it was pointed out that software is always embedded in a more complex
environment and that the interaction between the software and its environment determine the quality of the
software product. This interaction is defined through the interfaces between the software and its environment.
Thus, the correct definition and implementation of interfaces must be confirmed. Confirming the correct
definition of the interfaces is a part of the validation task, confirming their correct implementation is part of
the verification task.

6. HIGH LEVEL ASSESSMENT CRITERIA


6.1. Introduction
Harmonised criteria are necessary to obtain the mutual recognition of the assessment results of safety critical
digital architectures. The criteria exposed hereafter is a first set of basic criteria to guide assessment. They
can lead an assessor to organise his assessment plan and assessment activities. Assessment of any safety
critical digital architecture shall be based on a declared set of criteria derived by applying the basic
criteria of this document. This set of basic criteria has to be used with high care. The assessment shall begin
with the formulation of an assessment plan, detailing the scope of assessment and its basis, such as the safety
and reliability targets, integrity level and norms. The assessment plan and detailed criteria shall be produced by
the assessor.

3.11.2010 12:59
WCS_AP v.03 Test report

The supplier of the architecture shall provide all the evidence required to demonstrate compliance with the
detailed criteria. The evidence should be organised in accordance with a Safety Case Structure (see chapter
4.4.3), and shall be readily available for audit, walk-through, review and detailed examination.

The assessment should be based on the judgement resulting of the verification of a set of criteria on the
following properties:

the adequacy of the safety requirements specification of the product,


the effectiveness of the solution proposed by the supplier,
the conformity of the solution implemented by the supplier.

The assessment should be focused, mainly, on the conformity and effectiveness of the techniques and measures.

The assessment of Effectiveness is a judgement about the abstraction of the product, the safety principles or the
method. Effectiveness characterises how effective the techniques and measures are, in identifying and
eliminating or mitigating the hazards.

Effectiveness includes:

suitability of the safety principles and mechanisms, standards, safety functions, methods and tools used
to construct a safe product,
cohesion of the set of safety principles and safety critical functions,
cohesion of the set of tasks described in the safety plan.

The Conformity deals with the completeness of the implementation and the accuracy of the representation of
the specification. Conformity characterises how accurately the techniques and measures are implemented and
how well they are explained in the supplied documentation.

Conformity can be established through answering the following questions:

does the implementation contain all the requirements that are stated in the specification?
does the implementation not contain more than the requirements stated in the specification?
is the implementation an accurate representation of the specification?
are the methods planned in the safety plan used and applied?

Where necessary, single criterion can be broken down into several lower-level criterion in order to make the
assessment. Each criterion shall be applied according to current best practice and experience. In addition, the
assessor shall assess the design of the architecture independently, for example, by carrying out as a minimum, an
independent hazard analysis.

The assessor shall provide an assessment report which should summarise the approach, findings, criteria and
provide detailed reasons why the elements of the architecture passed or failed the criteria.

The assessor shall make judgements about the evidence presented by the supplier. The assessment criteria must
cover all the techniques and procedures used by the developer to achieve the integrity of the architecture.
According to [CEN01], the means to achieve railway dependability relates to controlling the factors which
influence dependability throughout the life of the system. Effective control requires the establishment of
mechanisms and procedures to defend against sources of error being introduced during the realisation and
maintenance of the system. Such defences need to take account of both random and systematic failure.

The means used to achieve dependability are based on the concept of taking precautions to minimise the
possibility of failure occurring as a result of an error during the realisation phases. Precaution is a combination
of:

prevention: concerned with lowering the probability of the impairment,


protection: concerned with lowering the severity of the consequences of the impairment.

The strategy to achieve dependability for the system, including the use of prevention and/or protection means,
shall be justified in the safety case.

3.11.2010 12:59
WCS_AP v.03 Test report

By defining a management process based on a life cycle, [CEN01] elaborates the means to ensure dependability
through minimising the effects of errors and by controlling the factors influencing railway dependability (see
section 6 of the standard). Methods, tools and techniques appropriate to engineering dependable systems are
presented in other CENELEC standards, [CEN02] and [CEN01] and in IEC standard [IEC01].

A general overview of the manner in which methods and techniques are used to support dependability
engineering and management is given in [CEN01] (chapter 5.3.7, figure 12).

The following documentary evidence is a condition (required by the standards [CEN03] and [IEC01]) for the
safety acceptance of the safety-related electronic system.

Evidence of quality management (Quality Management Report)


Evidence of safety management (Safety Management Report)
Evidence of functional and technical safety (Technical Safety Report)

These documents included in a structured safety justification document (Safety Case), have to present the
methods and techniques used to develop the system and ensure the safety. Examples of methods and techniques
to be used for the validation of safety digital architectures are given in the standards.
This chapter contains the basic criteria which are expected to provide the infra-structure and rules for
understanding an assessment of safety critical digital architecture. These assessment criteria have been derived
from the State of the Art and the standards [CEN01], [CEN02], [CEN03] and [IEC01]. They provide the basis
for the Development of detailed criteria for the individual architectures.

6.2. Assessment activities


The following assessment activities may be used to assess the processes and products of the architecture.

6.2.1. Referential Examination


This activity aims to identify the safety requirements that have to be taken into account. The goal is to list the
documentation, the standards and other information, such as assessment and certification reports, that are
needed for the assessment of the computer architecture.

6.2.2. Safety Management Assessment

The safety management assessment will examine all technical and management activities, during the whole
architecture life-cycle, to ensure that the safety-related systems and external risk reduction facilities allow the
required functional safety to be attained.

Competence of staff, departments or other groups involved in safety management activities will also form part
of this assessment.

6.2.3. Quality Management Assessment


The quality management system shall be examined systematically to assess compliance with EN 29001 and/or
other applicable quality procedures specified by the developer.

6.2.4. Organisation Assessment

The aim of this element of the assessment is to assess the capability of the organisation to administer safety
procedures. It has to ensure that the responsibilities of the staff and their competence and training requirements
are clearly specified and this process is being implemented.

6.2.5. Development Phase Assessment


The assessment of the development phase shall examine all development activities in order to verify that they
are undertaken in conformance with the relevant standards and with the required safety integrity level.

3.11.2010 12:59
WCS_AP v.03 Test report

6.2.6. Safety Plan Assessment

The structure and the content of the safety plan shall be examined to check whether they conform to the
ACRuDA Safety Plan Requirements.

6.2.7. Safety Case Assessment

The safety case structure and content shall be checked for conformance to the ACRuDA Safety Case Structure
(see chapter 4.4.3).

6.3. Structure of the criteria


These criteria are primarily written for the assessors of safety critical digital architectures, but they are also
expected to provide valuable guidelines for developers and users.

The basic criteria are presented in the form of process and product properties. They state the requirements for
the life-cycle processes and products and each requirement is devised to address a specific set of hazards. These
requirements will, in general, be satisfied by using the relevant techniques and measures recommended by the
safety critical standards. Therefore, with each set of basic criteria, a table of relevant techniques and measures is
attached. These tables also identify the objects to which they apply.

The effectiveness with which these techniques control the hazards or cover the faults, depends on various
factors, such as their frequency of application, accuracy of fault detection and timeliness of fault negation. The
effectiveness therefore, depends on the degree of sophistication used to implement the measure. For example,
effectiveness of a coding technique could very well depend on the size of code word, the bigger the size, more
effective is the implementation.

6.4. Process/Project criteria


A set of well-structured life cycle plans is essential for ensuring the product integrity of digital architectures.
Suitable life cycle frameworks have been described in standards, [CEN03] and [IEC01], and in the ACRuDA
Safety Case Structure (see chapter 4.4.3).

6.4.1. Contents

The structure and activities of the product life cycle shall provide a systematic approach to the development,
production, support and maintenance of the product.

The activities required to identify, control or eliminate hazards at each life cycle phase shall be described. A
structured plan of these activities constitutes the safety plan.

6.4.2. Basic Criteria

1. The life cycle plans shall cover all development phases and describe the processes used to ensure the
quality, reliability, maintainability and integrity of the products.
2. The life cycle plans shall identify all the resources to be used and their essential qualities, such as the
designers and their competence, tools and their reliability, validation teams and their independence.
3. Each development phase shall precisely specify:
the inputs, information and resources required to carry-out the activity
summary of the processes
its successful termination conditions
its outputs
4. All development activities shall be covered by an appropriate safety plan. The safety plan should have the
approval of the supplier's project manager and the supplier's internal independent safety organisation.
5. Personnel and responsibilities
Personnel in the safety should be suitably qualified,

3.11.2010 12:59
WCS_AP v.03 Test report

The designer/implementor shall be independent of the verifier and validator,


Personnel in the safety organisation shall have the competence to undertake this work.
6. In particular, the safety plan shall be compliant with the ACRuDA Safety Plan Requirements (see chapter
4.4.2). All life cycle activities shall be audited for compliance with the safety plan.
7. The supplier shall produce the safety case of the architecture which shall be compliant with the
requirements of the ACRuDA Safety Case Structure (see chapter 4.4.3).
8. The life cycle plan should cover the following plans:
Configuration and management plan,
Development plan,
Quality plan ,
Maintenance plan,
Manufacturing plan,
Safety plan,
Verification and Validation Plan.
9. The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]
shall be used to prepare and implement a life cycle plan. Variance from the recommendation of these
standards should be fully described and justified. The techniques and measures from the standards which
are applicable to life-cycle processes and products, are listed in Table 1. This list is not complete. It is
important to note that SIL 4 architecture will need a combination of techniques and measures with which
to provide a very high degree of protection against any identified hazard. For instance, a SIL 4
architecture safety planning process procedure would need to use a combination of several techniques and
measurers, such as checklist, audit and document review.

Activity/Object Technique/Measure Reference

Safety Planning Checklist [CEN03]-E.1


and Quality Audit
Assurance Document Inspection and
walk-through of the Specifications
Review after safety plan change
Review after each life cycle phase

Ressource Repetitive and regular [CEN03]-E.3


Qualities Staff Training
Completely independent
designers, validators and
assessor
Highly qualified and
experienced staff

Project Definitions of tasks and [IEC01]-I 7


Management responsibilities
consistency procedures after
modification
configuration management
monitoring and control of project
status.

Documentation Guidelines for organisation scheme. [IEC01]-I 7 &


Checklists for contents, unique G.2
form, on-line documents, formalised [CEN03]-E.8
revision, interface description,
environment studies, modification
and maintenance procedures,

3.11.2010 12:59
WCS_AP v.03 Test report

manufacturing and application


documents.

Construction Use well tried and approved [IEC01], I 7


components

Testing and Black-box testing from cause- [IEC01]-I 7


Validation consequence diagram, boundary
value cases
Statistical testing - realistic
distribution of input data and
assumed failure modes
Proven by use

Manufacturing Requirements, precautions and [CEN03]-E.9


audit plan of actual manufacturing
process by safety organisation

Installation and Requirements, precautions and [CEN03], E.9


Maintenance audit plan of actual installation and
maintenance processes by safety
organisation

Table 1: Life Cycle - Techniques and Measures

6.5. Requirements
A systematic approach to requirements development is essential to ensure high integrity.

6.5.1. Contents
The functionality and integrity, reliability and performance requirements of the architecture should be specified.

The desired features, such as protection against some specific component faults or target time for fault
detection, are regarded as an integral part of requirements.

6.5.2. Basic criteria

1. The approach for establishing and identifying detailed requirements shall be described. This should
include procedures for:
deviation of the safety target from the top level architecture
decomposition of system level requirements to lower level requirements specifications,
verifying the consistency of requirements,
tracing their relationships to the design objects, components and code,
providing traceability to test specification to enable testing of each requirements to validate the
systems,
mechanisms to ensure that changes to requirements are fully controlled
2. The safety critical digital architecture shall meet the SIL 4 requirements as prescribed by the standards
[CEN03] and [IEC01].
3. The safety requirements shall consider Human factor issues, reliability of the operators, information
overloading, operator errors, etc.
4. The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]

3.11.2010 12:59
WCS_AP v.03 Test report

shall be used for the requirements. Variance from the recommendation of these standards should be fully
described and justified. The techniques and measures from the standards which are applicable to life-cycle
processes and products, are listed in Table 2.

Activity/Object Technique/Measure Reference

Requirements Separation of safety-related functions [IEC01]-2B


Specification from non-safety related functions
Graphical description [CEN03]-E.2
Structured Specification
Inspection of Specification
Hazard log
Test Specification

Required Protection against operator error [CEN03]-E.5


Design Features Protection against sabotage
Protection against single faults for
discrete components and digital
electronics
Physical independence (insulation)
Target time for detection and
negation of single fault
Retention of safe state
Target time for detection and
negation of multiple faults
Dynamic fault detection
Program sequence monitoring
Measures against power supply
mal-function
Secondary Protection against
systematic faults

Table 2: Techniques and Measures for Requirements

6.6. Design
Digital architectures are designed to reduce random and systematic credible faults to an acceptable level by
using appropriate techniques and measures.

6.6.1. Contents

The design describes all the elements of the architecture, their interrelationships and interfaces, and their role in
fulfilling the requirements.

The techniques and measures used to achieve the design goal are also explained.

6.6.2. Basic criteria

1. The procedures used to derive the design from the requirements and to verify the design against the
requirements shall be described.
2. The safety critical digital architecture shall provide the following functionality:
implementation of requirements derived from mitigation or elimination of hazard identified for the
range of perceived applications of the architecture,
execution of application programs,
collection of inputs and delivery of outputs,

3.11.2010 12:59
WCS_AP v.03 Test report

detection and negation of faults,


provision of timer and watchdog functions,
fail-safe inputs and outputs,
facility to install application programs.
3. The hazardous failure rate of any vital hardware components shall be derived from the overall safety
target. For example, see ERTMS Specifications which stated 10-10 faults/hour [ERT96].
4. All credible failure modes for each hardware and software element of the architecture shall be identified.
5. The hardware components shall be able to perform the safety function in the presence of two faults
(source [IEC01] part 2 table 2).
6. Faults shall be detected with on-line, high diagnostic coverage (source [IEC01] part 2 table 2). A fail-safe
architecture very much depends on the effectiveness of its fault detection measures, it may not need any
on-line diagnostics. However, a fail-operational architecture needs detailed on-line diagnostic coverage to
achieve its integrity and reliability, because without this it is very difficult to implement any recovery
mechanism.
7. Undetected hazardous faults shall be detected by the (off-line) proof checks (source [IEC01] part 2 table
2).
8. The architecture shall be designed to minimise the credible faults by using a combination of well tried and
well defined fault avoidance and fault tolerant measures.
9. The design specification shall identify the components and modules of the architecture, and describe their
functional and other characteristics (such as their integrity levels, failure rates, performance). It shall also
describe interfaces, internally and with external equipment.
10. The failure modes of all the following components shall be identified along with the techniques and
measures used to eliminate or mitigate the hazards arising from such failures:
main processor, co-processors and micro-controllers,
watchdog and clock,
I/O cards, data path and field bus,
communication network,
operating system or executive program.
11. A quantitative estimate of the reliability of the overall architecture (for the worst case scenarios) shall be
presented. The process, procedures and standards on which these are based shall form part of this
presentation.
12. The design of the architecture shall ensure that higher integrity level modules are not affected by lower
integrity level modules. Appropriate analyses shall be used to justify this.
13. The design shall ensure that the architecture operate correctly in all foreseeable environmental conditions,
such as EMC, noise, heat, etc. The envelop for the environmental conditions and requirements shall be
defined in the requirements specification.
14. All software components of the architecture shall conform to [CEN02] norms and the relevant GAM
principles. The detailed software assessment criteria are given in chapter 6.10.
15. The detailed hardware assessment criteria are given in chapter 6.11.
16. The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]
shall be used for the design. Variance from the recommendation of these standards should be fully
described and justified. The techniques and measures from the standards which are applicable to life-cycle
processes and products, are listed in Table 3.

Activity/Object Technique/Measure Reference

Architecture Dual digital channels based on [CEN03]-E.4


composite fail-safety with fail-safe
comparison
Single digital channel based on
inherent fail-safety
Single digital channel based on
reactive fail-safety
Diverse digital channels with fail-safe
comparison
Justification of the architecture by
quantitative reliability analysis of the

3.11.2010 12:59
WCS_AP v.03 Test report

hardware

Processing Comparator [IEC01]-2B


Units Majority voting
Self-test by software (single channel
only)
Self-test by hardware (single channel
only)
Coded processing (single channel
only)
Reciprocal comparison by software

Invariable Signature of a double word (16 bit) [IEC01]-2B


memory ranges Block replication

Variable Galpat or transparent Galpat [IEC01]-2B


memory ranges test
Abraham test
Double RAM with hardware
or software comparison and
read/write test

I/O units and Test pattern [IEC01]-2B


interfaces Code protection
Multi-channelled parallel output
Monitored outputs
Input comparison

Data paths Complete hardware redundancy [IEC01]-2B


Inspection using test patterns
Transmission protocol
Transmission redundancy
Information redundancy

Power supply Overvoltage protection with shut-off [IEC01]-2B


Voltage control (secondary)
Power-down with shut-off
Graceful degradation

Watchdog Separate time basis and time-window [IEC01]-2B


Combination of temporal and logical
monitoring of program sequence
Temporal monitoring with on-line
check

Clock Reciprocal comparison in redundant


configuration
Dual frequency timer

3.11.2010 12:59
WCS_AP v.03 Test report

Communication Separation of electrical energy from


communication lines
Spatial separation in redundant lines
Increase of interference immunity
Antivalent signal transmission

Input and Idle current principle


Output cards Test pattern
Electrical interlocking
Cross-monitoring of redundant units

Software Techniques and measures [CEN02]


Components recommended by [CEN02] and [IEC01]-3
[IEC01] part 3

Table 3: Techniques and Measures for Design

6.7. Validation and off line testing


Validation of architectures against their requirements and dynamic off-line testing of their elements and
assemblies are essential to ensure their integrity.

6.7.1. Contents
A well structured validation and test plan is required. This plan shall describe all the activities from test
environment set-up and test scenario selection to test execution and analysis of the test results. It also describes
test organisation, test processes and test documentation. The test specifications, acceptance criteria and test
results form an essential part of evidence of safety.

6.7.2. Basic criteria

1. The plan shall define the validation test process by:


identifying the requirements specification against which the validation test is based,
identifying the different test phases including unit, integration and requirements testing,
specifying the test organisation and their responsibilities,
describing the testing procedures, e.g. fault injection, statistical testing method, or regression testing,
identifying the test environment and their required integrity and quality requirements.
2. Procedures to ensure and demonstrate independence of test from the design and integration activities shall
be defined. Such procedures shall describe, explicitly, requirements for independence of groups of
personnel.
3. Each test phase shall accompany a test specification describing the test objectives, test scenarios,
configuration data, and acceptance criteria.
4. The SIL 4 test techniques and measures (see Table 2.1), recommended by the standards, shall be used.
5. The test results shall record the frequency of tests, fault detection success rate, coverage and mean
detection times.
6. The test results shall be analysed to give quantitative estimates of the hidden faults, and their effects on
reliability estimates.
7. The Assessor shall witness a representative sample of tests to ensure that the test procedures have been
correctly implemented.
8. The test specification shall cover all credible failure modes.
9. The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]
shall be used for the validation and off line testing. Variance from the recommendation of these standards
should be fully described and justified. The techniques and measures from the standards which are
applicable to life-cycle processes and products, are listed in Table 4

3.11.2010 12:59
WCS_AP v.03 Test report

Activity/Object Technique/Measure Reference

Hardware Fault injection

Software Regression testing

Verification & Project management [CEN03]-E9


Validation Documentation
Functional testing under environmental conditions [IEC01]-2
Interference immunity testing
Functional testing at ambient
Check-list
Calculation of failure rate
Static and dynamic analysis
‘Worst case’ and failure analysis
Simulation
Statistical testing
Surge immunity testing
Expanded functional testing
‘Worst case’ testing
Black-box testing

Table 4: Techniques and Measures for Validation & Testing

6.8. Fault and failure analyses


An independent fault and failure analysis of the digital architecture shall show that the architecture has been
thoroughly analysed to ensure that all credible faults are identified, the fault control methods are effective, and
the residual faults are non-hazardous.

6.8.1. Contents

1. The analyses should be carried out as detailed in the safety plan. Their application, procedures and scope
of analysis should be explained.
2. The main finding of the analyse should be available for examination.

6.8.2. Basic criteria

1. The analyses shall be planned and performed in a timely manner, so that their findings are effectively used
in the development process.
2. Review and incorporation of the finding of the analyses shall be part of a formal implementation process.
3. The analyses shall identify all credible failure modes, estimate their criticality and frequency of
occurrence.
4. The types of failures considered shall be specified. They shall cover as far as possible all static and
intermittent failures, combination of failure modes, hazardous and safe failures, and latent and undisclosed
failure modes.
5. The results and findings of the analyses shall be integrated in the safety case of the architecture, they shall
form the core of the safety argument, evidence to support functional and technical safety.
6. The faults arising from the following sources shall be considered:
hardware and software and their interactions,
environmental factors, eg. EMC,
network elements and data and field buses,
operators and operating conditions,
critical operations including start up and close-down.
7. The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]

3.11.2010 12:59
WCS_AP v.03 Test report

shall be used for fault and failure analysis. Variance from the recommendation of these standards should
be fully described and justified. The techniques and measures from the standards which are applicable to
life-cycle processes and products, are listed in Table 5.

Activity/Object Technique/Measure Reference

Risk reduction Preliminary Hazard Analysis (PHA) [CEN03]-E6


Fault Tree Analysis (FTA)
Failure Mode, Effects and Criticality
Analysis (FMECA)
Hazard and Operability studies
(HAZOP)
Cause-consequence diagrams
Markov diagrams
Event tree
Reliability Block Diagram
Common Cause Failure Analysis
Historical event analysis
Zonal Analysis

Measures Failure detection via technical [IEC01]-2


against process (on-line)
systematic Programme sequence monitoring
hardware Test by additional hardware
failures Standard test access port and
boundary scan architecture
Code protection
Diverse hardware
Fault-detection and diagnosis
Error detection and correcting codes
Safety bag techniques
Diverse programming
Dynamic reconfiguration
Failure assertion programming
Recovery block
Backward or forward recovery
Graceful degradation
Artificial intelligence - fault
correction
Re-try fault recovery mechanisms
Memorising executed

Measures Measures against voltage breakdown [IEC01]-2


against Measures against voltage variations,
environmental overvoltage, low voltage
failures Separation of electrical energy lines
from information lines
Failure detection via technical
process (on-line)
Programme sequence monitoring
Measures against temperature
increase
Spatial separation in redundant lines
Test by additional hardware
Protection code

3.11.2010 12:59
WCS_AP v.03 Test report

Increase of interference immunity


Antivalent signal transmission
Diverse hardware
Software architecture

Table 5: Techniques and Measures for Fault & Failure Analysis

6.9. Operation, Maintenance and Support


A digital architecture is a "product", primarily designed to be used as a platform for delivering safety critical
applications. To fulfil this aim, it must be supported with an adequate, operation and maintenance programme.

The objective of the overall operation and maintenance is to operate and maintain the safety architecture, its
control system and the total combination of safety-related systems and external risk reduction facilities such that
the designed functional safety is maintained.

6.9.1. Contents
User manual, maintenance manual, upgrade and new release procedure FRACAS, user support.

6.9.2. Basic criteria

1. There shall be a maintenance plan for the product which should include collection of field data. Inspection
and off-line tests shall be performed at regular interval.
2. A support service plan shall specify support organisation, its responsibilities and policies. The support
procedure shall explain the mechanisms used for fault reporting and incorporating new releases.
3. Safety operation procedures, inspection and maintenance procedures shall be formulated and defined in a
way that ensures safety and minimises operator errors. All relevant issues from the hazard and safety
analyses shall be addressed.
4. The digital architecture components shall be kept as simple as possible to reflects the limits of the human
capacity. Appropriate metrics may be used to assess the relative complexity of these components.
5. Data-driven systems (including parametric or configurable systems) shall be protected against possible
errors arising from entry of incorrect data.
6. The control devices and means of surveillance shall be such that additional hazards due to operator error
are remote.
7. There shall be a well specified procedure for collecting and analysing the product's history of use data.
8. The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]
shall be used for operation, maintenance and support. Variance from the recommendation of these
standards should be fully described and justified. The techniques and measures from the standards which
are applicable to life-cycle processes and products, are listed in Table 6.

Activity/Object Technique/Measure Reference

Operation and Project management [CEN03]-E10


maintenance Documentation
procedures User & maintenance friendliness
Limited operation possibilities
Training in the execution of
operational and maintenance
instructions
Protection against operating errors
Protection against sabotage

Table 6: Techniques And Measures for Operation, Maintenance And Support

3.11.2010 12:59
WCS_AP v.03 Test report

6.10. Software Assessment Criteria


6.10.1. Software integrity level
The required software integrity level shall be decided on the basis of the level of risk associated with the use of
the software in the architecture. The safety integrity level shall be specified through the process identified in
[CEN01].

6.10.2. Life cycle issues and documentation

1. Software planning. The supplier shall produce the following documents and the assessor shall perform a
judgement on their contents :
Software Configuration Management Plan
Software development plan
Software quality assurance plan
Software validation plan
Software maintenance plan
Software/hardware integration plan
Software integration plan
Software verification plan
2. Software Requirements. The supplier shall produce the following documents and the assessor shall
perform a judgement on their contents:
Software requirements specification
Software requirements test specification
Software requirements verification report
3. Software Design: The supplier shall produce the following documents and the assessor shall perform a
judgement on their contents:
Software architecture specification
Software design specification
Software design test specification
Software integration specification
Software architecture and design verification report
4. Software Module Design. The supplier shall produce the following documents and the assessor shall
perform a judgement on their contents:
Software module design specification
Software module test specification
Software module verification report
5. Code. The supplier shall produce the following documents and the assessor shall perform a judgement on
their contents:
Software source code and supporting documentation
Software source code verification report
6. Module Testing. The supplier shall produce the following documents and the assessor shall perform a
judgement on their contents:
Software module test report
7. Software Integration. The supplier shall produce the following documents and the assessor shall perform a
judgement on their contents:
Software integration report
8. Software/Hardware Integration. The supplier shall produce the following documents and the assessor shall
perform a judgement on their contents:
Software/hardware integration report
9. Software Validation. The supplier shall produce the following documents and the assessor shall perform a
judgement on their contents:
Software validation report
10. Software Assessment. The supplier shall produce the following documents and the assessor shall perform
a judgement on their contents:
Software assessment report
11. Software Maintenance. The supplier shall produce the following documents and the assessor shall perform

3.11.2010 12:59
WCS_AP v.03 Test report

a judgement on their contents:


Software maintenance records
Software maintenance log
12. The techniques and measures, equivalent to those recommended by the standard, [CEN02] shall be used
for the software development. Variance from the recommendation of these standards should be fully
described and justified. The techniques and measures from the standards which are applicable to life-cycle
processes and products, are listed in Table 7.

Activity/Object Technique/Measure Reference

Software Formal Methods: [CEN02]


requirements CSS
specification CSP
HOL
LOTOS
OBJ
temporal logic
VDM
Z
B
Semi-Formal Methods:
Logic/function block diagrams
Sequence diagrams
Data flow diagrams
Finite state machines/state
transition diagrams
Temporal Petri nets
Decision/truth tables
Structured Methodology:
JSD
MASCOT
SADT
SSADM
Yourdon

Software Defensive programming [CEN02]


Architecture Fault detection and diagnosis
Error correcting codes
Error detection codes
Failure assertion programming
Safety bag techniques
Diverse programming
Recovery block
Backward recovery
Forward recovery
Re-try fault recovery mechanisms
Memorising executed cases
Artificial intelligence fault correction
Dynamic reconfiguration of software
Software error effect analysis
Fault tree analysis

Software Formal Methods:


design and CSS
development CSP

3.11.2010 12:59
WCS_AP v.03 Test report

(1/4) HOL
LOTOS
OBJ
temporal logic
VDM
Z
B
Semi-Formal Methods:
Logic/function block diagrams
Sequence diagrams
Data flow diagrams
Finite state machines/state
transition diagrams
Temporal Petri nets
Decision/truth tables

Activity/Object Technique/Measure Reference

Software Structured Methodology: [CEN02]


design and JSD
development MASCOT
(2/4) SADT
SSADM
Yourdon
Modular Approach:
Module size limited
Information
hiding/encapsulation
Parameter number limit
One-entry/one-exit point in
subroutines and functions
Fully defined interface
Design And Coding Standards:
Coding standard exists
Coding style guide
No dynamic objects
No dynamic variables
Limited use of pointers
limited use of recursion
No unconditional jumps
Analysable Programs
Strongly Types Programming
Language
Structured Programming
Programming Language:
ADA
MODULA-2
PASCAL
FORTRAN 77
C
PL/M
BASIC
Assembler
Ladder diagrams
Functional blocks

3.11.2010 12:59
WCS_AP v.03 Test report

Statement list
Subset of C with coding
standards

Activity/Object Technique/Measure Reference

Software Language Subset


design and Validated Translator
development Translator Proven In Use
(3/4) Library Of Trusted/Verified Modules
And Components
Functional And Black-Box Testing
Test case from
cause/consequence diagrams
Prototyping/animation
Boundary value analysis
Equivalence classes and input
partition testing
Process simulation
Performance Testing
Avalanche/stress testing
Response timing and memory
constraints
Performance specification
Interface Testing
Data Recording And Analysis
Fuzzy Logic
Object Oriented Programming
Software verification and testing
Formal Proof
Probabilistic Testing
Failure probability per demand
failure probability during a
certain period of time
Probability of error
containment
Probability of failure free
execution
Probability of survival
Availability
MTBF or failure rate
Probability of safe execution

Activity/Object Technique/Measure Reference

Software Static Software Analysis


design and Boundary value analysis
development Checklists
(4/4) Control flow analysis
Data flow analysis
Error guessing
Fagan Inspections
Sneak circuit analysis
Symbolic execution

3.11.2010 12:59
WCS_AP v.03 Test report

Walkthroughs/ design reviews


Dynamic Analysis And Testing
Test case execution from
boundary value analysis
test case execution from error
guessing
Test case execution from error
seeding
Performance modelling
Equivalence classes and input
partition testing
Structured-based testing
Metrics
Traceability Matrix

Software/ Functional And Black-Box Testing [CEN02]


hardware Test case from
integration cause/consequence diagrams
Prototyping/animation
Boundary value analysis
Equivalence classes and input
partition testing
Process simulation
Performance Testing
Avalanche/stress testing
Response timing and memory
constraints
Performance specification

Activity/Object Technique/Measure Reference

Software Probabilistic Testing [CEN02]


validation Failure probability per demand
failure probability during a
certain period of time
Probability of error
containment
Probability of failure free
execution
Probability of survival
Availability
MTBF or failure rate
Probability of safe execution
Performance Testing
Avalanche/stress testing
Response timing and memory
constraints
Performance specification
Functional And Black-Box Testing
Test case from
cause/consequence diagrams
Prototyping/animation
Boundary value analysis

3.11.2010 12:59
WCS_AP v.03 Test report

Equivalence classes and input


partition testing
Process simulation
Modelling
Data flow diagrams
Finite state machines
Formal methods
Performance modelling
Time Petri nets
Prototyping/ animation
Structure diagrams

Activity/Object Technique/Measure Reference

Assessment Static Software Analysis [CEN02]


techniques
Boundary value
analysis
Checklists
Control flow analysis
Data flow analysis
Error guessing
Fagan Inspections
Sneak circuit analysis
Symbolic execution
Walkthroughs/ design
reviews
Checklists

Dynamic Software Analysis

Test case execution


from boundary value
analysis
test case execution
from error guessing
Test case execution
from error seeding
Performance
modelling
Equivalence classes
and input partition
testing
Structured-based
testing
Test case from
cause/consequence
diagrams
Prototyping/animation
Boundary value
analysis
Equivalence classes
and input partition
testing
Process simulation

3.11.2010 12:59
WCS_AP v.03 Test report

Cause-Consequence Diagrams
Event Tree Analysis
Software Error Effect Analysis
Common Cause Failure
Analysis
Markov Model
Reliability Block Diagram
Field Trial Before
Commissioning

Activity/Object Technique/Measure Reference

Software Accredited to EN 29001 [CEN02]


quality Compliant with EN 29000-3
assurance Company quality system
Software configuration management

Software Impact Analysis [CEN02]


maintenance Data Recording and Analysis

Table 7: Techniques and Measures for Software

6.11. Hardware Assessment Criteria


6.11.1. Life cycle issues and documentation
1. Hardware requirements. The supplier shall produce the following documents and the assessor shall
perform a judgement on their contents:
Hardware requirements specification
Hardware requirements test specification
Hardware requirements verification report
2. Hardware design. The supplier shall produce the following documents and the assessor shall perform a
judgement on their contents:
Hardware architecture specification
Hardware design specification
Hardware design test specification
Hardware integration specification
Hardware architecture and design verification report
3. Hardware testing. The supplier shall produce the following documents and the assessor shall perform a
judgement on their contents:
Hardware test report
4. The techniques and measures, equivalent to those recommended by the standard, [CEN03] and [IEC01]
shall be used for the Hardware development. Variance from the recommendation of these standards
should be fully described and justified. The techniques and measures from the standards which are
applicable to life-cycle processes and products, are listed in Table 8.

Activity/Object Technique/Measure Reference

Architecture Separation of safety-related [IEC01]


systems from non safety-
related systems

3.11.2010 12:59
WCS_AP v.03 Test report

Single electronic structure


with self test and supervision
Dual electronic structure
Dual digital channels based
on composite fail-safety with
fail-safe comparison
Single electronic structure
based on inherent fail-safety
Single electronic structure
based on reactive fail-safety
Diverse electronic structure
with fail-safe comparison
Justification of the
architecture by quantitative
reliability analysis of the
hardware

Processing units Comparator [IEC01]


Majority voter
Selftest by software (one
channel)
Selftest supported by
hardware (one channel)
Coded processing (one
channel)
Reciprocal comparison by
software

Invariable Signature of a double word [IEC01]


memory ranges (16 bit)
Block replication

Variable Galpat or transparent [IEC01]


memory ranges Galpat test
Abraham test
Double RAM with
hardware or software
comparison and read/write
test

Activity/Object Technique/Measure Reference

I/O units and Test pattern [IEC01]


interfaces Code protection
Multi-channelled parallel
output
Monitored outputs
Input comparison

Clock Reciprocal comparison in [IEC01]


redundant configuration

3.11.2010 12:59
WCS_AP v.03 Test report

Dual frequency timer

Power supply Overvoltage protection with [IEC01]


shut-off
Voltage control
(secondary)
Power-down with shut-off
Graceful degradation

Watchdog Separate time basis and [IEC01]


time-window
Combination of temporal
and logical monitoring of
program sequence
Temporal monitoring with
on-line check

Data paths Complete hardware [IEC01]


redundancy
Inspection using test
patterns
Transmission protocol
Transmission redundancy
Information redundancy

Communication Separation of electrical [IEC01]


energy
Spatial separation in
redundant lines
Increase of interference
immunity
Antivalent signal
transmission

Input and Idle current principle [IEC01]


output cards Test pattern
Electrical interlocking
Cross-monitoring of
redundant units

Table 8: Techniques and Measures for Hardware

7. TERMINOLOGY
7.1. Introduction
This document is a list of working definitions for terms used in ACRuDA project related to the safety critical
application. The following principle have been used in selecting and forming these definitions:

3.11.2010 12:59
WCS_AP v.03 Test report

1. Existing definitions in accepted documents (Standards for example) should be used where possible. In
these cases, the source document of definitions is indicated,
2. Where no satisfactory existing definition can be agreed, a new term or phrase should be coined rather
than using an existing one in a new or non-standard way. This will reduce confusion,
3. If different definitions exist for a same term, the different definitions are presented (each definition is
preceded by a number between brackets) and a definition in relation to ACRuDA project will be agreed.

7.2. Terminology, Definitions and Abbreviations


1. Acceptance: The status given to any product by the final user. Source: [CEN01].
2. Accident: An unintended event or sequence of events that results in death, loss of a system or service or
environmental damage. Source: [CEN03].
3. Accreditation:

(1) Procedure by which the technical competence and the impartiality of a testing laboratory is
recognised. Source: [ITS01].
(2) Formal recognition of the laboratory competence to achieve some tests or some established type tests.
Source: [EN01].
4. Accreditation Body: Body which manages a laboratory accreditation system and pronounced
accreditation. Source: [EN01].
5. Accreditation Criteria (for a laboratory): Set of requirements defines and applied by an accreditation
body, and that a testing laboratory must satisfy to be accredited. Source: [EN01].
6. Accreditation System: System with its own procedures and management rules to proceed laboratory
accreditation. Source: [EN01].
7. Accredited Laboratory: Testing laboratory that has been accredited. Source: [EN01].
8. Apportionment: A process, whereby the RAMS elements for a system, are sub-divided between the
various items which comprise the system to provide individual targets. Source: [CEN01]
9. Approval: The status given to any product by the requisite Authority when the product has fulfilled a set
of predetermined conditions. Source: [CEN01]
10. Assurance of Conformity: Procedure resulting in a statement giving confidence that a product, process or
service fulfils specified requirements
11. Assessment:

(1) The undertaking of an investigation in order to arrive at a judgement based on evidence, of the
suitability of a product. Source: [CEN01].
(2) The process of analysis to determine whether the design authority and the validator have achieved a
product that meets the specified requirements and to form a judgement as to whether the product is fit for
its intended purpose. Source: [CEN03].
12. Assessment inputs: In the ACRuDA project, the assessment inputs are all the data necessary to achieve
the assessment (hardware, software, documentation, tools, standards, etc.).
13. Assessment Repeatability: the repetition of the assessment of the same product, with the same safety
requirements specification evaluated by the same assessor must give the same judgement than the overall
verdict as the first assessment. Source: [ITS01].
14. Assessment Reproducibility: assessment of the same product, with the same safety requirements
specification evaluated by an other assessor must give the same overall verdict as the first assessor.
Source: [ITS01].
15. Assessor:

(1) A body with responsibility for undertaking assessments. Source: [CEN01].


(2) The person or agent appointed to carry out the assessment. Source: [CEN03].
16. Attribute: An abstract term qualifying the properties of an item of data.
17. Audit: A systematic and independent examination to determine whether the procedures specific to the
requirements of a product comply with the planned arrangements, are implemented effectively and are
suitable to achieve the specified objectives. Source: [CEN01].
18. Availability: The ability of a product to be in state to perform a required function under given conditions
at a given instant of time or over a given time interval assuming that the required external resources are
provided. Source: [CEN01], [CEN02], [CEN03].

3.11.2010 12:59
WCS_AP v.03 Test report

19. Behaviour: The description of any sequence of states and transitions likely to exist in one system.
20. Certification:

(1) Formal declaration which confirms the results of an assessment and the fact that the assessment
criteria were correctly applied. Source: [ITS01].
(2) Action by a third party, demonstrating that the specific sample tested is in conformity with a specific
standard or other normative document.
21. Certification Body or Notified body: impartial and independent body which achieves certifications.
Source: [ITS01].
22. Certification System: A system that has its own rules of procedures and management for carrying out
certification of conformity. Source: [ITS01].
23. Coding: This is the work of translating the results of the detail design into a program using a given
programming language - one of the phases of the software life cycle.
24. Cohesion: The degree to which measures taken, interact with and depend on each other.
25. Commercial of the shelf (COTS) software: Software defined by market-driven need, commercially
available and whose fitness for purpose has been demonstrate by a broad spectrum of commercial users.
Source: [CEN02].
26. Common cause failure: A failure which is the result of an event(s) which because of dependencies,
causes a coincidence of failure states of components in two or more separate channels of a redundancy
system , leading to the defined system failing to perform its required function. Source: [CEN01].
27. Common Mode Failure: Failure of apparently independent components or communication links due to
an initiating event which effects them all. Source: [IDS01].
28. Common Mode Fault: Fault common to items which are intended to be independent. Source: [CEN03].
29. Compliance: A demonstration that a characteristic or property of a product satisfies the stated
requirements. Source: [CEN01].
30. Component:

(1) A part of a product that has been determined to be a basic unit or building block. A component may be
simple or complex. Source: [CEN03].
31. Configuration: The structuring and interconnection of the hardware and software of a system for its
intended application. Source: [CEN03].
32. Configuration management: A discipline applying technical and administrative direction and
surveillance to identify and document the functional and physical characteristics of a configuration item,
control change to those characteristics, record and report change processing and implementation status
and verify compliance with specified requirements. Source: [CEN01].
33. Conformity: the degree to which a given real product correspond to its description.
34. Conformance Testing: Testing whose purpose is checking whether the system satisfies its specification.
Source: [LAP01].
35. Control Flow Analysis:Analysis of the sequence of execution in a computer program. This analysis can
show unreachable code, dynamic halts or false entry points. Source: [IDS01].
36. Coverage: Measure of the representatively of the situations to which a system is submitted during its
validation compared to the actual situations it will be confronted with during its operational life. Source:
[LAP01].
37. Criterion: A standard by which a correct judgement may be formed.
38. Criticality (system): Level of safety integrity of function or component. Source: [IDS01].
39. Defensive Programming: Writing programs which detect erroneous input and output values and control
flow. Such programs prevent propagation of errors and recover by software where possible. Source:
[IDS01].
40. Dependability: Trustworthiness of a computer system such that reliance can justifiably be placed on the
service it delivers. Source: [LAP01].
41. Dependent failure: The failure of a set of events, the probability of which cannot be expressed as the
simple product of the unconditional probabilities of the individual events. Source: [CEN01].
42. Design: The pre - build exercise of defining elements and their interconnection such that the product will
meet its specified requirements. Source: [CEN03].
43. Detection (error): The action of identifying that a system state is erroneous. Source: [LAP01].
44. Deterministic Testing: Form of testing where the test patterns are predetermined by a selective choice.
Source: [LAP01].
45. Development Environment: set of organisational measures, procedures and standards which must be

3.11.2010 12:59
WCS_AP v.03 Test report

used during the development of the product. Source: [ITS01].


46. Development Process: set of phases and tasks by which a product is built and which translates the
specification in software and hardware. Source: [ITS01].
47. Disturbance: An unexpected influence of the environment on the behaviour of the equipment
48. Diversity: A means of achieving all or part of the specified requirements in more than one independent
and dissimilar manner. Source: [CEN03].
49. Dormant Fault: Internal fault not activated by the computation process. Source: [LAP01].
50. Dynamic Verification: Verification involving exercising the system. Source: [LAP01].
51. Effectiveness: The degree to which the safety measures taken actually achieve the desire results.
52. Element: A part of a product that has been determined to be a basic unit or building block. An element
may be simple or complex. Source: [CEN03].
53. End User: Person, in contact with a product, and who only uses the operational capacity of the product.
Source: [ITS01].
54. Equipment: A functional physical item. Source: [CEN03].
55. Error: A deviation from the intended design which could result in unintended system behaviour or
failure. Source: [CEN02], [CEN03].
56. Fail Safe: A concept which is incorporated into a design of a product such that, in the event of failure, it
enters or remains in a safe state. Source: [CEN03].
57. Failure: A deviation from the specified performance of a system. A failure is the consequence of a fault
or error in the system. Source: [CEN02], [CEN03].
58. Failure cause: The circumstances during design, manufacture or use which have led to a failure. Source:
[CEN01].
59. Failure Mode: The predicted or observed results of a failure cause on a stated item in relation to the
operating conditions at the time of the failure. Source: [CEN01].
60. Failure Mode and Effect Criticality Analysis (FMECA): A type of FMEA which intended to
determine, inductively, the nature and criticality of the consequences and fail of the equipment.
61. Failure rate: The limit, if it exists , of the ratio of the conditional probability that the instant of time, T, of
a failure of a product falls within a given time interval (t + delta(t)) and the length of this interval, delta(t)
when delta(t) tends towards zero, given that the item is in an up state at the start of the time interval.
Source: [CEN01].
62. Fault: An abnormal condition that could lead to an error in a system. A fault can be random or
systematic. Source: [CEN01], [CEN02], [CEN03].
63. Fault Avoidance: The use of design techniques which aim to avoid the introduction of faults during the
design and construction of a system. Source: [CEN02].
64. Fault Detection Time: Time span which begins at the instant when a fault occurs and ends when the
existence of the fault is detected. Source: [CEN03].
65. Fault Mode: One of the possible states of a faulty product for a given required function. Source:
[CEN01].
66. Fault Tolerance: The built in capability of a system to provide continued correct execution i.e. provision
of service as specified in the presence of a limited number of hardware or software faults. Source:
[CEN02].
67. Fault Tree analysis: An analysis to determine which fault modes of the product, sub-product or external
events, or combination thereof, may result in a stated fault mode of the product, presented in the form of
a fault tree. Source: [CEN01].
68. FMEA: an acronym meaning Failure Mode and Effect Analysis. A qualitative method of reliability
analysis which involves the study of the fault modes which can exist in every sub-product of the product
and the determination of the effects of each fault mode on other sub-product of the product and on the
required functions of the product. Source: [CEN01].
69. Formal Verification: Showing by formal mathematical proof or arguments that software implements its
(formal mathematical) specification correctly. Source: [IDS02].
70. Formal Mathematical Method: Mathematically based method for the specification, design and
production of software. Also includes a logical inference system for Formal Proofs of Correctness, and a
methodological framework for software development in a formally verifiable way. Source: [IDS01].
71. Formal Mathematical Specification: A specification in a formal mathematical notation. Source:
[IDS01].
72. Formal Proof of Correctness: A way of proving that a computer program follows its specification by a
mathematical proof using formal rules. Source: [IDS01].
73. Formally Defined Syntax: A technique such as Backus Naur Form (BNF) used to define the syntax of a

3.11.2010 12:59
WCS_AP v.03 Test report

language, plus collateral definition of an annotation language. Source: [IDS01].


74. FRACAS: A acronym meaning Failure Reporting And Corrective Action System. The process of
reporting a failure on test or in service, analysing its cause and implementing corrective action to prevent
or reduce the rate of occurrence. Source: [CEN01].
75. Function: A mode of action or activity by which a product fulfils its purpose. Source: [CEN01].
76. Functional Testing: Form of testing where the testing inputs are selected according to criteria relating to
the system's function. Source: [LAP01].
77. Hazard:

(1) A condition which can lead to an accident. Source: [CEN03].


(2) A physical situation with a potential for human injury. Sources: [CEN01].
78. Hazard Analysis: The process of identifying the hazards which a product or its use can cause. Source:
[CEN03].
79. Hazard Log: The document in which all safety management activities, decisions made and solutions
adopted, are recorded or referenced. Sources: [CEN01], [CEN03].
80. Hazard Sequence: A sequence of hazards that can lead to an accident. Sources: [CEN01].
81. Human Error: A human action (mistake) which can result in unintended system behaviour/failure.
Sources: [CEN03].
82. Information Flow Analysis: Identification of the input variables on which each output variables depend
in a computer program. Used to confirm that outputs only depend on the relevant inputs as specified.
Source: [IDS01].
83. Independence (human): Freedom from intellectual, commercial and /or management involvement.
Source: [CEN03].
84. Independence (technical): Freedom from any mechanism which can affect the correct operation of more
than one item. Source: [CEN03].
85. Independent Body: A body which is separate and distinct, by ways of management and other ressources,
from the bodies responsible for the development of the product. Source: [CEN01].
86. Item: Element under consideration. Source: [CEN03].
87. Maintainability: The probability that a given active maintenance action, for an item under given
conditions of use can be carried out within a stated time interval when the maintenance is performed
under stated conditions and using stated procedures and resources. Source: [CEN01], [CEN03].
88. Maintenance: The combination of all technical and administrative actions including supervision actions,
intended to retain an item in, or restore it to, a state in which it can performs it required function. Source:
[CEN01], [CEN03].
89. Measure: Something done with a view to the accomplishment of a purpose.
90. Operating Procedure: set of rules for the definition of the correct utilisation of a product. Source:
[ITS01].
91. Operational Environment: organisational measures, procedures and standards which must be used for
the operation of the product. Source: [ITS01].
92. Pertinence: The effectiveness of a single measure with respect to a specified desired result.
93. Process: A set of operations with defined inputs and outputs.
94. Product:

(1) Set of software and/or hardware which performs a function design and used or included in multiple
systems. Source: [ITS01].
(2) A collection of elements, interconnected to form a system, sub-system, or item of an equipment, in a
manner which meets the specified requirements. Source: [CEN03].
95. Proof Obligations: The requirement to prove a theorem to demonstrate the correctness of a development
step. Source: [IDS01].
96. Prototype: A rapidly produced program which is used to validate (part of) a specification. Source:
[IDS01].
97. Quality: A user perception of the attributes of a product. Source: [CEN03].
98. RAMS: An acronym meaning a combination of Reliability, Availability, Maintainability and Safety.
Source: [CEN01].
99. Random Faults: An occurrence of a fault based on probability theory and previous performance. Source:
[CEN03].
100. Random Hardware Failure: Failures, occurring at random time, which result from a variety of
degradation mechanism in the hardware. Source: [CEN01].

3.11.2010 12:59
WCS_AP v.03 Test report

101. Random Testing: See Statistical testing. Source: [LAP01].


102. Real-time Function: Function required to be fulfilled within finite time intervals dictated by the
environment. Source: [LAP01].
103. Real-time Service: Service required to be delivered within finite time intervals dictated by the
environment. Source: [LAP01].
104. Real-time System: System fulfilling at least one real-time function or delivering at least one real -time
service. Source: [LAP01].
105. Recovery (error): Form of error processing where an error-free state is substituted for an erroneous state.
Source: [LAP01].
106. Reliability: Dependability with respect to the continuity of service. Measure of continuos correct service
delivery. Measure of time to failure. Source: [LAP01].
107. Reliability Growth: The system's ability to deliver correct service is improved (stochastic increase of the
successive times to failure). Source: [LAP01].
108. Regression Verification: Verification performed after a correction, in order to check that the correction
has no undesired consequences. Source: [LAP01].
109. Risk:

(1) The combination of the frequency, or probability, and the consequence of the hazardous event.
Sources: [CEN02], [CEN03], [IDS02].
(2) The probable rate of occurrence of a hazard causing harm and the degree of severity of the arm.
Sources: [CEN01].
110. Risk Analysis: Analysis allowing identification of critical points and safety criteria in system elements.
This can be made at the different stages of building a system.
111. Safe State: A condition which continues to preserve safety. Source: [CEN03]
112. Safety: Freedom from unacceptable level of risk. Sources: [CEN01], [CEN02], [CEN03].
113. Safety Case: The documented demonstration that the product complies with the specified safety
requirements. Sources: [CEN01], [CEN03]
114. Safety Critical: Carries direct responsibility for safety. Source: [CEN03].
115. Safety Critical Software: Software used to implement a safety critical function. Source: [IDS01].
116. Safety Integrity:

(1) The likelihood of a system satisfactorily performing the required safety function under all the stated
conditions within a stated period of time. Sources: [CEN01].
(2) The likelihood of a safety related system achieving its required safety features under all the stated
conditions within a stated operational environment and within a stated period of time. Source: [CEN03].
117. Safety Integrity Level:

(1) One of four possible discrete levels for specifying the safety integrity requirements of the safety
functions to be allocated to the safety related products/systems. Safety integrity level 4 has the highest
level of safety integrity and safety integrity level 1, the lowest. Sources: [CEN01].
(2) A number which indicates the required degree of confidence that a system will meets its specified
safety features. Source: [CEN03].
118. Safety Involved: Carries indirect responsibility for safety. Source: [CEN03].
119. Safety Plan:

(1) A documented set of time scheduled activities, resources and events, serving to implement the
organisational structure, responsibility, procedures, activities, capabilities and resources that together
ensure that an item will satisfy given safety requirements relevant to a given contract or project. Source:
[CEN01].
(2) The implemented details of how the safety requirements of the project will be achieved. Source:
[CEN03].
120. Safety Process: The series of procedures that are followed to ensure that the safety requirements of a
product are identified, analysed and fulfilled. Source: [CEN03].
121. Safety Related: Carries responsibility for safety. Source: [CEN03].
122. Safety Related Software: Software which carries responsibility for safety. Source: [CEN02].
123. Safety Requirements: the requirements of the safety functions that have to be performed by the safety
related products/systems comprising safety functional requirements and safety integrity requirements.
Source: [CEN01].

3.11.2010 12:59
WCS_AP v.03 Test report

124. Safety Requirement Specification: Specification of the safety, necessary for a product and which is the
base for the assessment. The safety requirements specification must specify: the safety integrity target, the
risks, the standards and rules to apply, the safety functions, the type of application considered
(Interlocking, ATP, ATC, etc.), the different configurations, assumptions on environment of the product.
125. Security: Dependability with respect to the prevention of unauthorised access and/or handling of
information. Source: [LAP01].
126. Semantic Analysis: Checking the relationship between input and output for every semantically possible
path through a program, or part of a program. It can reveal semantically possible paths of which the
programmer was unaware, and coding errors. Source: [IDS01].
127. Severity (failure): Grade of the failure consequences upon the system environment. Source: [IDS01].
128. Software: Intellectual creation comprising the programs procedures rules and any associated
documentation pertaining to the operation of a data processing system. Source: [CEN02].
129. Software assessment: The process of product evaluation either by an official regulatory body or an
independent third party to establish that it complies with all necessary requirements, regulations and
standards.
130. Software Errors Effects Analysis (SEEA): Analysis intended to determine, inductively and similarly to
FMECA, the nature and criticality of consequences of software failures.
131. Sponsor: person or body who ask an assessment of a product. Source: [ITS01].
132. Static Code Analysis: Using mathematical techniques to analyse a program and reveal its structure. It
does not need execution of the program, but verifies the program against the specification. Techniques
include control flow, data use, information flow and semantic analysis. Source: [IDS01].
133. Static Verification: Verification conducted without exercising the system. Source: [LAP01].
134. Statistical Testing: Form of testing where the test patterns are selected according to a defined probability
distribution on the input domain. Source: [LAP01].
135. Structural Testing: Form of testing where the testing inputs are selected according to criteria relating to
the system's structure.
136. Sub-system: A portion of a system which fulfils a specialised function. Source: [CEN03].
137. Supplier: in the ACRuDA project, the supplier is the person or body who builds and sells a product or a
system. In the case of a product, the requirement, design, and validation phases are achieved by the
supplier. In the case of a system, the requirements can be defined by the end user.
138. System:

(1) A set of sub-systems or elements which interact according to a design. Source: [CEN03].
(2) A Specific installation with a particular goal and a particular operational environment. Source:
[ITS01].
139. Systematic Failure: Failures due to errors in any safety life cycle activity, within any phase, which cause
to fail under some particular combination of inputs or some particular environmental condition. Source:
[CEN01].
140. Systematic Faults: An inherent fault in the specification, design , construction, installation, operation or
maintenance of a system, sub-system, or equipment. Source: [CEN03].
141. Testing: Dynamic verification performed with valued inputs. Source: [EN01].
142. Testing Laboratory: Laboratory which achieves tests. Source: [EN01].
143. Traceability: The ability to trace the history, application or location of an item or activity, or similar
items or activities, by means of recorded identifications. Source: [NFX01].
144. Validation:

(1) Confirmation by examination and provision of objective evidence that the particular requirements for
a specific intended use have been fulfilled. Source: [CEN01].
(2) The activity of demonstration, by test and analysis that the product meets in all respects its specified
requirements. Source: [CEN03].
145. Validator: The person or agent appointed to carry out validation. Source: [CEN01], [CEN02], [CEN03].
146. Verification:

(1) Confirmation by examination and provision of objective evidence that the specified requirements have
been fulfilled. Source: [CEN01].
(2) The activity of determination by analysis and test that the output of each phase of the life cycle fulfils
the requirements of the previous phase. Source: [CEN02], [CEN03].
147. Verifier: The person or agent appointed to carry out verification. Source: [CEN01], [CEN02], [CEN03].

3.11.2010 12:59
WCS_AP v.03 Test report

8. References
8.1. European Council Directives
[DIN01] Council directive 96/48/EC: « Interoperability of the European High speed train network », 23 July
1996.

[DIN02] Council directive 93/465/EC: « Modules related to the different phases of assessment procedures of
conformity and rules of affixing and using CE mark, intended to be used in the technical harmonisation
directives », 22 July 1993 .

[DIN03] Council directive 90/531/EEC: « Procurement procedures of entities operating in the water, energy,
transport and telecommunications sectors », 17 September 1990.

8.2. European Technical Specifications


[STI01] AEIF - European Commission: « Interoperability for the Trans-European High speed Network -
Specification of the Technical Requirements for Interoperability - Control Command », Ref. MT114FE1L - 03
October 1996 - DRAFT.

8.3. Standards
[CEN01] prEN 50126 - CENELEC: « Railway Applications: The specification and demonstration of
dependability, reliability, availability, maintainability and safety (RAMS) », June 1997.

[CEN02] prEN 50128 - CENELEC: « Railway Applications: Software for Railway Control and Protection
System », January 1997.

[CEN03] prENV 50129 - CENELEC: « Railway Applications: Safety Related Electronic Systems for Signalling
», Version 1.0 - January 1997.

[IEC01] IEC 61508: « Functional Safety: Safety Related Systems », Part 1 to 7:

Part 1: General Requirements,


Part 2: Requirements For Electrical/Electronic/Programmable Electronic Systems,
Part 3: Software Requirements,
Part 4: Definitions And Abbreviations Of Terms,
Part 5: Guidelines On The Application Of Part 1,
Part 6: Guidelines On The Application Of Part 2 And 3,
Part 7: Bibliography Of Techniques And Measures.

[EN01] EN 45001: « General criteria for the operation of testing laboratories », 1989.

[EN02] EN 45002: « General criteria for the assessment of testing laboratories », 1989.

[EN03] EN 45003: « General criteria for the accreditation body of testing laboratories », 1989.

[EN04] EN 45011: « General criteria for certification bodies operating product certification », 1989.

[EN05] EN 45012: « General criteria for certification bodies operating certification of quality system », 1989.

[EN06] EN 45013: « General criteria for certification bodies operating certification of personnel », 1989.

[EN07] EN 45014: « General criteria for declaration of conformity by the suppliers », 1989.

[IDS01] Interim Defence Standard 00 - 55 (Draft)

3.11.2010 12:59
WCS_AP v.03 Test report

Part 1: Requirement « The procurement of safety Critical software in Defence Equipment »


Part 2: Evidence « The procurement of safety critical software in Defence Equipment » 00 - 55
issue 1 - 5 April 1991,

[IDS02] Interim Defence Standard 00 - 56 (Draft) « Hazard analysis and safety classification of the computer
and Programmable Electronic System elements of Defence Equipment » 00 - 56 - issue 1 - 5 April 1991

[NFX01] Norme Francaise - NFX50 120: « Qualite - Vocabulaire », Septembre 1987

8.4. ACRuDA Project


[ACR01] ACRuDA Project: « State of the Art - Safety Architecture Synthesis », 24 February 97. Reference:
ACRuDA/INRETS/MK-PM/WP1/D1/97.12/V2

[ACR02] ACRuDA Project: « State of the Art - Method Synthesis », 29 September 97. Reference:
ACRuDA/INRETS/PM-MK/WP1/D2/97.39/V3

8.5. CASCADE Project


[GAM01] CASCADE Project - ESPRIT 9032 - version 1.0 - 17 January, 1997: « General assessment method: it
provides the essentials to prepare and to handle the assessment of safety critical systems ». It is composed of:

Part 1: Rules,
Part 2: Guidelines ,
Part 3: Examples.

8.6. ERTMS project


[ERT96] ERTMS project: « RAMS Requirements Specification », Volume 5: Annexes, Safety, 20/12/96.

8.7. Information Technology domain


[ITS01] ITSEC: « Information Technology Security Evaluation Criteria », Version 1.2 - June 1991.

[ITS02] ITSEM: « Information Technology Security Evaluation Manual », Version 1.0 - September 1993.

[ITS03] ECF01: « Schéma Français d’Evaluation et de Certification des Technologies de l’Information -


Presentation du schéma », Version 2.0 - 16 janvier 1997

[ITS04] ECF03: « Schéma Français d’Evaluation et de Certification des Technologies de l’Information -


Procédure d’évaluation et de certification », Version 1.0 - 16 janvier 1997

[ITS05]> ECF04: « Schéma Français d’Evaluation et de Certification des Technologies de l’Information -


Format des rapports et certificats », Version 1.0 - 16 janvier 1997

8.8. Others
[AQC01] « Assurance Qualité en Conception » (« Quality Assurance in Design »). Marc Reynier. Collection
MASSON.

[LAP01] J.C. LAPRIE « Dependability: Basic concepts and Terminology » springer-Verlag Wien, NewYork.
1992.

9. ANNEX I: STRUCTURE OF A SAFETY PLAN


3.11.2010 12:59
WCS_AP v.03 Test report

The Safety Plan should include the following topics:

Chapter 1 - General Aspects


Introduction

This section provides an overview of the plan and includes any necessary background material
Aims and Objectives
Scope of the plan
The policy and strategy for achieving safety

This section will provide the policy and strategy for achieving safety, together with the means for
evaluating its achievement, and the means by which this is communicated within the organisation to
ensure a culture of safe working,
Assumptions and Constraints

This part of the plan will detail any assumptions being made in connection with the product development
together with the constraints under which the safety construction is to be conducted.
Interfaces with other related programs and plans
Applicable standards

Normative and supporting document should be listed in the Safety Plan.

Chapter 2 - Product Description


Description of the product

The main design characteristics of the product and its main applications are described in this section
The main selected safety measures and techniques

The main selected measures and techniques used to meet the safety requirements should be listed in the
Safety Plan

Chapter 3 - Management Aspects


Details of staff structure

The role, responsibilities, competencies and relationships of bodies undertaking tasks within the life cycle
will be described in this section. The identification of the persons, departments, organisations or other
units which are responsible for carrying out and reviewing each of the safety life-cycle phases will be
described. A description of the relationship between the bodies will be identified
Qualification and training of the staff

The procedures for ensuring all staff involved in all safety life-cycle activities are competent to carry out
activities for which they are accountable (Competence of persons).

Chapter 4 - System LifeCycle and Safety Activities


The system life cycle and safety activities

This section outlines the safety life cycle and safety activities to be undertaken within the life cycle along
with any dependencies
The safety analysis, engineering, verification and validation

The analyses and validations to be applied during the life cycle, should be clearly identified and should

3.11.2010 12:59
WCS_AP v.03 Test report

take into account the processes for ensuring an appropriate degree of personnel independence in tasks and
necessary safety reviews, to demonstrate compliance of the management process with the Safety Plan

Chapter 5 - Reporting, Control and Milestone


Details of all safety related deliverables and the milestone

This section presents the list of main documents to be produced and the milestones and the mechanism of
the documentation review and acceptance.
the mechanism of the documentation review and acceptance
Requirements for periodic safety validation and safety review

This section contains the planned safety reviews throughout the life cycle and appropriate to the safety
relevance of the element under consideration, including any personnel independence requirements and
shall be implemented, reviewed and maintained throughout the life cycle of the system.

Chapter 6 - Safety Case Plan


the safety case structure

the Safety Plan should include a safety case plan, which identifies the intended structure and principal
components of the final safety case.
the mechanism to prepare Safety Case

the procedures for the preparation and scheduling (draft, number of version, etc.) of the safety case
should be described in the Safety Plan.

Chapter 7 - Safety Maintenance


process for analysing operation and maintenance performance to ensure that realised safety is compliant
with requirements
Maintenance of safety-related documentation

The procedure for maintaining accurate documentation on potential hazards, safety related systems and
external risk reduction facilities.
Each of the components of the Safety Plan shall be formally reviewed by the organisations concerned and
agreement gained on the contents.

10. ANNEX II: STRUCTURE OF A PRODUCT


SAFETY CASE
The Safety case should include the following topics:

Chapter 1 - Contents
The section should contain a description of how the safety case has been constructed and how it will
demonstrate that the product meets the safety requirements for its intended purpose.

Chapter 2 - High Level Documentation


List of Safety Case Documents

3.11.2010 12:59
WCS_AP v.03 Test report

This list of documents comprises all the documents which form the safety case or support the safety
arguments contained in the safety case. The structure of this list must be consistent with the
structure of the safety case. If the safety case is self-contained, this section will not be necessary.
Safety Process

This section shall describe the process (plan, execution and evidence) through which the required
level of safety has been met.
Glossary

A comprehensive glossary should be included to provide a clear definition of all technical terms
used.

Chapter 3 - Safety Management Documentation


Documentation which relates to the management, organisation and control of safety should be listed in this
section which should include a summary of each document listed, for example:

Organisation Documentation

This should describe the organisational structure under which the product has been developed, and
defines the roles, responsibilities and reporting structure of personnel involved in management, quality,
development, safety, maintainability reliability and user support.
Development Plan

This defines the development of the product in terms of development stages and establishes the criteria
for demonstration and acceptance that each stage has been completed. This is a "living" document which
must reflect not only the original plan, but also the actual life cycle of the development that took place.
Quality Plan

The Quality Plan defines the quality requirements that will be applied to all aspects of the work in
developing the product. This will include the Quality Management System (QMS) used on the project
together with a traceable path to enable demonstration that the QMS is in accordance with EN 29001 and
related standards.
Safety Plan

The Safety Plan defines the way in which the safety of the product is to be assured. Details of techniques
and processes to be used, at what stage they are to be used and how the findings of each analysis is to be
addressed as part of the development process shall be described.

Refer to chapter 4.4.2 and the ANNEX I of this document, for a detailed description of the Safety Plan
Structure. A clear description of the safety case structure should also be included within the Safety Plan.
Those elements of the V & V Plan which relate directly to safety requirements may be addressed or
referenced in this document.

Verification & Validation (V&V) Plan

This document defines the objective and approach to be adopted in demonstrating that the requirements
described in the Requirement specification documentation and safety criteria drawn from the various
safety analyses have been met. Procedures for, and evidence of, traceability of specific requirements to
particular test elements of V&V activities shall be briefly described and appropriate, detailed
documentation should be referenced. This document should address the V & V of all requirements
including those relating to safety which may have been covered in the Safety Plan.
Configuration Management Plan

This document describes the principles and processes by which the build standard of, and changes to, the
product under consideration has been controlled throughout its lifecycle from conception through detailed
specification, design, build, validation. The Configuration Management Plan should detail the timing of

3.11.2010 12:59
WCS_AP v.03 Test report

design reviews, configuration baselines, status reporting mechanisms and procedures for deviation from
prescribed processes. This document is vital since traceability is a central requirement of a Safety Case
and rigorous traceability is only truly achievable when all evidence is from configured sources.

Despite the rigorous application of management processes, there will be a number of occasions in any
development activity when deviation from strict requirements will be unavoidable in order to maintain
control of overall schedule and cost. For example, the design process in a particular development may
dictate that all safety analyses are completed before a design is committed to production build. Events
may be such that the delays caused in adhering strictly to this requirement could seriously impact the
programme. To deal with this situation, a formal mechanism which manages deviation from procedures or
requirements should be put in place. This will detail the mechanism for recording the assessment, ,
acceptance and resolution or disposition of the deviation. This deviation procedure should be described
together with a reference to the list of deviations raised during the development together with their
resolution.

Clearly, because of the complex functionality of product and its development process, each plan
described above is likely to comprise a number of individual documents which should be controlled as
part of the overall configuration management system (as should all documentation)The structure of any
such plan should be described, within the Safety Case.

Chapter 4 - Product Element Documentation


Elements in the product, such as microprocessors, power supplies, previously designed modules, etc., which
have received certification in their own right, must be described briefly, together with reference to
documentation against which certification was granted. Reference must also be made to the notified body.

Evidence should be included to demonstrate that the supplier has fully assessed that the use of the
assessed/certified element in the product is entirely within the functional and safety related application
conditions specified for that element.

Chapter 5 - Reference v Self-containment


There are two viewpoints about how the basic Safety Case should be presented. The first is that the Safety Case
is a self-contained document with no reference to other documents. There are inherent difficulties with this
approach:

0. Product designs are the intellectual property of the suppliers and any critical details which are proprietary
to the supplier may not be included.
1. A self-contained safety case would be extremely bulky for a complex product making distribution and
availability difficult. Changes would be harder to incorporate
2. The Safety Case of this nature would be difficult to produce as the lifecycle of the product itself develops
and would increase the tendency to produce the Safety Case at the end of the lifecycle.
3. It is not clear how the traceability of the data contained in the safety case could be shown with no
external referencing

The second viewpoint is that the Safety Case is a reference document describing that documentation relating the
product needed to demonstrate the safety of that product.

Chapter 6 - Product Section


General Aspects

A high level description of the product , and the scope and limitation of the safety analyses shall be given
in this section. The preliminary hazard list, identification of the system boundary with respect to hazards
and classification of those hazards in terms of risk and consequence should also be presented.
Safety Objective

3.11.2010 12:59
WCS_AP v.03 Test report

The safety objectives must be clearly stated and a description included of the method for allocating the
safety objectives from the top level product requirements to lower levels of the product. Reference should
be made to documentation detailing functional and safety requirements for all elements of the product,
e.g. between and within hardware and software.
Description of the Architecture of the Product

The structure, functionality, operation, interface, and the environment envelope of the product shall be
described. This shall include a list of applications for which the product was considered or envisaged by
the designers. For each application, the element of configurability/adaptability must be considered and
described.
Functional Elements

According to ACRuDA definition (see deliverable D1 of ACRuDA project) a safety critical


architecture, is a configurable, structured set of components which can be demonstrated to comply
with critical or vital safety criteria or safety levels. In its simplest representation the architecture
consists of a programmable electronic elements with input and output ports or devices which
interface with external equipment.

Modes of operation including restricted and degraded operational modes shall be described together
with the failure modes of each of these.

Functional Safety Elements

The safety case must provide evidence that :


degraded input data, e.g. erroneous, missing and irrational data,
errors in the operation of the hardware, e.g. CPU, memories, interfaces, clocks, built-in-test,
watchdogs and communication links
errors in the software
are detectable and that action following detection will bring the system to a safe, defined state in a
timely and controlled manner.
Safety Studies

The following should be listed in this section :


Safety analyses undertaken,
the boundaries of these analyses (to what element of the product each is applicable),
the stage in the lifecycle they were completed,
any tools used (including databases).
Reference to well established practices or known standards which define the appropriate rules for
application of methods and techniques used should be made here.
Evidence of safety may be considered in two categories: functional safety and technical safety. The
elements of these are considered below :
Evidence of Functional Safety

Evidence must be provided that demonstrates all functional requirements are met and that the
design process to achieve the specified functionality is such that the "services" the product
provides, meet Safety Integrity Level 4 (SIL4). Consideration of alternative designs or approaches
should be presented together with the rationale for the approach implemented.

Correct and full operation at the limits of "normal conditions" must be demonstrated.

The safety properties (or principles) such as redundancy, diversity, error detection, self test,
information redundancy, shall be described together with the safety rationale for the design concept
and the underlying assumptions about what is "safe". Reference shall be made to documentation
containing quantitative analysis which supports the product design approach.

The effects of faults due to hardware or software must be comprehensively analysed, and evidence
that the effects have been addressed in the design should be presented. Established analyses such as
HAZOP, HAZID, Hazard Analysis, Fault Tree, FMECAs, shall be described and used as part of the
Safety Assurance lifecycle which shall be described in the Safety Plan documentation. A

3.11.2010 12:59
WCS_AP v.03 Test report

description of how the results of these analyses were fed back to the design should be given.

Evidence of Technical Safety

Susceptibility of the product. - The suppliers must demonstrate that the product functions correctly
within the range of specified external influences, e.g. temperature, vibration, humidity,
electromagnetic radiation (EMC), power supply variation, contaminants, etc. They must also
demonstrate that for conditions outside the defined specification for normal operation, the product
will default to a defined, safe state and upon return to normal operating conditions, the product will
behave in a safe predictable manner.

Influences upon susceptibility of other equipment. The supplier must define the characteristics of
the product which could influence the operation of the system, application or other equipment e.g.
thermal radiation, electromagnetic emission, etc. Evidence of testing to confirm that these are as
defined should be provided.

This section of the safety case should contain a description of the safeguards in the design which
protect the safety properties of the product from compromise during the implementation of the
product. For example, a product may have provision to accept site specific software. Reference
should be made to evidence that demonstrates that the safety functionality of the product is
unaffected (ring-fenced) when used within its defined range of application. Any mechanism to
indicate that the ring-fence has been breached should be given in this section.

Verification and Validation

Verification and validation reports shall be included or referenced. V&V activities must demonstrate that
at every level, each requirement within the product has been tested and that there is traceability of
evidence from the highest level requirement through to final testing. The key objective of this activity is to
ensure that there is clear evidence that the product conforms with every requirement.

Chapter 7 - Parts and Materials


There are well established methods and standards which describe the approaches and procedures for developing
a quantitative analysis for reliability and availability of components. Evidence that analysis has been undertaken
should be included, and reference made to procedures and plans used, and results obtained. What is less
commonly addressed at the design stage is the issue of component specification and obsolescence.

Components (i.e. COTS items resistors, ICs, ASICS, capacitors, gaskets etc.) are very "fuzzy" items in that they
have a vast range of parameters which specify their performance. Many of these parameters are a function of
the fabrication process rather than the design and so can vary from supplier to supplier. These parameters need
to be considered by the designer prior to inclusion in the design. Further, suppliers of complex components such
as microprocessors will release only very limited data about the parametric performance of their devices due to
commercial sensitivity. In all cases, availability must be a major consideration in component selection and
multiple sourced components from suppliers with a track record of long term product support should be
considered.

Evidence must be included to show that a formal parts and materials selection process had been established and
followed. The process should include detailed criteria for selection of components and assessment of suitability
by parties other than the design team. Policy and approach taken to minimise the effect of obsolescence should
also be discussed.

Chapter 8 - Ownership - Operation, Evolution, Modification


When the safety case of rail system, is accepted, the ownership of the safety case resides with the operator. If
the operator wishes to make changes, it is for him to judge whether or not these changes impact the safety cases
of system elements such as the product. If it does, then it is his responsibility to ensure that the element safety
cases and the overall rail safety cases are satisfactorily revised. It is essential, therefore, that at all points of
hand-over of safety case ownership, the parties must satisfy themselves that all documentation is available to

3.11.2010 12:59
WCS_AP v.03 Test report

enable any modification of the element to be undertaken and the revised safety case prepared.

Ideally, the most straightforward situation is that when all product documentation is available to the operator
who will then be equipped to make safety assessments of modifications at all levels of the system. In practice,
documentation relating significant elements of a will not be available because of commercial confidentiality
issues. In this situation, the responsibility for update of the safety case may reside with the supplier.

Modification may be required many years after initial delivery and, because any changes to the product could
affect safety, it is essential that all documentation is available. The safety case should, therefore, contain a
discussion on how issues of availability of documentation for design changes after hand-over, have been
addressed. This should include commercial aspects such as copyright, design authority, non-disclosure
agreements and intellectual property rights together with model agreements for dealing with these issues. As part
of this, there should be a clearly defined process to address the correction of errors which may be discovered
after hand-over of the product safety case.

Chapter 9 - User Support


While, ultimately the responsibility for correct usage of the product rests with the user (the term "user" refers
here to the operator or any body utilising the product or its application) there is a responsibility upon the
supplier to take all necessary steps to ensure correct usage. This will be achieved by not only by providing
information as described in the preceding sections but also by offering the facility of a technical support
programme to user of the product.

The safety case should include a model programme covering technical support to the user, repair capability,
repair times and calibration of support equipment supplied, including any test equipment and support tools
(hardware and software). The model should also consider the needs of in-service support which in addition to
the above, will cover issues such as spares holdings, spares manufacturing capability including ownership of
support equipment and retention of repair facilities.

The nature and level of support provided by the supplier will be the subject of commercial discussion between
supplier and user. However, while support issues may not directly impact the safety related functioning of the
product, lack of a clear strategy and inadequate or contractual agreements user support in place will almost
certainly result in significant difficulties in obtaining appropriate documentation and expert technical help. This
in turn can lead to poorly effected modifications and flawed safety arguments. Ultimately, this could affect the
safety of the system. It is, therefore, essential that a structure for post hand-over support is developed and
presented as part of the safety case.

Chapter 10 - Conclusion, Safety Argument Summary


The conclusion of the safety case shall summarise the evidence and state whether or not the product meets its all
its specified safety requirements. The constraints on the precise configuration, operation and application of the
product shall be summarised.

11. ANNEX III : STRUCTURE OF AN


ASSESSMENT PLAN
The plan is based on the assessment plan describe in [ITS05]. The assessment plan is structured according to the
criteria :

Chapter 1 - Introduction
Context

3.11.2010 12:59
WCS_AP v.03 Test report

A description of the context of the assessment is presented in this section. This description must contain :

the identity of the supplier,


the identity of the sponsor,
the identity of the assessors,
the identity of the notified body.

Objectives

Scope of the report

Chapter 2 - Management aspects


Special dispositions of the assessment

The assessment process should be considered as a project. The management of this process should then follow
the project management requirements.

Special dispositions in the quality process may be taken : a specific project organisation of the staff , specific
confidentiality agreements, configuration management requirements can be identified in this chapter in addition
to the dispositions of the Quality Handbook.

It is recommended to define a specific assessment steering committee. The members should be identified by
name.

Relations between the assessment activity and the supplier’s activity.

In case of a simultaneous assessment, the relations between the development phases of the architecture and the
assessment activity must be described carefully in this chapter. It is recommended that the assessment starts
during the specification phase of the architecture. A special care has to be taken to the modification process in
the development. The assessor has to define how the modifications during the development are impacting his
investigations.

In case of consecutive assessment, the chapter must identify in the planning some review meetings with the
supplier and the supplier activities to answer the assessment needs.

Estimation of time and costs


The duration and costs of the assessment depend on the kind of assessment : simultaneous or consecutive.

The estimation contains a planning of the activity. This planning must describe the duration of each activity and
the effort in men month allocated to the each activity.

It is recommended to identify the assessors members who are going to work on each activity by name.
(including the subcontractors).

The risk of underestimating the times and costs is to have a decrease in the global quality of the assessment (ex:
safety analyses not exhaustive).

Details of the staff structure

The role, responsibilities, competencies and relationship of all the assessors is given in this chapter.

Qualification and training of the staff

The procedure for ensuring that all staff involved in the safety activities, is competent to carry out activities for
which they are accountable.

3.11.2010 12:59
WCS_AP v.03 Test report

Chapter 3 - Description of the product


Functionality of the product

This section shall contain a summary of the operational role and the functions of the product.

History of development

This section shall present the development phases of the product (with tools, standards, method, techniques,
etc.)

Architecture of the product

This section shall present the high level architecture of the product. The separation between safety and non
safety components. The apportionment of the safety functions on hardware and software. A description of
hardware and software must be given and in particular of the components concerned by the safety.

Chapter 4 - Description of the safety requirements specification


A good understanding of the content of the safety requirements specification is necessary for the
understanding of the assessment report.

This chapters shall contain :

the safety integrity target,


the specification of the safety functions,
the specification of the safety mechanisms.

Chapter 5 - Description of the assessment

The assessment work can be divided in work packages.

For each work package, it is necessary to define

the work to achieve with a great precision,


name of the assessors,
a planning (with duration and meetings) to achieve the work,
the effort in men month, allocated,
all the inputs necessary to achieve the assessment,
the technical assessment report to produce.

If external assessors are in the assessment process, a procedure to collect all the reports to compose the final
report must be defined. The notified body has to do this work.

The minimum documentation supplied is the documentation identified in the safety case (chapter 4.4.3 of the
present guide).

Several assessment activities can be defined. All these activities are based on the criteria.

Preliminary activities

examine the definition of the product to assess : the assessor should evaluate the relevance of the
definition of the product in the point of view of safety. For example, the assessor can consider that some
elements should be integrated in definition of the product to assess because they have a potential impact
on the safety of the whole architecture.

3.11.2010 12:59
WCS_AP v.03 Test report

examine the safety requirements specification (safety integrity target, standards, risks, etc.) : the
assessment plan must explain whether the allocation of the safety requirements is taken as an input of the
assessment or if this activity will be examined by the assessor.
preliminary analyse of the architecture : a system approach is recommended for the assessment of
complex architectures. That is to say : first a global comprehension of the architecture with a global safety
study and then a detailed comprehension on each component.
verification of the set of criteria

If relevant, the assessor should prefer to evaluate the results of the supplier using a different method of
evaluation than the method used by the supplier.

Writing of the reports

The report should be compliant with [EN01] standard (chapter 5.4.3 of the standards)

Chapter 6 - Conclusions
In the conclusion of the assessment plan the assessor should give a justification that the process defined above
covers all the identified risks in the safety specification requirements. This justification should be based only on
the work of the supplier examined by the assessor and on the results of the assessor.

The main conclusion of the assessment allows to know if the criteria are verified and if there is no risk of
hazardous failure. The recommendations can remind that the results of the assessment are valid for a particular
version of the product when it is configured a certain way.

Annex A - Terminology and abbreviations

This annex must identify all the acronyms and terms used in the technical assessment report.

12. ANNEX IV : STRUCTURE OF A TECHNICAL


ASSESSMENT REPORT
This is a generic plan which can be fitted for the assessors and the notified body. This plan is based on the
assessment plan define in [ITS05].

Chapter 1 - Introduction
Context
A description of the context of the assessment is presented in this section. This description must contain :

the identification of the assessment,


the name and the version of the product
the identity of the supplier,
the identity of the sponsor,
the total duration of the assessment,
the identity of the assessors.

Objectives

This section presents the objectives of the technical assessment report.

Scope of the report

3.11.2010 12:59
WCS_AP v.03 Test report

This section must precise all the assessment tasks covered by the technical assessment report. In general, it
covers all the assessment. If it is not all the case, justification must be given.

Organisation

This section presents the organisation of the technical assessment report.

Chapter 2 - Summary
This chapter is the base of all information on the results of the assessment, published by the assessor. So, This
summary must not contain confidential information on the product (commercial, technical, etc.).

This chapter must contain :

the identification of the product and the version,


a brief description of the product,
a brief description of the safety characteristics of the product
a summary of the main conclusions of the assessment,

Chapter 3 - Description of the product


Functionality of the product
This section must contain a summary of the operational role and the functions of the product. This summary is
composed of

the type of data to treat


the different kinds of users

History of development
This section must present the development phases of the product (with tools, standards, method, techniques,
etc.)

Architecture of the product

This section must present the high level architecture of the product. The separation between safety and non
safety components. The apportionment of the safety functions on hardware and software. A description of
hardware and software must be given and in particular of the component concerned by the safety.

Description of hardware

This section gives details on all the hardware components necessary for the assessment.

Description of software

This section gives details on all the software components necessary for the assessment.

Chapter 4 : Description of the safety requirements specification

A good understanding of the content of the safety requirements specification is necessary for the understanding
of the assessment report.

This chapter points reference to the safety requirements specification or re - describes it in its totality.

3.11.2010 12:59
WCS_AP v.03 Test report

Chapter 5 : Assessment
History of the assessment
This section is designed like chapter 3. This section must present the assessment process and the main stages :

expected and defined in the assessment plan, at the beginning of the assessment,
really reached during the assessment.

This main stages can be : meetings, delivery, end of technical work, etc.

Assessment procedure

A summary of the assessment plan must be presented in this section. The tasks of the assessors defined in the
assessment plan and the work packages achieved. All the differences between the proposed work of the
assessment plan and the real work achieved must be recorded and argued.

A summary on the conformity of the real delivered assessment inputs to the aimed supplied must be given All
the differences between the delivered assessment inputs and the aimed assessment inputs must be recorded and
argued.

Boundaries of the assessment

This section must identify, clearly and precisely, all the components of the assessed product and all the
hypothesis made on the components not assessed.

Constraints and hypothesis


This section must identify all the constraints encountered and all the hypothesis made, during the assessment.

Chapter 6 : Summary of the assessment results


This chapter must supply a summary of the results of the assessment. The structure of the chapter is based on
the effectiveness and the conformity assessment and the criteria.

Each sub-sections of the chapter must contain the name of the assessor and the reference of the work package.
All the criteria must be covered.

Chapter 7 - Remaining risks/remaining scenario


This chapter presents all the remaining risks/remaining scenarii discovered during the assessment. For each
risk/scenario, it is necessary to describe :

the safety function concerned by the risk/scenario ,


a description of the risk/scenario,
the task during which the assessor found the risk/scenario,
the work package during which the assessor found the risk/scenario,
the person who found the risk/scenario,
the date of discovery,
if the risk/scenario was corrected or not, and if correction, the date of correction,
the origin of the risk/scenario.

Chapter 8 - Conclusions and recommendations


The main conclusion of the assessment allows to know if the product reach the safety requirements specification

3.11.2010 12:59
WCS_AP v.03 Test report

and if there is no risk of hazardous failure . The recommendations can contain suggestions towards the sponsor
and the supplier. The recommendation reminds that the results of the assessment are valid for a particular
version of the product when it is configured of a certain way.

Annex A - List of assessment inputs


This annex must identify all the assessment inputs, with their version and the date of delivery and reception.

Annex B - Terminology and abbreviations


This annex must identify all the acronyms and terms used in the technical assessment report.

Annex C - Assessed configuration


All the configurations of this product, examined during the assessment must be clearly and precisely identified.
All the hypothesis or the configurations not taken in account must be described. The hardware and the software
assessed must be described and in particular the pertinent elements for the assessment, the safety functions.

Annex D - Work package reports


This annex is facultative if all the assessment reports, issued from the work packages, are contained in the
chapter 6 of the assessment report (only one assessor). If this annex exists, it must be composed of all the work
and results necessary to justify the verdicts of the assessors.

13. ANNEX V : STRUCTURE OF THE


CERTIFICATION REPORT
This report is based on the certification report describe in [ITS05]. The certification report is public, should not
contain any confidential information and should contain as a minimum :

Chapter 1 - Introduction
Objectives

This section presents the objectives of the technical assessment report.

Terminology and abbreviations


This section identifies all the acronyms and terms used in the certification report.

References

This section defines the list of all the references used in the certification report.

Chapter 2 - Results
Conclusion

This section describes :

3.11.2010 12:59
WCS_AP v.03 Test report

the precise identification of the product (with the identification number, version number, etc.),
the conclusions in terms of remaining risk/scenarii
the recommendations of use

Context of the assessment

a definition of the used criteria


the identity of the supplier, and if necessary the identity of subcontractors,
the identity of the sponsor
the date and the duration of the assessment

Chapter 3 - Description of the product


Description of the product

This section must contain a detailed description of the operational role, the components and the functions of the
product.

Description of hardware

This section gives details on all the hardware components assessed.

Description of software
This section gives details on all the software components assessed.

Description of the documentation

This section gives details on all the documentation associated with the product. The minimum is the users
documentation.

Chapter 4 - Assessment
Technical assessment report
The reference (s) of the technical assessment report.

Main results of the assessment


This section describes the safety requirement specification, the applied criteria and the results of the assessment.
All useful remarks to understand the results are supplied.

Chapter 5 - Recommendations of use


This chapter describes all the recommendations necessary to the users of the product.

Some recommendations can be done on the configuration and safe use of the product notably by describing
procedural, technical, organisational measures. Some recommendation can end to a restriction of use of the
product.

Chapter 6 - Certification
This chapter describes the scope of the certificate.

3.11.2010 12:59
WCS_AP v.03 Test report

14. ANNEX VI: STRUCTURE OF A CERTIFICATE


The certificate is based on the certificate describe in [ITS05]

The certificate contains :

the identification of the version of the product assessed,


the identification of the certification report (with how and where the report can be accessed)
the identity of the sponsor and the supplier,
the identification of the notified body : symbol, drawing, mark ,
a mention on the importance of the certificate and the boundaries of the assessment and
certification responsibilities,
the signature of the supplier and the notified body.

(To top of text)

3.11.2010 12:59

Potrebbero piacerti anche