Sei sulla pagina 1di 5

Pharma IT - Vol 2 No 1.

qxd:pharma IT text

26/2/08

11:43

Page 14

Pharma IT Journal

The Myth of
Software Validation
This article attempts to dispel industry confusion surrounding the term Software Validation, and how it
can be reconciled with Computerised System Validation. It then gives some guidance on how to address
software components within a wider implementation and validation project including a discussion of the
role that can be played by vendors, internal IT or engineering groups, and other third parties.
By Tim Croft, Take Control Technical Services Pty Ltd
Key Words: Software Validation, Vendor, Regulated User, V-Model. PICs

software must be installed and configured appropriately on


the selected hardware, in order for a subsequent
implementation of application software to sufficiently support
the GxP-related process. Note that application software
could be custom-coded software, configured off-the-shelf
(COTS) packages such as a spreadsheet, or a combination
of those, while the GxP-related process is something less
tangible such as quality testing & record-keeping for finished
products.
Furthermore, even if hardware and software components
of a computerised system have been perfectly selected,
installed and configured to support the defined GxP-related
process, failure is guaranteed unless:
people are working with procedures that are purposewritten in a way that complements the computer
operation,
Too often in the therapeutic goods industry of today the term
software is confused with computerised system and
more destructively, the validation of each treated as
interchangeable.
In order to clarify the difference, the use of the term
computerised system should be carefully considered
significantly defined by the Pharmaceutical Inspection
Cooperation Scheme (PIC/S) guidance for computerised
systems1 as the controlling system and the controlled
process in an operating environment. This definition clearly
places emphasis on the computerised system, and does
not limit attention to the included software only (refer to Figure
1, extracted from the PIC/S Guidance). This definition has
been selected by PIC/S very deliberately, and provides a
framework for the entire PIC/S approach to Good Practices
for Computerised Systems.
From a GxP perspective, what is important is the
controlled process, not the hardware or software in itself a
fact easily missed by technical staff close to the computer
system, and who are not necessarily associated with GxPrelated business processes such as clinical and product data
analysis, product manufacture, or regulatory document
management.
Successful execution of the controlled process depends
on the entire computerised system working properly. The
correct versions of operating systems, drivers and platform
14

all associated non-computerised equipment behaves


the way the computer system expects, and
interfaces with external systems and processes
(outside the defined scope of the particular
computerised system under scrutiny) are performing
consistently and accurately.
Figure 1: Illustration of a Computerised System (from PIC/S
Guidance1)

www.PharmaIT.co.uk

Vol. 2 No. 1 January-March 2008

Pharma IT - Vol 2 No 1.qxd:pharma IT text

26/2/08

11:43

Page 15

The Myth of Software Validation

Pharma IT Journal

Fortuitously, PIC/S also provides clues to a potential


contribution to problems stemming from confusion between
software and a computerised system. According to PIC/S
Guidance scope, activities dwelling in the Regulated GxP
Environment include regulatory submissions, research &
development, clinical trials, procurement, dispensing/
weighing, manufacturing, assembly, testing, Quality Control,
Quality Assurance, inventory control, storage & distribution,
training, calibration, maintenance, contracts & technical
agreements and associated records & reports. From this
broad scope, it is obvious that a Computerised System may
or may not involve a traditional IT-based computer system. In
reality, it could also be a process automation system,
instrumentation, or any other equipment that involves some
sort of software (or firmware) running on a hardware platform.
In a traditional IT-based computerised system there is a
disproportionate emphasis on software (compared perhaps
to an equally disproportionate emphasis on hardware in
process automation and instrumentation systems). Therefore,
when considering a traditional IT-based system, there actually
is less distinction between Software Qualification and
Computerised System Validation. However, as established
above, software doesnt work in isolation success depends
on hardware, supporting equipment and processes all
performing as the software expects them to.
Significantly, in their recently-published update to their
Guidance for Computerized Systems used in Clinical
Instigations, the FDA have appeared to have deliberately
removed all references to software in relation to validation
and testing with a more consistent use of the term
Computerized System. It should be noted that caution is to
be exercised when interpreting FDA implementation of the
term Software Validation in some of their less-recently
updated documents, since it often manifests as legacy
terminology from the days when computer validation in the
United States spawned from the medical device industry.
(Unfortunately nothing it seems can be done about the AngloAmerican schism over the use of z or s in such written
words however!)
Whilst it appears that all regulatory-related bodies globally
are making a valiant attempt to clarify the distinction between
software and a computerised system, even PIC/S itself
contributes somewhat to the confusion when it identifies a
requirement for quality and functionality to be built into the
software in a disciplined manner, but validation of

Vol. 2 No. 1 January-March 2008

computerised systems. How then to build a workable


validation model that satisfies both the need for
Computerised System Validation and Software Quality,
when the two have overlapping scope but differing goals?

Full Compliance Lifecycle


The answer begins to take shape in the statement from
PIC/S1 that a User Requirement Specification (URS) includes
non-software (e.g. human processes) and hardware as well
as software. Such a URS must therefore pertain to an entire
Computerised System not just software and since the
remainder of a validation project hinges entirely on the URS
(assuming adherence to the V-Model, discussed below), it
follows that the subject Computerised System will be
validated as a whole.
Now it seems would be the ideal time to call upon what
has become a familiar and trusted friend to many
professionals working in the therapeutic product industries
the V-Model. The V-Model has become a quasi-universal
standard used in regulated industries globally, and can be
found in validation strategies employed by the smallest
organisations through to the largest conglomerates.
Significantly, the V-Model is typically applied on these sites
not only for validation of computerised systems but for
validation of almost everything and anything required by GxP
regulations (including processes, facilities, equipment, and
cleaning with the list always growing). For the purposes of
this discussion, focus will be limited to one adaptation of the
V-Model that is of particular value when applied to
implementation of a computerised system. It is this
adaptation that is illustrated in Figure 2.
One significant advantage of using the above model is that
a User Requirement Specification for any particular
computerised system may be created before a vendor or
product is selected. This will ensure that system functionality
is selected to match the users business requirements not
the other way around. Note that any particular set of selected
hardware and software components is only one method of
implementing a given set of GxP requirements2. Secondly,
the V-Model provides a structured, stepwise design
methodology that ensures suitability of the final solution at
multiple checkpoints along the projects journey. Since costs
associated with flaws and re-engineering are born
Figure 2: The V-Model

www.PharmaIT.co.uk

15

Pharma IT - Vol 2 No 1.qxd:pharma IT text

26/2/08

11:43

Page 16

The Myth of Software Validation

Pharma IT Journal

disproportionately by the purchaser of a computerised


system4, it is then in the interest of that purchaser to have the
design phase monitored carefully through to implementation.
PIC/S says that IQ, OQ and PQ (performed on the
computerised system as a whole, according to the above
model) may be sufficient testing alone for some simpler GxP
systems, but increasingly complex systems will require a
more detailed software testing effort. Evidently then,
Software Qualification must be incorporated somewhere
within this Computerised System Validation process to
ensure the presence of built-in quality and functionality as
stipulated by the regulatory authorities. Since such software
may involve varying amounts of configuration, customisation
and bespoke coding3 to complement standard functions
incorporated in off-the-shelf software packages, this
Software Qualification will consist of varying activities such
as software module specifications, unit/integration testing,
source code reviews, vendor qualifications, and industry user
group liaison. It must be noted that while IQ, OQ and PQ can
ensure high-level compliance with the requirements of a
particular installation, the inherent reliability of a software
product is attributable to the quality of all software
engineering procedures followed during its development.
This includes design, coding, verification testing, integration
and change control (including after-sales support)1.
PIC/S nominates Design Validation or Qualification as a

16

receptacle for documented evidence of additional softwarelevel testing. It is therefore a natural extension of this that all
evidence of Software Qualification is deposited there
together. When only off-the-shelf software is selected for a
computerised system, this will most likely consist only of
evidence showing that the software is in fact off-the-shelf
(such as market statistics and user group literature). When
custom-built software is utilised, this will be significantly more
including a Vendor Qualification, or alternatively a full set of
Software Development Life-Cycle deliverables for each
module of software.
However, a Design Qualification serves more purpose than
being simply a vehicle to present evidence for Software
Qualification such as traceability. Traceability in this context
is showing that each step of the design process does in fact
progress from the previous, and that the entire validation
effort does stem ultimately from the URS. When it comes to
software forming part of a computerised system, the purpose
of traceability is to prove a link between selected component
functionality and configuration back to requirements and
functional specification for which existing vendor-supplied
software documentation is helpful and important to reference
(if available)5. This is where the quality of a vendor and in
particular its experience in a GxP environment can be
invaluable.
Thus there is satisfaction that a practical solution to the

www.PharmaIT.co.uk

Vol. 2 No. 1 January-March 2008

Pharma IT - Vol 2 No 1.qxd:pharma IT text

26/2/08

11:43

Page 18

The Myth of Software Validation

Pharma IT Journal

computerised system quality does not depend only on


validation tasks performed prior to the commencement of
operation. Equally important are the ongoing tasks such as
maintenance and training that ensure the validated state of
the system remains intact3. Hence it is a vital part of a project
Validation Plan to identify such post-implementation activities
and ensure that they are adequately addressed prior to
official go-live.

Roles

Software Qualification versus Computerised System


Validation dilemma can be fashioned working within the
faithful V-Model, and true to the expectations of the authorities.
One last thing though how to explain this rather involved
methodology to colleagues, customers, auditors and
inspectors? The answer is simple: the formalisation of
procedures and/or plans to govern design, purchase/
development (including the basis for vendor selection) and
implementation and validation of computerised systems
(including the Software Development Life-Cycle where
appropriate). This is explicitly identified by PIC/S as mandatory
for all GxP-related systems, (and even desirable for all others
as part of good business practice), and it is strongly suggested
that they be governed by a Validation Master Plan.1
GAMP adds one important point to this though that
18

A side-effect to confusion between software and


computerised systems is the expectation that responsibility
for validation can be divested wholly to a vendor of software.
As discussed above already, there are dangers associated
with such an attitude, such as (a) an obsession with core
software functionality whilst largely ignoring the remainder
of what forms the full computerised system, and (b) being
left with business processes that are moulded to suit
functionality of one particular piece of software available on
the market. In addition, there are several other good reasons
to keep the vendor a safe distance from your validation effort.
First of all, PIC/S explicitly says that it is essential for the
regulated user to define a requirement specification prior to
selection and to carry out a properly documented supplier
assessment and risk assessment. Furthermore, it says that the
regulated user has responsibility for conducting a PQ against
the URS for both the controlling system and the controlled
process1. GAMP then offers its contribution with a stipulation
that PQ is to be executed in the specified operating
environment which notably includes associated procedures3,
and hence should be performed on-site by the user.
At the very least, the Design Specification and Installation
Qualification phases of the V-Model must be implementationspecific (i.e. describing what is physically installed on a
current site, rather than typical diagrams and the like) in
order to be useful from an engineering perspective. Even
non-regulated industries acknowledge the importance of high
quality documentation to describe the current state of
computer equipment (particularly in relation to complicated
and sprawling networks), and for that documentation to be
verified against the pieces of silicon, copper & plastic actually
in the field. Therefore it follows that if a software vendor or
integrator takes responsibility for the Design Specification
and/or Installation Qualification for a particular computerised
system that the regulated user must take special care to
ensure that they are (a) sympathetic to the existing site
computer infrastructure and (b) accurate representations of
what is actually installed (including details of configuration). A
Software Design Specification alone is simply not sufficient.
This is all reinforced by a declaration that the regulated
user holds ultimate responsibility for reliability of tasks
performed by the computerised system (PIC/S p5). Suppliers
are not bound themselves by regulations, but are held
accountable by their customers6. A vendor audit (for
example) is primarily for the user to present in an audit
situation. Note that enlightened suppliers will provide
evidence of appropriate quality policies, systems and
procedures which (as recommended by PIC/S) may include
a recognised Quality Management System1. All these things
will contribute to a customers confidence in their ability to
give evidence of a suppliers suitability in the face of
regulatory scrutiny. Note that this concept applies not only to
external vendors, but to an internal Engineering, IT or

www.PharmaIT.co.uk

Vol. 2 No. 1 January-March 2008

Pharma IT - Vol 2 No 1.qxd:pharma IT text

26/2/08

11:43

Page 19

The Myth of Software Validation

Pharma IT Journal

Software Development group. In this case, the end-users can


be considered the customer or regulated user, and must play
a large role in implementation and validation of the system6,
in order to ensure successful execution of their regulatory
critical business processes.
This is not to say however that a vendor must never be
involved beyond merely the supply of the software off the
shelf. The reality of many a regulated users situation is that
the required specialised resources simply are not available inhouse to implement and validate an entire new computerised
system. PIC/S even explicitly states that Where regulated
users do not have resources for engineering and design
within their own organisation, there is heavy reliance on the
supplying companys resources.1
Therefore it makes sense to divest some of the activities
and deliverables to a vendor as long as the scope is
explicitly defined (in a Project Validation Plan for example),
and that scope is executed under vigilant guidance of
representatives of the regulated user. Ideally, such activities
can be defined before a vendor is selected helping to
address the assessment of the capability of potential vendors
in areas apart from pure functionality of the software. Even if
the selected vendor is deemed incapable of the level of rigour

demanded by GxP, a third party consultant can always be


engaged to supplement a vendors project responsibilities4. A
good guide is that the User Requirement Specification and
Performance Qualification should be owned by the end user,
Functional Specification and Operational Qualification by the
developer or vendor, and Design Specification and
Installation Qualification by IT or the Integrator6.

Summary:
Thus it has been shown that the term Software Validation is
a misnomer. Software components can be qualified within a
wider computerised system validation effort utilising a full
life-cycle implementation of the V-Model. Vendors, internal
technical groups and other third parties all have an important
role to play, but ultimate responsibility lies with the end-user
as the regulated user of a computerised system."
Tim Croft
Computer System Validation Specialist
Take Control Technical Services Pty Ltd
Email: tim.croft@tc-ts.com.au

References:
1

The Pharmaceutical Inspection Cooperation Scheme, Good Practices for


Computerised Systems in Regulated GxP Environments (Sep 2007)
2
R.E. Chew Enhanced Design Review/Design Qualification, Pharmaceutical
Engineering, 30-38 (Jan/Feb 2003)
3
The International Society of Pharmaceutical Engineers, Good Automated
Manufacturing Practice (Version 4 2001)

Vol. 2 No. 1 January-March 2008

S. Uzzaman Computer Systems Validation: A Systems Engineering


Approach, Pharmaceutical Engineering, 52-66 (May/Jun 2003)
5
The International Society of Pharmaceutical Engineers GAMP Forum GAMP
Traceability for GxP Regulated Applications Pharmaceutical Engineering, 5255 (Jan/Feb 2006)
6
T. Stokes When GCP Meets IT, GCPj 15-18 (Apr 2006)

www.PharmaIT.co.uk

19

Potrebbero piacerti anche