Sei sulla pagina 1di 366

Thesis submitted for the degree of Master of Information Technology

(Research, IT60)

An
Improved Method to
Identify Critical
Processes

Craig M Huxley

2003

School of Information Systems


Faculty of Information Technology
Queensland University of Technology
A Member of the Centre for Information Technology Innovation

Craig Huxley Page ii


A Member of the Centre for Information Technology Innovation

Key Words
Business Process Improvement

Business Process Improvement Targeting

Balanced Scorecard

Delphi Study

Focus Group

Implementation

Multiple Case Studies

Craig Huxley Page iii


A Member of the Centre for Information Technology Innovation

Abstract

Nearly 70% of process improvement projects are failing to provide the


expected benefits (Grant 2002). The cost of process improvement projects
can be quite substantial and the number of these projects occurring within
organisations continues to increase. John Thorp (1998) describes an
environment in which managers are struggling to demonstrate the
connection between costs and expected business benefits. This eighteen
month master’s research project has identified a gap in both the academic
literature and the business practices of most organisations. This thesis
aims to make explicit the selection of processes to improve and to provide
the link between process objectives and organisational goals (Davenport
1993; Hammer and Champy 1993).

Published literature, coupled with the experience of the research team, has
resulted in the development of a targeting methodology for defining and
ranking critical processes, and then selecting which of those critical
processes to improve first. Although the research team believes that the
methodology is applicable to many industries, the research was
undertaken in the application hosting centre (AHC) and application service
provision (ASP) industry. A focus group and follow on Delphi study was
used to ensure that the processes and functional area focused upon was of
importance to the participants of the research.

This research project was funded by the Australian Research Council’s


Linkage projects and undertaken with support by REALTECH. The
participants included the top three information systems outsourcing
companies in Australia and another in the top ten of this industry.

The study commenced with identifying critical processes in the ASP


environment. This involved both a focus group session and a Delphi study.
The Delphi study was followed by four action learning cycles using case
studies (action, observe, reflect and revise). These action learning cycles
using case studies have revealed that the methodology (which includes the

Craig Huxley Page iv


A Member of the Centre for Information Technology Innovation

steps to implement the methodology) meets the needs of organisations to


identify and select ‘critical’ processes for improvement. It provides business
and researchers with a logical and explicit method to reduce the ‘squeaky
wheel’ and ‘latest fad’ approaches to process improvement projects. These
prior approaches improve processes not necessarily critical for achieving
organisational goals consuming limited resources for little gain. The
targeting method makes the alignment of process objectives with goals by
explicitly linking processes to organisational goals possible.

The limitations of this research project are that it does not intend to verify
the achievement of business benefit, document the change to an
organisation due to its use of the targeting methodology or determine the
long term benefits to an organisation using the targeting methodology.
These questions might be answered in a longer and larger study as this
project is limited to an eighteen month time frame. As for generalisability,
the study has focused on the AHC and ASP industries, and the
participants, while operating within this industry, are quite different. For the
different phases of this project the participants come from in-house
providers, multinational outsourcing providers, commercialised government
providers, specialist niche product providers, and enterprise system
suppliers.

Craig Huxley Page v


A Member of the Centre for Information Technology Innovation

Table of Contents

1 Introduction.............................................................................................................................1
1.1 The Present Information Systems Environment...............................................................1
1.2 Motivation for Research ..................................................................................................3
1.3 Research Question...........................................................................................................6
1.3.1 Research Objectives....................................................................................................6
1.3.2 Research Outcomes ....................................................................................................7
1.4 Relation of the Research to Previous Work.....................................................................8
1.5 Format of the Thesis........................................................................................................9

2 Context of the Problem.........................................................................................................13


2.1 The Environmental View ...............................................................................................13
2.2 REALTECH Australia ...................................................................................................17
2.3 Process-oriented Administration of Enterprise Systems ...............................................19
2.3.1 Reference Process Models for ES Service Delivery.................................................22
2.3.2 The Participants ........................................................................................................25

3 Literature Review .................................................................................................................35


3.1 Introduction ...................................................................................................................35
3.2 Identifying Critical Processes .......................................................................................38
3.3 Process Selection...........................................................................................................47
3.4 Assessment of the Five Factors .....................................................................................52
3.4.1 Impact of a Process on Organisational Goals ...........................................................53
3.4.2 The Balanced Scorecard ...........................................................................................56
3.4.3 Performance Measures..............................................................................................58
3.4.4 ‘Cause and Effect’ Relationships..............................................................................62
3.4.5 Impact Factor Summary ...........................................................................................65
3.4.6 Dependency ..............................................................................................................66
3.4.7 Probability of Failure of a Process............................................................................68
3.4.8 Cost/ Benefit .............................................................................................................70
3.4.9 Probability of Successful Improvement of a Process................................................71
3.5 Summary of Literature Review ......................................................................................73

4 The Targeting Methodology.................................................................................................74


4.1 The Steps for Business Process Improvement Targeting...............................................75
4.1.1 Step One -Preplanning..............................................................................................78
4.1.2 Step Two –Defining Scope and Introduction............................................................78
4.1.3 Step Three: Assessing Dependency..........................................................................80
4.1.4 Step Four: Assessing Probability of Failure .............................................................83
4.1.5 Step Five: Developing a Balanced Scorecard...........................................................85
4.1.6 Step Six: Assessing Impact.......................................................................................88
4.1.7 Step Seven: Calculating Criticality...........................................................................92
4.1.8 Step Eight: Assessing Cost/benefit ...........................................................................95
4.1.9 Step Nine: Assessing the Probability of Successful Improvement of a Process .......97
4.1.10 Step Ten: Selecting Which of the Critical Processes to Improve First ................98
4.2 Version 1 of the Targeting Method..............................................................................100
4.3 Targeting Methodology Summary ...............................................................................104

Craig Huxley Page vi


A Member of the Centre for Information Technology Innovation

5 Research Method and Design ............................................................................................ 107


5.1 Data Collection Phase Description............................................................................. 108
5.2 Design of Data Collection........................................................................................... 109
5.3 Focus Group Method .................................................................................................. 115
5.3.1 Focus Group Purpose, Approach and Outcomes.................................................... 115
5.3.2 The Focus Group as a Research Method ................................................................ 116
5.3.3 Characteristics of Focus Groups............................................................................. 117
5.3.4 Focus Group Recruitment ...................................................................................... 118
5.3.5 Data Collection....................................................................................................... 118
5.3.6 Data Analysis ......................................................................................................... 119
5.4 Delphi Method............................................................................................................. 121
5.4.1 Delphi Study........................................................................................................... 121
5.4.2 The Delphi Study as a Research Method................................................................ 122
5.4.3 Delphi Study Recruitment ...................................................................................... 124
5.4.4 Data Collection....................................................................................................... 125
5.4.5 Data Analysis ......................................................................................................... 126
5.5 Action learning using Case Study ............................................................................... 128
5.5.1 Action Learning Method ........................................................................................ 129
5.5.2 The Case Study Method ......................................................................................... 130
5.5.3 Case Study Recruitment ......................................................................................... 131
5.5.4 Action Learning Case Study Data Collection......................................................... 132
5.5.5 Action Learning Case Study Data Analysis ........................................................... 132
5.6 Ethical Considerations................................................................................................ 134
5.7 Summary of the Research Methodology...................................................................... 135

6 Identifying Critical Processes ............................................................................................ 136


6.1 Focus Group ............................................................................................................... 136
6.1.1 The Focus Group Session....................................................................................... 137
6.1.2 Description of the Focus Group Session ................................................................ 138
6.1.3 Data Analysis of the Focus Group.......................................................................... 147
6.1.4 Focus Group Summary........................................................................................... 151
6.2 Delphi Study................................................................................................................ 153
6.2.1 Purpose of the Delphi Study................................................................................... 153
6.2.2 The Participants...................................................................................................... 153
6.2.3 Description of the Delphi Study ............................................................................. 156
6.2.4 Data Analysis of the Delphi Study ......................................................................... 162
6.3 Identification of Critical Processes Summary............................................................. 177

7 Action learning Using Case Studies .................................................................................. 180


7.1 Case Study Participants .............................................................................................. 182
7.2 Cycle one of the Action Learning................................................................................ 187
7.2.1 Background ............................................................................................................ 187
7.2.2 Purpose and Business Problem............................................................................... 187
7.2.3 Proposed Approach ................................................................................................ 188
7.2.4 Actual Approach and Observation ......................................................................... 189
7.2.5 Reflection Phase Cycle One ................................................................................... 192
7.3 Cycle 2 of Action Learning: Case Study One.............................................................. 199
7.3.1 Purpose and Business Problem............................................................................... 200
7.3.2 Proposed Approach ................................................................................................ 200
7.3.3 Actual Approach and Observations........................................................................ 201
7.3.4 Reflection Phase Cycle Two .................................................................................. 228
7.3.5 Revision Phase Cycle Two..................................................................................... 230

Craig Huxley Page vii


A Member of the Centre for Information Technology Innovation

7.4 Cycle 3 of Action Learning: Case Study Two..............................................................237


7.4.1 Purpose and Business Problem ...............................................................................237
7.4.2 Proposed Approach.................................................................................................238
7.4.3 Actual Approach and Observations ........................................................................239
7.4.4 Reflection Phase Cycle Three.................................................................................258
7.4.5 Revision Phase Cycle Three ...................................................................................262
7.5 Cycle 4 of Action Learning; Case Study Three ...........................................................272
7.5.1 Purpose and Business Problem ...............................................................................272
7.5.2 Proposed Approach.................................................................................................273
7.5.3 Actual Approach and Observations ........................................................................274
7.5.4 Reflection Phase Cycle Four ..................................................................................286
7.5.5 Revision Phase Cycle Four.....................................................................................295
7.6 Cross Case Analysis ....................................................................................................303
7.7 Action Learning Summary...........................................................................................307

8 Findings and Limitations....................................................................................................310


8.1 Findings.......................................................................................................................311
8.2 Limitations...................................................................................................................314
8.2.1 Future/Follow-on Research ....................................................................................319
8.2.2 Epilogue..................................................................................................................320
8.3 Summary......................................................................................................................323

9 References............................................................................................................................324

10 Appendices...........................................................................................................................333
10.1 Appendix 1- BSC Implementation Issues.....................................................................333
10.2 Appendix- 2 Benefits Documents.................................................................................343
10.3 Appendix 3 Focus Group – Information Pages ...........................................................344
10.4 Appendix- 4 Ethics Documentation .............................................................................346
10.5 Appendix 5- IDC Definitions .......................................................................................351

Craig Huxley Page viii


A Member of the Centre for Information Technology Innovation

List of Figures
FIGURE 1-MODEL OF RESEARCH APPROACH......................................................................................10
FIGURE 2- MODEL SHOWING RELATIONSHIPS BETWEEN DIFFERENT BUSINESS MODELS ....................16
FIGURE 3- MODEL OF THE OVERALL PROJECT AND THE TWO INTERLINKED PROJECTS.......................20
FIGURE 4- REPRESENTATION OF A VALUE CHAIN OF ASP SERVICE DELIVERY .................................23
FIGURE 5-VERSION 1 REFERENCE MODEL OF ASP SERVICE DELIVERY (TAYLOR 2002) ...................23
FIGURE 6- POSSIBLE COMPARISONS OF DIFFERENT TYPES OF ENTITY ...............................................26
FIGURE 7- CATEGORIES OF PERFORMANCE LEVELS ...........................................................................44
FIGURE 8- CATEGORIES OF PERFORMANCE MODEL ...........................................................................44
FIGURE 9- THE THREE FACTORS USED TO ASSESS CRITICALITY .........................................................46
FIGURE 10- THE TWO FACTORS FOR SELECTION ADDED TO CRITICALITY ..........................................50
FIGURE 11- FIVE FACTORS FOR IDENTIFYING AND SELECTING A CRITICAL PROCESS .........................51
FIGURE 12-DESCRIPTION OF ALTERNATIVE TOOLS FOR ACHIEVING ALIGNMENT...............................55
FIGURE 13- EXAMPLE OF CAUSE AND EFFECT FOR BSC (KAPLAN AND NORTON 1996) ....................63
FIGURE 14- EXAMPLE OF PROCESSES LINKED TO GOAL ....................................................................64
FIGURE 15- EXAMPLE OF THE CATEGORIES OF PERFORMANCE..........................................................69
FIGURE 16- DIAGRAM SHOWING THE TWO MAIN AREAS OF THE METHODOLOGY; CRITICALITY AND
SELECTION ..............................................................................................................................75
FIGURE 17- THE TEN STEP TARGETING METHOD................................................................................77
FIGURE 18-VERSION 1 REFERENCE MODEL OF ASP SERVICE DELIVERY (TAYLOR 2002) .................79
FIGURE 19- CATEGORIES OF PERFORMANCE MODEL .........................................................................83
FIGURE 20-EXAMPLE OF PROCESSES AND OR OBJECTIVES WHICH NEEDED TO BE CONSIDERED
TOGETHER ...............................................................................................................................87
FIGURE 21- PORTION OF THE PREVIOUS EXAMPLE, OF A BSC MAP, SHOWING IMPACT ASSESSMENTS
................................................................................................................................................89
FIGURE 22- CALCULATING ALONG THE BRANCH ...............................................................................91
FIGURE 23-MODEL OF RESEARCH APPROACH.................................................................................. 107
FIGURE 24- DIAGRAM OF THE RESEARCH APPROACH...................................................................... 108
FIGURE 25- COSMOS CORPORATION’S MODEL OF DIFFERENT RESEARCH STRATEGIES (YIN 1994)110
FIGURE 26- MODEL OF THE ACTION LEARNING CYCLES USING CASE STUDY ................................... 129
FIGURE 27- EXAMPLE OF POST-IT NOTES USED IN FOCUS GROUP .................................................... 141
FIGURE 28- EXAMPLE OF THE TYPE OF RESPONSE FOR IDENTIFYING PROCESSES ............................. 142
FIGURE 29- INFORMATION TECHNOLOGY STRATEGY PROCESSES ................................................... 143
FIGURE 30-HARDWARE, SOFTWARE AND APPLICATION MANAGEMENT PROCESSES ...................... 143
FIGURE 31- ENABLING PROCESSES ................................................................................................. 144
FIGURE 32- SERVICE SUPPORT PROCESSES ..................................................................................... 145
FIGURE 33- EXAMPLE OF MOST CRITICAL AND LESS CRITICAL PROCESSES ...................................... 145
FIGURE 34- INSTRUCTIONS AND CHECKLIST SUPPLIED WITH THE DELPHI STUDY ............................ 157
FIGURE 35- EXAMPLE OF COMMENTS ADDED BY PARTICIPANTS ..................................................... 159
FIGURE 36- COMPARISON OF SD FOR SERVICE SUPPORT FOR THE 3 ROUNDS OF THE DELPHI STUDY
.............................................................................................................................................. 174
FIGURE 37-RATING RESULTS FOR THE PROCESS HEADINGS FROM THE DELPHI STUDY .................... 175
FIGURE 38-THE FOUR PHASES OF ACTION LEARNING ..................................................................... 180
FIGURE 39-DESCRIPTION OF THE ACTION LEARNING CYCLES USING CASE STUDIES ........................ 181
FIGURE 40- SECTION OF 'MAP' DEVELOPED BY RESEARCH TEAM SHOWING THE 'ARROWS'.............. 190
FIGURE 41- AGENDA FOR FIRST MEETING WITH CSC...................................................................... 202
FIGURE 42- REFERENCE MODEL OF ASP SERVICE DELIVERY .......................................................... 204
FIGURE 43- CSC VERSION OF AHC (CHANGES HIGHLIGHTED IN BLUE)........................................... 204
FIGURE 44-CSC BALANCED SCORECARD 'MAP' SHOWING 'CAUSE & EFFECT'.................................. 207
FIGURE 45- COMPARISON OF FIRST STYLE OF BSC REPRESENTATION WITH NEW MIND MAPPER STYLE
.............................................................................................................................................. 210
FIGURE 46- DOCUMENT USED TO PROVIDE VALUES FOR IMPACT, DEPENDENCY AND PROBABILITY OF
FAILURE ................................................................................................................................. 213
FIGURE 47- PORTION OF LARGER BSC MAP SHOWING PROCESSES LINKED TO OBJECTIVES ............. 215
FIGURE 48- EXAMPLE OF PROCESSES AND OR OBJECTIVES WHICH NEEDED TO BE CONSIDERED
TOGETHER ............................................................................................................................. 216
FIGURE 49- EXAMPLE OF %'S TAKEN TO PROVIDE IMPACT VALUE .................................................. 217

Craig Huxley Page ix


A Member of the Centre for Information Technology Innovation

FIGURE 50- CSC MAP SHOWING LINKED PROCESSES IN RED TO INTERNAL PROCESS OBJECTIVES IN
DARK GREEN WITH THE VALUATIONS IN BLACK BELOW THEM. ..............................................222
FIGURE 51- LEGEND FROM FIGURE 50.............................................................................................224
FIGURE 52- COMPARISON OF DIFFERENCES IN THE TWO PARTICIPANTS’ RATINGS ...........................226
FIGURE 53- GRAPHICAL VIEW OF THE RESULTS...............................................................................227
FIGURE 54- REALTECH VERSION OF REFERENCE MODEL OF ASP SERVICE DELIVERY (ALTERATIONS
IN BLUE).................................................................................................................................240
FIGURE 55- REALTECH (EXAMPLE) CORPORATE LEVEL BSC .......................................................244
FIGURE 56- REALTECH (EXAMPLE) REMOTE SERVICES BSC .......................................................245
FIGURE 57- RESULTS OF VALUATION OF DEPENDENCY, PROBABILITY OF FAILURE AND IMPACT .....250
FIGURE 58-CHART SHOWING RESULTS OF REALTECH PROCESS IDENTIFICATION PROJECT ..........253
FIGURE 59- EXAMPLE OF THE REALTECH REMOTE SERVICES BUSINESS UNIT BSC.....................256
FIGURE 60- SECTION OF THE LESS DETAILED INFORMATION PROVIDED IN THE 2.2 EXPLANATION STEP
..............................................................................................................................................263
FIGURE 61- BSC PERSPECTIVE VIEW OF CITEC STRATEGIC PLAN ....................................................277
FIGURE 62- SECTION OF THE BSC PERSPECTIVE VIEW OF THE CITEC STRATEGIC PLAN ...................278
FIGURE 63- EXAMPLE OF THE TYPE OF LAYERING NEEDED TO SHOW CAUSE & EFFECT LINKAGES ...279
FIGURE 64- COMPLETE STRATEGIC MAP FOR CITEC INCLUDING PROCESSES LINKED TO OBJECTIVES
..............................................................................................................................................284
FIGURE 65- CITEC STRATEGIC MAP NOT SHOWING PROCESSES LINKED TO OBJECTIVES ..................285
FIGURE 66- FOUR CYCLES OF ACTION LEARNING.............................................................................307
FIGURE 67- FIVE FACTORS FOR IDENTIFYING AND SELECTING A CRITICAL PROCESS........................312
FIGURE 68- INFORMED CONSENT FOR FOCUS GROUPS PAGE 1 OF 3 ................................................346
FIGURE 69-INFORMED CONSENT FOR FOCUS GROUPS PAGE 2 OF 3 .................................................347
FIGURE 70-INFORMED CONSENT FOR FOCUS GROUPS PAGE 3 OF 3 .................................................348

Craig Huxley Page x


A Member of the Centre for Information Technology Innovation

List of Tables
TABLE 1- IT SERVICES AS DEFINED BY IDC (2002) ...........................................................................13
TABLE 2- CONTACTED COMPANIES AND PARTICIPANTS BY NUMBER OF TYPE ...................................26
TABLE 3- PARTICIPANTS BY NAME AND TYPE ...................................................................................27
TABLE 4- TOP 10 ORGANISATIONS IN OUTSOURCING MARKET: 2001...............................................28
TABLE 5- HIGH LEVEL PROCESS IN WHICH PARTICIPANTS OPERATE IN THE ASP MARKET ...............29
TABLE 6- EFFECT OF FAILURE OF A PROCESS - RANKING GUIDELINES (STAMATIS 1995)...................67
TABLE 7- COMPARING TARGETING FACTOR-DAVENPORT (1993) AND HAMMER & CHAMPY (1993) 74
TABLE 8- EFFECT OF FAILURE OF A PROCESS - RANKING GUIDELINES (STAMATIS 1995)...................82
TABLE 9- EXAMPLE OF TABLE USED TO RECORD ASSESSMENT RESULTS ...........................................82
TABLE 10- EXAMPLE OF TABLE USED TO RECORD ASSESSMENT RESULTS .........................................84
TABLE 11- EXCEL TABLE SHOWING POSSIBLE FORMULA FOR CALCULATING IMPACT........................91
TABLE 12- EXAMPLE OF TABLE USED TO RECORD ASSESSMENT RESULTS .........................................92
TABLE 13- EXAMPLE OF TABLE USED TO CALCULATE CRITICALITY ..................................................93
TABLE 14- EXAMPLE OF THRESHOLD ISSUE ......................................................................................94
TABLE 15- NUMBER OF PARTICIPANTS FROM EACH COMPANY ........................................................ 119
TABLE 16- EXCERPT FROM AN EXCEL SHEET SHOWING DELPHI STUDY PROCESSES ........................ 121
TABLE 17 - COMPANY TYPE FOR DELPHI STUDY ............................................................................. 124
TABLE 18- PARTICIPANTS BY NAME AND TYPE ............................................................................... 125
TABLE 19- START, RETURN AND ANALYSED DATES FOR DELPHI STUDY ......................................... 125
TABLE 20- THE NUMBER OF PARTICIPANTS FROM EACH COMPANY IN THE DELPHI STUDY .............. 126
TABLE 21- NUMBER OF PARTICIPANTS FROM EACH COMPANY ........................................................ 137
TABLE 22- DRAFT VALUE CHAIN FOR ASP SERVICE DELIVERY ....................................................... 139
TABLE 23- AREAS OF IS OUTSOURCING IN WHICH FOCUS GROUP PARTICIPANTS OPERATE ............ 140
TABLE 24- VALUE CHAIN OF ASP AND ENABLING PROCESSES ........................................................ 146
TABLE 25- APPLICATION SERVICE PROVISION VALUE CHAIN ........................................................ 147
TABLE 26- LIST OF ALL THE CRITICAL PROCESSES AND THEIR VALUE AND PROBLEM SCORE .......... 149
TABLE 27 - PARTICIPANT TYPE FOR DELPHI STUDY ........................................................................ 154
TABLE 28- NUMBER OF PERSONS FROM EACH TYPE OF ORGANISATION........................................... 154
TABLE 29- TOTAL NUMBER OF PARTICIPANTS AND TOTAL RESPONSES ........................................... 155
TABLE 30- LIST OF PROCESSES AND PROCESS HEADINGS USED IN DELPHI STUDY ........................... 156
TABLE 31- START, RETURN AND ANALYSED DATES FOR DELPHI STUDY ......................................... 157
TABLE 32-EXAMPLE OF COLLATED AND ANALYISED RESPONSES .................................................... 158
TABLE 33- NUMBER OF RESPONSES TO ROUND ONE OF THE DELPHI STUDY ................................... 158
TABLE 34- NUMBER OF RESPONSES TO ROUND TWO OF THE DELPHI STUDY .................................. 160
TABLE 35- LIST OF PROCESS HEADINGS FOR DELPHI STUDY ........................................................... 161
TABLE 36- THE NUMBER OF PARTICIPANTS FROM EACH COMPANY IN THE DELPHI STUDY .............. 162
TABLE 37- COLLATED RESPONSES FOR ROUND ONE OF DELPHI STUDY .......................................... 165
TABLE 38- PROCESS FROM ROUND ONE WITH RESULTING MEAN OF LESS THAN THREE ................... 165
TABLE 39-PROCESSES WITH A RESULTANT MEAN OF GREATER THAN FOUR FOR CRITICALITY ........ 166
TABLE 40- ROUND TWO COLLATED AND ANALYSED RESULTS ........................................................ 168
TABLE 41- PROCESS RATED LESS THAN THREE FOR THE SECOND ROUND OF THE DELPHI STUDY .... 169
TABLE 42- COMPARISON OF SERVICE SUPPORT PROCESSES FROM R1 TO R2.................................. 170
TABLE 43-LIST OF FUNCTIONAL PROCESSES IN ORDER OF CRITICALITY FOR R2.............................. 171
TABLE 44- COLLATED AND ANALYSED DATA FROM THE THIRD ROUND OF THE DELPHI STUDY ...... 173
TABLE 45- LIST OF MOST CRITICAL PROCESSES IN R3 BY MEAN RATING ......................................... 174
TABLE 46- TIMINGS FOR THE CASE STUDIES.................................................................................... 186
TABLE 47-MEETING DATES AND TIMES FOR CASE STUDY 1............................................................. 199
TABLE 48-RELATIONSHIP DRAWING TABLE ................................................................................... 212
TABLE 49- FIGURES FROM VALUATION OF IMPACT, DEPENDENCY AND PROBABILITY OF FAILURE .. 219
TABLE 50- LIST OF RESULTS FOR ASSESSMENT OF CRITICALITY ...................................................... 225
TABLE 51- LIST OF SERVICE PROVIDED WITHIN THE 'REMOTE SERVICES' PRODUCT ........................ 242
TABLE 52- RELATIONSHIP DRAWING TABLE FOR REALTECH ....................................................... 248
TABLE 53- LIST OF MEETING AND TIMES FOR CITEC CASE STUDY ................................................... 274
TABLE 54- LIST OF THE 42 PROCESS USED BY CITEC ....................................................................... 282
TABLE 55- ISSUES IN IMPLEMENTATION OF A BSC.......................................................................... 342
TABLE 56- OPTIONS FOR TREATING CONFIDENTIAL INFORMATION ................................................. 349

Craig Huxley Page xi


A Member of the Centre for Information Technology Innovation

List of Abbreviations
Application Hosting Centre- (AHC)
Application Service Provider- (ASP)
Australian Dollar- (AUD)
Australian Postgraduate Award; Industry- (APAI)
Australian Research Council- (ARC)
Business Process Re-engineering (BPR)
Balanced Scorecard- (BSC)
Chief Information Officer- (CIO)
Computer Science Corporation- (CSC)
Corporate Services Agency- (CSA)
[part of the Queensland State Government]
Deloitte Touche Tohmatsu- (DTT)
Department of Primary Industries- (DPI)
[part of the Queensland State Government]
electronic Telecom’s Operations Map- (eTOM)
Enterprise Resource Planning System- (ERP)
Enterprise Systems- (ES)
Failure Mode Effects Analysis- (FMEA)
IBM Global Services Australia- (IBM GSA)
International Criticality Safety Benchmark Evaluation Project-(CSBEP)
Information Systems- (IS)
Information Technology- (IT)
Information Technology Information Library - (ITIL)
Natural Resources & Mines- (NR&M)
[part of the Queensland State Government]
Process Engineering- (PE)
Queensland University of Technology- (QUT)

Craig Huxley Page xii


A Member of the Centre for Information Technology Innovation

Authorship

The work contained in this thesis has not been previously


submitted for a degree or diploma at any other higher education
institution. To the best of my knowledge and belief, the thesis
contains no material previously published or written by another
person except where due reference is made.

_________________
Signature of Author

________October_2003
Date

Craig Huxley Page xiii


A Member of the Centre for Information Technology Innovation

Acknowledgements
This Master’s thesis would not have been possible without the significant support
of quite a number of people and organisations.
Financial support was given by the Australian Research Council in the form of
an ARC Industry grant. The grant was applied for with the support of REALTECH
AG. REALTECH AG also provided funds, time and personnel support. Special
thanks goes to Wayne Baker General Manager who inherited the research project
the day he joined REALTECH Australasia and Helena Mendes Remote Services
Manager whose enthusiasm and time contributed greatly. I hope that the
outcomes of the research project provide tangible advantages to the REALTECH
Group.

Sincere thanks needs to go to the companies who were willing participants in the
research project. These participants supplied their time and knowledge and
without their expertise and participation the research project would not have been
possible.

CSC Mark Harris and Nigel Hillier


Citec Peter Marshall, Terry Collins, Phil Murray, Greg
Prostamo and Mary-Lou Dutton
Mincom Mark McCafferty and Adrian Hale
EDS Solutions Consulting Leon Bray
Deloitte Touche Tohmatsu Andrew Ogbourne and Alan Scott
IBM GSA Chris Wilson
Qld Gov. CSA Craig Vayo and Philip Hood
Pauls (Parmelat) Tim Noonan
Hitachi Data Systems Tim Munn

The very considerable efforts of my principal supervisor Associate Professor


Glenn Stewart and associate supervisor Associate Professor Michael Rosemann
require sincere thanks and acknowledgment. They took what I like to believe was
a non-directive role in the project providing advice and considerable experience
more as a coach than supervisors, for which I am very grateful. Additionally I
would like to thank Chris Taylor for collaborating with me thus reducing the effort
needed to find and persuade participants. Karen Stark, Roy Chan, Helmut Klaus,
Isley Davies and our new group members all require thanks for accepting the
noise that was generated on a daily basis in the name of research and for their
practical support.

Last but not least I need to thank my family, Dimitti for taking on the role of major
‘breadwinner’ accepting the lack of support in many areas and still remaining a
positive and encouraging influence. Robbie and Thom for their patience while not
understanding what my research was really about and Will for arriving just before
the finish.

Without all these people and many more I have not mentioned, I apologise for not
mentioning you, but thank you all the same. Without this considerable support this
project would not have been realised, let alone provide what I believe is a highly
useful outcome. Thankyou

Craig Huxley Page xiv


A Member of the Centre for Information Technology Innovation

Craig Huxley Page xv


A Member of the Centre for Information Technology Innovation

1 Introduction

1.1 The Present Information Systems Environment

The Information Systems (IS) environment of 2002 has changed significantly


from that of the millennium bug era. An important focus in the millennium bug
era (97-2000) was to ensure that the organisation had an information system
which was year 2000 compliant. Many large organisations took the
opportunity to replace their information system with systems which were far
more advanced than their previous information systems. These replacement
systems of choice were enterprise systems from major vendors such as JD
Edwards, Peoplesoft, SAP, Baan, Technology 1 and Navision.

Enterprise Resource Planning (ERP) systems (the most costly of these IS


investments) are promoted as offering ‘best practice’ processes that may
differ substantively from those existing within an organisation. This
proliferation of ERP systems often fails to provide benefits (Sedera,
Rosemann et al. 2001a). Low utilisation of the software and a poor
understanding of the processes (Becker, Rosemann et al. 2000) has
increased the need for a solution that reduces the hit and miss approach to
process improvement and management seen in many companies (O'Neill
and Sohal 1997; Bender 2000).

In response, there has been further research into the area of benefits
realisation (Rosemann and Wiese 1999; Scott and Vessey 2000; Sedera,
Rosemann et al. 2000; Lin and Pervan 2001 ; Timbrell, Andrews et al. 2001).
Broadly speaking, benefits realisation covers all activities which might
improve the information system lifecycle. This includes both continual
improvement of processes and the more radical business process re-
engineering (BPR) (Davenport 1993; Hammer and Champy 1993).

Craig Huxley Page 1


A Member of the Centre for Information Technology Innovation

As the number of process engineering (PE) projects has increased,


organisations have found that process engineering projects are not easily
conducted (Bender 2000; Lin and Pervan 2001). Melnyk (2000) states that
process failure is caused by a focus on cost reduction, projects taking too
long and the lack of clear objectives (Melnyk 2000). There has been research
which shows that up to 70% of business process re-engineering projects
have failed to provide the benefits initially sought (Grant 2002). Bender,
Cedeno, Cirone, Klaus, Leahey, and Menyhert (2000) cite a failure to achieve
business objectives and the neglect of end-users as contributors to the high
failure rate of process engineering projects (Bender 2000; Lin and Pervan
2001).

Investment in this type of information technology has increased to become


the largest single element of capital expenditure (Thorp 1998). The high cost
of these systems has now found both advocates and executives being asked
hard questions concerning the promise of improvements to the organisation
(Thorp 1998). Mounting pressure by the ‘market’ to produce continued
growth by companies has led to a search for greater alignment of all
activities, coupled with rapid change in the way companies conduct their
business. John Thorp (1998) describes an environment in which managers
are struggling to demonstrate the connection between these IT costs and
expected business benefits (Thorp 1998).

This research project was prompted by the increasing demand to achieve


business benefits through effective service delivery and process
improvement, coupled with the increase in the utilisation of ASP’s for large
enterprise systems.

This research project seeks to improve the way organisations approach


process improvement. The following section describes the motivation for the
research project.

Craig Huxley Page 2


A Member of the Centre for Information Technology Innovation

1.2 Motivation for Research

Given the high failure of process improvement projects, there is an urgent


need for improvements to the way organisations approach and conduct
process improvement (Talwar 1994; Moreton 1997; O'Neill and Sohal 1997;
Bender 2000; Melnyk 2000; Grant 2002). The growth of IS outsourcing
through ASP and AHC channels, has been in excess of 10% per year over
the last five years (Australian Bureau of Statistics 2000; Benson 2002; IBIS
World 2002). The outsourcing companies seek cost effective ways of
improving their process improvement efforts. These two factors (high process
improvement failure rates and outsourcing industry growth) prompted the
creation of this research project, which is funded by the Australian Research
Council (ARC) in conjunction with an industry partner (REALTECH) as part of
an ARC Spirt Project.

One approach to process improvement by enterprise systems suppliers is to


recommend the use of reference models. A reference model is a
representation of ‘best’ or ‘common’ practice for a process. Chan and
Rosemann (2001) state that reference models do not show if there is an
alignment between a process and organisational business objectives (Chan
and Rosemann 2001). Additionally Chan and Rosemann (2001) reports that:
"while reference models supplied by Enterprise Systems
providers are beneficial for the configuration and
implementation needs, they do not focus on enterprise-
individual aspects, business objectives" (Chan and Rosemann
2001).

This lack of ‘focus on enterprise-individual aspects’ and ‘business objectives’


indicates that more work needs to be done on how organisations can use
reference models and still achieve their unique or individual needs, and forms
one driver for the research project.

Chan’s (2001) research also shows that while reference models are
beneficial for configuration and implementation of enterprise systems,

Craig Huxley Page 3


A Member of the Centre for Information Technology Innovation

organisations still require a method by which they can improve processes to


support business objectives. This suggests that process alignment is a
problem area for many organisations.

The following analogy (we have composed) explains the concept of process
alignment:
If a sporting team’s players are each a process: backs defend against the fast
breakaways and out wide attack while forwards concentrate on the rucks and mauls
and heavier work. For the team to be successful it needs to have a strategy for how
it will win the game: who will do which tasks and what the results of each task
should be. Alignment occurs when the strategy is effectively performed and points
are scored. Greater alignment might lead to many more points being scored. Non-
alignment occurs when individual players implement different strategies for scoring
which do not support the team strategy thus leading to points scored against them.

From this analogy we define alignment of a process as when the output of a


process is meeting one or more planned organisational objectives.

Given a view of the organisation as a collection of interdependent processes


and a limited budget for improvement projects, the question is: How does an
organisation select the right processes to improve and how do they ensure
these improved processes are aligned with organisational goals? This latter
question is the focal research question for the project.

To date, organisations use, and some universities promote the use of


Davenport (1993) and Hammer & Champy’s (1993) definitive works on the
process selection method (Davenport 1993; Hammer and Champy 1993).
Hammer and Champy (1993) promote a radical approach to process
improvement whereas Davenport (1993) supports the incremental approach
(Davenport 1993; Hammer and Champy 1993). The radical approach ignores
current practice and suggests that entirely new processes are developed.
The incremental approach suggests that small improvements are
implemented continually.

Craig Huxley Page 4


A Member of the Centre for Information Technology Innovation

Hammer and Champy (1993) provide three criteria for selecting processes:
1. Dysfunction- Which processes are in the deepest trouble?
2. Importance- Which processes have the greatest impact on the
company‘s customer?
3. Feasibility- Which process is the most susceptible to successful
redesign? (Hammer and Champy 1993)

Davenport (1993) suggests that process improvement teams should “identify


the processes to be improved” (Davenport 1993). He also states that
organisations should “focus on the most important processes or those that
conflict most with the business vision,” and prioritise according to urgency
(Davenport 1993).

These criteria leave the research team and the practitioner asking questions
about how to identify ‘important’ processes and how to link processes to
‘business vision’. Hammer & Champy’s (1993) unanswered questions for the
practitioner are:
How do you know which are the ‘deepest troubled processes’ and how do
you make the link between process and ‘impact on the company’s
customer’?

There are several levels of key issues for the researcher and practitioner. At
the top level are the issues of process alignment and reference models. The
next level has key issues of what are important processes within the ASP
industry and seeks to identify the link between process, process objective
and organisational goals.

We have seen that business would benefit from an approach to process


improvement. This approach would identify the necessary link between
process, objectives and business vision. In addition, such an approach would
enable identification of those processes which are most ‘important’ to the
entity in achieving organisational goals. There is little in the literature that
directs organisations towards this knowledge gap or region of complexity in

Craig Huxley Page 5


A Member of the Centre for Information Technology Innovation

process improvement. One means of relating process objectives to business


strategy, goals and vision is the Balanced Scorecard (BSC). This relationship
between processes and goals leads to the following research questions:

1.3 Research Question

1. Is a modified balanced scorecard approach able to identify those


processes which are most ‘important’ in achieving the business objectives
and vision?
2. If so, can this modified balanced scorecard methodology for identifying
‘important’ processes be implemented into an entity successfully?
These two elements lead to the research objectives described in the next
section.

1.3.1 Research Objectives

The research objective for this project is to develop a methodology using a


BSC that identified ‘important’ processes in application hosting and
application service provision. The method will provide a step by step guide
for the identification of ‘important’ processes and the selection of which of
these processes should then be improved 1 . Additionally the method will
include the steps required to successfully implement the methodology into
organisations. The research focus is those companies providing outsourced
IT services via the AHC or ASP business model. It is intended that the step
by step guide will be a practitioner’s method, providing instructive documents,
tested practices, suitable software and user friendly forms. The methodology
will contain details on how to logically identify ‘importance’ whilst also linking
this ‘importance’ with business objectives and vision. The criteria against

1
In line with Hammer & Champy’s criteria of “feasibility- Which process is the most
susceptible to successful redesign” Hammer, M. and J. Champy (1993). Reengineering the
Corporation: A Manifesto for Business Revolution. New York, Harper Business.

Craig Huxley Page 6


A Member of the Centre for Information Technology Innovation

which this methodology is being validated are the perceptions and responses
of the participants and the perceptions of the research team.

It is not intended that this research project verify the achievement of business
benefit, document the change to an organisation due to its use of the
targeting methodology or determine the long term benefits to an organisation
using the targeting methodology. These questions might be answered in a
longer and larger study.

The following section identifies the four major outcomes of the research
project and two further smaller outcomes.

1.3.2 Research Outcomes

The major outcomes of this research project are:


I. The development of a methodology to identify ‘critical’ processes and
to decide which of these processes to select for improvement based
on the BSC approach. This methodology is called ‘targeted business
process improvement’ (TBPI) or the targeting method.
II. A testing of the implementation approach used for the targeting
methodology within the application hosting service provision domain.
III. The refinement of the targeting method.
IV. The identification of generic critical processes within the AHSP
domain.
Further outcomes of this research are:
V. A generic definition for a critical process in an application hosting
service provision environment.

VI. A compilation and rating of critical processes taken from focus group
and Delphi study research.

Craig Huxley Page 7


A Member of the Centre for Information Technology Innovation

1.4 Relation of the Research to Previous Work

Since Davenport (1993) and Hammer & Champy’s (1993) definitive work in
the area of process re-engineering and process innovation, little has been
published in the area of choosing processes for improvement (Davenport
1993; Hammer and Champy 1993). The area of criticality is dominated by the
work of those in the nuclear (Poullot, Doutriaux et al. 2001), pharmaceutical
(Seely, Hutchins et al. 1999) and automotive industries (Stamatis 1995;
Kinetic 1999). The automotive industry predominantly uses the Failure Mode
Effects Analysis tool (FMEA) and this tool has influenced the targeting
methodology (Stamatis 1995; Kinetic 1999).

The balanced scorecard is also an integral part of the targeting methodology


(Kaplan and Norton 1992; Kaplan and Norton 1996; Kaplan and Norton
2001). Previous work on the balanced scorecard is extensive, though most of
this is in the form of non-academic writings. The originators of the BSC have
used their consultancy practice to test and validate the theories surrounding
its use and benefits (Kaplan and Norton 1996a; Kaplan and Bower 2000).
The use of the BSC for this particular function has not (to this research
team’s knowledge) been discussed in the literature. John Thorp (1998) uses
a method similar to the BSC as an aid to aligning IT projects with the goals
and strategies of an organisation (Thorp 1998). It is not used to identify
critical processes. There is also very little in the literature on specifically how
to implement a BSC.

Thus, the key contributions of the research project are:


1. A step by step method of how to identify critical processes
2. An improvement to the way strategic plans are communicated within
organisations
3. An improved understanding of the meaning of a critical process in the
ASP industry
4. A definition of a critical process
5. A method by which to provide a ‘critical path’ for strategic plans

Craig Huxley Page 8


A Member of the Centre for Information Technology Innovation

The next section of this chapter describes the format of the thesis.

1.5 Format of the Thesis

Thus far (in chapter 1) this thesis has described the environment in which
research has been initiated and cited some of the research works which
indicate that the problem area is significant to both research and business.
The chapter also outlines the research objectives and the innovative
outcomes of this 18 month project, concluding with a section on the previous
research related to this area. The rest of this thesis is organised as follows:

The purpose of chapter 2 is to describe the context within which the research
has been conducted. It starts with the broad Australian and international
environment and drills down to that of the industry partner REALTECH. The
chapter describes the overall research project conducted with sponsors
REALTECH, the Australian Research Council and the Queensland University
of Technology. The chapter concludes with a description of the research
participants in this project.

Chapter 3 is the literature review which cites the works which influenced and
directed the development of the targeting methodology including the
implementation process. It leads the reader from the domain of process
engineering and briefly discusses the problem area of process selection. This
section provides a list of criteria for identification and selection, which is the
basis of the targeting methodology. It then gives a review of the balanced
scorecard. The BSC review then discusses the success and failure of the
BSC and describes how the methodology should allow for these issues.
There are many standard and accepted tools used to calculate some criteria
(cost/benefit and risk analysis) and these are only briefly discussed. The
chapter concludes with a summary of these findings.

Craig Huxley Page 9


A Member of the Centre for Information Technology Innovation

Chapter 4 describes in detail (1) the targeting methodology, which uses a


rank ordering of factors we have named Impact, Probability of Failure,
Dependency, Probability of Success and Cost/Benefit in order to identify
which process to improve first. (2) The steps in the process used to identify
and select critical processes for improvement including the necessary
implementation steps are described. This chapter concludes with a ‘model’ of
what is identified as Version 1 of the targeting methodology. At the
conclusion of each of the action learning cycles a revised version is provided.

Chapter 5 describes the research methods used to achieve the research


objectives. This involved a focus group session, Delphi study and four cycles
of action learning using case studies. The case studies were conducted in
four different organisations.

Figure 1 shows the three main phases of the research approach. Phase 1 is
Model development using the literature review as the guide. The second
phase (Identify critical processes) is the development of the generic definition
of a critical process and identifying ‘critical processes’ using a focus group
and Delphi study. The use of a focus group and Delphi study to identify
critical processes and a single most critical functional area within ASP
provides an appropriate and necessary business focus for the final phase.
The final phase (3) called ‘test and revise’ uses action learning with a pilot
case study and three case studies. This third phase uses a cyclical approach
of act, observe, reflect and refine. The cycle occurs for each of the case
studies.

2
Model Identify Critical 3 Test and
1
Development Processes Revise
Focus
Act, observe, reflect & refine
Focus Delphi
Literature Review
Group Study
PS C1 C2 C3
Action Learning using a;
Pilot study & Case studies 1-3

Figure 1-Model of research approach

Craig Huxley Page 10


A Member of the Centre for Information Technology Innovation

Chapter 6 describes the second portion of the research (identifying critical


processes) to develop a generic definition of a critical process and identify
(using a Delphi study) the perceived critical processes within the ASP
industry. The focus group and Delphi study were used to reduce the scope of
the case studies, discussed in chapter 7, and to focus on an area of each
organisation which was critical to that organisation. This chapter also
describes who was involved in the focus group session and the Delphi study.

The chapter takes the reader through a description of the activities from three
perspectives.
1) The academic story concerning the actual conduct of research, (ie: how
the focus group was conducted);
2) The data collection and analysis; and
3) The personal story of the researcher.

The conclusion to this chapter contains the results of the data collection and
explains how these results directed the focus of the following case studies.

Chapter 7 is similar to chapter 6 in its approach to describing the action


learning cycles using case studies. The purpose of this chapter is to test and
refine the targeting methodology. This phase of the research uses a modified
action learning cycle of implement, observe, reflect and revise through four
cycles (pilot study and case studies 1-3). The action learning cycles are
completed with a cross case analysis of the three case studies to bring out
any revision to the targeting methodology which can be taken from studying
the data from all three. This chapter then concludes with what will be called
Version 5 or the tested targeting methodology.

Chapter 8 is the concluding chapter, which describes the significance of the


project to the target groups;
1) The research community;
2) The wider business community; and
3) The specific industry of AHC and ASP.

Craig Huxley Page 11


A Member of the Centre for Information Technology Innovation

This chapter will also describe the limitations of the project, by reflecting on
the types of organisations involved, their similarities and differences, the
focus of the study on the functional area of service support and that the study
is situated in the Australian context. Generalisability and validity will also be
discussed, with a final section concerning possible future research directions.

A comprehensive set of references and appendices is provided. Finally the


appendices contain the full set of documents used for implementing the
methodology and where necessary examples of ‘maps’ and collected data.
These appendices are:
Appendix 1- BSC Implementation Issues
Appendix 2- Benefits Documents
Appendix 3- Focus Group Information Pages
Appendix 4- Ethics documentation
Appendix 5- IDC Definitions

Craig Huxley Page 12


A Member of the Centre for Information Technology Innovation

2 Context of the Problem

This chapter describes the context within which the research team have
sought to uncover new and practical knowledge of benefit to the industry
partner REALTECH and the wider Australian community. This chapter
describes the general environment of the ASP industry and the importance of
process improvement to the players. The chapter also describes the
environment from the industry partner’s view and then suggests the types of
knowledge that would be useful to their future activities. The chapter provides
a description of the combined research project and shows how this stream is
related to the combined project. Finally, the chapter provides a description of
the participants in this stream of the research.

2.1 The Environmental View

IDC (2002) state that in 2001, IT services in Australia was a 10 billion dollar
industry with growth of approximately 10.5% over the previous year and
projected growth of 10.4% per year till 2006 (Benson 2002). IT services
within Australia includes:

This table is not available online. Please consult


the hardcopy thesis available at the QUT Library.

Table 1- IT services as defined by IDC (2002)

IT services are those services in Table 1 which an outside entity might offer
to another organisation. These are not the IT services provided in-house for a
company. Many companies today outsource some part of their IT needs.
Some have a total outsourcing arrangement and others are using a mixture

Craig Huxley Page 13


A Member of the Centre for Information Technology Innovation

of outsourced and in-house services. In-house services are those provided


by the company for itself.

Within IT services in Australia, Benson (author of the IDC ‘IS Outsourcing’


report), states that the IS outsourcing (part of IT services) “sector continues
to make the largest contribution to spending within the overall Australian IT
services market”, with a total of 38% of IT services spending in 2001 (Benson
2002). IS outsourcing has an annual revenue in the Australian market of
approximately AUD 3.8 billion (Benson 2002).

IDC defines outsourcing as:


“ the contracting by an organisation with a third party for the
management and enhancement of ongoing operations for all or part
of its IT infrastructure, IT functions, business processes, or
business solutions. Outsourcing involves a fixed-term, contractual
arrangement that may involve the transfer of assets or people.”
(Benson 2002)

The definitions of ASP (application service provision) according to Dunn


(2001) “vary widely, but in their most basic forms, ASPs are third-party
technology service providers that deploy and remotely host a software
application (Dunn 2001). This basic definition agrees with the Gartner
Dataquest (2002) definition which places the ASP and AHC service into the
same definition (Biscotti and Fulton 2002).

That is, “ASP and Application Hosting is a service addressing the


life cycle needs of the application from the initial IT infrastructure
development to maintenance of a complete set of IT business
applications. The provider offers software maintenance, conversion,
enhancement and support in a hosted environment” (Biscotti and
Fulton 2002).

The Gartner Dataquest definition suggests that the business model of the
ASP and AHC are very similar. That is, that ASP’s and AHC’s provide the

Craig Huxley Page 14


A Member of the Centre for Information Technology Innovation

same type of service and may differentiate their service only by the focus of
the service. An ASP’s business focus is on applications and services to
support the applications and an AHC’s focus is on the hosting of the
application not including any enhancement to the application. Stating the
definition in this way is then in agreement with the IDC definition of
application outsourcing if we presume that application outsourcing and ASP
are the same.

IDC define application outsourcing as “a service wherein responsibility for the


deployment, management and enhancement of a packaged or customised
software application only is transferred contractually to an external service
provider” (Benson 2002). The important point within this IDC definition is that
the application outsourcer is responsible for enhancements to the application.
These may be the updates provided by the software supplier but can also be
the provision of reports for management, increased functionality of the
application and integration of the application with legacy systems.

The IDC definition suggests a difference between ASP and AHC in


contradiction to that used by Gartner Dataquest. This difference is that an
AHC does not provide enhancements to the application as part of their
service.

The reality is that outsourcing companies offer services tailored to the needs
of each customer. This service provision might be on a one to many or one to
one basis; that is, the application being provided might be configured the
same for all customers (one to many) or configured uniquely for each
customer (one to one).

A further variation is found in the amount of outsourcing being sought by a


customer. Some companies will outsource to an ASP or AHC all their
application and hardware needs (total external sourcing) and some will only
outsource part of their IT needs. Other companies will use what they term as
in-sourcing which is when the in-house IT department act as an outsourcing
organisation and tender for the contract to supply

Craig Huxley Page 15


A Member of the Centre for Information Technology Innovation

The operations of the ASP are really a subset of the IS outsourcing business
model and the AHC can be considered a subset of the ASP business model.

Figure 2 shows a simplistic view of the basic relationship between the


different business models within the IT sector.

IS Outsourcing IT Services
¾ Consulting
ASP ¾ Systems Integration
¾ Hardware Support
AHC ¾ Network Mgmt.
¾ Processing Services
No application ¾ Education
enhancement services

Figure 2- Model showing relationships between different business models

The model is not truly representative as business models tend to vary with
the organisation and change with the contract. Mincom Pty Ltd, for example,
is a software supplier but also provides an IS outsourcing service for some
clients. Some IS outsourcing companies provide ASP services for one client
but not for another. Extending this to AHC services, some outsourcing
companies may only provide application hosting, leaving the enhancement of
the software to someone else. An example of this is Citec who provides an
application hosting service for some customers with the application managed
by a separate organisation. The separate organisation enhances and
provides a help desk facility for the application for the users of the product.

The next section of this chapter describes the environment for the industry
partner REALTECH, who are the industry sponsors of the research project.

Craig Huxley Page 16


A Member of the Centre for Information Technology Innovation

2.2 REALTECH Australia

REALTECH Australia is a subsidiary of REALTECH AG which has its


corporate headquarters in Walldorf, Germany. This is the same city in which
the world leader in enterprise systems solutions (SAP) is based. REALTECH
AG, the parent company, was founded in 1994 as a niche player in providing
an application which automated many of the SAP software maintenance
tasks (REALTECH 2002). This new product is today called The Guard
(REALTECH 2002). The company today is truly a global entity with offices
and subsidiaries in 11 countries. The Australian subsidiary was established in
1997 and its headquarters are in North Sydney NSW. This is also the
location for many big names in the IT world, (SAP, Microsoft, Navision,
Oracle, PeopleSoft, EDS and Sun Microsystems).

REALTECH specialises in technology consulting and development of


application and system management software. At present, REALTECH is a
SAP focussed company with their services and products applied to
implementing, maintaining and developing software exclusively for the SAP
suite of products.

REALTECH also has a ‘Remote Services’ product (to clients using SAP
products) which provides many of the vital system administration services for
a customer’s SAP system (REALTECH 2002; REALTECH 2002a). As such,
they can be seen to be an application service provider. In the Asia-Pacific,
REALTECH has 35 full time employees out of a total 640 for the group.
These are supplemented by contract employees. REALTECH AG’s 2001/02
revenue exceeded €56.5 million Euros; of this 5% was from the Asia-Pacific
in what is considered a very tight market (REALTECH 2002a). It is this tight
market which prompted REALTECH to find improvements in the way they

Craig Huxley Page 17


A Member of the Centre for Information Technology Innovation

operated. The tight market was causing all competitors to compete for a
declining IT budget.

Many client companies had reduced their IT expenditure to levels which were
causing them serious problems with data recovery, hardware and software
maintenance (Baker 2001). This cost cutting has affected the REALTECH
‘Remote Services’ product as evidenced by either a reduction in services
required or withdrawal of service altogether and pressure to reduce their
pricing (Baker 2001).

REALTECH were looking for tools and improved processes to help them
improve their competitive advantage and encourage potential clients to
increase budgets. There are usually only three ways to increase profits for a
company; increase prices, reduce costs or sell more services or product.
REALTECH anticipated that the outcomes of this project would impact on all
three areas. By improving processes more effectively and efficiently they
could cut their own costs and those of their clients. This would be value
adding for their clients and a price increasing strategy for REALTECH. Being
able to provide a unique product to clients would also attract new clients,
hence selling more services and products.

To find this advantage in the market place, REALTECH needed more than it
could gain by reading publications from the research, which would provide no
more advantage than the many other companies who keep themselves
updated in this way. The real advantage of participation in this research for
REALTECH lay in being closely associated with the research activities and
participating in the regular meetings and updates of the research team.
Companies who are closely associated with a research project have a two
year advantage on companies whose contact with the research is through its
publications alone (Sauer 2001).

REALTECH were consequently interested in developing a partnership with


the research team. The result of this interest was a successful bid for what is
called an ARC (Australian Research Council) Industry SPIRT grant. The ARC

Craig Huxley Page 18


A Member of the Centre for Information Technology Innovation

Industry SPIRT research projects are part of the Australian Federal


Government National Competitive Grants Programme (NCGP). One of the
aims of these projects is to support and develop long-term strategic research
alliances between industry and Universities. This is to encourage the
application of advanced knowledge to problems, or to provide opportunities
for Australians to obtain economic or social benefits (Australian Research
Council 2003).

The final signing of contractual documents occurred in December 2001,


though the project was supposed to commence in July 2001 with the
awarding of the ARC grant. The ARC and REALTECH provided funds for two
Australian Postgraduate Awards: Industry (APAI) and REALTECH the funds
for research related travel and equipment for a maximum of two years. The
time frame is in line with that prescribed in the QUT rules for a Masters of IT,
Research. The research team consists of two chief investigators and two
masters by research investigators.

The overall project was called the “Process-oriented Administration of


Enterprise Systems”. The next section of this chapter describes the overall
project and how the two masters by research projects, which fit within it, are
related.

2.3 Process-oriented Administration of Enterprise Systems

The purpose of the ARC SPIRT project was to increase the effectiveness and
efficiency of administering enterprise systems through a process oriented
approach. “The expected outcomes of this project were to be:
1. Reference solutions for a Balanced Scorecard; and
2. Development of selected reference process models for enterprise system
(ES) Service Delivery” (Rosemann 2002).

Craig Huxley Page 19


A Member of the Centre for Information Technology Innovation

Rosemann (2002a) states that reference models are “characterised as


reusable semi-formal descriptions of specific domains” (Rosemann 2002a).
Kaplic and Berus (2001) define a reference model saying that in most
organisations there exist common business processes, which are similar to
or the same no matter what the objectives of the enterprise. Therefore, the
adoption of such common reusable processes is “a significant improvement
in the efficiency and quality of the planning of new or redesigning of existing
processes” (Kaplic and Bernus 2001). These common reusable processes
are termed reference models.

The two outcomes (reference solutions for a BSC and reference process
models for ES service delivery) were to be the objectives of the two masters
research projects.

Figure 3 is a model of the overall project and shows the two masters projects
of which this thesis was originally tasked with ‘reference solutions for a BSC’.

Reference solutions Reference models for


Renamed the
for a BSC ES Service delivery
“Targeting Start: September 2001 Start: Dec 2001
Methodology”
Intersection of
projects where
resources and
participants
This thesis were shared

Figure 3- Model of the overall project and the two interlinked projects.

The model (Figure 3) shows that the two projects are interlinked. The first
project (reference solutions for a BSC) provides input, (shown by arrows) to
the second project (reference models for ES service delivery), which, in turn,
provides input back into the first. In particular, the reference models for ES
Service Delivery project provided this research project with a reference
model of ASP service delivery.

Craig Huxley Page 20


A Member of the Centre for Information Technology Innovation

The intersection between the two projects was:


1. The use of the same participants for some parts of the data collection
2. Combining the projects for the initial approach to these companies
3. Using one set of documentation for information about benefits and
confidentiality
4. Utilising parts of each projects output as input for the other project
This last intersection will be fully discussed in section 2.3.1 which discusses
the second project ‘reference models for ES service delivery’.

The ‘reference solutions for a BSC’ project (this thesis) re-assessed the
objective (develop a reference solution for a BSC) in the light of a review of
the literature on the BSC. The literature review reveals that seeking to
develop reference solutions for a BSC is difficult. Every company has a
unique combination of strategies, knowledge, skills, view of the environment
and position within that environment. Even with the same strategies or goals,
each company will develop unique objectives in their effort to achieve their
strategies and goals. This uniqueness leads to a lack of useful transferable
solutions for a BSC. Consequently, this thesis developed a new objective
which was to identify those processes which should be improved first with the
assistance of a BSC.

Although the outcome of this thesis is now considerably changed, it still


meets with the original aims for significance and innovation. This was stated
as increasing “the efficiency of administering Enterprise Systems through a
process-oriented approach” (Rosemann 2002). The original project
documents claimed that, to date the concepts used for the management of
business processes had not been adapted for enterprise system service
delivery processes and that they were still focussed on the technical issues
(Rosemann 2002).

The research project ‘reference process modelling for ES service delivery’


started some 4 months later than this thesis and has provided input into this
thesis with a reference model of ASP service delivery. The next section of

Craig Huxley Page 21


A Member of the Centre for Information Technology Innovation

this chapter explains briefly how the two research projects interacted and
describes how the reference model of ASP service delivery was developed.

2.3.1 Reference Process Models for ES Service Delivery

One aim of the ‘reference models for ES service delivery’ research project
was to develop a multi-level reference model of ASP service delivery. There
are many activities which occur before an ASP reaches their ‘service
delivery’ activities. These are activities such as marketing, client acquisition,
product development and human resources. These activities were placed out
of scope for the project. The ‘reference models for ES service delivery‘
project focussed on the core activities of ASP service delivery and left the
support and strategic activities at the high level (Taylor 2003).

Most of the constructs and content for the first version of the ASP service
delivery reference model was derived from two existing industry reference
models (Bartett, Hinley et al. 2001; Berkhout, Harrow et al. 2001)
(STRATECAST PARTNERS 2001). Information Technology Information
Library (ITIL) is a set of reference models developed for the IT industry in the
UK (Bartett, Hinley et al. 2001; Berkhout, Harrow et al. 2001). ITIL is
focussed on the operational view of IT services (Bartett, Hinley et al. 2001;
Berkhout, Harrow et al. 2001). The e-business Telecom Operations Map
(eTOM) is another widely accepted set of reference models developed
originally for the telecommunications industries (STRATECAST PARTNERS
2001). eTOM differs from ITIL in that eTOM started with a focus on the
strategic view of business (STRATECAST PARTNERS 2001). These large
and complex reference models were used because of their large contributor
base and industry acceptance (Taylor 2003).

The ‘reference models for ES service delivery’ research project initially


attempted to combine the two models (eTOM & ITIL) in order to identify any
gaps in the models. Input from eTOM and ITIL, combined with information
about ASP service delivery provided the basis of the first ASP service

Craig Huxley Page 22


A Member of the Centre for Information Technology Innovation

delivery reference model. This model was a high level model of value chain
appearance (see Figure 4). A value chain is a description of those ‘Primary’
activities which are needed to accomplish a task (Porter 1980).

Hardware Software Application Security Service


Management Management Management Support

Figure 4- Representation of a Value Chain of ASP service delivery

The model (Figure 5) was developed to what is called version 1, (the high
level reference model) with input from the research participants used in both
research projects.

This figure is not available online.


Please consult the hardcopy thesis
available at the QUT Library.

Figure 5-Version 1 Reference Model of ASP service delivery (Taylor 2002)

Craig Huxley Page 23


A Member of the Centre for Information Technology Innovation

Figure 5 shows version 1 of the reference model for application service


provision service delivery. This model shows the strategic service planning
processes at the top of the model. These are broken into two main areas of
business planning and product life cycle management. Below the strategic
service planning processes are those processes which deal with the service
definition. They are defining service scope and defining service levels. At the
bottom of the model are the processes which support or enable the delivery
of the core processes found in the centre. These core processes are service
infrastructure (hosting), service delivery, service support and customer
relationship management.

The model differentiates the client into customer and user (right of model).
Processes such as customer relationship management and service definition
are normally focussed on the management with service support processes
focussed on the users of the application. The model (version 1) was used in
this thesis (‘targeting methodology’) to act as a starting point when defining
the high level processes of the participants.

The interaction with the reference process modelling for ES service delivery
research project occurred when data collected from this project (targeting
methodology) was used to identify those processes perceived as most
important for the ASP business participants.

This thesis uses a Delphi study in which the research participants rate a
number of perceived critical processes in order of criticality from within the
service delivery area of the model. Subsequently the second research project
(‘reference process modelling for ES service delivery’) focussed on those
processes within service support for the development of new reference
models at lower levels within the model of ASP service delivery.

The next section of this chapter describes the participants who were involved
in the data collection of this thesis in order to validate the findings, which are
grounded in the knowledge of significant and experienced ASP

Craig Huxley Page 24


A Member of the Centre for Information Technology Innovation

organisations. We will also describe the development of the ASP service


delivery model version 1.

2.3.2 The Participants

Of the 25 companies contacted as possible participants, ten companies


agreed to participate in at least one of the data collection exercises.

Selection for the focus group activities was originally based on the following
criteria:
1. Head office in Brisbane or knowledgeable staff based in Brisbane
2. A provider of ASP or AHC services or in-house providers of enterprise
systems or similar complex information systems
The reasoning behind the first selection criteria was to improve the access
and participation of the participants. This could occur by having them based
in Brisbane. Travel time and travel costs were also reduced.

The second factor, the choice of the ES service provision domain, was
already in place from the original documentation concerning the overall
project (“process-oriented administration of enterprise systems”). Finally, the
type of industry which used enterprise systems was chosen by the industry
partner REALTECH. By focusing on the ASP and AHC industry the project
would have greater relevance to the industry partner.

The original aim was to have a mix of government and public organisations
and outsourcing and in-house organisations. This approach would have
allowed for a comparison of each area. Figure 6 shows (with arrows) some of
the possible comparisons between an in-house provider and outsourcing
provider of ES and government and public company.

Craig Huxley Page 25


A Member of the Centre for Information Technology Innovation

In-house Outsourcing
ASP Service Delivery ASP Service Delivery
Public Company Public Company

Outsourcing In-house
ASP Service Delivery ASP Service Delivery
Government entity Government entity

Figure 6- Possible Comparisons of different types of entity

Twenty five companies were approached to participate in the overall


research project. Table 2 shows the break-up of organisations that were
contacted (left) and compares this to the break-up of organisations who
participated (right).

Contacted Government Public Participated Government Public


Co. Co.
In-house 4 8 In-house 0 1
Outsourcer 2 11 Outsourcer 2 7

Table 2- Contacted companies and participants by number of type

Of those contacted, there were twelve organisations providing in-house


services of which four were government and eight were public organisations.
Thirteen outsourcers were contacted of which two were government owned
and 11 were public outsourcers. Ten organisations agreed to participate,
they consisted of nine outsourcers (7 public companies and 2 government
owned organisations) and 1 in-house public company. The forty percent
participation rate was considered to be representative of the industry, given
the stature and experience of the participants.

As can be seen in Table 2 contact was made with a balance of in-house


service providers and outsourcer though biased towards public companies..
Table 3 shows the break-down of participants, using a similar format as in
Table 2.

Craig Huxley Page 26


A Member of the Centre for Information Technology Innovation

Participated Government Organisation Public Company


In-house Parmalat (Pauls)
ASP provider
Outsourcing CSA Qld Government IBM Global Services Aust.
ASP provider Citec EDS Consulting
CSC
REALTECH
Deloitte Touche Tohmatsu
Hitachi Data Systems
Mincom

Table 3- Participants by name and type

A detailed study of each participant organisation was undertaken. Information


on each company was obtained from various sources but most notably from
an IDC report on the Information Systems (IS) Outsourcing industry (Benson
2002). Another source of information was the company web sites and annual
reports (where available). These are listed in the bibliography.

Annual reports were sourced from REALTECH and Mincom (Mincom 2002;
REALTECH 2002a). Data from the Australian Bureau of Statistics and Paul
Budde Communications were used for corroboration of the stated claims of
market size and market share (Australian Bureau of Statistics 2000; Paul
Budde Communication 2002). The participants were all asked by email to
verify the final descriptions and this was achieved.

The information provided in this section is constrained by a number of issues:


1. The ethics of identifying individuals from companies
2. The ability of the research team in gathering the required data from
individuals who, due to their high level of responsibilities, were not able to
devote more time to providing this data
3. The cost restraint in purchasing data on the industry from consulting
companies such as Gartner Dataquest and IBIS World.

Craig Huxley Page 27


A Member of the Centre for Information Technology Innovation

Of those companies which agreed to participate in one or more phases of the


research project, three (IBM GSA, EDS Consulting, CSC), make up a
combined 76% market share of the IS outsourcing market with an annual
revenue of nearly AUD 3 billion (Benson 2002). Another participant is Citec,
which is the ninth largest participant in the IS outsourcing market within
Australia.

The top ten outsourcing companies within Australia are shown in Table 4
from IDC data in 2001.

Outsourcer Estimate Outsourcing


Revenue ($M)

This table is not available online.


Please consult the hardcopy thesis
available at the QUT Library.

Table 4- Top 10 Organisations in Outsourcing Market: 2001 Source: IDC (Benson 2002)

In order to provide an initial description of the participants’ activities, the


participants were asked to indicate which high level processes within the
ASP service sector they believed they operated in. This is shown in Table 5
and covers the processes which were used in the ASP service delivery
reference model before version 1. The table includes product development,
IT strategy, business process engineering and business process outsourcing,
which were removed from the ASP service delivery reference model as they
did not reflect the view being sought.

Craig Huxley Page 28


A Member of the Centre for Information Technology Innovation

Company Product H/W S/W Application Security Service IT Bus BP


Development Mgmt Mgmt Mgmt Support Strategy Process Outsourcing
Eng.
Citec 9 9 9 9 9 9 9
CSC 9 9 9 9 9 9 9 9 9
DTT 9 9 9
EDS 9 9 9 9 9 9 9 9 9
REALTECH 9 9 9 9 9 9
IBM GSA 9 9 9 9 9 9 9 9
HDS 9 9 9 9 9
Parmalat 9 9 9 9 9 9
Mincom 9 9 9 9 9 9 9 9
CSA 9 9 9 9 9 9

Table 5- High level process in which Participants operate in the ASP Market

Participants held positions described as Managing Consultant, Account


Executive, General Manager, Senior Manager, Manager, Professional
Services Director, Contract Manager, Manager Service Strategies, Remote
Services Manager, Associate Principle, SAP Business Analyst and
Applications Delivery Manager. A description of duties associated with each
position name is not provided here since we believe that in some cases this
would enable individuals to be identified, in contravention of the projects
ethics submission.

We believe that the calibre of these participants increases the credibility of


the output of the research project. IBM Global Services Australia, EDS, CSC
and Citec, on their own, are considered the clear dominant players within the
Australian market, claiming nearly 80% of the annual IS Outsourcing
revenues. This next section provides information about each of the
companies and the participants.

IBM Global Services Australia


The participant from IBM Global Services Australia (GSA) is a member of the
Global and Asia Pacific focus groups, able to provide a global perspective as
well as understanding the uniqueness of the Asia Pacific and Australian
markets. IBM GSA has 10,500 employees in Australia with revenue of AUD
3.3 billion in 2001 of which AUD 1.725 billion was for Services. The company
operates mostly in the IS outsourcing and network infrastructure

Craig Huxley Page 29


A Member of the Centre for Information Technology Innovation

management service fields competing in the Banking and Finance Services


sectors, Communications, Government, Manufacturing and Transportation
sectors. IBM Global Services support users in Australia, Singapore,
Malaysia, Thailand, Philippines, Indonesia, Hong Kong, Taiwan and Korea
from data centres in Sydney, Melbourne and Ballarat (Benson 2002; IBM
Global Services Australia 2002).

Computer Science Corporation


The participant CSC manage the IS needs for the world’s largest mining
company and have access to the collective knowledge of a multinational
organisation which has operated since 1959 in the United States and since
1970 in Australia. Knowledge management is a formalised process within
CSC and the participants were able to utilise this network of people and
databases in support of the research project. CSC has more than 4500
employees in Australia, almost all of which work in the services market. IDC
report that CSC operates across the banking and financial services,
government and manufacturing industries (Benson 2002).

EDS Consulting
The participant from EDS operates at the operations end of the business,
managing the takeover and set-up of new clients. This participant brings a
wealth of experience and knowledge to the construction of the physical and
informational infrastructure of complex outsourcing contracts. EDS is the
second largest IS outsourcing company in Australia with 26% of the IS
outsourcing market in revenue terms. In Australia the 2001 revenue reached
AUD 1.45 billion. Operations in Australia were established in 1986 and it has
some 7000 employees with operations in Banking and Finance Services,
Communications, Government, the Manufacturing industry and
Transportation sectors (Benson 2002).

Craig Huxley Page 30


A Member of the Centre for Information Technology Innovation

Mincom
Mincom was founded in 1979 and is still majority owned by current and
former employees of Mincom (75.80%). The participant from Mincom brings
a specialised knowledge and experience in managing an outsourcing
business with enterprise system products produced by Mincom itself. This
was a unique perspective in the research project, with no other participant
having such close and easy access to this type of knowledge. Mincom has
over 1100 employees in 18 offices and operates in 40 countries. Revenue for
the 2001/2002 financial year was AUD 207.8 million. The company develops
and sells enterprise system software and provides IS outsourcing services for
its range of enterprise systems. It has a reputation for producing the world’s
best enterprise asset management systems for asset intensive industries
(such as utilities, mining and transport). (Hoover's Online 2002; IBIS World
2002; Paul Budde Communication 2002) (Mincom 2002).

Hitachi Data Systems


Hitachi Data Services (HDS) is a private wholly owned subsidiary of Hitachi
Ltd founded in 1989 (Japan’s largest electronics company). The participant
from HDS comes from a true niche player within the IS outsourcing industry.
This company specialises in providing remote data base services for all types
of applications (not just ES systems). This participant was able to provide
insights to the research gained from being a highly focussed specialist. As a
private company, HDS do not publish revenue or employee figures. All
information about the company is consolidated in the group reports of the
Hitachi Corporation, Japan (Connected Corporation 2001).

Deloitte Touche Tohmatsu


The two participants from Deloitte Touche Tohmatsu (DTT) were senior
managers within the Management Solutions group. They brought a wealth of
knowledge and experience to the research by providing insights gained from

Craig Huxley Page 31


A Member of the Centre for Information Technology Innovation

their own experience but also that of the 259 Australian partners and 2554
professional staff. Like many of the large consulting firms DTT operates a
knowledge database on a global scale. The participants were thus able to
provide a global view of the industry. For the financial year ending June 2002
there was a Net Revenue of AUD573 Million.

The Management Solutions group Deloitte also provide consulting services,


specialising in Business and Technology Strategy, Collaborative Supply
Chain Management, Strategic Sourcing and Procurement, eMarketplaces,
Systems Implementation, Process Transformation, Finance Transformation,
Human Resources and Change Management, CRM, and Applications
Integration (Deloitte Touche Tohmatsu 2002) .

Citec
The fourth of the top ten IS outsourcing participants is Citec (ninth largest)
who provided four participants at different times during the focus group
sessions. Citec is a government owned commercialised company established
in 1964 and commercialised in 1992. Citec participants were managers with
responsibilities which ranged across the strategic management areas. They
brought a unique and valuable knowledge and experience taken from a
customer base of 57% government and 43% public clients. Citec have offices
in 5 Australian states and a 2001 revenue of AUD120 million and total staff of
700 plus people. Citec do not offer full service outsourcing, and focus on
integrated infrastructure management, e-Business solutions and application
outsourcing (Benson 2002).

Corporate Service Agency


Corporate Services Agency (CSA) was the only other government participant
in the research project. CSA was able to add to the input of Citec from the
view of a pure ASP with no hosting of applications. CSA was established in
July 1996 as a shared service provider of finance, human resource and

Craig Huxley Page 32


A Member of the Centre for Information Technology Innovation

administrative services. The organisation is jointly owned by the Department


of Natural Resources and Mines (NR&M) and the Department of Primary
Industries (DPI). The participants’ knowledge and experience stem from their
roles as managers of the SAP R/3 system delivered to NR&M and DPI along
with a number of legacy systems. CSA provides a full range of services to
support these systems and effectively operates as an Applications Service
Provider (ASP) to its clients. The agency’s revenue base is around $22M per
annum and there are approximately 260 full time equivalent staff (Corporate
Services Agency 2002).

Parmalat Pty Ltd


The ninth participant and the only in-house organisation was Parmalat with
its head office in Milan Italy. The participant from their Brisbane headquarters
was able to provide the in-house view of enterprise system service provision.
Parmalat operates one of the latest versions of SAP R/3 in Australia with the
full suite of e-Business modules. They have a progressive and innovative IT
culture operating within the dairy industry which is a 24hours, 7 days a week
business. This drives the need for an IT service which is critical to product
management and thus a participant with a proven skills and knowledge in
developing effective and highly efficient processes within enterprise systems.
In 2001, Parmalat operated in 30 countries, with 146 plants employing
approximately 36,000 employees and consolidated (group wide) sales of
AUD 13.5 billion (Parmalat 2001).

REALTECH AG
The tenth company was the industry partner REALTECH and two members
of this company participated in the research. These participants were able to
provide valuable input from both the perspective of an outsourcing company
specialising in remote hardware and operating system maintenance and as a
service provider to outsourcing companies as well. This enables the
participants to have a wide experience with the processes of many Australian

Craig Huxley Page 33


A Member of the Centre for Information Technology Innovation

and international companies. The company itself is described earlier in the


chapter.

Summary

In summary the participants involved in this research project were not only
from “highly regarded” companies, but were also able to provide a breadth of
knowledge and experience that would be enviable in any research project
within the IS outsourcing industry. This valuable and unique input has
contributed to the important outcomes and innovative methodology that this
thesis describes.

This chapter described the general environment within the information


systems ASP industry and the importance of process improvement to the
players. The participants are also described in the context of their individual
knowledge and experience. Their companies are briefly described as support
for their value to the research project. The chapter discusses the industry
partner’s view of the environment and then suggests the types of knowledge
that would be useful to their future activities. A description of the overall
research project (process-oriented administration of enterprise systems) is
provided and also how this thesis is related to the reference process
modelling for ES service delivery research project.

So far this thesis has outlined the motivation for the research project and the
context in which it was conceived. It has explained the need for further
research into the area of ‘important’ or critical processes in order to improve
or operationalise the work of Davenport (1993) and Hammer & Champy
(1993). The following chapter (the literature review) grounds the work in the
current research and provides the thesis with the basis for the first version of
the targeting methodology and the linkage between this methodology and the
need for an operational method of process selection.

Craig Huxley Page 34


A Member of the Centre for Information Technology Innovation

3 Literature Review

This literature review examines those publications which influenced and


directed the development of the targeting methodology including the
implementation process. This chapter will briefly describe the motivation for
developing the targeting methodology and then leads the reader from the
domain of process engineering to briefly discuss the problem area of process
selection.

This first section deals with the criteria for identification of critical processes.
The next section provides the criteria for selection of the most critical process
to re-engineer. These two sections are the basis of the targeting
methodology. The emerging methodology reveals a need to couple
processes with objectives. This leads to an in-depth review of the balanced
scorecard (BSC) as one viable strategy. The review discusses the success
and failure of the BSC and describes how the BSC methodology should
provide support for these issues.

There are many standard and accepted tools used to calculate some criteria
for process selection such as cost/benefit and risk analysis, and these
standard approaches are only briefly introduced. The chapter concludes with
a summary of the findings and how they affect the targeting methodology.

3.1 Introduction

Enterprise system (ES) users have recently started to upgrade and


implement new versions of their enterprise system in the ‘vanilla’ form. This
‘vanilla’ form refers to the system set-up as being without any major
configuration of processes to suit the individual processes of the
organisation. The ‘vanilla’ form contains processes which are considered

Craig Huxley Page 35


A Member of the Centre for Information Technology Innovation

‘reference’ models by the enterprise systems provider (Chan and Rosemann


2001).

The intent for the user of the ‘vanilla’ version is to reduce the implementation
time. The result, though, is that many organisations undergo a process
change during the implementation of the system to convert processes to the
vanilla version and then undertake process improvement to align the
processes of the organisation with the strategic needs of the organisation.
Chan and Rosemann (2001) state that reference models do not show if there
is an alignment between a process and organisational business objectives
(Chan and Rosemann 2001).

This trend has resulted in an increase in the number of process engineering


(PE) projects and organisations have found that process engineering projects
are not easily conducted (O'Neill and Sohal 1997; Bender 2000; Lin and
Pervan 2001).

A common approach to process improvement today is to use reference


process models as the improved process model (Becker, Rosemann et al.
2000). These reference models offering ‘best practice’ processes may differ
substantively from those existing within the organisation (Rosemann 2000).
This way of sharing ‘best practice’ is not new to business with many large
organisations ensuring that particular effective and efficient processes are
used by all departments and or subsidiaries within a corporation (Conn and
Yip 1997).

Up to 70% of business process re-engineering projects have failed to provide


the benefits initially sought (O'Neill and Sohal 1997; Grant 2002). Bender et
al (2000) cite a failure to achieve business objectives and the neglect of end
users as contributors to failure in many process engineering projects (Bender
2000). Given the high failure of process improvement projects, there is an
urgent need for improvements to the way organisations approach and
conduct process improvement (Talwar 1994; Moreton 1997; O'Neill and
Sohal 1997; Bender 2000; Melnyk 2000; Grant 2002).

Craig Huxley Page 36


A Member of the Centre for Information Technology Innovation

Davenport (1993) suggests to process improvement teams that they “identify


the processes to be improved” (Davenport 1993). He also states that
organisations should “focus on the most important processes or those that
conflict most with the business vision,” and prioritise according to urgency
(Davenport 1993).

Hammer and Champy (1993) also provided three criteria for selecting
processes to improve; namely to focus on those processes which are most
1. Dysfunctional- Which processes are in the deepest trouble?
2. Important- Which processes have the greatest impact on the
company‘s customer?
3. Feasible- Which process is the most susceptible to successful
redesign? (Hammer and Champy 1993)

Hammer & Champy (1993) failed to prioritise or weight these factors and to
suggest just how to operationalise the assessment of the three factors
(dysfunction, importance and feasibility).

Hammer & Champy’s suggestions, like those of Davenport’s (1993), leave


the practitioner asking questions about how to identify ‘important’ processes
and how to link processes to ‘business vision’.
How do you know which are the ‘deepest troubled processes’ and how do
you make the link between process and ‘impact on the company’s
customer’?

The key issues arising from these literature sources appear to be:
1. What is the meaning or definition of important processes?
2. What are the factors which are needed to identify an ‘important’ process?
3. What are the factors which are needed to select an ‘important’ process for
improvement?
4. How do we make the link between processes and goals?

Craig Huxley Page 37


A Member of the Centre for Information Technology Innovation

The following sections will describe from the literature the areas of important
processes, factors for identifying and selecting important processes for
improvement and how to link processes to organisational goals.

First we examine the notion of importance and critical processes. Then we


examine the literature on selecting candidate processes for process
improvement projects. Finally we provide some insight from the literature into
the possible tools which might be used to asses the factors which define
criticality and process selection.

3.2 Identifying Critical Processes

We sought to define the concept of a critical process, as a subset of those


processes seen as ‘important’. Critical can be defined as “of decisive
importance with respect to the outcome” (Makins 1992). Stamatis (1995)
supports this view with his own comments that "not all problems are equally
important” (Stamatis 1995) (p xxii). Melnyk (2000) and Stamatis (1995) cite
the Pareto Principle that we should learn to separate the critical few from the
‘trivial’ many (Stamatis 1995; Melnyk 2000). The Pareto principle can be
applied to important processes as we are suggesting that even when
important processes are identified there will always be some that are more
important than others.

The objective then is to identify within the literature what is considered to be


a critical process.

Hax and Majluf (1984), Carpinetti, Gerlamo & Dorta (2000), Rummler and
Brache (1991) and Dervitsiotis (1999) discuss the need to identify critical
processes or those processes that deliver value within the business process
arena. These authors though, fail to provide a clear definition of a critical
process (Hax and Majluf 1984; Rummler and Brache 1991; Dervitsiotis 1999;
Carpinetti, Gerlamo et al. 2000).

Craig Huxley Page 38


A Member of the Centre for Information Technology Innovation

Hax and Majluf (1984) state that a critical process is one that affects key
success factors within organisations (Hax and Majluf 1984). Key success
factors are that group of objectives that must be satisfied in order to achieve
organisational goals. Hax and Majluf (1984) link key success factors to
organisational goals. It is not difficult though, to postulate, that if a critical
process is one that has the most effect on key success factors then it would
have an effect or impact on organisational goals.

Rummler and Brache (1991) add further evidence, suggesting that “critical
processes are strategically significant because their performance has an
impact on the success or failure of a business strategy” (Rummler and
Brache 1991). The link between organisational goals and business strategy
is an accepted one; thus we can suggest that Rummler and Brache (1991)
agree with Hax and Majluf (1984).

Dervitsiotis (1999) suggests that “an important issue in identifying critical


processes is the selection of goals, which define an organisation’s
competitive strategy” (Dervitsiotis 1999). He argues that to identify critical
processes the organisation needs to assess the impact of each process on
the goals of the organisation (Dervitsiotis 1999). We now have critical
processes linked to key success factors, business strategy and the goals of
the organisation. In addition, critical processes appear to be those that
impact on the goals, strategies and key success factors.

Carpinetti, Gerlamo & Dorta (2000), in their framework for the deployment of
strategy-related continuous improvements, extend our definition of a critical
process, stating that a “critical process is one that has the most impact on the
performance of an organisation” (Carpinetti, Gerlamo et al. 2000). Carpinetti,
Gerlamo & Dorta (2000) also suggest that a process is critical if it has the
most effect on organisational goals (Carpinetti, Gerlamo et al. 2000).

If there are hundreds of processes in the average business, we can postulate


that not all processes are equal in terms of their impact on objectives. That is,

Craig Huxley Page 39


A Member of the Centre for Information Technology Innovation

the processes have different relative contributions to the objectives of the


organisation. As Carpinetti, Gerlamo & Dorta (2000) suggest, the critical
processes have the most impact on organisational goals. This statement
suggests the development of a list in which processes are rank ordered by
how much they impact upon organisational goals. To assess ‘impact’, if
impact is the relative contribution of a process on objectives and goals, we
would need a method which identifies and rates the impact of the process on
goals.

Although Carpinetti, Gerlamo & Dorta (2000), Dervitsiotis, (1999), Rummler


and Brache (1991) and Hax and Majluf (1984) are in agreement with the
concept that critical processes impact upon organisational goals, these
authors do not attempt to provide a guide as to how this should be
accomplished.

Is impact the only factor which defines a critical process? Within the literature
we have found three types of critical process. There is the generic business
critical process as defined by Carpinetti, Gerlamo & Dorta (2000), which
suggests that a “critical process is one that has the most impact on the
performance of an organisation” (Carpinetti, Gerlamo et al. 2000).

The second type of critical process is found in the manufacturing and


production industries and suggests that critical processes are where most
problems are encountered (Melnyk 2000). Seely, Hutchins, Luscher, Sniff
and Hassler (1999) extend this definition by associating critical steps and
critical variables with the reliability of equipment and possible failure of that
equipment. Their explanation of a critical step suggests that many steps
would become a process and, thus, is relevant to our study.

The third type is the safety focussed critical processes seen in the nuclear
industry (Poullot, Doutriaux et al. 2001).

Craig Huxley Page 40


A Member of the Centre for Information Technology Innovation

The manufacturing and nuclear industries do not relate their critical


processes to objectives or goals. They are instead more focussed on the
notion of failure.

Manufacturing and Production Critical Process


Melnyk (2000) has a manufacturing view of critical processes and describes
critical processes as bottlenecks, processes which consume most of the
available resources, require the most lead time and are where most problems
or rejects are encountered. He adds that critical processes are also the
visible activities or processes from a customer view (Melnyk 2000). Melnyk
(2000) is focused on manufacturing or production processes and critical
processes which are visible to the customer. These suggest problem
processes and processes which are difficult to control. These may lead to
failure or some form of failure of the process.

Authors of an article on pharmaceutical manufacturing, Seely, Hutchins,


Luscher, Sniff and Hassler (1999) define a variable as “a critical variable if its
operating range is near the edge of failure” (Seely, Hutchins et al. 1999).
Seely et al (1999) provide a linkage between critical processes and failure
similar to Melnyk (2000) who links critical processes to problems. It is
plausible to suggest that problems might also be a type of failure; thus both
authors consider a critical process to be a failed or failing process.

Seely et al (1999) further extended their definition by stating that a critical


step is “one that is difficult to control, usually because one or more critical
variables cannot be sufficiently engineered out of the process” (Seely,
Hutchins et al. 1999). Thus a critical variable might result in a critical step
which in turn identifies a critical process. As Melnyk (2000) also suggests a
critical process is one which is difficult to control, then difficulty in controlling
the process is the symptom, and the issue which makes the process critical
is the probability of failure of the process due to the lack of control. We might
then state that a process is critical because of a greater probability of failure
of that process.

Craig Huxley Page 41


A Member of the Centre for Information Technology Innovation

The use of Failure Mode Effects Criticality Analysis (FMEA) provides further
insights into the notion of criticality within the automotive industry. FMEA is a
quality improvement method initiated by the US army in the late 1940’s
(Kinetic 1999). The automotive industry has used this as the basis of their
quality improvement process until the development of ISO 9000 (Kinetic
1999). FMEA is commonly defined as “a systematic process for identifying
potential design and process failures before they occur, with the intent to
eliminate them or minimize the risk associated with them” (Stamatis 1995;
Kinetic 1999). The focus within the FMEA methodology is on failure and the
detection of failure.

Further evidence of this focus is seen in the FMEA Risk Priority Number
(RPN). The RPN is an assessment of the risk of probable undesired
outcomes of a production process which provides a rank order list of those
outcomes for which action is required. The undesired outcomes are the types
of failure that might occur in a design or production process. We might thus
postulate that within some industries a failed process, or a process that might
fail, is a critical process.

Nuclear Industry Critical Processes


The third type of critical process can be found in the nuclear industry. The
Nuclear Energy Agency in their guide the “ICSBEP Guide to the Expression
of Uncertainties” by the International Criticality Safety Benchmark Evaluation
Project (ICSBEP) studied the factors leading to failure (Poullot, Doutriaux et
al. 2001). One area in which the working group focussed was on
uncertainties in relation to defining criticality. Criticality in this industry is
related to safety. Failure in a process may lead to a safety issue of enormous
proportions, the Chernobyl disaster being an example of a failed process
where the process was critical to safety. Thus we can define critical
processes in this industry as those processes in which failure results in
unsafe or life threatening conditions. The nuclear industry assesses this
criticality of a process by assessing the amount of uncertainty relating to
when failure might occur (Poullot, Doutriaux et al. 2001).

Craig Huxley Page 42


A Member of the Centre for Information Technology Innovation

The manufacturing, production and nuclear industries do not link critical


processes to objectives or goals; instead they relate critical processes to
failure or failure avoidance (Stamatis 1995; Kinetic 1999; Seely, Hutchins et
al. 1999; Melnyk 2000; Walker 2000; Poullot, Doutriaux et al. 2001).

If our objective is to identify critical processes and rank order these critical
processes so that we can select the most important for re-engineering then
we should focus on those that are most critical. This then suggests that only
those processes which are most likely to fail should be considered critical.

If failure is related to critical processes, what is failure?

Seely et al. (1999) state that failure in a critical step is when the variation in
performance is outside of stipulated margins. They suggest that there is only
one category for failure, whereas the automotive industry defines failure
using five categories of failure. They state that in the FMEA methodology
failure falls into 1 of 5 possible failure categories:
1. Complete failure;
2. Partial failure;
3. Intermittent failure;
4. Failure over time; and
5. Over-performance of Function (Stamatis 1995; Kinetic 1999).

These categories of failure enable an organisation to define failure. The


reason for defining the type of failure is that failure can be considered to be
different as seen by the effect of that failure. For example, the effect of a
printer failing to operate in the middle of the night when it is not needed is
different to the effect of the printer failing to operate when a potential client is
waiting for an important document. As some failures are not critical we need
to define failure into some useful categories.

Craig Huxley Page 43


A Member of the Centre for Information Technology Innovation

Over Performing -Exceeds Need Utilises more resources than needed

Optimal Performance Operates within the target range

Sub- Optimal Performance Operates occasionally outside of the target


range

Failure Unable to achieve the target range

Figure 7- Categories of performance levels

Figure 7 above is a breakdown of the categories of performance related to


failure. Over-performance is considered a type of failure as more resources
are consumed than necessary. Sub-optimal performance is when the
process operates occasionally outside of the optimal range and failure is
when the process is not able to reach the output it is designed for.

Optimal Performance

Neighbourhood of
acceptable performance

Failure

Figure 8- Categories of performance model

Figure 8 reduces the types of failure to one area, suggesting that some
failure is really within the neighbourhood of acceptable performance as the
effect of the failure is minimal and thus not a reason for process
improvement.

We now know that failure is related to critical processes and if failure is


categorised, it is possible to define the effect of failure on the organisation.
To make use of this information we will need a method to assess the effect of
the failure, of a process, on the organisation.

McFarlan and McKenney (1983) introduce a concept called “dependency” in


relation to IT services which did not perform as intended. They suggest that

Craig Huxley Page 44


A Member of the Centre for Information Technology Innovation

the dependence of an organisation on a process is a separate issue to the


impact of a process on the strategies of the organisation (McFarlan and
McKenney 1983). That is, a company may not be absolutely dependent on
the uninterrupted and cost-effective functioning of a process to achieve either
long term or short term objectives. That the impact of the system is however
absolutely vital for the firm to reach its strategic objectives (McFarlan and
McKenney 1983) (p15). McFarlan and McKenney (1983) suggests that
dependency and impact are different issues. Their use of dependency is in
relation to the need for a system to operate optimally. Thus we might use an
organisation’s dependency on a process as an assessment of the effect of
that process failing.

We have seen that critical processes are related to the ‘impact’ of a process
on organisational objectives and goals, that this ‘impact’ is related to the
contribution of a process when its performance is considered optimal. We
have also seen that the effect of the failure of a process on an organisation is
also critical and can be assessed as ‘dependency’.

To understand fully the effect of failure we must also understand how often
the failure occurs. The occurrence of failure might be assessed as the
probability of failure of a process.

The result is that we now have two factors by which to assess failure;
1. The probability of process failure;
2. Dependency, the effect of failure of the process, on the organisation

Thus these two factors are also factors with which to assess how critical a
process might be. It might be said that the factors “probability of failure” and
“dependency” are related to the negative effect of a process and the factor
previously examined, impact, is related to the positive contribution of the
process on organisational objectives and goals.

Craig Huxley Page 45


A Member of the Centre for Information Technology Innovation

Summary
To summarise thus far, we now have three factors for criticality; which are the
1. Impact- the relative contribution of a process on the objectives, strategies
and goals of the organisation;
2. Probability of failure- the chance that a process will operate within the
failure zone
3. Dependency- the effect on the organisation of the failure of a process.

enycy
dnc
CRITICALITY
D
Dep ende
epen re
lu
ai
Im

fF
pa

.o
ct

ob
Pr

Figure 9- The three factors used to assess criticality

The three factors used to assess criticality are shown in the diagram above.
These three factors will allow us to identify a process as critical to the
achievement of organisational goals.

At the beginning of the literature review we listed four key issues which the
literature review should focus upon.
Q 1. What is the meaning or definition of important processes?
Q 2. What are the factors which are needed to identify an ‘important’
process?
Q 3. What are the factors which are needed to select an ‘important’ process
for improvement?
Q 4. How do we make the link between processes and goals?

In relation to key issue #1, a definition of an ‘important’ process would now


be: “An important process is one which has the greatest positive contribution
or negative effect on the organisation over time.”

Craig Huxley Page 46


A Member of the Centre for Information Technology Innovation

Key issue #2. Namely, what are the factors which are needed to identify an
‘important’ process? These factors are called impact, dependency and
probability of failure in this thesis.

The next section will examine the literature surrounding the selection of
processes for improvement in order to seek an answer to key issue #3.

Key issue #4, “how do we make the link between processes and goals?” has
not been achieved. We have yet to show how this link might be assessed
although we have already suggested that a BSC might provide the answer.

The next section will examine the issue of selection.

3.3 Process Selection

Thus far we have shown from the literature how we can derive the three
factors which make up what we term criticality. Our research objective is to
identify processes that are critical and suitable for process improvement. It is
insufficient to just identify critical processes, as not all processes, critical or
not, are susceptible to successful improvement. For this reason we now need
to understand how to select which of the most critical processes might be
successfully improved. In order to do this we should go back to the work of
Hammer & Champy (1993) who provided some useful advice on this matter.

Hammer & Champy concede that they provide little advice on how to make
re-engineering projects successful (Hammer and Champy 1993) (p216).
They do provide three directives for identifying processes to be re-
engineered. The third of these directives is that selection should be based on
feasibility, which is the processes most susceptible to successful redesign
(Hammer and Champy 1993).

Craig Huxley Page 47


A Member of the Centre for Information Technology Innovation

The intent of Hammer &Champy’s (1993) feasibility factor is concerned with


the “practicability of a [process improvement] plan” (Makins 1992). That is, is
the improvement of the process within the skills and resources of the
organisation? The meaning of ‘most susceptible to successful redesign’ is
part of the meaning of feasibility, but is more focussed on those factors which
might influence the success of the improvement project. The issue then is to
select processes for improvement based on the principles of project selection
(skills and resources): Does the project add value for the organisation and is
the organisation capable of completing the project successfully?

Choosing which of a number of critical processes to improve first may be


similar to selecting which IT project (or projects) a firm should undertake.
Selection requires that there is a way of comparing the different projects.

That is, it is much easier to select a project from within a group of projects if
you can define the benefits (both tangible and intangible), costs and the
ability of the organisation to complete it on time, within budget and with the
available resources (Weiss and Wysocki 1992; Meredith and Mantel Jr 2000;
Jewels 2001; Sisco 2002). We have now three factors which need to be
assessed in order to answer the question of which of the most critical
processes are the most feasible for improvement. These are:
1. Benefits to the organisation of improving the process;
2. Costs to the organisation to improve the process; and
3. Ability of the organisation to successfully improve the process.

Examining the benefits and costs factors we know that for a project to be
useful to an organisation it should have benefits which exceed the costs of
conducting the project (Weiss and Wysocki 1992; De Loof 1997; Davidson
and Griffin 2000; Meredith and Mantel Jr 2000). These comparisons are
commonly referred to as a cost/benefit analysis (Davidson and Griffin 2000).
A cost/benefit analysis will provide an organisation with a ranking of the most
critical processes sorted by those with a positive net benefit to the
organisation. This meets a portion of the feasibility factor provided by
Hammer & Champy (1993) (Hammer and Champy 1993). The remaining

Craig Huxley Page 48


A Member of the Centre for Information Technology Innovation

portion of feasibility is the ability of the organisation to successfully improve


the process. In order to assess this ability we should identify the probability of
successful process improvement.

The probability of successful improvement of a process is essentially a risk


assessment of the abilities of the organisation or project team to successfully
improve the process so that it will provide the suggested benefits within the
costs allowed for the project. This assessment should take into the account
the actual budget and resources provided for the project, the scope of the
project and the time frame in which the project should be completed. In
addition the assessment should consider the skills of the project team, their
historical successes in this type of project and the support of the project
provided by top management.

We have now found two factors for process selection (cost/benefit &
probability of success). Cost/benefit relates to the ‘business case’ used as
justification for choosing which project adds value to the organisation and
probability of success relates to the factors which contribute to the success of
completing the project itself rather than the justification for the project. A
business case is a formal document used to define the costs and possible
benefits of a project. This document can be used to compare projects
requiring funding (Weiss and Wysocki 1992; Thorp 1998; Davidson and
Griffin 2000; Meredith and Mantel Jr 2000; Viljoen and Dann 2000).

We thus have two further factors to use in a method for identifying and
selecting critical processes for improvement (cost/benefit and probability of
success). These two further factors should now allow an organisation to at
least identify and select processes for improvement.

Craig Huxley Page 49


A Member of the Centre for Information Technology Innovation

Co
Cos stenefit
t/B

ss
ce
Cr

uc
iti

fS
ca

.o
lit

ob
y
Pr
Figure 10- The two factors for selection added to criticality

In Figure 10 we have shown the two new factors, cost/benefit and probability
of success. Included is the combination of the previous three factors, impact,
probability of failure and dependency which make up what we have termed
criticality shown in the red triangle.

Figure 11 provides the five factors in one model. Criticality is shown in a red
triangle and is the rank ordering of the processes assessed for dependency,
probability of failure and impact.
1. Dependency is the effect of failure of the process;
2. Probability of Failure is the probability that a process will fail; and
3. Impact is the relative effect of a process on the strategies and business
goals of the organisation.

The top ‘few’ of the ranked processes are then assessed for the value they
might provide to organisation if they are improved. Of those processes
assessed for cost/benefit only, those having a positive value for the
organisation are assessed for the probability of successful improvement.
Process improvement projects with positive value for the organisation and
the greatest chance of successful improvement are the selected processes.
4. Cost/Benefit is the positive or negative result of comparing the expected
costs of improving a process against the expected benefits resulting from
improving the process

Craig Huxley Page 50


A Member of the Centre for Information Technology Innovation

5. Probability of Success is the assessment of the risk that the organisation


can successfully improve the process.

Co
Cos stenefit
t/B

s
es
cc
Cr

Su
iti
ca

of
ende
epen
Dep enycy
dnc

lit

.
ob
y
re

Pr
lu
ai
Im

fF
pa

.o
ct

ob
Pr

Figure 11- Five factors for identifying and selecting a critical process

The review of the literature has thus far identified the five factors which are
necessary for identifying and selecting a critical process for improvement.

Returning to our key issues we have now addressed the first three.
1. What is the meaning or definition of important processes?
2. What are the factors which are needed to identify an ‘important’ process?
3. What are the factors which are needed to select an ‘important’ process for
improvement?
4. How do we make the link between processes and goals?

As previously suggested the research team believed that a BSC would


enable an organisation to define the link between processes and goals. This
would provide the ‘impact’ factor for criticality. This will be examined later in
the literature review.

In order to provide an operationalisable method for targeting critical


processes for improvement we need to provide suitable methods or tools with
which a practitioner might assess the factors of the method.
1. Dependency, effect of failure of a process on the organisation
2. Probability of failure of a process

Craig Huxley Page 51


A Member of the Centre for Information Technology Innovation

3. Impact, relative contribution of a process on organisational objectives and


goals
4. Cost/benefit of the process improvement project
5. Probability of a successful process improvement project for that process

The following section will examine some of the possible tools/methods


available to the practitioner for assessing the five factors stated above.

3.4 Assessment of the Five Factors

The purpose of this section is to identify a suitable method by which a


practitioner might conduct the assessments of the five factors which make up
the targeting method. There are many standard and accepted tools used to
calculate some criteria for process selection such as cost/benefit and risk
analysis: these standard approaches are only briefly introduced.

Many organisations suffer from ‘paralysis by analysis’ when they attempt to


provide conclusive proof of the state of a situation or element (Dettmer 1998;
Hacker and Brotherton 1998; Dollinger 1999; Frenzel 1999; Meredith and
Mantel Jr 2000; Kaplan and Norton 2001). This can be as unhelpful and
disastrous as guessing at a state of being from limited knowledge and
experience (Meredith and Mantel Jr 2000).

As with research methods there are two categories in which assessment


methods might be considered:
1. Quantitative assessments- these are based on possibly complex statistical
data which is analysed to provide a more reliable result.
2. Qualitative assessments- these are based on historical data, thoughts and
impressions gained from experience and conversation. The analysis might
occur as conversation or debate and a single person’s thought.

Craig Huxley Page 52


A Member of the Centre for Information Technology Innovation

The sophisticated analytical methods have not been well researched or


applied in this thesis. From a more pragmatic view the qualitative approach is
favoured by business in order to avoid “paralysis by analysis”.

One practical method is the heuristic approach called “anchoring and


adjustment style heuristics” (Davidson and Griffin 2000). The approach bases
the assessment of process factor on historical data. This is the anchoring
portion of the heuristic style. The historical data is adjusted to suit the present
context (time and place) adding knowledge sourced since the original data
was provided. This is the adjustment portion of the anchoring and adjustment
style heuristic approach. The result is an assessment based on fact and
adjusted qualitatively. The approach consumes less time and resources but
is ultimately an approximation of what the assessment might be if a
quantitative approach was used (Davidson and Griffin 2000).

The following sections will provide an explanation of how the anchoring and
adjustment style heuristic approach might be applied to assess each of the
five factors in the targeting method (dependency, probability of failure,
impact, cost/benefit and probability of success).

3.4.1 Impact of a Process on Organisational Goals

Of the five factors, impact of a process on organisational goals is the factor


which will be used to assess the relative contribution of a process to
organisational objectives and goals. The contribution is assessed for the
process when the process is considered within the optimum performance
zone. We have previously shown that not all processes are equal and that
the impact of a process on objectives is variable. The variation arises due to
the context in which the process resides. That is, it is affected by the skills,
resources and knowledge of the people involved in the process and the
impact of other events outside of their control.

Craig Huxley Page 53


A Member of the Centre for Information Technology Innovation

Our fourth key issue is: How do we make the link between processes and
goals? We thus need a method by which we are first able to link processes to
objectives and goals and then able to assess the impact identified by each
link. This method will then provide a calculable assessment of the factor
impact.

Some of the following methods may be suitable for linking processes to


goals. The review of the literature was unable to find research that compared
and contrasted those methods that were promoted as management tools
which link business activities to strategies and goals. The following is a list of
possible tools which were considered by the research team when looking for
a suitable method of linking processes to objectives and goals:
1. The John Thorp DMR Approach (Thorp 1998)
2. The French ‘Performance Scorecard’ (Mendoza and Zrihen 2001)
3. Analog Devices Ltd’s ‘Corporate Scorecard’ (Kaplan and Norton 1992;
Kaplan and Norton 1996)
4. McKinsey's 7-S Framework (Peters and Waterman 1982)
5. The Goldratts Theory of Constraints (Mabin, Forgeson et al. 2001)
6. Kaplan and Norton’s Balanced Scorecard (Kaplan and Norton 1992)
7. The ‘Australian Business Excellence Framework’ (Australian Quality
Council 2001)
8. Strengths, Weakness, Opportunities and Threats (SWOT) analysis
9. Porter's Five-Forces Model (Porter 1980)

These methods assist in reducing complexity, identifying important


influences, showing trends from recorded data and formulating organisational
need. Methods 1, 2, 3, 4 and 5 were considered to be similar in intent to the
BSC while 7 and its related frameworks from Europe and the US were
considered to be used at a layer above, dealing solely with the measurement
of performance. The remaining methods (8 & 9) are considered to be applied
at a lower layer organisationally than the BSC. Figure 12 summarises these
nine different tools which may be useful for linking processes to goals, and
the three levels of hierarchy to which they are best suited.

Craig Huxley Page 54


A Member of the Centre for Information Technology Innovation

Measures achievement of
goals, not a strategy builder

Strategy development and


measurement using cause &
Business Excellence Framework effect

Improves the alignment of IT


Projects with IT strategies
Balanced Scorecard
Precursor to the BSC circa
1930’s
John Thorpe DMR Approach
Precursor to BSC for
Performance Scorecard measuring corporate strategy
achievement
Corporate Scorecard
Similar to BSC except uses 7
7-S Framework perspectives instead of 4 and is
based in strategy development
not measurement
Theory of Constraints
Generally aimed at problem
solving
SWOT Analysis
Assesses strength, weakness,
Porters 5 Forces opportunity and strength of
entity

Helps to describe business


environment

Figure 12-Description of alternative tools for achieving alignment

Figure 12 lists some possible approaches to linking processes to goals. It


shows the tools as fitting into three separate hierarchies, on the left, and a
brief explanation of why they should sit there, on the right. Each of the
methods in the central box used some form of heuristically based cause &
effect to uncover the linkages between items that were developed in that
method. It is this linking of items that is of use in assessing impact.

The BSC was selected for three reasons;


1. Well known in business circles (Ittner and Larker 1998; Lipe and Salterio
2000)

Craig Huxley Page 55


A Member of the Centre for Information Technology Innovation

2. Has good general acceptance within the business community and


academia. For example Harvard Business School, Siemens, IBM, British
Telecom, Mobil, Volvo, AT&T, Toyota Australia, Alcoa and Bayer are users of
the BSC method (Kaplan and Norton 1996).
3. Appeared to fulfil the needs of the targeting methodology. This would be
ascertained from the literature review and tested in the action learning case
studies.

The research project’s intention is to use the BSC to assess the impact of a
process on organisational goals and to provide the link between processes
and organisational goals.

The following sections examine the BSC in detail as the BSC is fundamental
to the successful identification of critical processes. A detailed review is
necessary to allay concerns that it can be a difficult method to implement,
since the research team will not be using it as it was originally intended.
Kaplan (1998) in a review of implementations of the BSC commented that
many failures to implement successfully were due in part to using the BSC
for something other than that which was originally intended (Kaplan 1998).

3.4.2 The Balanced Scorecard

In this section, we introduce the key concepts of the BSC required by our
objective to link processes to organisational objectives and goals. We close
by showing how the BSC links processes to objectives and thus to goals.

Robert Kaplan and David Norton introduced the Balanced Scorecard (BSC)
in 1992 as an improved method of measuring an organisation’s goal
achievement (Kaplan and Norton 1992). It was unique in that it removed the
then heavy focus of organisations on the financial measures of success, such
as profit and share price (Kaplan and Norton 1992).

Craig Huxley Page 56


A Member of the Centre for Information Technology Innovation

The introduced perspectives are;


1. Financial perspective
2. Customer perspective
3. Internal process perspective
4. Learning & knowledge perspective

Business goals are defined for each of these perspectives. For each goal, a
set of specific strategies is developed. For each strategy, a set of objectives
is defined and key performance indicators developed that measure progress
towards the attainment of the objectives.

A Balanced Scorecard is balanced when there are objectives identified for


each of the four perspectives. ‘Cause and effect’ linkages are shown
between each of these objectives and the selected strategies. This ‘strategy
map’ provides a logical picture of how to achieve the goals of the
organisation (Kaplan and Norton 1996a). The objectives or minor goals are
then provided with measures in order for the organisation to monitor progress
towards their achievement (Kaplan and Norton 1992; Kaplan and Norton
1996; Kaplan and Norton 2001).

In 1996 Norton and Kaplan authored a book ‘The Balanced Scorecard’


proposing the BSC as a method of aligning the strategies of an organisation
to their vision and mission. Norton and Kaplan stated that, "the BSC
translates an organisation's mission and strategy into a comprehensive set of
performance measures and management system" and that this
measurement is achieved and evaluated by assessing the “organisational
performance across four perspectives; financial, customers, internal business
processes, and learning and growth" (Kaplan and Norton 1996).

They suggested that their method was far more efficient than many
improvement initiatives such as total quality management (TQM), activity
based cost management and re-engineering as these initiatives fail to link the
organisations’ objectives to their strategies (Kaplan and Norton 1996). The
improvement of financial results is usually the main objective of organisations

Craig Huxley Page 57


A Member of the Centre for Information Technology Innovation

and the BSC allows this priority to remain while ensuring that the focus of the
business takes into account the intangible or ‘soft’ issues required for long-
term results (Kaplan and Norton 1996; Kaplan and Norton 2001).

The financial perspective is the traditional view of the organisation. The


objectives here are based around those traditional performance indicators,
which include, assessments of measures such as operating costs, return-on-
investment and shareholder value. Within the BSC this perspective is still
required as any business needs to be financially viable to remain in business.

The customer perspective looks at the objectives, which you as an entity


need to achieve in order for your customers to perhaps pay higher prices and
buy more products or services from you. Measures might be customer
satisfaction and retention, and repeat custom.

The internal process perspective looks at the processes of the organisation.


The objectives of the internal process perspective deal with those efforts to
improve or change processes to become more effective and efficient. It also
looks at production and processes, measuring performance in terms of time
to market, number of defects, new product development and information
system performance

The learning and knowledge perspective explores the effectiveness of


management in terms of measures of employee satisfaction and retention
and the completeness of employee skills required for tasks. These tasks are
usually involved with the internal processes of the organisation.

The next section discusses the development of measures within a BSC.

3.4.3 Performance Measures

The choice of performance measures is one of the most critical challenges


facing organisations (Ittner and Larker 1998). Even an excellent set of BSC

Craig Huxley Page 58


A Member of the Centre for Information Technology Innovation

measures does not guarantee a winning strategy, as not all long-term


strategies are profitable strategies. The BSC can only assist in translating a
company’s strategy into specific measurable objectives (Kaplan and Norton
1992).

The measures suggested by Kaplan and Norton were described as leading,


lagging and diagnostic indicators of performance used to evaluate progress
(Kaplan and Norton 2001). A good BSC should contain a mix of each type.
1. A leading indicator is one which provides an indication of future
performance.
2. A lagging indicator is an historical performance measure such as profit and
loss.
3. A diagnostic indicator is a measure which assesses the validity of a
leading or lagging indicator.

The use of leading indicators ensures that the organisation is able to alter
their activities in response to measures which indicate future problems in
achieving goals. This type of indicator is difficult to develop and may be
affected by many situations outside of the organisations control (Kaplan and
Norton 1996a; Kaplan and Norton 2001).

For example, a leading indicator for an Australian leather producing company


was a measure of the rainfall in the grain growing areas of the US. The cause
and effect applied to this measure was that low rainfall in the grain growing
areas would result in higher prices for grain and fewer grain fed cattle in ‘feed
lots’ in the US. Fewer grain fed cattle led to fewer cattle slaughtered and an
increase in imports of beef to the US market. As most beef imported to the
US came from Australia this would lead to larger numbers of cattle
slaughtered in Australia, thus resulting in a plentiful supply of skins for the
leather production company. The time frame was approximately six months
and meant that depending on the rainfall figures, the company would have
early notice (leading indicator) concerning the lack of or over supply of skins
in the Australian market in six months time.

Craig Huxley Page 59


A Member of the Centre for Information Technology Innovation

Diagnostic indicators associated with the leading indicator might be an


assessment of the relationship between rainfall and the cost of grain and the
number of US feed lot cattle and imports from Australia. If trade barriers were
raised by increasing the tariffs on Australian beef then the leading indicator
might be faulty.

Leading indicators can be very difficult to develop and thus create many
difficulties for organisations (Beauchamp 1999; Lipe and Salterio 2000).

Much has been said on the problems with defining and using measures
within the BSC (Brancato 1995; Kaplan and Norton 1996; Lingle and
Schiemann 1996; Abell 1999; Beauchamp 1999; Germain 2000; Lipe and
Salterio 2000; Nickols 2000; Stewart 2001).

Gendron (1997) states that the measurements used “might be large (100 or
more) and collected daily, weekly, monthly, quarterly or yearly” (Gendron
1997). Kaplan and Norton appear to contradict this, suggesting that
companies used too many measures for performance and that the BSC
“forces managers to focus on the handful of measures that are most critical.”
These were the “operational measures that are the drivers of future financial
performance” (Kaplan and Norton 1992). Stewart suggests limiting yourself to
20 measurements or fewer (Stewart 2001).

Of these indicators, financial data make up 1/3 or less (Gendron 1997). Ittner
and Larker (1998) suggest that the financial indicators are usually 50% of all
measures used. A figure such as this alludes to an imbalance in the
measures used by organisations (Ittner and Larker 1998).

Martinson, Davison and Tse (1998) state that there is an increased difficulty
in assessing the suitability of measures due to the confusion caused “by the
changing and individual objectives of an organisation” (Martinson, Davison et
al. 1998). Lipe and Salterio cite their study that revealed, only “common
measures affect the superiors' evaluations” (Lipe and Salterio 2000). This
concerned an organisation with a number of business units with both

Craig Huxley Page 60


A Member of the Centre for Information Technology Innovation

common and unique measures. The unique measures usually considered the
leading indicators or drivers were disregarded in cross business unit
evaluations (Ittner and Larker 1998). Norton and Kaplan also suggest that an
area lacking in knowledge is performance measures for staff (Kaplan and
Norton 1996) (p149).

There are considerable challenges for organisations implementing the BSC.


The complexity and uniqueness of each set of measures for each
organisation might be outside the level of expertise of those entrusted with
the task of defining measures. Managers state that this is one of the hardest
tasks they will tackle (Kaplan and Norton 2001).

Summary
This section described some of the problems associated with measures.
These included:
1. The development of suitable leading and diagnostic measures
2. The number of measures which should be used
3. The mix of leading, lagging and diagnostic measures
4. The difficulty of management in assessing measures which are unique
5. Lack of knowledge concerning staff performance measures; and
6. The difficulty found in assessing the correctness of measures.

These issues suggest that this part of the BSC is one of the most difficult to
develop and maintain.

The following section examines the facet of ‘cause and effect’ within the
BSC. This facet is particularly important to the targeting method as it is the
basis of the linking of processes to goals and integral to the assessment of
the impact of a process on organisational goals.

Craig Huxley Page 61


A Member of the Centre for Information Technology Innovation

3.4.4 ‘Cause and Effect’ Relationships

This section will examine the use of cause & effect within the BSC. Cause
and effect linkages within the BSC methodology offers the opportunity to first,
link processes to objectives and goals and, second, assess the impact of
these linkages in order to provide a rank order listing of those processes
which contribute the most to organisational objectives and goals.

Norton and Kaplan in their 2001 book ‘The Strategy-Focussed Organisation’,


show how the intangible areas of an organisation affect the financial results
(Kaplan and Norton 2001). This is developed using the ‘cause and effect’
method which explains how an intangible area such as customer satisfaction
is linked through customer loyalty to repeat customers to greater revenue
(Kaplan and Norton 1996a). It is this connection of objectives within the BSC,
which allows the user to visually see linkages relating the customer, internal
processes and learning and knowledge perspectives to the financial
perspective.

Norton and Kaplan state that a “strategy is a set of hypotheses about


cause and effect” (Kaplan and Norton 1996) (P30). This cause and effect
linkage is concerned with the ‘drivers’ behind a strategy. What are the
actions that impact upon the achievement of a given strategy?

Figure 13 is an example of cause & effect between the four perspectives.

Craig Huxley Page 62


A Member of the Centre for Information Technology Innovation

This figure is not available online.


Please consult the hardcopy thesis
available at the QUT Library.

Figure 13- Example of cause and effect for BSC (Kaplan and Norton 1996)

Figure 13 is read from bottom to top and provides a useful example of how
increasing the knowledge and skills of employees is the cause used to
improve the way employees use a process. Improved processes caused by
improved knowledge and skills will thus provide increased customer
satisfaction which might result in increased customer buying and thus greater
financial results. The example is a set of cause and effect relationships. Each
cause has effects which create other causes for further effects.

The balanced scorecard supplies the “essential strategic feedback system”


(Van der Zee 1999). “A business strategy can be viewed as a set of
hypotheses about cause-and-effect relationships” and Norton and Kaplan
(1996) suggest that by establishing short-term goals, or milestones, within
the business planning process, management might test the linkages between
strategy and performance measures.

Organisations successful in achieving performance goals and still not


achieving objectives, should understand that it is possible their performance
measures are not linked to the objective (Van der Zee 1999). Continuing this
link, Norton and Kaplan (1992) state in their original journal article that “not all
long-term strategies are profitable strategies” (Kaplan and Norton 1992).
Kaplan and Norton (2001) also state that strategy needs to be chosen with

Craig Huxley Page 63


A Member of the Centre for Information Technology Innovation

care as the BSC does not test strategies for correctness (Kaplan and Norton
2001).

Mitchell, Coles and Metz (1999) state that, over time, cause and effect links
can be correlated statistically and practitioners should remove measures of
incorrect propositions or propositions which are not linked (Mitchell, Coles et
al. 1999).

The cause and effect process within the BSC methodology will enable this
study to assess the impact of a process on organisational objectives,
strategies and goals. An example of this is shown in Figure 14 where cause
& effect linkages (arrows) are drawn showing the link between processes and
goals.

Goal

Strategy Strategy

Objective Objective Objective Objective Objective

Process Process Process Process Process Process Process Process


Figure 14- Example of processes linked to Goal

The issues with ‘cause & effect’ appear to be concerned with the validity of
the cause & effect links over time and that the strategies chosen are correct
for that organisation.

Norton and Kaplan suggest that the decision to use the BSC should be
based on whether the organisational unit has a “strategy to accomplish its
mission,” and that “Balanced Scorecards have been developed for complex
support functions, joint ventures and not-for-profit” organisations (Kaplan and
Norton 1996) (P301). The BSC method is not the only method which uses
cause and effect linkages. Mabin, Menzies, King and Joyce (2001) cite
Goldratts (1990) work on the Theory of Constraints which uses cause and
effect to provide a logical approach to problem solving (Mabin, Menzies et al.
2001a). Cause and effect is also used in the McKinsey’s 7-S Framework

Craig Huxley Page 64


A Member of the Centre for Information Technology Innovation

(Peters and Waterman 1982), the John Thorp DMR Approach (Thorp 1998),
the French ‘Performance Scorecard’ (Mendoza and Zrihen 2001), Analog
Devices Ltd’s ‘Corporate Scorecard’ (Kaplan and Norton 1992; Kaplan and
Norton 1996) and the ‘Australian Business Excellence Framework’
(Australian Quality Council 2001). Thus we can conclude that cause & effect
is a valid approach to linking objectives and goals and processes to internal
process perspective objectives.

3.4.5 Impact Factor Summary

In section 3.4 we have described the typical BSC as developed by Kaplan


and Norton (1992). We described the four perspectives that are typically
used in this BSC and also examined the issues within the literature
concerning the measures used in a BSC. It is these measures that appear to
provide the most problems for organisations using the BSC. We saw that the
use of cause & effect to link objectives to goals and objectives to other
business areas is common to many methodologies and considered accepted
practice. A review of the literature relating to implementation issues of the
BSC can be found in the appendix.

It is a common practice in establishing cause & effect to use a heuristic


approach to identify possible linkages (Kaplan and Norton 1992; Kaplan and
Norton 1996; Kaplan 1998; Martinson, Davison et al. 1998; Kaplan and
Norton 2001; Mabin, Forgeson et al. 2001; Mabin, Menzies et al. 2001a).
These links might then be assessed using the anchoring and adjustment
style heuristics method explained earlier. The user of the method would base
judgements on past experience and research. Using this knowledge, they
would assess the probable impact of a process on objectives and objectives
on strategies and strategy on the goal.

Thus far in the literature review we have provided a way of defining an


important or critical process, identifying a critical process using the factors
called dependency, probability of failure and impact. We then select which of

Craig Huxley Page 65


A Member of the Centre for Information Technology Innovation

the most critical processes to improve first, using two further factors called
cost/benefit and probability of success. We have examined the BSC method
and suggest that the cause & effect linkages that are developed in a BSC are
suitable for identifying the link between processes and organisational
objectives and goals. These outcomes have provided the answers to the four
key issues identified at the beginning of the literature review. They were:
Q 1. What is the meaning or definition of important processes?
Q 2. What are the factors which are needed to identify an ‘important’
process?
Q 3. What are the factors which are needed to select an ‘important’ process
for improvement?
Q 4. How do we make the link between processes and goals?

Our research objective is to identify processes that are critical and suitable
for process improvement. In order to achieve this objective we have in this
last section suggested that the anchoring and adjustment style heuristics are
used to define the linkages between processes and objectives and that the
same approach is taken to assessing the impact of these linkages.

The following sections introduce the approach which the research team
considers should be taken to assess the factors of dependency, probability of
failure, cost/benefit and probability of success.

3.4.6 Dependency

Using the anchoring and adjustment heuristic requires the practitioner to


base initial judgements on known facts and then assess the factor from this
point. The practitioner requires knowledge or evidence of the effects of failure
for the processes under assessment. This type of knowledge is commonly
experienced based. A ten point ranking system taken from FMEA
documentation can support the decision making by providing a logical list of
possible rankings and explanation of the ranking. Defining the level of effect
for the factor dependency is the adjustment portion of the heuristic approach.

Craig Huxley Page 66


A Member of the Centre for Information Technology Innovation

The rankings and their explanation are shown in Table 6 (Stamatis 1995)
(p451).

This figure is not available online.


Please consult the hardcopy thesis
available at the QUT Library.

Table 6- Effect of failure of a process - ranking guidelines (Stamatis 1995)

This table enables the user to identify possible rankings based on their
historical knowledge of the process and the state of the process currently.
The user applies a ranking with 1= least dependent and 10 = most
dependent.

An alternative is to use common risk assessment approaches such as that


suggested by the FMEA methodology. In this style of assessment each
characteristic of a product or process is assessed for possible failure modes
and these failure modes are then assessed for the effect of failure or in
FMEA terms ‘severity’ (Stamatis 1995; Kinetic 1999). This is a complex and
prolonged process but is designed to assess the risk of failure for production
processes and products. Applying it to business process would result in a
more reliable result but a far greater drain on resources.

The anchor and adjustment heuristic approach is proposed by the research


team as the approach of choice. It is based on historical facts where possible
and takes further refinement from the ‘effect of failure ranking guidelines’.
This approach is also far less time consuming and may result in a similar
outcome as a user taking the FMEA approach or a similar approach.

Craig Huxley Page 67


A Member of the Centre for Information Technology Innovation

The next section describes the assessment approach for the probability of
failure factor.

3.4.7 Probability of Failure of a Process

As with dependency the factor ‘probability of failure of a process’ can be


assessed in a number of ways. The suggested approach for this factor is the
anchoring and adjustment style heuristic.

The user should undertake to discover the historical performance of each


process and the number of occurrences of failure (Davidson and Griffin
2000). From this base or anchorage point the user is then able to predict the
probability of failure for each process taking into account the variable
elements of the environment; that is, elements such as the skill levels of
employees and usage of the process. The rating of failure would use a scale
similar to that used for dependency where 1 = least probability of failure and
10 = greatest probability of failure. The scale refers to the possible number of
failures that might occur when using that process and not the category of
failure. The category of failure or the effect of each failure is assessed using
the dependency factor.

Using categories with terms that would possibly suit business processes
would provide the user with a greater knowledge of what to deem failure. In
Figure 15, optimal performance is that area of performance in which the
process is expected to perform. If the process utilises more resources than
expected or provides greater output than needed it would then be considered
an over performing category of failure. This type of failure is similar to sub-
optimal performance in that both types of failure may be considered
acceptable states within an organisation. While making their assessments of
failure, users need to consider that some performance such as sub-optimal
performance may not be considered failure as this type of failure might be
considered within the neighbourhood of acceptable performance.

Craig Huxley Page 68


A Member of the Centre for Information Technology Innovation

Over Performing -Exceeds Need Utilises more resources than needed

Optimal Performance Operates within the target range

Sub- Optimal Performance Operates occasionally outside of the target


range

Failure Unable to achieve the target range

Figure 15- Example of the categories of performance.

When failure is defined, then the user is able to assess the probability of that
failure occurring based on the historical data available, possible changes to
the process since the historical data was gathered and their assessment of
the chance of failure in the future.

Another approach to the assessment of failure is to use scenario analysis


(Viljoen and Dann 2000) (p227-230). In this approach each process would be
considered within many different possible scenarios. The users are required
to consider each facet of the environment: political, economic, socio-cultural
and technological. The probability that the scenario might occur is combined
with the probability of a failure occurring during that scenario. The negative
aspect of this approach is that the user needs to predict all possible likely
scenarios in order to arrive at a reliable assessment. The anchoring and
adjustment approach also attempts to predict the future but bases the
assessment on historical data.

To assess the probability of failure of a process we recommend the use of


anchoring and adjustment style heuristics. The user should first define their
view of failure and then based on historical data and taking into consideration
the effects of their present and future environment assess the probability of
failure.

The next section describes the assessment approach for the cost/benefit
factor.

Craig Huxley Page 69


A Member of the Centre for Information Technology Innovation

3.4.8 Cost/ Benefit

The factor cost/benefit can be assessed in a large variety of ways. The


assessment of cost/benefit is a widespread financial analysis undertaken by
most organisations. The research team suggests that a practitioner wishing
to assess cost/benefit utilise the approach which is recognised within their
organisation. The use of the anchoring and adjustment heuristic in this
assessment is not recommended in light of the many accepted cost/benefit
analysis methods available which provide accurate and valuable results.

There are a considerable number of cost/benefit analysis or investment


analysis methods for the practitioner to choose from, such as:
1. Accounting rate of return;
2. Net present value (NPV);
3. Return On Investment (ROI);
4. Economic Value Added (EVA);
5. Internal Rate of Return (IRR); and
6. Return On Capital Employed (ROCE) (Weiss and Wysocki 1992; Nel
1997; Carnegie, Jones et al. 1998; Botten and McManus 1999; Mah 1999;
Mankiw, King et al. 1999; Meredith and Mantel Jr 2000).

In most assessments of cost/benefit it is first necessary to define the project


scope. The scope of a process improvement project relates to the project
plan. The project plan identifies the scale of the project and the time frame in
which the project should occur. When scope is defined it is then necessary to
identify the resource (human & equipment) components, which both
contribute to the cost of the project (Weiss and Wysocki 1992; Meredith and
Mantel Jr 2000; Jewels 2001). We then have a basis for calculating cost.

Carnegie, Jones, Norris, Wigg and Williams (1998) provide a useful guide to
cost/benefit analysis and suggest that “both the tangible and intangible
benefits” (p823) should be included in any analysis (Carnegie, Jones et al.
1998). The intangible benefits are those attributes which are difficult to
quantify such as goodwill and morale. These benefits, including tangible

Craig Huxley Page 70


A Member of the Centre for Information Technology Innovation

benefits such as increased productivity and increased profit are assessed


and compared to the costs of the project.

A result which shows greater net benefits than costs is considered to have a
positive cost/benefit and noted for further consideration. Positive cost/benefit
processes then need to be assessed to ascertain the probability that they can
be successfully re-engineered. Those critical processes with high probability
of successful improvement and positive cost/benefit are those which should
be improved first.

The next section describes the approach to assessing the last factor for the
targeting method, probability of successful improvement of a process.

3.4.9 Probability of Successful Improvement of a Process

The previous assessment (cost/benefit), provides a ranking of processes with


positive cost/benefit which should then be assessed for the likelihood that
they can be successfully re-engineered. Thus, this section deals with the last
assessment activity for the targeting method, that of assessing the probability
of the organisation successfully conducting the process improvement project.
It is essentially a risk assessment of the process improvement skills of the
organisation or project team.

The use of the anchoring and adjustment style heuristic approach is suitable
for assessing this factor (Davidson and Griffin 2000).

The work of Sedera (2001) is able to support the anchoring and adjustment
process by providing a list of factors that impact on the likely success of a
process modelling project (Sedera 2001). The factors applicable to this
assessment are;
1. Team Orientation;
2. Project Management;
3. Management Support;

Craig Huxley Page 71


A Member of the Centre for Information Technology Innovation

4. User Participation;
5. Project Championship; and
6. Communication.

Sedera (2001) has based her work on many of the well supported
frameworks of assessing success in the IS literature. Including: De Lone and
Mclean (1992), Garrity and Sanders (1998), Ballantine, Bonner, Levy, Martin
Munro and Powell (1998), Ishman (1998), Myers, Kappelman and Prybutok
(1998), Seddon (1997), and Jennex, Olfman, Panthawi and Parl (1998).

Suitable historical data: would be the success or failure of previous projects


and the impact of each of the Sedera (2001) factors on their success or
failure. The information would then be used for the anchoring phase of the
heuristic. The adjustment phase might also use the Sedera (2001) factors to
provide the user with points of focus in which to judge the possible success
or failure of the processes being assessed. The user is able to adjust the
original data they have collected by using their knowledge and experience
with the members of the organisation to provide an assessment of the
capabilities and support which might be found to influence the success of the
project.

A scale of 1 = least probability of success and 10 = most probability of


success is also appropriate for the ranking.

Alternative approaches based on that used by FMEA practitioners would be


to weight each factor (taken from Sedera, (2001)) and derive a score or value
for each project based on the factors (Stamatis 1995). Each factor would
need to be assessed individually and provided with a rating. When this is
complete each factor would then need to be provided with a weighting if the
user did not consider the impact of each factor to be equal. The final rating
for each process would be the mathematical product of the ratings and
weightings.

Craig Huxley Page 72


A Member of the Centre for Information Technology Innovation

The research team recommends that the anchoring and adjustment style
heuristic approach is considered except in the case where the process
improvement project is so large that it demands an in-depth assessment of
the project probability of success.

The following section is a summary of the literature review and the


conclusion of the chapter.

3.5 Summary of Literature Review

This literature review reveals a gap in the area of identifying those processes
which must be focussed on for business process improvement projects. Most
research in this area of the literature tends to be high level and lacking
sufficient depth to direct the user in a practical approach to selecting critical
processes. Defining “critical”, “criticality” and “important” is found
predominantly in the manufacturing and engineering literature. The BSC
literature reveals consensus in the area of linking objectives to strategy and
goals.

The first section provided the criteria for identification and selection which is
the basis of the targeting methodology. Then the literature review identified
some of the approaches a practitioner might use in assessing the five factors
for the targeting methodology. There are many standard and accepted tools
used to calculate some of the factors and a user should understand that the
targeting method provides a level of information suitable for making business
decisions. The quality of this information is an improvement on the current
approach to selecting processes for improvement.

The following chapter describes the targeting methodology. The description


is that which the research team used in the action learning cycle.

Craig Huxley Page 73


A Member of the Centre for Information Technology Innovation

4 The Targeting Methodology

This chapter explains the steps in the process which are used to identify and
select critical processes for improvement. The chapter describes each step
required for implementation of the targeting methodology as a step by step
approach. Finally, the chapter provides a multi-level text based model of the
targeting method which is the intended approach that the research team
used for the pilot study. This multi-level text based model is used in each of
the case studies to identify changes to the method during the action learning
phases.

This research project does not intend to validate the process over time;
rather, it aims to provide a valid implementation plan for the process and to
improve both the targeting methodology and the implementation through the
cycles of action learning.

The targeting method has five factors, which is two more than Hammer &
Champy (1993) and Davenport (1993). (See Table 7 for a comparison)

Targeting Impact Prob. Of Dependency Prob. of Cost/


Method Failure Success Benefit
(Improved
Approach)

Hammer & Importance for Dysfunction Feasibility of


Champy customer successful redesign
(Hammer and
Champy 1993)

Davenport Develop Vision Prioritise according to Focus on Importance


(Davenport & Process Urgency or those that conflict
1993) objectives with vision

Table 7- Comparing Targeting factors with Davenport (1993) and Hammer & Champy (1993)

It can be argued though, that Hammer and Champy’s work does comprise
the targeting method’s five factors, dealing with the two main areas of
defining criticality and then selection (Hammer and Champy 1993).
Davenport’s factors focus on defining both the positive and negative impacts

Craig Huxley Page 74


A Member of the Centre for Information Technology Innovation

of a process alluding to the development of linkages from process to


objectives and goals (Davenport 1993). Davenport (1993) suggests that
selection might be based on importance and conflict with vision. In the
literature review it was shown how the operationalisation of the need to
identify and select critical processes for improvement led to the targeting
method.

The following sections will firstly describe briefly the five factors of the
targeting method and how they are combined to provide a rating or value for
each process under consideration. It will then describe the intended
approach of the research team in implementing the targeting method for the
first action learning pilot case study.

4.1 The Steps for Business Process Improvement Targeting

This section will describe the targeting method on a high level and then
provide a description of the basis of the five factors which make up the
targeting method. Then we will describe the steps which the research team
intend to use when implementing the method. The targeting method has two
main parts, ‘identifying critical processes’ and the ‘selection of which
identified critical processes to improve’. This can be seen in Figure 16 as
broken into the two triangles of three factors each.

stenefit
sto/B
CoC
s
es
cc
Cr

Su
it i
ca

of

Dep
De peen ncy
ency
ndde
li

.
ob
ty
re

Pr
lu
ai
Im

fF
pa

.o
ct

ob
Pr

Figure 16- Diagram showing the two main areas of the Methodology; Criticality and Selection

Craig Huxley Page 75


A Member of the Centre for Information Technology Innovation

The first triangle (bottom left hand side) in Figure 16 is made up of three
factors;
1. Dependency; the effect of failure of a process on the organisation
2. Probability of failure; the chance that a process will fail
3. Impact; the relative contribution of a process on organisational goals.
These three factors combine to provide a rank order of processes, with the
processes having larger combined assessments for the three factors being
considered as critical. We have termed this combination criticality. It is shown
within the red area of the second triangle.
The second triangle on the right hand side contains criticality and the two
factors concerned with selecting which of the identified ‘critical’ processes
should be improved upon first;
4. Cost/benefit; the analysis of the costs and benefits of improving the
process
5. Probability of success; the chance that the process improvement project
will be successfully completed

Once criticality is defined we are left with a rank order of processes, with the
most critical at the top and least critical at the bottom. The most critical
processes are then assessed for cost/benefit in order to ascertain if
improvement will result in a positive cost/benefit. Those processes with a
positive cost/benefit are then assessed for the probability that the
organisation or project team can successfully improve the process and
achieve the expected benefits. The assessments for criticality, cost/benefit
and probability of successful improvement are then combined to allow the
organisation to make a business decision on which process to improve first.

The diagram on the following page is a representation of the ten step


targeting method in process model form. Each step is indicated by a dotted
horizontal line and a number in a circle. Some steps have more than one part
and these are shown by the rectangular boxes in the centre. The inputs for
the model are on the right and the outputs of the model are on the left.

Craig Huxley Page 76


A Member of the Centre for Information Technology Innovation

1 Prepare Terms and


Explanations of documents definitions
terms and process
Explanation of
process
Introduce the project

2 List of Identify Processes Reference model of


processes ASP service
delivery

Assessments of Assess Dependency


3 the effect of
failure

Assessment of Assess Prob. of Failure


4 the chance of
failure
Strategic plan
Map showing Develop BSC
5 Goals, strategies
and objectives

Map showing Define cause & effect


cause & effect
linkages

Map showing Link Processes to Objectives Document listing


processes linked to processes and
objectives objectives

6 Calculate the Assess impact


totals for impact

7 Rank order of Calculate Criticality


critical process Assessments for
Dependency, Prob. of
Failure and Impact

8 List of processes Assess Cost/benefit


with positive
cost/benefit

9 List of processes with Assess Prob. of Success List of processes


positive cost/benefit and with positive
high probability of cost/benefit
success

Rank order of Select processes for


10 selected process improvement Combined list of the
improvement assessments for
projects cost/benefit, prob. of
success and criticality

Figure 17- The ten step targeting method

Craig Huxley Page 77


A Member of the Centre for Information Technology Innovation

We will use the following sections to explain each of the ten steps in sufficient
detail to enable a practitioner to implement the method.

4.1.1 Step One -Preplanning

Before any project, the project needs to prepare introductory documents for
the participants. For the targeting method, the research team had prepared
documents explaining the targeting method, the BSC and the terms that
might be used in the project. In addition to these documents there should
also be an agenda document for each meeting and research into the
participant organisation to provide the project team with a reasonable
background to the organisation.

4.1.2 Step Two –Defining Scope and Introduction

There are two parts within step two. They are the introduction of the project
to the participants and the scoping of the project.

Introduction of the Project


The participants are provided with the explanatory documents developed in
step one and given an overview of the process that will be undertaken to
implement the targeting method. It is similar in detail to that provided in
Figure 17. We used a power point presentation for this description, if
available, and printed slides otherwise. The level of detail is dictated by the
needs of the participants and their experience. The explanation should also
include a description of how the BSC is used within the targeting method.

The next part is to define the scope of the project. Scoping includes decisions
concerning the area of the organisation in which the method will be
implemented and identifying the processes to be assessed. Some
organisations will not have the resources to consider the processes with the

Craig Huxley Page 78


A Member of the Centre for Information Technology Innovation

entire organisation and may focus on a particular department or service.


Within this now smaller scope it is possible to define the processes that will
be assessed.

To do this it is necessary to firstly identify the processes that are used in the
entity at a high level.

Identifying Processes
The research team used a high level model (Reference Model of ASP service
delivery) for this task, which initially provides those processes which would
be expected in a typical ASP. The Reference Model of ASP service delivery
shown in Figure 18 depicts a view of the major processes within an ASP
business. Using this reference model organisations’ can add or remove
processes to provide them with a representation of their own environment.

This figure is not available online.


Please consult the hardcopy thesis
available at the QUT Library.

Figure 18-Version 1 Reference Model of ASP service delivery (Taylor 2002)

Craig Huxley Page 79


A Member of the Centre for Information Technology Innovation

Each rectangle within the model represents a high level process for many
Application Service Providers. Using the model the organisation can identify
the processes being used in their business. The model helps to initiate
discussion and provide a starting point for the entity to consider their
processes. Processes can be added or removed and names of processes
can also be altered to suit communication within the organisation.

Another approach is for the entity to undertake this task using their
knowledge of the business. They may wish to compile a list of the processes
without using a model.

It is possible that both approaches to identifying processes may result in


some processes being forgotten. Any processes missed during this step will
either be identified at a later stage or are considered of little criticality to the
entity. An entity is an organisation, department, business unit or group.

With the processes identified, the scope of the project recorded and the
explanation of the targeting method and terms complete, we can then move
to step three.

The assessment of the factors which make up criticality can be undertaken in


any order without altering the outcome of the assessment. That is, the
assessment for dependency can be conducted before, after, or in parallel
with the assessments for impact and probability of failure. The practitioner
should choose the order of conducting the assessment to suit the entity
needs.

4.1.3 Step Three: Assessing Dependency

This section outlines the approach taken to conducting step three, the
assessment of dependency.

Craig Huxley Page 80


A Member of the Centre for Information Technology Innovation

Dependency is defined as the effect of failure of a process on the


organisation. The factor dependency is measured by assessing the probable
effect of the types of failure of a process on the organisation.

To calculate ratings for dependency an organisation can use a number of


techniques as explained in the literature review; for example, anchoring and
adjustment style heuristics and the FMEA methodology severity assessment.

The first task in this step is to agree on the type of assessment that will be
undertaken. The research team, with the agreement of the case study
participants, chose the anchoring and adjustment style heuristics as more
appropriate to the needs of participants and the research team. A major
influence in this decision was the amount of time available to the participants
and also the complexity of possible alternatives.

Using the anchoring and adjustment heuristic requires the practitioner to


base initial judgements on known facts and then asses the factor from this
point. The practitioner requires knowledge or evidence of the effects of failure
for the processes under assessment and if these effects will be similar in the
future. When using anchoring and adjustment style heuristic decisions, input
by a group of people may be an improvement over that of a few people. A
consensus result can be used or if this is not possible then an average or
mean can be applied.

A ranking system with 1 = least effect and 10 = most effect is used by the
participants in assessing each process. Participants can apply their own
criteria to the rankings or use the rankings found in the FMEA methodology.

Craig Huxley Page 81


A Member of the Centre for Information Technology Innovation

The rankings and their explanation are shown in Table 8 (Stamatis 1995)
(p451).

This Table is not available online.


Please consult the hardcopy thesis
available at the QUT Library.

Table 8- Effect of failure of a process - ranking guidelines (Stamatis 1995)

These rankings from the FMEA methodology use a ten point scale with an
explanation of the type of effect that would be expected for each ranking. The
participants might adjust these rankings by changing the criteria for each
effect to suit their environment.

The participants then take the list of processes identified in step two and
assess the probable effect of failure of the process on the organisation for
each process. Each process from the list will then have a ranking of between
1 and 10. A table listing the processes down one side and the rankings in a
column can be used to record the results as shown in Table 9.

Process Dependency Probability of Failure Impact Criticality

Process name Ranking result

Table 9- Example of table used to record assessment results

Within step three we have agreed the approach to be used for the
assessment of the factor, made any adjustment to the approach to suit the
Craig Huxley Page 82
A Member of the Centre for Information Technology Innovation

organisation and assessed each of the processes from the list of processes
generated from step two. The results of these assessments are then
recorded in a table listing all the processes and their result, using a scale of
one to ten, with ten being most effect and one being least effect. With this
task complete we turn to step four, the assessment of probability of failure.

4.1.4 Step Four: Assessing Probability of Failure

In this step we assess the probability or chance that a process will fail. The
factor ‘Probability of Failure’ is defined as the chance that a process will fail.
Step four uses a similar approach to that seen in step three, that is, the first
task is to agree the method by which the assessment will be conducted.

The anchoring and adjustment style heuristic is the suggested approach to


this assessment.

The participants should also agree to a definition of failure and the model
below might be a good starting point.

Optimal Performance

Neighbourhood of
acceptable performance

Failure

Figure 19- Categories of performance model

Figure 19 suggests one type of failure, an area of optimal performance and


area between optimal and failure in which performance is not optimal but is
still considered within acceptable levels. Sub-optimal performance and over-
performance is a category of failure and might be considered to be within the
neighbourhood of acceptable performance as the effect of the failure is

Craig Huxley Page 83


A Member of the Centre for Information Technology Innovation

minimal and thus not a reason for process improvement. When the
participants are agreed on how they will identify failure they can then locate
historical data on the processes being assessed which have failed. This
information will anchor the heuristic in fact and the next step is to use current
experience to adjust the original data taking into account changes in the
environment. Variable elements of the environment such as the skill levels of
employees and usage of the process will influence the outcome of the
rankings.

A ranking system similar to that used for dependency is applied for this
factor. 1 = least probability of failure and 10 = most probability of failure.
Participants can apply criteria to each ranking; for example, 1 = never fails
and 10 = fails more than five times a day

The participants then assess the same processes as were assessed in step
three by recording a ranking in the table used in step three. We will then have
assessments for dependency and probability of failure in the table (Table 10).

Process Dependency Probability of Failure Impact Criticality

Process name Ranking result Ranking result

Table 10- Example of table used to record assessment results

Step four has seen the agreement of participants for the method of
assessment and the categories of failure which will be used to define the
number of probable failures that might occur. We have explained the
anchoring and adjustment style heuristic approach as it applies to this
assessment and how the assessment might be recorded.

The next step in the targeting methodology is the development of a BSC in


which we can identify the linkages of cause and effect between the
processes identified in step two and the objectives, strategies and goals
developed for the BSC. These links are then assessed for their relative
contribution to the organisational goals (impact)

Craig Huxley Page 84


A Member of the Centre for Information Technology Innovation

4.1.5 Step Five: Developing a Balanced Scorecard

The third factor (impact) is calculated by first developing a partial Balanced


Scorecard (BSC). It is termed ‘partial’ as there is no need for measures to be
identified. The BSC is used to identify objectives, strategies and goals and
then to identify the cause and effect links which are assessed for their
contribution to organisational goals. Once the partial BSC is completed it is
possible to show linkages between the identified processes and the internal
process objectives within the BSC. A ‘Map’ showing these linkages is then
used as an aid in the assessment of each linkages contribution or ‘impact’. In
this way an assessment of impact is calculated.

Goals Strategies Objectives Cause & Effect

To develop a BSC the organisation should first define their goals. There may
be existing formal goals available within the entity and these may be used. In
many cases the development of a BSC is the initiator for the review or formal
description of the goals of the entity. Once goals are described, it is then
necessary to devise strategies which will provide goal achievement. With
strategies in place, the next step is to develop objectives or minor goals.
These objectives are the activities which need to occur for strategies to be
successful.

Kaplan and Norton (1992, 1996 & 2001) say that organisations should use
perspectives in order to ensure that all relevant views of an organisation are
taken into account. This is what the balance is concerned with in the
balanced scorecard (Kaplan and Norton 1992; Kaplan and Norton 1996;
Kaplan and Norton 2001). Perspectives can be the original four perspectives
used by Kaplan and Norton (1992); financial perspective, customer
perspective, internal process perspective and learning and growth
perspective (Kaplan and Norton 1992).

Craig Huxley Page 85


A Member of the Centre for Information Technology Innovation

The research team have used different names for some perspectives and
also added a fifth perspective within some organisations. The rule here is to
use appropriate names and further perspectives as an aid to communication
within the entity and to provide a balanced view for developing objectives
(Huxley, Taylor et al. 2002a). Almost all managers today understand the
need for a view of the organisation which is broader than the purely financial.

The final step in developing this partial BSC is to define the cause & effect
relationships between the objectives, strategies and goals. This is usually a
‘rule of thumb’ or heuristic approach in which the developers of the BSC ask
themselves’ if there is an effect by a particular objective on the other
objectives, strategies or goals. This can be shown by using arrows and in
some software packages the arrows have different thicknesses to define the
strength of effect (Scheer 2000). For the targeting method it is only
necessary at first to show the cause & effect relationships, not to value them.

Once the cause & effect analysis is complete, it is time to link processes to
internal process perspective objectives. The research team considered that
the internal process perspective objectives should be the only objectives
linked to processes. We considered that there was a logical link between
internal process perspective objectives and processes, and could not find
guidance from the literature due to the innovative nature of the method. The
case studies indicate that this view may not be useful for all BSC’s.

The internal process perspective objectives are then linked to the original list
of identified business processes, developed in step two, by using the same
approach to cause and effect. The entity asks – which processes have an
impact on which internal process perspective objectives? A single process
may have relationships with many objectives.

An example of cause and effect relationships for processes to internal


process perspective objectives can be seen in Figure 20. Processes are in
red text and these processes have cause & effect relationships with the
internal process perspective objectives which are identified in green text.

Craig Huxley Page 86


A Member of the Centre for Information Technology Innovation

Processes used in a typical ASP


In c id e n t R e p o rt in g
Internal process perspective R e p o r t in g ( In te r n a l)
Objectives within a typical ASP P r o b le m M a n a g e m e n t
In c re a s e u t ilis a t io n o f lo w c o s t R e le a s e M a n a g e m e n t
s o lu t io n p r o c e s s e s C hange / E nha ncem ent M a nagem e nt
H e lp D e s k
5
C o n fig u ra t io n
6 M a n a g e C a p a c it y
M o n it o r S e r v ic e L e v e ls

Q u a lit y P r o c e s s e s
R e le a s e M a n a g e m e n t
R e p o r t i n g ( In te r n a l)
R e d u c e n o n -p r o d u c tiv e t im e
H e lp D e s k 4
P r o b le m M a n a g e m e n t
M o n it o r S e r v ic e L e v e ls
R e d u c e u n it c o s t s H e lp D e s k

Financial perspective
P r o c e d u u r a lis e e x is t in g
in f o r m a l p r o c e s s e s
In c id e n t R e p o r t in g
Q u a lit y P r o c e s s e s
1
objective within a typical ASP P r o b le m M a n a g e m e n t

In c id e n t r e p o r t in g
D e v e lo p S u p p o rt Q u a lit y p r o c e s s e s
te a m m o d e l

R e d u c e a c t iv it y c o m p l e x it y
P r o b le m M a n a g e m e n t
R e le a s e M a n a g e m e n t 2
In c id e n t R e p o r t in g
M o n it o r S e r v ie L e v e ls
P r o b le m M a n a g e m e n t
R e le a s e M a n a g e m e n t
Q u a lit y P r o c e s s e s
B illin g (O u t p u t ) 3
C h a ng e / E nh a nc e m e nt M a na g em e n t

Figure 20-Example of processes and or objectives which needed to be considered together

Figure 20 is a portion of a BSC ‘map’, taken from one of the case studies, in
which the processes from the reference model for ASP service delivery are
shown in red text. These have been linked, using heuristics, to the internal
process perspective objectives shown in green text. The blue text on the far
left indicates a financial perspective objective. The blue boxes in the map are
inserted to show the processes and objectives which are taken into
consideration when assessing the relationships between processes and the
objective to the left. The relationships are the contribution of the processes in
assisting the achievement of the objectives. For example; in box one (1)
there are four red text processes relating to one internal process perspective
objective.

Craig Huxley Page 87


A Member of the Centre for Information Technology Innovation

The use of heuristics is the common approach to identifying relationships


between the elements of a BSC. The participants identify these relationships
using knowledge gained from experience and research. Ideally the decision
should be based in fact and adjusted to suit the context of the organisation.

The diagram shown in Figure 20 is developed using Mind Jet software called
‘Mind Mapper’. The research team sought an improved approach to
visualising the linkages within a BSC and found that this software, while not a
100% solution, was far more useful than other approaches it had tested or
studied. The software essentially uses a branch network to visualise linkages
and within the diagram further linkages can be added using U shaped
arrows. Colour is easily added as well as icons and ‘floating’ text (text not
attached to a branch). These elements provide a map that offer a number of
levels of communication, reducing the complexity of a BSC. The case studies
relate the story of the introduction of this software.

In step five we have developed a BSC to identify the goals, strategies and
objectives of an organisation. We have then identified the cause and effect
linkages within this BSC and shown them visually. The last task of this step is
to identify the processes which relate to the internal process perspective
objectives and add these to the visual map being developed.

When the relationships within this partial BSC are completed it is now
possible to take the next step of providing an impact rating to these
relationships.

4.1.6 Step Six: Assessing Impact

In step six we assess the impact factor, which is defined as the relative
contribution of a process on organisational objectives and goals. In order to
measure this factor we assess each of the cause & effect linkages developed
in the BSC from step five. These linkages start with the processes and end at
the goal or goals by following the branch structure. The linkages are

Craig Huxley Page 88


A Member of the Centre for Information Technology Innovation

calculated for each branch along which they impact and then added where a
process impacts more than one objective. The calculation is explained later
in this section.

This approach is similar to those used in many BSC software systems, where
relationships are classified as high, medium and low impact (CorVu 1999;
Scheer 2000). In our methodology the user is asked to provide a percentage
score of the impact of the focal process in relation to all the other processes
or objectives which are shown to impact on an individual objective.
Figure 21 uses blue rectangular boxes to indicate how this works.

Help Desk 30 %
Procedularise existing informal
processes
Incident reporting 20% 1
Quality Processes 20 %
15 %
Problem Management 20 %

Incident Reporting 35 %

Develop support team model


Quality Processes
15 % 2
Problem Management 25 %
30 %
Release Management 25 %
Reduce activity complexity
Incident Reporting 5%
40 %
Monitor Service Levels5 %
Problem Management 10 %
Release Management 10 %
Quality Processes 5%
3
Billing (Output) 10 %
Change/Enhancement Management
10 %

Figure 21- Portion of the previous example, of a BSC Map, showing impact assessments

The map is read from right to left with the processes identified by the
organisation shown in red text and to the right of the map and the objectives
they link to in green text to the left of the map.

Box 1 at the top of diagram has four processes, help desk, incident reporting,
quality process and problem management, which all have been assessed as
impacting on the objective to its left, proceduralise existing informal
processes. Proceduralise existing informal processes, develop support team
model and the seven processes within box 3 are all assessed as impacting

Craig Huxley Page 89


A Member of the Centre for Information Technology Innovation

upon the objective “reduce activity complexity”. The rating for impact is
shown to the right of each process and objective.

The ratings given to each of the processes or processes and objectives


within a blue box (1,2 & 3), should not in total add to more than 100% but can
be less (box 1 shows impact of 90% for the four processes). It is possible that
there are other objectives or processes which might impact on the individual
objective which are not mentioned and may contribute in a minor way.

It should be noted that some processes are assessed as impacting on more


than one objective. In Figure 5 the process ‘problem management’ is shown
as impacting on three objectives, proceduralise existing informal processes,
develop support team model, and reduce activity complexity. Each branch
along which the process problem management impacts are assessed
individually and the totals for each branch are added together.

At this point if there are processes which have been forgotten they can be
added to the map. In the research team’s experience, the use of a rating
system of between 1% and 100% is more effective than 1 = least and 10 =
most as it allows for increased differentiation of the cause & effect
relationships. It is also possible to ignore the use of percentages and opt for
the equivalent in decimal places, for example, 20% is equal to point 2.

Once these percentages are applied to the map they can then be calculated
to provide a ranking for each process.

For example, in Figure 22 the organisation would multiply the 20%


assessment for problem management by the 15% assessment for
proceduralise existing informal processes. The result is the impact of the
process problem management on the objective ‘reduce activity complexity’.

Craig Huxley Page 90


A Member of the Centre for Information Technology Innovation

Proceduralise existing informal Problem Management 20%


processes 15% Process C
Process D

Reduce Activity Complexity Process E


Objective 2
Process F

Process A
Process B

Figure 22- Calculating along the branch

The same procedure is extended along each branch from process to goal,
resulting in n number of % assessments multiplied by each other. For
example, in Figure 22 the calculation would start with 20% x 15% x …

An excel table is a suitable approach to recording and calculating the totals


for the assessments. Rows list the processes and columns are used to
record each % assessment along an individual branch. Table 11 shows the
layout and part of the formulae that might be used to calculate the impact of
each of the processes shown in column B.

A B C D E F
Total = addition of
Branch 1 Branch 2 Branch 3 each column
1 Help Desk = (%*%* ….) = (%*%* ….) = (%*%* ….) = (C2+D2+E2+..)
2 Incident reporting = (%*%* ….) = (%*%* ….) = (%*%* ….) = (C3+D3+E3+..)
3 Quality Processes = (%*%* ….) = (%*%* ….) = (%*%* ….) = (C4+D4+E4+..)
4 Problem Management = (%*%* ….) = (%*%* ….) = (%*%* ….) = (C5+D5+E5+..)
5 Release Management = (%*%* ….) = (%*%* ….) = (%*%* ….) = (C6+D6+E6+..)
6 Monitor Service
= (%*%* ….) = (%*%* ….) = (%*%* ….)
Levels = (C7+D7+E7+..)
7 Billing (Output) = (%*%* ….) = (%*%* ….) = (%*%* ….) = (C8+D8+E8+..)
8 Change/Enhancement
= (%*%* ….) = (%*%* ….) = (%*%* ….)
Management = (C9+D9+E9+..)
Table 11- Excel table showing possible formula for calculating impact

We have now calculated the ratings for the impact assessments of each
process on the organisational goal (or goals). The result of these calculations
will be a list of processes, with a value indicating the probable impact of the
process on the organisation’s goal or goals. Those processes with the largest
value result are considered to have the greatest impact.

Craig Huxley Page 91


A Member of the Centre for Information Technology Innovation

In step six, we have taken the output of step five, which was, the identified
linkages between processes and goals using cause & effect. These linkages
were then assessed for their relative contribution to the objective, strategy
and goal along each branch (impact). The results of these assessments were
then multiplied in order to arrive at a total value for each branch within the
BSC. Where a process impacted more than one objective, the total value for
each branch along which the process impacted was added. The grand totals
indicated which processes had the greatest impact on organisational goals.

The grand totals for impact are then added to the table used to record the
results of the assessments for the criticality factors (dependency, probability
of failure and impact). Table 12 below is an example of the table.

Probability of
Process Dependency Impact Criticality
Failure
Process Ranking
name
Ranking result Ranking result
result

Table 12- Example of table used to record assessment results

Thus far we have assessed the factors dependency, probability of failure and
impact of each of the processes identified in step 2. The next step requires
that we calculate the criticality, which is the product of the factors assessed
so far.

4.1.7 Step Seven: Calculating Criticality

In step seven we will calculate criticality which is the mathematical product of


impact, probability of failure and dependency. The definition of criticality as
taken from the literature review is “a critical process is one which has the
greatest positive contribution or negative effect on the organisation over
time.” Thus we suggest, as shown in the literature review that criticality is the
product of impact, probability of failure and dependency.

Craig Huxley Page 92


A Member of the Centre for Information Technology Innovation

The relationship between each of the three factors which make up criticality
is important to the treatment of the factors during the calculation of criticality.
This research project has not sought to determine whether the relationships,
between the three factors that combine to form criticality in the targeting
methodology, are dependent or independent of each other. When calculating
the values for each process the method thus requires that each factor is
multiplied rather than added in the same way as the FMEA methodology
operates (Stamatis 1995; Kinetic 1999). Adding the factors suggests that
they are independent of each other (Kinetic 1999).

Step seven then requires that we return to the table, in which the results for
each of the assessments for the processes have been recorded.

1 A B C D E
2 Probability of
Process Dependency Impact Criticality
Failure
3 Process
Ranking result Ranking result Ranking result =(B3*C3*D3)
name

Table 13- Example of table used to calculate criticality

The ranking result for each process is placed into the formula shown in cell
E3 of Table 13 above. The output of this calculation is a list of processes
ranked by their criticality. That process with the greatest value in comparison
to the other processes is considered to have the greater criticality.

There are some issues that need to be discussed at this point in regards to
the final ranking of the processes in terms of criticality.
Weighting of the three factors:
Thus far we have two factors (dependency and probability of failure) that are
rated out of ten and the third factor (impact), which is a percentage (out of
100). Though it may appear that impact is ten times more important that
dependency and probability of failure, our sub objective here is to derive a
rank order; a list of processes from most critical (highest rank) to least critical
(lowest rank). This difference of scale between impact (1-100), dependency
(1-10) and probability of failure (1-10), will not alter the rank order of

Craig Huxley Page 93


A Member of the Centre for Information Technology Innovation

processes in the list. The impact scale (1-100) was selected to make the
decision of assessing impact as simple and logical as possible.

Each organisation might adjust the weighting of each of the three factors to
suit their perception of the importance of one factor in comparison to the
others. This was not suggested by the research team as we are unable to
provide literary evidence of the reasoning behind changes of this nature.

If the results this far, in the targeting method, provided a result such as the
example shown in table 14, then there is an argument that both process A &
B should be considered critical.

Probability of
Process Dependency Impact Criticality
Failure
Process A 10 1 1 10
Process B 2 3 7 42

Table 14- Example of threshold issue

The argument suggests that Process A should be considered critical because


of the very large effect that the ‘failure of the process’ or the ‘level of
dependency’ that organisation has for the process. In this example, the
probability of failure assessment is almost negligible, and the assessment for
impact on goals is also very low.

In real organisations this type of result, (pathological behaviours) may not


exist in an a priori understanding, and requires further research. In any case
it is necessary to consider each process being assessed for criticality in the
light of the combined results. As previously stated in the literature review,
criticality is the product of impact, dependency and probability of failure.

It may be useful to some organisations to provide a threshold figure for each


assessment result, which will, if exceeded, be considered for assessment by
the final two factors (probability of successful improvement and cost/benefit).

Craig Huxley Page 94


A Member of the Centre for Information Technology Innovation

In this step we have brought together the outputs of the steps which provided
assessments for dependence, probability of failure and impact. These
ranking have then been used to provide a criticality value for each of the
processes. They can then be placed into a list of ascending order with the
largest criticality value at the top.

The next step is designed to assist the organisation in selecting which of the
most critical processes it should examine for possible process improvement.

4.1.8 Step Eight: Assessing Cost/benefit

This step is concerned with the assessment of cost/benefit of a process in


order to uncover the value to the organisation in conducting a process
improvement project on a process. If a process improvement project is
considered to be an investment of an organisation’s resources then
economic theory states that only those investments which add positive value
should be undertaken (Carnegie, Jones et al. 1998; Dollinger 1999; Mankiw,
King et al. 1999).

The definition of cost/benefit is the comparison of costs associated with and


caused by the project and the benefits derived from and affected by the
project. A cost/benefit analysis which provides greater value of benefit than
costs is considered to have positive cost/benefit and should be further
considered for improvement.

The steps so far have provided the entity with ratings for the ‘criticality’ of
each process. It is entirely possible that at this point an entity might decide to
improve half or a few of those processes with the highest criticality rating.
The organisation should decide at this point how many processes it can
afford to conduct the assessment of cost/benefit upon.

Initiating a cost/benefit on a large number of processes can be an expensive


and time consuming exercise in itself. Thus it is suggested that some form of

Craig Huxley Page 95


A Member of the Centre for Information Technology Innovation

‘culling’ is performed before starting this step. The entity may at this point
also choose to ignore the steps eight, nine and ten and improve processes
purely on their ‘criticality’ rating. Where the processes are complex and there
are many to choose from, it is suggested that the entity take into
consideration a more in-depth approach as suggested by steps eight, nine
and ten.

The suggested approach to assessing this factor is to use one of the


common cost/benefit methods presently available and described in many
financial texts. An organisation that presently employs a cost/benefit analysis
should follow their accepted method. Most approaches should take into
account both tangible and intangible benefits over an agreed time frame as
well as the costs accrued in the improvement project and any costs incurred
by other areas due to the changes made to the process (Carnegie, Jones et
al. 1998) (page 821).

A positive value would be a result showing that the benefits were financially
greater than costs. A negative value would be associated with costs being
greater than benefits. Those processes with positive value are then taken to
step nine to be assessed for the probability that the organisation or project
team can successfully improve the process. This assessment is the risk of
not receiving a desired return.

In this step we have reduced the number of processes to be considered by


deciding how many processes should be assessed for their possible value
after improvement. Cost/benefit analysis can be costly and there may be
many processes. We have suggested that organisations use a standard
cost/benefit analysis tool that is accepted within their organisation. The step
also stated that those processes that after assessment have a positive
cost/benefit should be considered in step nine.

In step nine we assess the likelihood that the organisation can successfully
improve each of the positive cost/benefit processes.

Craig Huxley Page 96


A Member of the Centre for Information Technology Innovation

4.1.9 Step Nine: Assessing the Probability of Successful Improvement of a


Process

Step nine, “Probability of Success”, is defined as the assessment of the


likelihood that the organisation can successfully re-engineer a process
providing the expected benefits within the expected time and cost.
The probability of successful improvement of a process can be completed by
using risk analysis techniques. The suggested approach in this methodology
is to use the anchoring and adjustment style heuristics.

The work of Sedera (2001) is able to support the anchoring and adjustment
process by providing a list of factors that impact on the likely success of a
process modelling project (Sedera 2001). The factors applicable to this
assessment are;
1. Team Orientation
2. Project Management
3. Management Support
4. User Participation
5. Project Championship
6. Communication (Sedera 2001)

The anchoring and adjustment style heuristic requires that the user base their
assessment on reliable data. The user then adjusts that data taking into
consideration their current knowledge of the elements that might impact upon
the probable success of the project.

Suitable historical data the user might seek information on would be the
success or failure of previous projects and the impact of each of the Sedera
(2001) factors on their success or failure. The information would then be
used for the anchoring phase of the heuristic. The adjustment phase might
also use the Sedera (2001) factors to provide the user with points of focus by
which to judge the possible success or failure of the processes being
assessed. The user is able to adjust the original data they have collected by
using their knowledge and experience with the organisation to provide an

Craig Huxley Page 97


A Member of the Centre for Information Technology Innovation

assessment of the capabilities and support which might be found to influence


the success of the project.

A scale of 1 = least probability of success and 10 = most probability of


success is appropriate for the ranking. Participants can apply criteria to each
level of ranking if this assists in the decision making.

The result of this assessment is that each of the now reduced list of
processes is provided with a rating which signifies their probable chance of
being successfully re-engineered.

In step nine we have assessed the likelihood of the processes with positive
cost/benefit being successfully re-engineered. The assessment was assisted
by the factors developed in Sedera’s (2001) study of the literature. The
anchoring and adjustment style heuristic was suggested as the method of
choice and historical data concerning the success or failure of previous
similar projects was also required. The user then adjusts this original data
with their knowledge and experience of the organisation and rates each
process in the light of the probable outcomes. We are then provided with a
list of processes ranked according to most likely successful improvement.

The next step is the final step of the targeting method, requiring the
organisation to make a business decision based on the output of the previous
three steps (steps seven, eight and nine).

4.1.10 Step Ten: Selecting Which of the Critical Processes to Improve First

In step ten we use the outputs of the assessments for cost/benefit and
probability of success combined with the rating for criticality, to decide which
processes to improve first. Much of the decision making in this step is
dependent on the groupings of the results. If results of each of the
assessments are closely grouped then the decision making may be difficult if
a further reduction in the selection process is needed.

Craig Huxley Page 98


A Member of the Centre for Information Technology Innovation

Ultimately, the decision is a business decision as the results of the


methodology are indicative only. Due to the use of heuristics, the quality of
assessments is necessarily based on the skills and knowledge of those
people conducting the assessment.

It is possible that the selected process may be far too large to consider
improvement of the process in its entirety. If this is the case then the
organisation might identify the lower level processes within the larger process
and perform the identification and selection process again at this level. Once
the critical processes have been identified at a granularity that is suitable for
improvement then the entity can undertaken a suitably targeted process
improvement project.

In this step we have suggested that the final decision making for the selection
process is not based on scientific fact but the quality of the assessments
used during the targeting method. We have made the decision of which to
improve first based on the outcomes of the assessments of cost/benefit and
probability of successful improvement of the process and the criticality rating
for the processes assessed in steps eight and nine. We have also suggested
that if the chosen process is very large and complex then the organisation
might undertake the process again using the lower level processes within the
larger complex process.

The next section of this chapter is a ten step, text based, model of the
targeting method followed by a more detailed explanation of each step. This
model will be used at the end of each action learning cycle to indicate the
changes made by the research team as a result of the data collected during
and after the case studies conducted within each action learning cycle. The
model identifies the ten steps already discussed and includes major activities
within each of the ten steps. The second part of this model provides further
detail of the methodology.

Craig Huxley Page 99


A Member of the Centre for Information Technology Innovation

4.2 Version 1 of the Targeting Method


1. Preplanning
1.1. Assessing participants
1.2. Preparation of any documents
2. Defining Scope
2.1. Introduction of the project as a whole to the project team
2.2. identify the processes
3. Assessing Dependency; the effect of failure of a process on the
organisation
3.1. Agree on the method to be used for assessing dependency
3.2. Identify the criteria to be used and rate each process
4. Assessing Probability of Failure of the Process
4.1. Agree on the method to be used for assessing probability of failure
4.2. Identify the criteria to be used and rate each process
5. Developing a Balanced Scorecard (BSC)
5.1. Identify the goals and strategies and objectives of the entity
5.2. Identify the cause & effect linkages within the BSC
5.3. Link processes identified earlier (2.2) to internal process objectives
6. Assessing the Impact of Processes on Goals
6.1. Assess the impact of each process on goals using heuristics and total
7. Calculate the Criticality of each Process
8. Assess the Cost/Benefit of Improving the Process
8.1. Agree on the method to be used for assessing cost/benefit
8.2. Identify the criteria to be used and analyse each process selected
9. Assess the Probability of Successful Improvement of the Processes with
positive cost/benefit
9.1. Agree on the method to be used for assessing probability of success
9.2. Identify the criteria to be used and rate each process
10. Selection of which Critical Process to Improve First
1.1. Rank order the processes with positive cost/benefit by greatest
probability of successful improvement. Those processes with the
greatest probability of success and greatest cost/benefit should be
improved first

Craig Huxley Page 100


A Member of the Centre for Information Technology Innovation

An explanation of each part of the generic 10 part process


1. Preplanning
1.1. Assessing Participants
This step involves some research of the participants and their organisation. If
possible, initial contacts should be used to assess the knowledge of the
participants and who should attend the first implementation meeting.
1.2. Preparation of any documents
Before the first meeting of the project team there are a number of documents
which should be produced. The first is a simple outline of the targeting
methodology that is to be used. The second is the terms and meanings of the
words within the BSC and targeting method. The third document is an
agenda.

2. Defining Scope
2.1. Introduction of the project as a whole to the project team
Discussion of the whole project is started here. The step is dependent on the
number of participants and time as to the detail required. It should be verified
that the participants are familiar with the BSC and its terms. Agreement
should also be reached as to the lines of communication and confidentiality.
Define the business area in which to conduct the project. (Not all targeting
projects are implemented for the whole organisation.) The time frame of the
project should then be assessed and the roles and responsibilities for the
project outlined.
2.2. Identify the processes applicable to the project
The processes and the level at which they should be assessed are identified
at this point.

3. Assessing Dependency, which is the effect of failure of a process on the


organisation
3.1. Agree on the method to be used for assessing dependency
The generic method suggests the use of the anchoring and adjustment style
heuristic for this task. Assessment of the process is in relation to each of the
other identified processes.

Craig Huxley Page 101


A Member of the Centre for Information Technology Innovation

3.2. Identify the criteria to be used to rate each process and conduct
the assessments

4. Assessing Probability of Failure of the Process


4.1. Agree on the method to be used for assessing probability of failure
The use of anchoring and adjustment style heuristics is suggested.
4.2. Identify the criteria to be used to rate each process
Define the organisation’s understanding of what is meant by failure. This will
enable the user to determine when a process has failed historically and if it
will fail in the future.

5. Developing a Balanced Scorecard (BSC)


5.1. Identify the goals, strategies and objectives of the entity in the
project
If necessary an explanation of the BSC is necessary and the goals and
strategies of the entity are provided. Develop objectives as the mini goals for
the strategies and place these objectives within the agreed perspectives of
the BSC.
5.2. Identify the cause & effect linkages within the BSC
Using the experience and skills of the project team (a heuristic approach),
identify the cause and effect linkages within this BSC. If necessary add
objectives to allow for possible causes and effects.
5.3. Link processes identified earlier (2.2) to internal process objectives
With the processes identified earlier (or now) the project team link those
process which impact upon the internal process objectives within the BSC.
Provide a visual ‘map’ of the BSC for the participants to assess for
correctness of the work so far and make changes from any feedback.

6. Assessing the Impact of Processes on Goals


6.1. Assess the impact of each process on goals
Using anchoring and adjustment style Heuristics, assess the impact of each
process on the objectives, strategies and ultimately the goals of the entity.
Each link is assessed one at a time and but in relation to all of the processes
and objectives impacting on the objective, strategy or goal. Calculate totals

Craig Huxley Page 102


A Member of the Centre for Information Technology Innovation

for each process by multiplying the percentages together along the links and
adding where a process impacts more than one objective.

7. Calculate the Criticality of each Process


Multiply the ratings or values for impact, dependency and probability of
failure. This total is the criticality of each process. Higher values indicate
higher criticality than lower values.

8. Assess the Cost/Benefit of Improving the Process


8.1. Agree on the method to be used for assessing cost/benefit
The organisation should first decide how many of the most critical processes
it will assess for cost/benefit. If the projects are very small then heuristics
may be suitable, otherwise the suggested approach is to use your
organisation’s current cost/benefit analysis method.
8.2. Identify the criteria to be used to rate each process
Dependent on the approach taken above in 8.1 the criteria would include;
costs for resources and non-project related incurred costs, tangible and
intangible benefits. Only those processes with a positive cost/benefit would
then be assessed for the probability of successful improvement.

9. Assess the Probability of Successful Improvement of the Process


9.1. Agree on the method to be used for assessing probability of
success
The suggested approach is to use the anchoring and adjustment style
heuristics.
9.2. Identify the criteria to be used to rate each process
Some criteria which may support the decisions are: Team Orientation,
Project Management, Management Support, User Participation, Project
Championship and Communication.

10. Selection of which Critical Process to Improve First


10.1.Rank order the processes with the greatest probability of successful
improvement.

Craig Huxley Page 103


A Member of the Centre for Information Technology Innovation

Those processes with the greatest probability of successful improvement and


the best cost/benefit ratio should be selected first for improvement. This is
essentially a business decision. The rank order is not a scientific approach to
selecting which processes to improve first. It is an improved approach to
current practice.

4.3 Targeting Methodology Summary

This chapter has described the steps that need to be taken to implement the
targeting methodology. It has initially provided the breakdown of the factors
used in the method and then described the approach which was taken with
the methodology for the Action learning case studies.

A ten step methodology for determining the value of each of the five factors
(dependency, probability of failure, impact, cost/benefit and probability of
success), was presented. This methodology is unique in that it combines the
BSC method with the work of Carpinetti, Gerlamo & Dorta (2000) and the
research and methodologies focussed on failure in the manufacturing and
nuclear industries. This combination has provided a method for assessing the
criticality of a process. The method is also unique in providing an improved
and practitioner ready method of selecting which of the hundreds of
processes within an organisation they should improve first.

Impact determination uses the BSC approach in order to identify the links
between processes and objectives, strategies and goals.

Impact is defined as the relative contribution of a process on organisational


goals. Impact is the calculation of the % impacts of a process along the links
of that process to the goal. The links are based on anchoring and adjustment
style heuristics.

Craig Huxley Page 104


A Member of the Centre for Information Technology Innovation

Dependency is defined as the assessment of the effect of failure of a process


on the organisation. Dependency is measured as a ranking of the probable
result of the effects of failure of a process within a 1 to 10 scale using
heuristics.

Probability of failure is defined as the chance that a process will fail during a
future time frame. The measure for this factor is a 1 to 10 scale using
heuristics of the likelihood of failure.

These elements then established a rank order of the processes in terms of


their criticality. Criticality is merely the product of impact, dependency and
probability of failure. For those most critical processes, further analysis is
required in terms of cost/benefit of the process improvement of each process
and the likelihood of the organisation successfully improving the process
(probability of success).

Cost/benefit is defined as the comparison of costs associated with and


caused by the project and the benefits derived from and affected by the
project. The analysis of cost/benefit uses an organisations current approach
to cost/benefit analysis for investment decisions.
Probability of success is defined as the likelihood or chance that the
organisation can successfully improve a process and realise the expected
benefits within the expected time and costs. This factor uses the anchoring
and adjustment style heuristics to assess the skills and determination of the
organisation to successfully improve the process achieving the benefits in a
timely and cost effective fashion.

Only those processes that are;


1. Most critical
2. Have a positive cost/benefit
3. Have a strong possibility of being successfully improved are then analysed
for selection.

Craig Huxley Page 105


A Member of the Centre for Information Technology Innovation

These are all business decisions. The application of this methodology within
ASP businesses using the case study approach is the test of the targeting
methodology.

The following chapter describes the approach to the research data collection
and justifies the research methods used to test the implementation of the
targeting methodology. It also justifies the approach taken to improving the
method using action learning case studies.

Craig Huxley Page 106


A Member of the Centre for Information Technology Innovation

5 Research Method and Design

Chapter 5 describes the research process required to achieve the research


objectives. There was a need to test the method in a number of organisations
and to refine the method after each test. In order to ensure that the research
was beneficial to the business participants it needed to be focussed on an
area of importance to them. Surrounding these requirements was an 18
month time frame and limited resources. In addition, each organisation was
also only able to provide limited time and personnel for the project.

Figure 23 shows the three main phases of the research approach for the
research project.

Model 2 Identify Critical 3 Test and


1
Development Processes Revise
Focus
Act, observe, reflect & refine
Focus Delphi
Literature Review
Group Study
PS C1 C2 C3
Action Learning using a;
Pilot study & Case studies 1-3

Figure 23-Model of research approach

Phase 1 was model development using the literature review as the guide.
The second phase (identify critical processes) was the development of the
generic definition of a critical process and identifying ‘critical processes’ using
a focus group and Delphi study. This approach also provides appropriate
business focus for the final phases. The final phase (3) called ‘test and
revise’ uses action learning with a pilot case study and three case studies
implementing the methodology developed in chapter four. This third phase
uses a cyclical approach of act, observe, reflect and refine. The cycle occurs
for each of the case studies.

Craig Huxley Page 107


A Member of the Centre for Information Technology Innovation

5.1 Data Collection Phase Description

General area of Interest


Increased Knowledge
Literature -Of problem area
-Of weakness in research
Review -Of approaches to research
Allows definition of
Research Question
Need to reduce
scope

Qualitative Data
• Ideas
Focus Group • Thoughts
• Examples
Outputs
Definition of Critical process
& examples of these

Delphi Study Quantitative Data


• Assessment of perceived
criticality of a process
Outputs
Processes assessed for
criticality

Action Learning Cycles Qualitative Data


Using a pilot case study & three case studies • Ideas
4 cycles of implement, observe, reflect and revise • Issues
Implement Implement • Lessons
Outputs
Tested and improved
PS 1 C1 C2 C3 targeting method
Revise
Reflect
Observe

Outputs
Thesis
• Detailed analysis of the
collected data, issues
arising, lessons learnt
and further research
directions

Figure 24- Diagram of the Research approach

Craig Huxley Page 108


A Member of the Centre for Information Technology Innovation

5.2 Design of Data Collection

This section of the research approach makes the case for using the tools
applied in this research project and also examines these tools, which were
used to assist in answering the research questions.

Yin (1994) states that the goal of selecting the appropriate research method
is to “avoid gross misfits; that is, when you are planning to use one type of
strategy but another is really more advantageous” (Yin 1994) (P4). Yin (1994)
is suggesting that there may be a number of viable approaches to answering
the research questions and achieving the research objectives. Thus, the
choice of research strategy is one of choosing the most suitable to the needs
of the research team.

Considering the objective is to improve and test the existing targeting method
so that it is useful for practitioners, the type of questions we asked ourselves
were:
1. How do we improve the method?
2. How do we know what is flawed?
3. Why is it flawed?
4. How do we know what the change should be?
5. Why is the proposed change appropriate?

These questions according to Yin (1994) are more explanatory type


questions and “likely to lead to the use of case studies, histories or
experiments” (Yin 1994) (p6).

Figure 25 shows the comparison of five different research methods that could
be used to answer the questions we have posed. It describes which method
might be used for the different types of questions asked.

Craig Huxley Page 109


A Member of the Centre for Information Technology Innovation

This figure is not available online.


Please consult the hardcopy thesis
available at the QUT Library.

Figure 25- COSMOS Corporation’s model of different research strategies (Yin 1994)

Figure 25 describes five different research strategies, Experiment, Survey,


Archival Analysis, History and Case study. The third approach, Archival
Analysis, looks at published data, which for this project would not be suitable
as it is a new methodology. The fourth approach, Historical Analysis, is a
survey of past events in an organisational setting. This is one of the options
for explanatory questions (Yin 1994). The focus of an historical analysis is on
past events which are reconstructed by the survey participants.

This research project requires an approach that answers the questions of the
research team during and after a planned implementation of the targeting
method. It would be more suitable to collect the data during the
implementation than to come back at a later date. Apart from the issue of
time distorting the memories of the participants there is also the risk of not
being able to locate the original participants through staff attrition and
company transfers.

The experiment approach, (top of Figure 25) and Yin’s (1994) third option for
explanatory, requires the research team to have control over behavioural
events. As the methodology is to be tested in real life situations, this type of
control is not considered as an option (Yin 1994).

Craig Huxley Page 110


A Member of the Centre for Information Technology Innovation

The Survey method is not considered an option by Yin (1994) for these
explanatory questions (Yin 1994). It is possible though that a survey might be
conducted after the implementation or as part of the data collection process.
This leaves the fifth approach, Case Study, which Yin (1994) states, allows
the researcher to focus on the contemporary or current situation and does
not require the control of the participants (Yin 1994). Thus the case study is
an appropriate method to answer these explanatory questions (Yin 1994).
The issue that now arises is that if using a case study is the most appropriate
method of answering these questions (see previous page), how much
improvement will be achieved from one case study? Also, what is the
external validity or generalisability of the single case study? Consequently it
follows that multiple case studies would enable a progression of
improvements and also provide some external validity.

The essence of multiple case studies is to provide a researcher with the


ability to pattern match between the data observed in the case and the
developed theory (Smyth 2001; Smyth 2001a). Yin (1994) states that multiple
case studies are used for literal or theoretical replication to predict similar
results or provide contrasting results for predictable reasons (Yin 1994)
(p46). This is not the aim of this research project as it intends to use each
case study to improve the targeting method. Thus the use of action research
or action learning may be an appropriate way of utilising the multiple case
studies.

Bunning (1993) states that the purpose of action learning is "to make
improvements in the world, and so contribute to private learning” and that
"the purpose of action research is to make improvements in the world and so
contribute to public learning” (Bunning 1993) (p25).

McGill and Beaty (2001) state that "as well as sharing the same trusty
learning cycle, action learning and action research share many of the same
values" (McGill and Beaty 2001) (p22) . They add that the primary difference

Craig Huxley Page 111


A Member of the Centre for Information Technology Innovation

between action research and action learning is that action learning uses a
group of people for the reflective phase (McGill and Beaty 2001) (p21).

Action research and action learning may be very similar tools, with the
extremes of each tool being the intention of who learns most, the researcher
or the participants. Stewart (2001) and Stewart and Gable (2001a) describe
research in which case study and action research are combined in order for
the participants to learn from the experience (Stewart 2001; Stewart and
Gable 2001a). It is for this reason that we believe that the action learning tool
is the appropriate choice as the intent of this research project is for the
research team to gain the lion’s share of the new knowledge in order to
improve the method.

Action research is used to transfer learning to improve the internal practices


of an organisation. Alternatively, action learning is used to transfer learning to
the team or person conducting the project. In this research project the
research team are conducting the project.

With the knowledge that there are many thousands of processes within an
organisation and that the best research results would come from large
organisations, the research team sought to firstly understand the perceptions
of the industry in relation to critical processes.

The aim then was to identify those areas of the ASP Service Delivery
industry which were considered to be most important. One approach was to
use a Delphi study asking a larger group of ASP Service Delivery industry
participants to rank a list of possible critical processes. The reason for
choosing a Delphi study was influenced by a reduced need for prior research
which a survey would have required to develop questions and the ease and
relative speed of conducting the Delphi study by email (Bowles 1999;
Gatfield, Barker et al. 1999). To use a Delphi study the research team
needed a set of critical processes from the perspective of those in this
industry.

Craig Huxley Page 112


A Member of the Centre for Information Technology Innovation

The solution to this question was to use a focus group. Exploratory research
using focus groups requires less preparation and produce roughly 70% as
many ideas as individual interviews (Morgan 1988). This research project
required only the one focus group session. The associated research project
‘reference models for ES service delivery’ used the same participants in four
further focus group sessions.

Considering the time frame for this research project, this reduction in time
was a very important factor. The reasoning here was that a focus group
would provide the necessary information and also solve a number of
logistical needs in one joint effort. The joint effort was connected with the
associated research project ‘reference models for ES Service Delivery’. This
associated project was to use a series of focus groups as an approach to
improving early versions of newly developed reference models.

Consequently it was seen as possible for both research teams to collaborate


in attracting participants, submitting ethics documentation and reducing the
time needed from participants. As was explained in chapter 2 of this thesis
the two research projects were then able to provide valuable input to the
other. The output of the Delphi study was able to provide focus for the action
learning case studies and also for the future focus groups of the second
research project (reference models for ES service delivery).

A large benefit of the Delphi study and focus group approach was that it was
able to provide a way of concentrating the effort for the action learning case
studies. This concentration would ensure that the scope of each case study
would be of most benefit to the participants and enable the attainment of the
research objective: the validation of the targeting method. In addition, there
would be reduced logistical problems for the participants and the research
team. Both the participants and the research team were under tight time
constraints, the participants were constrained due to their level of seniority
within their organisations and the research team were constrained due to the
18 month time frame for the project. In addition, the number of persons
needed for project completion was reduced when the scope was narrowed.

Craig Huxley Page 113


A Member of the Centre for Information Technology Innovation

A large benefit of the Delphi study, focus group approach was that it was able
to provide a way of reducing focus for the action learning case studies. This
reduced focus would ensure that the scope of each case study would not
create logistical problems for the participants and the research team. Both
the participants and the research team were under tight time constraints. The
participants due to their level of seniority within their organisations and the
research team due to the 18 month time frame for the project. In addition the
number of persons needed for project completion was reduced when the
scope was narrowed.

Thus the result was to use a single focus group to provide, at a minimum, the
necessary list of perceived critical processes for use in the Delphi study and
for the Delphi study to provide the case studies with a process area of
importance to the ASP Service Delivery participants.

Thus far we have stated the reasons for the focus group and Delphi study
were to improve the research team’s understanding of ‘critical’ from a
practitioners view and also to derive a suitably important focus for the action
learning case studies. We have justified the use of action learning coupled
with multiple case studies as an appropriate approach to answering the
research questions.

The following section examines the tools and provides information on their
use and constraints.

Craig Huxley Page 114


A Member of the Centre for Information Technology Innovation

5.3 Focus Group Method

This section describes the intent of the research team in using the focus
group method and then in the next section we examine the focus group
method and provide the argument for its use in this research project.

5.3.1 Focus Group Purpose, Approach and Outcomes

The purpose of the focus group was to explore the conceptualisation of


critical processes and to agree to a definition of a critical process. The focus
group was also to provide the research team with examples of critical
processes within the ASP Service Delivery industry from the participant’s
perspective. If possible, the research team attempted to discover the context
of the identified critical process by asking the participants to record their
business activities from a list of probable activities. This list was first
assessed by the participants with some activities added to provide a view of
their major activities.

The seven participants of the focus group were from three commercial
outsourcing companies and one government owned but commercial
outsourcing company.

The approach taken was an exploratory focus group which used


‘brainstorming’ as one method of exploration. Starter information was
provided and the participants were asked to comment on this information.
Participants were asked to contribute information individually and to comment
on the information provided by the researchers.

The data collection was conducted by using audio tapes during the session,
taking notes and the written output of the participants. A white board was
also used to provide visual feedback to the participants and this was noted as
well. The outcomes of the focus group were a number of exemplar critical

Craig Huxley Page 115


A Member of the Centre for Information Technology Innovation

processes from within the ASP Service Delivery industry and a working
definition of what is a critical process in this domain. For example, some of
the critical processes provided were project management, information
systems planning and human resource planning.

The exemplar critical processes were then used in the Delphi study and the
definition of critical was used to guide the participants of the Delphi study in
ranking the data (list of critical processes collected from the focus group).

5.3.2 The Focus Group as a Research Method

Sofaer, Kreling, Kenney, Swift and Dewart (2001) suggest that if the previous
work in the field is limited (and this is the case) then the research needs to
be, at least initially, exploratory in approach (Sofaer, Kreling et al. 2001).
Fern (2001) suggests that "creating, collecting, identifying, discovering,
explaining and generating thoughts, feeling and behaviours are all purposes
of exploratory research" (Fern 2001) (p5). Morgan (1998) suggests that focus
groups as a research method are suitable for orienting a researcher to a new
field, generating hypotheses based on participants’ insights, evaluating
research populations and developing questionnaires (Morgan 1988). Thus,
the focus group method was selected to explore the concept of a critical
process and to generate an agreed definition of a critical process for use in
the Delphi study.

Focus groups are useful as self contained methods of data collection or as a


supplement to others. Morgan states that the "hallmark of focus groups is the
explicit use of the group interaction to produce data and insights that would
be less accessible without the interaction found in a group" (Morgan 1988)
(p12). Producing an agreed definition of what is a critical process fits neatly
with this definition. A survey, while permitting the researcher to ask
participants more than one question, does not provide an avenue for
discussion which would provide improvements on their original thoughts.
Exploration of the conceptualisation of a critical process in the focus group

Craig Huxley Page 116


A Member of the Centre for Information Technology Innovation

session was expected to produce both discussion on the characteristics of


critical processes and also examples. These outcomes were used in a
Delphi study to extend the range of applicable processes, rank these
processes and to gain a greater participation in the determination of
candidate critical processes.

5.3.3 Characteristics of Focus Groups

Focus groups generally have between 6 and 10 participants although there


are many examples of larger groups (Saulnier 2000). Larger groups are more
difficult to manage and tend to allow some participants to say nothing or
follow the more dominant personalities (O'Neill, Small et al. 1999). O’Neil
(1999) suggests that a focus group should not be used for more than one
session per subject. He cites research showing that attempting to build
consensus with a focus group in more than one session produces outcomes
which are quite misleading.

Questions in a focus group are of the how or what type, and normally, a
specific probe question is designed to follow, if the initial question fails to
elicit the desired information. Each session is usually 1 to 2 hours in length
and has 3 people involved in the session outside of participants; a moderator,
a note taker and a time keeper who might also monitor the audio equipment
(O'Neill, Small et al. 1999).

The moderator is very important and should have experience dealing with
groups. It is this person who introduces the group members, initiates the
discussion, ensures that everybody is given an opportunity to voice their
opinion, calms the outspoken and encourages the reluctant (Saulnier 2000).
The note taker is present in case of technical difficulties and as an aid to the
moderator to ensure that all necessary questions or topics are covered
sufficiently.

Craig Huxley Page 117


A Member of the Centre for Information Technology Innovation

5.3.4 Focus Group Recruitment

"A critical aspect of conducting focus groups is to specify the inclusion and
exclusion criteria for participants" (Sofaer, Kreling et al. 2001). Possible
participants for this focus group were sourced from large application service
provision companies in Australia; that is, organisations who provided large
application service delivery to outside organisations or, (in-house), to their
own companies. It was preferred that they were based, or had suitable staff,
in Brisbane but this was not a firm rule. The result provided commercial and
government ASP organisations though no in-house providers were attracted.

Participants were sourced initially by two methods:


1. Known contacts of the research team
2. Contacted from research into commercial and government organisations
conducting large application service provision operations within Australia.

Initial contact with possible participants was by email and if interest was
expressed then a face to face meeting was scheduled. At this first meeting
the research team provided information on the scope of the project and the
benefits of participation. (The combined project was the ‘Process oriented
administration of enterprise systems’ and not only the research project
described in this thesis.) In addition, we described the possible solutions to
problems with confidentiality and inability to participate in all parts of the
project.

5.3.5 Data Collection

Data collection in focus group research is normally undertaken by audio-


taping and note taking. When an understanding of feelings and reactions is
required video-taping is also recommended (Morgan 1988). The approach
taken for this focus group was to use two audio-taping machines, a dedicated
note taker and timekeeper in addition to the moderator. Immediately following
the session the note taker, timekeeper and moderator composed their main

Craig Huxley Page 118


A Member of the Centre for Information Technology Innovation

thoughts and issues. All audio tapes from the focus group were transcribed
and checked and this information was sanitised to remove names or other
identifying information. This approach is consistent with those recommended
by Morgan (1998), Fern (2001) and Saulnier (2000) (Morgan 1988; Fern
2001) (Saulnier 2000).

5.3.6 Data Analysis

The focus group session was held on May 22nd 2002 in a conference room in
the university research facilities. There were seven participants in the focus
group, all coming from companies that participated in the ASP service
delivery industry. The seven participants were from four organisations, with
two from one organisation and three from another. We believed the resultant
combination of participants provided the session with a valid focus group as
the company which supplied three participants were all from different
management areas of that business. Table 15 shows the break-up of
participants for the focus group and how many participants were from each
company.

Company A Company B Company C Company D Total


# of participants 1 1 2 3 7

Table 15- Number of participants from each company

The data captured from the focus group session was both qualitative and
quantitative. The research team undertook to analyse the qualitative data by:
1. The research team compared their impressions of the group consensus for
the definition of critical, with that taken from the transcribed audio recordings.
2. The listing of critical processes was also assessed for completeness and
consensus with the comments of the participants in the audio transcriptions.

The quantitative data was then compared with the comments found in the
transcriptions for completeness where appropriate. That is, the final version

Craig Huxley Page 119


A Member of the Centre for Information Technology Innovation

of the definition was validated by looking for comments in the transcriptions


which supported or disagreed with the final definition of a critical process.

The final results for the list of critical processes were then tabulated and used
in the Delphi study. This Delphi study employed the same focus group
members and enlarged the group with further ASP service delivery
participants, in order to confirm the finding of the focus group and generate
further discussion.

The next section describes and discusses the Delphi study.

Craig Huxley Page 120


A Member of the Centre for Information Technology Innovation

5.4 Delphi Method

This section describes the intent of the research team in using the Delphi
Study method and then in the next section we examine the Delphi Study
method and provide the argument for its use in this research project.

5.4.1 Delphi Study

The Delphi Study purpose was to rank and achieve consensus in the ranking
of the data collected from the focus group. The sixteen participants in the
Delphi study were from ten organisations, one a public in-house service
provider, two government owned outsourcers and seven public outsourcers.
This data was composed of the 25 critical processes identified by the
participants during the focus group.

The approach taken first was to group the processes under higher level
processes such as hardware management, service support and security.
This was put into an Excel sheet with the first column showing each high
level process with the lower level processes below. This is partly shown in
Table 16.

Table 16- Excerpt from an excel sheet showing Delphi study processes

Craig Huxley Page 121


A Member of the Centre for Information Technology Innovation

Table 16 shows two high level processes (Support Processes & Security)
and four processes identified by the focus group participants below each of
the high level processes. The collated data was placed in Excel sheets with
appropriate instructions and emailed individually to the participants asking
them to rank the exemplar critical processes and make comments on their
reasons if possible. The ranking process was to identify which processes the
participants perceived were most critical to their ASP activities in relation to
the other processes on the sheet. (1 = most critical and 10 = least critical.)

The Delphi study went through three cycles before the research team
considered it was not achieving greater consensus or further useful
comments. The data collection for the Delphi study was the Excel sheets
attached to returned emails. Each response was collated into a single Excel
sheet and a combined result was used for the second and third cycle of the
Delphi study. The outcomes of the Delphi study were a ranked list of critical
processes from within the ASP Service Delivery industry. This enabled the
following action learning case studies to focus on the most critical area of
service support.

5.4.2 The Delphi Study as a Research Method

The Delphi Method is an approach to building consensus in a wide range of


industries and participants, but was developed in the 1950's by Dalkey and
Helmer (1963) to ascertain the opinions of experts in a field (Dalkey and
Helmer 1963). Groups of 10 to 50 are common, but any number is possible,
with one study using 1,685 individuals to undertake a study, reported by Reid
(1988) (Bowles 1999). These participants are provided with a set of general
questions or broad issues and asked to comment on them. The Qualitative
responses are collated and synthesised into statements which form a second
questionnaire. To gather quantitative data, a Likert-type scale is added to the
statements and re-sent to the participants asking for the revised list or

Craig Huxley Page 122


A Member of the Centre for Information Technology Innovation

collection of issues to again be rated. For example, 10 = extremely critical


and 1 = not at all critical.

This can be repeated many times, though in practice 3 or 4 iterations are


normal. The feedback, synthesised comments or individual statements can
be added to the next round so that participants are able to review their
decisions in the light of the group output (Jeffery, Ley et al. 2000). The output
is a consensus driven opinion or a stability of responses which includes both
quantitative and qualitative data (Chang and Gable 2000; Chang, Gable et al.
2000).

This technique offers a number of benefits;


¾ The response rates are traditionally high, with about 75% participation
¾ The participants can be selected on demographics, skill levels and
perceived knowledge
¾ There is controlled anonymous feedback, with less pressure on panel
members to conform than in a committee
¾ Its iterative approach leads to systematic refinement and developed
consensus or a stability of responses (Jeffery, Ley et al. 2000)
¾ Email has enabled it to be an inexpensive way to access a large
number of respondents who may be geographically distant. (Bowles
1999; Gatfield, Barker et al. 1999)
¾ Respondents might be both known to each other or anonymous but
their individual responses remain anonymous

The purpose of the Delphi study in this research project was to identify,
extend and validate those processes which were considered most critical to
the ASP service delivery industry.

The next section details the data collection process for the Delphi study.

Craig Huxley Page 123


A Member of the Centre for Information Technology Innovation

5.4.3 Delphi Study Recruitment

Possible participants for this Delphi study were sourced in the same way and
with similar criteria as for the focus group. Target organisations were large
application service provision companies in Australia, who either provided
large application service delivery to outside organisations or, in-house, to
their own companies. There was no preference for geographic location as the
study was conducted by email.

Participants were sourced initially by two methods;


1. Known contacts of the research team
2. Contacted from research into commercial and government organisations
providing large ASP operations within Australia.

There were 16 participants from ten organisations for this study as described
in chapter 2.
Participated Government Public Co.
In-house 0 1
Outsourcer 2 7
Table 17 - Company type for Delphi study

Table 17 shows the category of organisation in which the participants for the
Delphi study originated. It shows a heavy bias towards those companies
involved in the commercial outsourcing of ASP service delivery. The
participants in the study held positions described as Managing Consultant,
Account Executive, General Manager, Senior Manager, Manager,
Professional Services Director, Contract Manager, Manager Service
Strategies, Remote Services Manager, Associate Principle, SAP Business
Analyst and Applications Delivery Manager. The companies they were from
are listed in Table 18.

Craig Huxley Page 124


A Member of the Centre for Information Technology Innovation

Participated Government Organisation Public Company


In-house Parmalat (Pauls)
ASP provider
Outsourcing CSA Qld Government IBM Global Services Aust.
ASP provider Citec EDS Consulting
CSC
REALTECH
Deloitte Touche Tohmatsu
Hitachi Data Systems
Mincom

Table 18- Participants by name and type

Seven companies were classified as being public outsourcing companies,


two as government outsourcing companies and one as a public in-house
provider. The next section describes the data collection for the Delphi study.

5.4.4 Data Collection

The Delphi study was started on the 27th May 2001 and had all three rounds
completed within eight weeks. Each round was provided with a two week
turnaround and a week between each round for collation and resending the
following Monday.

Round Date Issued Date Received Date Analysed


th
1 27 May 2002 10th June 2002 16th June 2002
2 17th June 2002 1st July 2002 7th July 2002
3 8th July 2002 22nd July 2002 28th July 2002
Table 19- Start, return and analysed dates for Delphi study

The last round of the Delphi study was sent out on the 8th of July and was
collated by the 28th of July 2002.

Craig Huxley Page 125


A Member of the Centre for Information Technology Innovation

Table 20 lists the number of participants from each of the ten organisations
who participated in the Delphi study.

Company A B C D E F G H I J A-J total


Total # of Participants 1 3 2 2 2 1 1 1 2 1 16
Round One Responses 0 1* 1* 2 2 1 1 1 1* 1 11
Round Two Responses 1 1* 1* 2 1 0 1 1 1* 1 10
Round Three Responses 0 1* 1* 0 2 1 1 1 2 1 10

Table 20- The number of participants from each company in the Delphi study

Table 20 list the number of responses for each of the three rounds of the
Delphi study. Numbers in the table with an * indicates that more than one
person provided input to the one response received from that organisation.

The responses to this first round were both quantitative and qualitative data.
The quantitative responses, ratings using the ten point Likert scale, were
collated into a combined rating providing a mean, maximum, minimum and
standard deviation result for each process. The qualitative responses were
synthesised in order to find common themes. These were then re-sent (by
email) to the participants with the same instructions. The Delphi study went
through three rounds to achieve consensus in some areas and increasing
disparity in others. Collection of the data was achieved by recording all email
responses into one Excel document.

5.4.5 Data Analysis

Data analysis of the quantitative data (the ratings from responses) was
carried out by statistically deriving the mean rating of a process from all
responses, the standard deviation of each set of ratings for a process and the
maximum and minimum rating for each process from the responses for that
process. This was conducted at the end of each round with this data returned
to the participants in the next round. The final analysis of data was conducted
on the collated results of each of the rounds. The standard deviation of the

Craig Huxley Page 126


A Member of the Centre for Information Technology Innovation

mean, maximum, minimum and standard deviation of each of the rounds was
also established. The final result of this analysis was to be able to quantify
which processes were considered most critical to the participants and how
much consensus was achieved in providing the results.

The qualitative data (comments by participants) was used to develop a file of


issues raised during the study (Chang and Gable 2000). These were useful
to the participants in understanding the viewpoint of other participants.
Almost all, though, were taken from their organisation’s unique view of their
activities and related to the reasons why a particular process was more
important to that organisation. Many also indicated the important
relationships between processes.

With this set of critical processes, a four cycle action learning process using
case studies was undertaken to test and improve the targeting methodology.
The identification of the most critical processes by the Delphi study enabled
the focus of the action learning case studies to be on those considered most
critical to the ASP service delivery industry.

The following sections discuss the action learning method and case study
method and how these are combined as a tool to test and improve the
targeting methodology.

Craig Huxley Page 127


A Member of the Centre for Information Technology Innovation

5.5 Action learning using Case Study

This section describes the action learning using case studies phase. The
action learning approach was used here as the intention of the research team
was to improve the targeting methodology from real life experience. As such
the action learning was conducted in four cycles of implementing the
targeting methodology, observing its use, reflecting on the results and finally
revising the methodology where appropriate. Data collection came from the
case studies. The outcome of the action learning cycles was an improved
and tested targeting methodology.

There was one pilot case study and three case studies used in the action
learning. The purpose of the pilot case study was to undertake an
implementation of the targeting methodology in an organisation which had
already developed a BSC. The approach taken was to implement the
targeting methodology as per the steps outlined in chapter 4. This was
completed with a number of meetings at the company’s offices. The results of
each meeting were then emailed to the participants or presented at the next
meeting. The outcomes were seen to be an improvement on the targeting
methodology in a small time frame (due to not having to develop a BSC)
early on in the research project. The purpose of the three case studies was to
improve the methodology in the ASP Service Delivery industry and therefore
improve the generalisability of the method in the ASP industry.

The outcome of each case study was part of the input for each action
learning cycle. Each case study provided data from the participants
themselves in the form of answered questions, a developed BSC, the
assessments of criticality, cost/benefit, and probability of successful
improvement and from the research team’s notes.

The next section describes the action learning method as it will apply to this
research project.

Craig Huxley Page 128


A Member of the Centre for Information Technology Innovation

5.5.1 Action Learning Method

This section will describe the action learning method. Action learning is a four
phase method used to improve the outcomes of a learning process. These
four phases are essentially: act, observe, reflect and revise. McGill and Beaty
(2001) cite Kolb’s (1984) learning cycle of; Experience, reflection,
generalisation and testing. This is similar to that of Bunning (1993) and also
to that which this research team will use.

Figure 26 shows the four cycles that will be used in this research project and
also lists the four phases of action learning, act, observe, reflect and refine.

Act, observe, reflect & refine

PS C1 C2 C3
Action Learning using a;
Pilot study & Case studies 1-3

Figure 26- Model of the action learning cycles using case study

The implementation or action phase is when the researcher implements, or


initiates some action, in order to allow observation of the results of that
action. The second phase, observe, is when the data is collected concerning
the outcomes of the previous action. With the data in hand, the researcher
then takes time to reflect in a group and individual setting on the results of
the first action or implementation. McGill and Beaty (2001) state that action
learning, which uses a ‘set’ or group of people for the reflection phase, the
third phase, is essentially a group process (McGill and Beaty 2001) (p21).
The group in this research project is the research team. The fourth phase is
that of revision, which is when the research team revise the targeting
methodology in the light of the previous reflection of the case study data
taken in the observation phase.

The cycles of action learning are to ensure that:

Craig Huxley Page 129


A Member of the Centre for Information Technology Innovation

1. There are sufficient periods of observation to capture all relevant data


(Pedler, Burgoyne et al. 1986; Bunning 1993; McGill and Beaty 2001)
2. That reflection is undertaken after more than one experience (Pedler,
Burgoyne et al. 1986; Bunning 1993; McGill and Beaty 2001)
3. That this experience involves more than one context to provide
generalisability (external validity) (Benbasat, Goldstein et al. 1987)
4. That each revision builds on increasing experience and is an improvement
of the forthcoming action (implementation) (Pedler, Burgoyne et al. 1986;
Bunning 1993; McGill and Beaty 2001)

Thus this research project uses four cycles of action learning. The four cycles
will use a pilot study and three single case studies as the observation phase
for the action learning.

The following section will discuss the case study method.

5.5.2 The Case Study Method

Yin (1994) defines a case study as an “empirical enquiry that:


1. Investigates a contemporary phenomenon within its real-life context,
especially when
2. The boundaries between phenomenon and context are not clearly evident”
(Yin 1994) (P13). Benbasat, Goldstein and Mead (1987) add that case study
“is an appropriate way to research a previously little-studied area” (Benbasat,
Goldstein et al. 1987). Yin (1994) confirms this by stating that a single case
study is a relevant approach to discovering new or un-researched material in
a natural setting (Yin 1994). This definition fits neatly with the needs of the
research project to:
1. Test the targeting method in a real-life context
2. Contend with the issues that arise from the contextual view or issues from
the methodology context.
3. Contend with the issue that the targeting method has never been studied
previously.

Craig Huxley Page 130


A Member of the Centre for Information Technology Innovation

Yin cites Schramm (1971), who observes “the central tendency among all
types of case study, is that it tries to illuminate a decision or set of decisions:
why they were taken, how they were implemented, and with what result” (Yin
1994) (P12). The intent of the research team is to test the targeting method
and as such we will seek answers to questions such as why did the
participant not understand some part and how can we improve it.
Thus we believe that case study is the appropriate choice for this research
project.

5.5.3 Case Study Recruitment

The pilot study participant was sourced using different criteria to the three
larger case studies. The intent here was to implement the targeting method
into an organisation that had already completed the development of a BSC.
The reasoning here was that the research team required an initial
implementation that did not require the more difficult facet (BSC) to be
implemented. This would allow the new facets of the methodology to be
tested with much reduced risk. It would provide the team with valuable
experience in a short time frame in an organisation which saw a definite need
for support in the area of identifying critical processes.

Thus the selection criteria for this part of the study were:
1. Have completed a BSC implementation in a departmental level or higher
2. Undertaking some process improvement projects or about to undertake
them
3. Located in Brisbane and able to start immediately
A company known to the research team and meeting these criteria was
approached and agreed to be the participant for the pilot study. They had
previously developed a corporate BSC and had a specific department for
which they would like support as it was undergoing considerable process
change.

Craig Huxley Page 131


A Member of the Centre for Information Technology Innovation

Participants for the three case studies were sourced from the Delphi study
and focus group participants. There was a preference for case study sites
with headquarters located in the city of the research team, though interstate
centres were also considered. Each participant in the focus group and Delphi
study was asked about their interest in participating in the case study
research.

Three national companies agreed to participate in the case study phase. All
were focused on outsourced services in the ASP service delivery industry,
with two from the commercial arena and one a government owned
commercial entity with customers from both government and public
organisations. (REALTECH AG, CSC and Citec)

5.5.4 Action Learning Case Study Data Collection

Data collection for these case studies was taken from diary entries during
and after meetings with participants, email correspondence and documents
provided by the company during the implementation of the targeting method.
Participants were also asked to provide comments on the method after
completion of the implementation. The research team also provided input and
this data was collected in the same way as from the participants of the case
studies.

5.5.5 Action Learning Case Study Data Analysis

There were small amounts of data which were quantitative though much of
this was used for the scoping of each case study. This scoping data was
used to assist in defining the parameters of the case study. The research
team viewed the positive and negative values of each document and each
documents impact upon the implementation process.

Craig Huxley Page 132


A Member of the Centre for Information Technology Innovation

Thus while the documents themselves were quantitative data the impact was
considered from a qualitative aspect. A second source of quantitative data
was in the form of strategic plan documents and these were treated in the
same way as the scoping documents. The third and final source of
quantitative data was from the participants’ answers to five questions after
the completion of the case study. This data was analysed to identify
comments which were able to add to the research team’s list of issues.

The vast majority of the data from the action learning using case study phase
was qualitative data. A meeting was held at the end of each case study to
discuss:
1. The things that went well and why
2. The things that went badly and why
3. The things that didn’t appear to be needed
4. Any suggestions from the participants or the research team
5. What modifications to make and why

The data collection process was to ask all 5 questions for each of the ten
steps of the targeting methodology (see chapter 4). Once comments were
gathered, analysed and conclusions reached, changes were made to the
methodology for the next implementation.

The following section describes the ethical considerations of the research


project and how this research project and the associated project (Reference
models for ES service delivery) collaborated to reduce the workload in this
area and to provide consistency to the research participants.

Craig Huxley Page 133


A Member of the Centre for Information Technology Innovation

5.6 Ethical Considerations

Under University policy, most research projects involving humans requires


some level of University ethical review. Each submission is put before the
University Human Ethics committee who after reading and assessing each
submission will provide guidance to ensure that research activities meet the
Australian guidelines for research ethics. If all suitable and possible
precautions are taken then clearance is given to undertake the activities
necessary for the research project.
This research project required two separate ethics submissions, one asking
for exempted clearance for the pilot study and the second, an expedited
submission, dealt with the remainder of the project.

There are three types of submission dependent on the scale of possible risk
to participants in a project. But simply they are:
1. Exempted- this is a project with almost no risk of identifying people or
organisations in the study
2. Expedited- this is a project which might identify people or organisations
and requires some moderate risk which can be reasonably assessed by the
participants
3. Full submission- this type of project involves people unable to assess the
ethical risk to themselves or unable to control the clearance for that risk to be
taken. (Children and young people, persons with a intellectual or mental
impairment, persons in dependant or unequal relationships, deception or
covert observation) (Queensland University of Technology 2003)

The second ethics submission was a combined submission of this research


project and that of the associated research project (Reference models for ES
service delivery) that was using the same participants for a continuing series
of focus group sessions. By combining the ethics submission for both
projects we ensured that the participants saw a consistent view of this area
and were also able to reduce the workload in preparing the submission.
Consequently both submissions have been granted approval.

Craig Huxley Page 134


A Member of the Centre for Information Technology Innovation

The documentation for this ethics process, which, appears in Appendix 3, is


that which was provided to each participant in this research project. It
contains the details of the focus group study and reflects some of the
requirements of the ethics committee.

5.7 Summary of the Research Methodology

This chapter has described the research process required to achieve the
research objectives. There was a need to test the method in a number of
entities and to develop the method if possible after each test. In order to
ensure that the research was beneficial to the business participants it needed
to be focussed on an area of importance to them. This has been achieved by
using a focus group to identify critical processes and provide a definition of a
critical process for use in a Delphi study. The Delphi study provided the
action learning using case studies with an appropriate focus, which is of most
importance to the ASP service delivery industry. This chapter has also
examined each of these research tools and justified their appropriate use in
the project. It has also provided a description of the ethics requirements and
documents needed to ensure that the project meets Australian ethical
research standards.

The following chapter describes the development of the generic definition of


a critical process and how a list of perceived critical processes was
developed within the focus group. It further describes the process of using
the Delphi study method to develop a rating for each of the perceived critical
processes taken from the focus group data.

Craig Huxley Page 135


A Member of the Centre for Information Technology Innovation

6 Identifying Critical Processes

This chapter describes the facilitation of a focus group to develop a generic


definition of a critical process, a starting list of possible critical processes and
describes the use of a Delphi study to identify the critical processes and
functional areas within the AHC and ASP industry. This chapter also
describes the type of organisation involved in the focus group and Delphi
study. The chapter takes the reader through a description of what occurred
from three perspectives.

These perspectives are:


1) The academic story concerning the actual conduct of research, (that is:
how the focus group and Delphi study were conducted)
2) The data collected and the conclusions from the analysis of the data
3) The personal story of the researcher
The conclusion to this chapter explains how the results of the focus group
and Delphi study provided an important focus for the following case studies.

This chapter will initially describe the focus group and the output from the
focus group session. Then we will describe the Delphi study, the output of
this study and the impact of these two studies on the following case studies.

6.1 Focus Group

The purpose of the focus group was to provide the research team with
insights into the participants’ perceptions of the ASP industry. The objective
of the focus group was to agree on a definition of a critical process and to
discover some examples of critical processes from their industry.

Craig Huxley Page 136


A Member of the Centre for Information Technology Innovation

6.1.1 The Focus Group Session

The focus group was held on May 22nd 2002 in a conference room in the
University research facilities. We had invited participants from organisations
which had business activities within the ASP service delivery industry.

Saulnier (2000) states that focus groups generally have between 6 and 10
participants and that they generally average a 20% non-attendance rate
(Saulnier 2000). For this reason we invited all ten organisations who had
agreed to participate in the focus group session expecting two or three to be
unable to attend. Of those ten companies invited to the focus group session,
six said they were able to attend the focus group session. So to ensure that
numbers were above six we suggested to companies that they may like to
send more than one person, especially in large organisations where all points
of view would have been considered valid.

The day provided us with seven participants from four organisations for the
focus group. We believed this was still a valid size as it was above Saulnier’s
(2000) minimum of six. In addition, the company which supplied three
participants provided people from different management areas of that
organisation (hardware management, network management and services
management). Although the attendance was less than expected, there were
still sufficient persons to convene and run a valid focus group session.

Table 21 shows the break-up of participants for the focus group and how
many participants were from each company.

Company A Company B Company C Company D Total

# of participants 1 1 2 3 7

Table 21- Number of participants from each company

Two organisations provided more than one participant to the focus group,
company C and company D. The focus group participants came from three

Craig Huxley Page 137


A Member of the Centre for Information Technology Innovation

commercial outsourcing firms and one government outsourcing organisation.


There were no in-house ASP organisations (government or commercial) that
were able to attend.

The participants were emailed a two page information document which


contained a map to find the research centre and a brief agenda. The focus
group session was scheduled to last two hours, which was considered by the
literature to be an appropriate time frame (Morgan 1988; O'Neill, Small et al.
1999; Hines 2000; Fern 2001).

The complete agenda was to:


1. Introduce the participants
2. Provide an overview of the research project and the reasoning behind this
session
3. Explain the objectives for the session and the rules for participants
4. Discuss the participants’ view of how they would define a critical process
5. Seek agreement on a definition of a critical process
6. Generate examples of critical processes within the ASP service delivery
domain
7. Position these processes within the value chain of the ASP service
delivery domain developed earlier
8. Permit questions and conclusion

6.1.2 Description of the Focus Group Session

It was intended that the session start at three pm and run till five pm. We had
chosen a Wednesday afternoon as that appeared to be a time in which most
participants had said they were available. We had also telephoned each
participant on the previous day to provide a reminder and find out if there
might be any late cancellations. Due to the non-arrival of some participants
we did not formally start the session until twenty past three.

Craig Huxley Page 138


A Member of the Centre for Information Technology Innovation

Two audio recorders were used and one of the research team took
comprehensive notes. Another of the research team acted as sound
technician and also as support to the moderator. This support was in the form
of time keeping and preventing the group from moving too far from the focus
of the session. The recording did not start until after the introductions and
overview was complete. The research team introduced each of the
participants and then gave a five minute overview on the entire research
project and the aims of the project.

Following the overview we presented the agenda for the session, our aims
and the general rules for a focus group.
1. There should be no derogatory comments about people
2. Each person should be given an opportunity to speak
3. No right or wrong statements, we were interested in the differences as well
as the similarities (Morgan 1988; Hines 2000; Saulnier 2000)
4. Confidentiality of information is not possible once information has been
discussed

With the introductions, overview and rules considered, we asked participants


to initially help us define the context of their organisations. This context was
necessary to show that the participants had valid views of the ASP service
delivery industry. To achieve this, participants were shown a value chain
diagram drawn on the white board as seen in Table 22.

Hardware Software Application Service Business Business Process


Security
Mgmt. Mgmt. Mgmt. Support Process Eng. Outsourcing

Table 22- Draft value chain for ASP service delivery

Table 22 is a description of the value chain which was used to position the
activities of the focus group participants. They were asked first if this
represented a description of their industry. The responses led to two
additions to the value chain being made;
1. Product Development
2. Information Technology Strategy

Craig Huxley Page 139


A Member of the Centre for Information Technology Innovation

This can be seen in Table 23, with Product Development in the second
column from the right and IT Strategy in the third column from the left. Table
23 also provides the activities performed by each of the focus group
participants.

Business Business
Product Hardware Software Application Service IT
Co. Security Process Process
Develop. Mgmt. Mgmt. Mgmt. Support Strategy
Eng. Outsourcing

A √ √ √ √ √ √ √ √ √
B √ √ √ √ √ √ √ √ √
C √ √ √
D √ √ √ √ √ √ √

Table 23- Areas of IS Outsourcing in which Focus Group Participants operate

In Table 23 each company is represented as company A, B, C and D in the


first or left hand column. Company C provides services in product
development, business process engineering and business process
outsourcing. They outsource the hardware, software and application
management as well as the Security, service support and IT strategy
services.

Once the context of each organisation was obtained and the value chain
reconfigured to resemble that of the participants, we introduced the concept
of what a critical process is and posed the question, ‘how do we define a
critical process?’ The question asked was simply, ‘What is a critical process?’
The discussion between the participants was allowed to move between types
of critical processes and why they were critical. To discover more about what
critical meant, we encouraged participants to be more specific when
explaining why they considered a process critical.

Once the participants appeared to have contributed sufficient points of view


and related their experiences on the subject, the group was asked if they
would evaluate the ‘working’ definition for critical. This definition is the work of
Stewart (2002) “Those ‘few’ processes which have the ‘greatest’ effect on the
attainment of Corporate Strategic Goals” (Stewart 2002). The participants

Craig Huxley Page 140


A Member of the Centre for Information Technology Innovation

agreed that the definition was correct in looking at strategic goals or


corporate strategic goals. They also agreed that the word effect was useful
as it could be both a negative or positive effect that was occurring.

The only change made was that the word ‘few’ was omitted from the
definition. This provided the research team with a definition of a critical
process which is; “those processes which have the greatest effect on the
attainment of Corporate Strategic Goals.” Their reasoning is that in some
situations there may be many critical processes.

After reaching consensus over the definition of a critical process, the session
moved onto the next point on the agenda. This was initiated by providing all
participants with a pad of yellow ‘post it’ notes. Figure 27 is an example of the
type of post-it notes that were used.

Post-it

Figure 27- Example of Post-it notes used in focus group

The participants were then asked to write down the name or description of
five processes which they thought were most critical in their view (One on
each page). In addition to naming the process, participants placed a dot on
an X-Y chart showing the relationship of the process to Value or Problem.
Figure 28 provides an example of the type of response which participants
provided to the request for examples of processes they considered critical to
their business. The diagram shows an x-y chart on the right with a black dot
indicating the relationship between value and problem area for the process
called project management.

Craig Huxley Page 141


A Member of the Centre for Information Technology Innovation

Project
Management Value

Problem

Figure 28- Example of the type of response for identifying processes

There were 23 processes taken from this part of the focus group session.
These processes could be grouped into the following functional areas:
1. Information Technology strategy processes
2. Hardware Management, software management and application
management processes
3. Enabling processes
4. Service support processes

Each process has an x-y chart showing the participant’s perception of the
value and problem rating for that process. This is indicated by the blue dot
which was positioned by each participant for the process that they named
and put forward. The participants agreed that a process with a high value
rating and a high problem rating would be considered a more critical process
than one that might be high as a value adding process or a problem process
on the x-y chart.

Figure 29 shows three critical processes from within the IT strategy domain
of the ASP industry; consulting, business analysis and information systems
planning. Though each is shown as not a current problem area, the values of
each process indicate that each process is considered of high value.

Craig Huxley Page 142


A Member of the Centre for Information Technology Innovation

Consulting Value Business Value


o Issue Analysis
Identification
o Analysis
o Presentation

Problem Problem

Information Value
Systems
Planning

Problem

Figure 29- Information Technology Strategy Processes

Figure 30 shows processes from within the domain of hardware, software


and application management within the ASP industry: monitoring & alarms,
server monitoring, cost control, regular hardware maintenance, procurement
and data backup and integrity. The process ‘cost control’ might be part of all
areas of the ASP value chain.

Monitoring Value Data Backup Value


and Alarms & Integrity

Problem Problem

Server Value Procurement Value


Monitoring

Problem Problem

Cost Control Value Regular Value


Hardware
Maintenance

Problem Problem

Figure 30-Hardware, Software and Application Management Processes

In Figure 30 we see problem areas in all but the monitoring and alarms
where it was considered to provide few problems.

Craig Huxley Page 143


A Member of the Centre for Information Technology Innovation

Human Value Information Value


Resource Technology
Planning Services

Problem Problem

Supplier Value Strategic Value


Relationship Planning
Management

Problem Problem

Research Value Business Value


Awareness Development
Training
Knowledge Man.

Problem Problem

Timely Payment Value


To Providers

Problem

Figure 31- Enabling Processes

Figure 31 shows seven critical processes that are viewed as belonging to the
domain of enabling processes: human resource planning, supplier
relationship management, research awareness training and knowledge
management, information technology services (internal IT), strategic
planning, business development, and timely payment to providers. Enabling
processes are those things that an organisation undertakes to support their
customer facing activities. Of these enabling processes strategic planning is
seen to be both a problem process as well as a high value process.

Figure 32 shows seven critical processes from the service support domain of
the ASP industry. These are the processes which are customer facing in that
they have a direct interface with the client or user of the services being
provided. Though change management, incident management and
commissioning process for new hardware were considered high problem
areas, their business value is low. The service level agreement management
processes was considered to have a high value and also be a problem
process.

Craig Huxley Page 144


A Member of the Centre for Information Technology Innovation

Implementation Value Project Value


Project Management
Management

Problem Problem

Project Value Service Level Value


Management Agreement
Management

Problem Problem

Change Value Incident Value


Management Management

Problem Problem

Commissioning Value
Process for New
Hardware

Problem

Figure 32- Service Support Processes

Of note are the differences in the project management x-y charts indicated by
the red stars. Though not at opposite poles of the value-problem chart, the
disparity in results for these three examples of project management were
explained by the participants as due to their experience of project
management within their organisation. This is shown in Figure 33 as an
example of a most critical process and two examples of less critical
processes.

Project Project Project


Management Value Management Value Management Value

Problem Problem Problem

Figure 33- Example of most critical and less critical processes

Figure 33 is an example of the difference between a most critical process


and that of two processes which are critical but not as critical as the process
on the left of the figure. The left hand process has a high value rating and a

Craig Huxley Page 145


A Member of the Centre for Information Technology Innovation

high problem rating whereas for the two examples on the right, the centre
one has a high value rating but low problem rating and the far right example
has a high problem rating and a low value rating.

This single method identifies critical processes and gives a visual


representation of their state as a problem and or value process. The
approach was easily applied and gave a quantitative result for the critical
processes and their relative weights in relation to being a problem or value
process.

The next step was to ensure adequate coverage of the process domain
found within ASP. The manner in which this was explored is discussed in the
next section.

Classifying the Processes

Each of the ‘post it’ notes with a process was then stuck to the whiteboard
under the functional area of the application service delivery value chain that
was thought most appropriate by the group. We also placed below the value
chain a series of enabling process headings which were used for the
enabling processes identified by the participants. Table 24 lists the processes
which were used for this activity.

Business Business
Product Hardware Software Application Service IT
Security Process Process
Develop. Mgmt. Mgmt. Mgmt. Support Strategy
Eng. Outsourcing

Human
Business Relationship Risk Internal IT
Resource Finance Procurement Marketing
Planning Management Management Infrastructure
Management

Table 24- Value chain of ASP and enabling processes

Table 24 lists the two sets of ‘headings’ which were used as domain areas for
the identified processes. The participants agreed to this list as a valid set of
headings under which their individual processes might be grouped. With their

Craig Huxley Page 146


A Member of the Centre for Information Technology Innovation

concurrence of Table 24, we then placed the processes under relevant


domains. The result of this process is shown in Table 25.

Table 25- Application Service Provision Value Chain

Table 25 lists the layout for the processes provided by the participants and
positions them within the value chain and under an appropriate heading in
the enabling process section. The group considered that cost control could
be part of all areas of the value and chain and also part of each enabling
area. The headings in blue are the enabling process area and those in green
are the value chain high level processes.

With this phase of the focus group complete, we asked if there were any final
questions. No further issues arose so we concluded the focus group session
with thanks for the useful data collected and time given by the participants.

The following section describes the data analysis of the focus group.

6.1.3 Data Analysis of the Focus Group

Craig Huxley Page 147


A Member of the Centre for Information Technology Innovation

This section of the chapter examines the data from the focus group session
and provides an analysis of this data and uses this analysis to support the
output of the quantitative data already discussed in the previous section.

The approach taken to this analysis of the qualitative data is:


1. The research team compared their impressions of the group consensus for
the definition of critical, with that taken from the transcribed audio recordings.
2. The listing of critical processes was also assessed for completeness and
consensus with the comments of the participants in the audio transcriptions.
The quantitative data (list of critical processes and value chain) was then
compared with the comments found in the transcriptions for completeness
where appropriate. That is, the final version of the definition was validated by
looking for comments in the transcriptions which supported or disagreed with
the final definition of a critical process.

The major qualitative data output for the focus group were the critical
processes identified by the participants.

Craig Huxley Page 148


A Member of the Centre for Information Technology Innovation

Domain Process Problem Value


IT strategy Consulting Low High
Business analysis Low High
Information systems planning Low High
H/ware, s/ware & Monitoring and alarms Low High
maintenance
Server Monitoring Medium Medium
Cost control High High
Data backup & integrity High Low
Procurement High Low
Regular Hardware maintenance Medium Medium
Enabling Human resource planning High Low
Supplier relationship management Low High
Research, awareness, training & knowledge Medium Medium
management
IT services (internal) Low Medium
Strategic planning High High
Business development Medium High
Timely payment to providers Low High
Service Support Implementation project management Low Medium
Project management 1 Low High
Change management High Low
Commissioning for new hardware High Low
Project management 2 Medium Medium
Service level agreement management High High
Incident management Medium Low

Table 26- List of all the critical processes and their value and problem score

This data provided the initial list of processes used in the Delphi study and
also the functional areas for these processes. It also enabled the focus of the
case studies to be around functional areas which the participants had agreed
were critical to their business. The improved focus led to a greater
involvement due to the greater relevance and increased useful of outcomes.

The research team compared their impressions of the group consensus for
the definition of critical with that taken from the transcribed audio recordings.
The research team believed that the focus group participants had general
consensus for the agreed definition of what a critical process is: those

Craig Huxley Page 149


A Member of the Centre for Information Technology Innovation

processes which have the greatest effect on the attainment of Corporate


Strategic Goals.

Our interpretation of the focus group session was that most discussion was
centred on what a critical process is, that is: core processes, core business
processes and essential processes; a description of a process which was
essential or critical because of the time it takes to go out of business if that
process fails. Another description of these ‘critical’ processes was that of how
dependent your organisation was on the process. The example used was for
Hardware processes, which had backup processes that were ‘critical’, thus
resulting in the hardware process itself being not as critical as the backup
process.

These viewpoints were nicely summarised by one participant who


commented that:
“we are trying to be a bit absolute in defining critical in what we
deliver and of course concluding that you can’t because it
depends on circumstances and requirements and other
things……so something like corporate strategic goals is a
catch all which says that whatever the corporate strategy says
is necessary is what defines what is critical”.

Other participants agreed, adding that a critical process was ‘context driven’
and also that a critical process ‘referenced the provider, manufacturer and
the service to the customer’.

The next focus for participants was that of the term ‘few’ in the offered
definition. One participant commented that “how many [processes] you may
already have identified as critical does not have any bearing on whether the
next one you are considering is critical or not”. This participant was referring
to the restriction of the word few in defining critical processes. They added
that “if you have 50 processes you consider critical your business is a bit
complex …. but it may be true.” Another participant stated “I find it difficult to
conjure up a quantity, because to my mind it is dependent on context”.

Craig Huxley Page 150


A Member of the Centre for Information Technology Innovation

With these comments the participants agreed that the word ‘few’ would be
removed from the definition and that they agreed to the final version “those
processes which have the greatest effect on the attainment of Corporate
Strategic Goals”.

The listing of critical processes was also assessed for completeness and
consensus with the comments of the participants in the audio transcriptions.
The research team went over the transcriptions looking for mention of
processes which were described as critical but were not offered as a critical
process on a ‘post-it’ note. The scan of the transcriptions added no new
processes. There was some discussion of the value chain and enabling
processes, with participants undecided as to the position of processes such
as customer relationship management.

For example, one participant commented: “where would customer


relationships initiatives be I would say they should be up the top there [with
the value chain processes]” with another participant saying “no it’s a support
[enabling] process”. This type of disagreement was concerned with the
context in which the participant used the process under discussion. In
general though the participants found agreement for the value chain and
naming of the enabling processes with one participant saying we were “over
80 percent correct”.

6.1.4 Focus Group Summary

The transcriptions in general supported the quantitative data of the focus


group indicating that we had provided a new and agreed:
1. Definition of a critical process as “those processes which have the greatest
effect on the attainment of corporate strategic goals.”
2. List of critical processes within the ASP service delivery industry
3. Value chain and list of enabling processes

Craig Huxley Page 151


A Member of the Centre for Information Technology Innovation

This focus group also performed an important function for the participants.
We believed it was important that they meet as a group initially to evaluate
the ‘worth’ of the research project in the context of the other participants. If
they believed that they were amongst their peers they would be more likely to
contribute in order to learn from their peers. In order for this to occur the first
fifteen minutes was allocated for informal networking.

The next step was to validate these findings in a wider sample. This was
undertaken in the Delphi study, which is described in the next section. The
following section examines the Delphi study, which used the list of critical
processes and the agreed definition of a critical process as the basis of the
first round of the study.

Craig Huxley Page 152


A Member of the Centre for Information Technology Innovation

6.2 Delphi Study

The section describes the conduct of the Delphi study and the data obtained
from the study. The results of this study were input for the case studies and
also for the associated research project (‘reference process modelling for ES
service delivery’). This section will next describe the purpose of the Delphi
study and then provide a description of the three rounds and finally discuss
the findings. The chapter will then conclude with a summary of both the focus
group and the Delphi study and how this data was used in the case study
phases of this research project.

6.2.1 Purpose of the Delphi Study

The purpose of this Delphi study was to discover which processes from those
provided by the participants in the focus group session were the most critical
as perceived by a wider community. With this knowledge the research team
was able to narrow the focus of the case studies to an area which the case
study participants considered most critical. This approach ensured greater
business benefit to participants and the data was also then used in the same
way for the associated research project, ‘reference process modelling for ES
service delivery’.

6.2.2 The Participants

Participants for this Delphi study were sourced from large application service
provision companies in Australia; that is, organisations who provided large
application services as a commercial business or organisations providing this
type of service, (in-house), to their own company. There was no constraint on
geographic location within Australia as the study was conducted by email.

Participants were sourced initially by two methods:

Craig Huxley Page 153


A Member of the Centre for Information Technology Innovation

1. Known contacts of the research team


2. Contacted from research into commercial and government organisations
providing large ASP service delivery operations. We were constrained to
some extent in our selection of participants in needing to introduce the
research project, explain the ethics of the project and ensure that participants
were able to sign our “agreement to participate” form. For this reason we did
not attempt to source applicants from outside Australia.

The Delphi study involved 16 participants from ten organisations and these
are described in chapter 2. The companies are categorised as government or
public company and in-house or outsourcing providers as shown in Table 27.
There were seven commercial outsourcers and 2 government outsourcers.

Participated Government Public Co.


In-house 0 1
Outsourcer 2 7
Table 27 - Participant type for Delphi study

Table 28 shows the number of persons from each of the types of


organisations in which we segmented our participants as well as the data
from Table 27.

# of Persons # of # of
# of Persons
Participated from Government from Public Co.
Public
Government Organisations Co.’s
In-house 0 0 1 1
Outsourcer 5 2 10 7

Table 28- Number of persons from each type of organisation

Table 28 compares the number of people who participated in the Delphi


study and the type of organisation in which they operated to the number of
companies in the study. The table shows that the bias seen in the company
distribution is partially reduced by considering the figures for the number of
actual persons involved. That is, five people from two government
outsourcing organisations against ten people from seven commercial

Craig Huxley Page 154


A Member of the Centre for Information Technology Innovation

outsourcing companies. There was no change to the in-house government


against in-house commercial, which remained at zero for government and
one and one for commercial in-house. This should have provided a total of
sixteen separate participants from ten organisations for the Delphi study, but
this was not the case. Six of the participants combined their response with
that of one of the ten or eleven other participants.

Although the Delphi study was conducted with eleven responses on the first
round and ten responses on the second it was still considered by the
research team to be a valid representation of the ASP industry in Australia.
Four of the participant companies shared nearly 80% of the market revenue
in this country. One participant company is also the largest government
provider in Australia. These attributes combine to provide the study with
useful results in this geographic environment.

Table 29 lists the total number of responses for each round of the Delphi
study and the total number of participants.

Total # of Participants 16

# of Round One Responses 11

# of Round Two Responses 10

# of Round Three Responses 10

Table 29- Total number of participants and total responses for each round of the Delphi study

There were eleven responses for the first round and ten for the second and
third rounds of the study. The cause of this was that where there was more
than one participant from a company, in many cases all the participants from
that one company provided a joint response. The combining of responses
resulted in a single response being recorded in the Delphi study results.

Craig Huxley Page 155


A Member of the Centre for Information Technology Innovation

6.2.3 Description of the Delphi Study

The Delphi study was started on the 27th May 2002. Participants were
emailed an Excel sheet containing a list of processes and instructions asking
them to provide a rating in the cell next to each process as to the criticality of
that process. The processes used in the Excel sheet were those taken from
the focus group, with the addition of two added critical processes, Contract
Negotiation and Service Level Management. These processes are shown in
Table 30 below as ‘additional processes’ and left to the participants to place
them within the correct functional area.

Table 30- list of processes and process headings used in Delphi Study

Cost control was also added to six of the process headings as it was
originally under product development only, and procurement was added to
software management. Participants were allowed to add processes, therefore
increasing the number of processes to be rated from an original twenty three
to thirty two.

An Excel sheet was used, as any comments could be easily added by the
participants and the data collected was easily collated within the Excel
application.

Craig Huxley Page 156


A Member of the Centre for Information Technology Innovation

The instructions were kept simple and were repeated in the email as well as
within the Excel sheet. Included in the Excel sheet was the agreed definition
of critical; “those processes which have the greatest effect on the attainment
of Corporate Strategic Goals”. This was used so that participants would view
each process in the same way as other participants. Figure 34 is an example
of the instructions and checklist which was used for the first round of the
Delphi study.

1 The Instructions and Checklist


¾ Add any critical processes that you believe are not on the list

¾ Which of these processes are most important to you and why?

¾ Please rate each process in order of criticality

(1 most critical – 10 least critical)

¾ Responses can be emailed and faxed back.

The definition of a critical process is; “those processes which have


the greatest effect on the attainment of Corporate Strategic Goals”.

Figure 34- Instructions and checklist supplied with the Delphi study

Figure 34 is called the instructions and checklist as each instruction is also


used at the completion of a participants rating of the processes to check that
they have completed all the necessary tasks. It was thought that participants
would take approximately fifteen to twenty minutes to complete the response
and this was generally the case.

Round Date Issued Date Received Date Analysed


1 27th May 2002 10th June 2002 16th June 2002
2 17th June 2002 1st July 2002 7th July 2002
3 8th July 2002 22nd July 2002 28th July 2002

Table 31- Start, return and analysed dates for Delphi study

Craig Huxley Page 157


A Member of the Centre for Information Technology Innovation

The results for round one were received with the last one arriving on the 10th
of June 2002. Collation of the eleven responses was completed by compiling
all the responses including comments into one Excel sheet. Each process
was then analysed to calculate the mean result, median response and
standard deviation of the responses for that process and also to determine
the maximum and minimum rating. Table 32 below, is an example of the
collation and analysis of the responses in the Delphi study.

Std.
Mean Median Min Max # of People
Dev.
IT Strategy
Business analysis 3.2 2 1 10 2.4 11
Information Systems Planning 3.2 2 1 10 2.6 11

Table 32-Example of collated and analyised responses

In Table 32 it can be seen that under the process heading IT strategy are two
processes (Business analysis and Information systems planning). The mean
of the eleven responses for both processes is 3.2 with the median 2. The
standard deviation tells us that the maximum rating of 10 for the business
analysis process was more than two standard deviations from the mean (3.2
+2.4+2.4 =9). Thus this maximum rating of ten might be considered a
statistical outlier. This means that statistically that particular response is far
enough from the mean or average response to be considered an anomaly
and could be removed from the data. Table 33 shows the number of
responses received for round one of the Delphi study.

Company A B C D E F G H I J Total # people


Total # of Participants 1 3 2 2 2 1 1 1 2 1 16
Round One Responses 0 1* 1* 2 2 1 1 1 1* 1 11

Table 33- Number of responses to Round one of the Delphi Study

The asterisk [*] denotes responses received which were the response of
more than one individual from that organisation. For example, in company B
three people contributed to one response. The first round was sent out with
32 processes in the list and the collation revealed that an extra 12 processes

Craig Huxley Page 158


A Member of the Centre for Information Technology Innovation

had been added with ratings applied to them and two further processes
added without a rating, totalling 46. This can be seen in the data analysis
section. (32 on the outgoing Excel sheet + 12 rated and + 2 unrated within
the responses = total 46 processes)

Second Round
With the first round responses collated and analysed, we were able to
provide participants with the second round data. Once again it was in an
Excel worksheet and contained sanitised comments from participants.
Comments were sanitised by removing information that may identify the
participant. These comments related to that participants reason for the rating
they provided. In the figure below we have shown an example of the type of
comment which was left in the Excel sheet in the second round responses by
a participant.

Figure 35- Example of comments added by participants

The small red triangles in the top right hand corner of some cells indicate that
there is a comment attached to that cell. By holding the cursor over the cell
the text box shown in the example appears with the comment. In this way
suitably sanitised comments were able to support the ratings provided by
each participant. The addition of comments from the participants is the main
reason why consensus is generally found in Delphi studies (Dalkey and
Helmer 1963; Bowles 1999; Chang, Smythe et al. 2000). Participants are
able to provide anonymous comments as to their reasons for a particular

Craig Huxley Page 159


A Member of the Centre for Information Technology Innovation

rating within the study and the other participants may learn of a new
perspective on the subject, thus change their opinion and rating.

The outgoing Excel sheet also contained the responses from the first round
though no names were attributed to any responses. In this way each
participant or group of participants was able to assess their new response in
the light of comments, the ratings of other participants and their previous
response. This approach also ensured that participants would not need to
look for their previous response to see what they had said about their rating
for each process. There were now forty-six processes to rate due to those
that participants had added and this was too many for a participant to
remember the rating they had previously given.

The second round was started on the 17th June and completed by the 1st July
2002. The two week period for participants to complete and return their
responses was the 17th June to the 1st July 2002 and the week for collation
and analysis was the 1st July to the 7th July 2002. Sixteen emails were sent
out to sixteen participants and ten responses were returned for the second
round of the Delphi study.

Table 34 shows the break-up of responses for round two and compares
these with those for round one.

Company A B C D E F G H I J A-J total


Total # of Participants 1 3 2 2 2 1 1 1 2 1 16
Round One Responses 0 1* 1* 2 2 1 1 1 1* 1 11
Round Two Responses 1 1* 1* 2 1 0 1 1 1* 1 10

Table 34- Number of responses to Round Two of the Delphi Study

As for round one of the Delphi study Table 34 shows that ten responses were
received from a possible sixteen participants. The asterisk * denotes those
responses which were the collective thoughts of more than one person.
Round two saw a reduction in responses by one with company A providing
one response for the first time and company F unable to provide a response.

Craig Huxley Page 160


A Member of the Centre for Information Technology Innovation

Round Three
All received responses were collated and analysed by the 12th July 2002 and
the research team were able to send out the last round (round three) on the
15th July 2002. We followed the same process as for round two and emailed
all sixteen possible participants with the data in an Excel sheet for this final
round. Each Excel sheet contained the collated results of the second round
as well the anonymous responses and comments of each participant. The
only change to the previous round was that respondents were asked to rate
the process headings as well. Table 35 below contains the process headings
which participants in the third round were asked to rate for criticality.

IT Strategy
Software Management
Application Management
Service Support
Hardware Management
Product Development
Security
Support Processes
Application Support

Table 35- List of process headings for Delphi Study

In Table 35 the headings are taken from the focus group data apart from the
last heading (application support) which was added by one of the participants
in the first round. This heading might fit neatly into the heading ‘application
management’ but was left as a separate heading to avoid the use of sub-
headings, which might add confusion. There were nine further processes
(process headings) to rate in the final round, with a total of fifty-five
processes. There were forty-six processes from the responses to the first
round, with no changes in the second round, plus the nine process headings
added to the third round. The email with an attached excel sheet was sent to
participants on the 8th of July and all responses were received by the 22nd of
July 2002.

Craig Huxley Page 161


A Member of the Centre for Information Technology Innovation

With all those responses received by the 22nd, even after further reminder
emails, there were ten complete responses to the round. Table 36 lists the
number of participants from each of the ten organisations who participated in
the Delphi study and highlights the responses for the third round.

Company A B C D E F G H I J A-J total


Total # of Participants 1 3 2 2 2 1 1 1 2 1 16
Round One Responses 0 1* 1* 2 2 1 1 1 1* 1 11
Round Two Responses 1 1* 1* 2 1 0 1 1 1* 1 10
Round Three Responses 0 1* 1* 0 2 1 1 1 2 1 10

Table 36- The number of participants from each company in the Delphi study

Numbers in the table with an * indicates that more than one person provided
input to the one response received from that organisation. The table shows
that company A and company D did not provide a response in round three.
Company I reversed the trend, by sending two responses, one for each
participant. The result was ten usable responses for the third round from
thirteen participants.

There were no added processes. This left a total of forty six processes and
nine process headings which were rated for criticality. Participants were
thanked for their participation in the study and sent a sanitised set of the
results.

The following section will analyse the data from the Delphi study which was
used to provided an industry perceived focus of importance for the action
learning using case studies phase of this research project.

6.2.4 Data Analysis of the Delphi Study

This section examines the data collected and then collated from each of the
three rounds of the Delphi study. The aim of this analysis was to assess the

Craig Huxley Page 162


A Member of the Centre for Information Technology Innovation

level of consensus for the collated results of responses for each round and to
assess which processes and process groups are considered most critical to
the participants.

The approach taken in the analysis was to place all the quantitative data into
one Excel sheet at the end of each round of the study. Each process was
then analysed for the mean response, maximum rating applied, minimum
rating applied, standard deviation of all rating for that process and the
number of responses for that process. Once this was achieved the research
team were able to position the processes within each of the processes
headings in order of criticality. The collated and analysed results of each
round of the study was then placed into one further Excel sheet and the
changes in standard deviation of the results for each round were viewed in
order to assess the consensus that may or may not have occurred.

Two important parts of the analysis were:


1. The final ratings for each process and the ratings for the functional
headings
2. The amount of agreement or disagreement which occurred in reaching the
final rating of criticality for each process.

The qualitative data (comments by respondents) was not analysed for use in
any other part of the study as this analysis would add little value to the focus
for the case studies.

This section will provide the data for each round separately and the analysis
of that data will follow immediately after the table of data. We will then
provide the data for standard deviation into one table.

Craig Huxley Page 163


A Member of the Centre for Information Technology Innovation

Table 37 is the collated data showing some analysis from round one of the
Delphi study. (1 = most critical and 10 = least critical)

Mean Median Min Max Std. Dev. # of Entries


IT Strategy
Business analysis 3.2 2 1 10 2.4 11
Information Systems Planning 3.2 2 1 10 2.6 11
Consulting- Issue Identification,
3.3 3 1 10 2.4 11
Analysis, Presentation
Cost Control 3.7 3 2 10 2.2 11
Software Management
Data Base backup & Delivery 2.5 2 1 6 1.7 11
Cost Control 3.5 4 2 5 1.2 11
Procurement 4.6 5 3 7 1.1 11
Application Management
Performance Monitoring 3.0 3 3 3 0.0 2
Server Monitoring 3.5 3 1 8 2.0 11
Monitoring & Alarms 3.5 3 1 8 2.1 11
Cost Control 3.9 3 2 8 1.7 11
Service Support
Customer Relationship Mgmt. 1.0 1 1 1 0.0 2
Service Level Management 2.5 2 1 6 1.5 8
Change Management 2.5 2 1 7 1.7 11
Implementation Project Mgmt. 2.5 2 1 7 1.8 11
Project Management 2.6 2 1 7 1.6 11
Incidence Management 2.6 3 1 7 1.6 11
Service Level Agreement Mgmt. 2.8 3 1 7 1.7 11
Cost Control 3.7 4 2 7 1.5 11
Commissioning Process
3.9 3 1 7 2.1 11
for new Hardware
Supplier Relationship Mgmt. 4.3 4 2 7 1.5 11
Hardware Management
Upgrade/Refresh 3.0 3 3 3 0.0 1
Cost Control 4.0 4 2 9 1.9 11
Regular Hardware Maintenance 4.6 5 2 9 2.1 11
Procurement 5.2 5 2 9 2.0 11
Product Development
Contract Negotiation 2.9 3 1 5 1.2 9
Architecture/Design 3.0 3 3 3 0.0 1
Solution Development 4.0 4 4 4 0.0 1
Cost Control 4.0 4 2 6 1.3 11
Marketing 7.0 7 7 7 0.0 1
Security
Physical Security 2.0 2 2 2 0.0 1

Craig Huxley Page 164


A Member of the Centre for Information Technology Innovation

Operating System Security 2.0 2 2 2 0.0 1


Network Security 3.2 2 1 7 2.2 11
Cost Control 3.9 3 2 7 1.7 11
Support Processes
Business development 3.1 3 1 7 1.5 11
Research, Business Awareness,
3.2 3 1 7 1.6 11
Training & Knowledge Mgmt.
Strategic Planning 3.4 3 1 7 1.6 11
Human resource planning 3.9 3 2 7 1.6 11
Internal IT Services 4.0 4 2 7 1.4 11
Timely payment to providers 4.3 4 2 8 1.9 11
Billing
Application Support
Application Security 1.0 1 1 1 0.0 1
IS System Configuration 2.0 2 2 2 0.0 1
Service Desk Support 2.0 2 2 2 0.0 1
Programming Development 3.0 3 3 3 0.0 1
System Implementation

Table 37- Collated responses for Round one of Delphi Study

Of the forty six processes within the study, only eight resulted in a mean
rating of less than three. (1= most critical and 10 least critical) Table 38
below lists the eight processes with a resultant mean of less than three.

Mean Median Min Max Std. Dev. # of Entries


Software Management
Data Base backup & Delivery 2.5 2 1 6 1.7 11
Service Support
Service Level Management 2.5 2 1 6 1.5 8
Change Management 2.5 2 1 7 1.7 11
Implementation Project Mgmt. 2.5 2 1 7 1.8 11
Project Management 2.6 2 1 7 1.6 11
Incidence Management 2.6 3 1 7 1.6 11
Service Level Agreement Mgmt. 2.8 3 1 7 1.7 11
Product Development
Contract Negotiation 2.9 3 1 5 1.2 9

Table 38- Process from Round one with resulting mean of less than three

Of these eight, six of these processes are under the heading of ‘Service
Support’. It is possible that Service level management, the process second
from the top, which received only eight responses might be different if all
eleven had been usable. Some responses did not provide a rating for these

Craig Huxley Page 165


A Member of the Centre for Information Technology Innovation

processes and were thus unusable for those processes. At the other end of
the scale, using the mean result for each process there was only one process
which had a rating of greater than five (1= most critical and 10 least critical)
and seven between four and five. This can be seen in Table 39.

Mean Median Min Max Std. Dev. # of Entries


Software Management
Procurement 4.6 5 3 7 1.1 11
Service Support
Supplier Relationship Mgmt. 4.3 4 2 7 1.5 11
Hardware Management
Cost Control 4.0 4 2 9 1.9 11
Regular Hardware Maintenance 4.6 5 2 9 2.1 11
Procurement 5.2 5 2 9 2.0 11
Product Development
Cost Control 4.0 4 2 6 1.3 11
Support Processes
Internal IT Services 4.0 4 2 7 1.4 11
Timely payment to providers 4.3 4 2 8 1.9 11

Table 39-Processes with a resultant mean of greater than four for criticality

Looking at the maximum rating applied by a participant, we can see that


fourteen ratings were more than two standard deviations from the mean in
Table 379. This caused some concern to the research team as it may have
been considered an anomaly within the data. The participant was contacted
to ensure that there had not been some confusion over the rating process.
For example, one participant had applied a ten rating thinking that it was the
most critical end of the scale (1= most critical and 10 least critical). This was
not the case and statistically the removal of this data from the study changed
the results only minimally.

The second round of the study provided ratings for all forty six processes with
ten usable responses. Table 40 represents the results of assessing the mean
response, maximum rating applied by a participant, minimum rating applied,
standard deviation of all ratings for that process and the number of
responses for that process. In addition, the placement of processes within
their process headings is in order of criticality. Those processes that are

Craig Huxley Page 166


A Member of the Centre for Information Technology Innovation

thought by the participants as most critical are positioned the top of their
functional area and those least critical at the bottom.
Mean Median Min Max Std. Dev. # of Entries
IT Strategy
Information Systems Planning 3.0 3 2 5 0.77 10
Consulting- Issue Identification,
3.0 3 1 5 1.26 10
Analysis, Presentation
Business analysis 3.1 3 1 6 1.37 10
Cost Control 3.5 4 1 5 1.20 10
Software Management
Data Base backup & Delivery 1.9 2 1 4 0.87 9
Cost Control 3.3 3.5 1 5 1.10 10
Procurement 5.0 5 4 7 0.89 10
Application Management
Server Monitoring 2.6 3 1 4 0.92 10
Monitoring & Alarms 2.8 3 1 4 0.98 10
Performance Monitoring 3.0 3 2 5 0.89 10
Cost Control 3.6 4 2 5 0.80 10
Service Support
Customer relationship Management 1.7 1 1 5 1.27 10
Incidence Management 2.0 2 1 3 0.77 10
Service Level Agreement Management 2.4 3 1 3 0.80 10
Service Level Management 2.5 3 1 4 0.92 10
Implementation Project Management 2.6 2 2 4 0.80 10
Change Management 2.6 2 1 7 1.71 9
Project Management 2.8 3 2 4 0.60 10
Cost Control 3.4 3 2 5 1.07 9
Commissioning Process for
4.0 4 3 5 0.77 10
new Hardware
Supplier Relationship Management 4.1 4 3 5 0.70 10
Hardware Management
Upgrade/Refresh 3.5 3 3 5 0.81 10
Cost Control 3.6 4 2 5 0.80 10
Procurement 4.2 5 2 5 1.17 10
Regular Hardware Maintenance 4.3 5 1 6 1.49 10
Product Development
Architecture/Design 3.4 3 3 5 0.66 10
Contract Negotiation 3.5 3 3 5 0.67 10
Solution Development 3.6 4 2 5 0.80 10
Cost Control 3.9 4 2 5 0.83 10
Marketing 5.0 5 2 7 1.33 9
Security
Physical Security 1.9 2 1 3 0.54 10
Operating System Security 2.0 2 1 3 0.63 10
Network Security 2.0 2 1 3 0.89 10

Craig Huxley Page 167


A Member of the Centre for Information Technology Innovation

Cost Control 3.8 4 3 5 0.60 10


Support Processes
Billing 2.6 3 1 4 1.18 7
Strategic Planning 3.1 3 2 5 0.70 10
Research, Business Awareness,
3.2 3 1 6 1.33 10
Training & Knowledge Mgmt.
Timely payment to providers 3.9 4 1 8 1.81 10
Internal IT Services 4.1 4 3 6 0.83 10
Human Resource Planning 4.1 4 3 7 1.04 10
Business Development 4.4 3.5 2 10 2.29 10
Application Support
Application Security 1.8 1.5 1 5 1.17 10
IS System Configuration 2.2 2 1 4 0.75 10
Service Desk Support 2.5 2 1 5 1.12 10
System Implementation 3.0 3 2 5 1.00 6
Programming Development 4.0 3 3 7 1.33 9

Table 40- Round two collated and analysed results

The same process is carried out for this second round of data as has been
performed for the first round. Table 40 provides the collated and analysed
results of the second round of the Delphi study. The number of processes
that were rated below three on the Likert scale went from eight to seventeen.

Craig Huxley Page 168


A Member of the Centre for Information Technology Innovation

Table 41 shows the processes in the second round which had a mean
criticality rating of less than three.

Mean Median Min Max Std. Dev. # of Entries


Software Management
Data Base backup & Delivery 1.9 2 1 4 0.87 9
Application Management
Server Monitoring 2.6 3 1 4 0.92 10
Monitoring & Alarms 2.8 3 1 4 0.98 10
Service Support
Customer relationship Management 1.7 1 1 5 1.27 10
Incidence Management 2.0 2 1 3 0.77 10
Service Level Agreement Management 2.4 3 1 3 0.80 10
Service Level Management 2.5 3 1 4 0.92 10
Implementation Project Management 2.6 2 2 4 0.80 10
Change Management 2.6 2 1 7 1.71 9
Project Management 2.8 3 2 4 0.60 10
Security
Physical Security 1.9 2 1 3 0.54 10
Operating System Security 2.0 2 1 3 0.63 10
Network Security 2.0 2 1 3 0.89 10
Support Processes
Billing 2.6 3 1 4 1.18 7
Application Support
Application Security 1.8 1.5 1 5 1.17 10
IS System Configuration 2.2 2 1 4 0.75 10
Service Desk Support 2.5 2 1 5 1.12 10

Table 41- Process rated less than three for the second round of the Delphi study

Within Table 41 there are four processes highlighted in green, which are the
first processes to be given a mean rating of less than two. There are now
also seventeen processes with a mean criticality rating of less than three, of
which seven are within the Service Support functional area.

Craig Huxley Page 169


A Member of the Centre for Information Technology Innovation

Table 42 provides a comparison of the service support functional area with


the results of round one service support.
Mean Median Min Max Std. Dev. # of Entries
Service Support
Service Level Management 2.5 2 1 6 1.5 8
Change Management 2.5 2 1 7 1.7 11
Implementation Project Mgmt. 2.5 2 1 7 1.8 11
Project Management 2.6 2 1 7 1.6 11
Incidence Management 2.6 3 1 7 1.6 11
Service Level Agreement Mgmt. 2.8 3 1 7 1.7 11

Service Support
Customer Relationship Management 1.7 1 1 5 1.3 10
Incidence Management 2.0 2 1 3 0.8 10
Service Level Agreement Mgmt. 2.4 3 1 3 0.8 10
Service Level Management 2.5 3 1 4 0.9 10
Implementation Project Management 2.6 2 2 4 0.8 10
Change Management 2.6 2 1 7 1.7 9
Project Management 2.8 3 2 4 0.6 10

Table 42- Comparison of Service Support Processes from R1 to R2

Comparing the standard deviations (SD) for both rounds, it can be seen that
in the second round almost all (70%) responses were below one standard
deviation from their mean. Of the two with standard deviations over one,
change management has the same standard deviation in round two as in
round one and is only .1 less in criticality in round two than it was in round
one. The second process, customer relationship management, was not in the
first round of the study and cannot be compared. From this information we
can infer that while the criticality of these processes has not changed
substantially from round one to round two, the amount of consensus for the
result is becoming greater.

Craig Huxley Page 170


A Member of the Centre for Information Technology Innovation

The functional area of security has also been seen in this round (2) to have
become considered the most critical area. This was calculated on the
average mean rating for processes within the functional area. Security had a
mean rating of 2.4 against 2.7 for application support and 2.8 for service
support.
(1 = most critical and 10 = least critical)

Mean
Security 2.4
Application Support 2.7
Service Support 2.8
Application Management 3.0
IT Strategy 3.2
Software Management 3.4
Support Processes 3.6
Product Development 3.9
Hardware Management 3.9

Table 43-List of functional processes in order of criticality for R2

All other functional areas had mean scores between three and three point
nine. Table 43 above contains the entire list of functional areas and their
criticality rating.
Overall, for all processes in the second round there were no maximum
ratings provided which were outside the two times standard deviation plus
the mean, (2x SD + Mean), which suggests that there are no anomalies of
this type in round two. In addition, the results provided resulted in no mean
rating being larger than five. It could then be said that the average participant
in the ASP industry considers all processes in this study to be in the lower
portion of the Likert scale of 1= most critical and 10 = least critical.

The data for the third round is found in Table 44 and is listed with the most
critical at the top and least critical at the bottom.

Craig Huxley Page 171


A Member of the Centre for Information Technology Innovation

Mean Median Min Max Std. Dev. # of Entries


Service Support 2.0 2 1 3 0.89 5
Incidence Management 1.9 2 1 3 0.83 10
Service Level Agreement
2.3 2 1 4 0.82 10
Mgmt.
Customer Relationship Mgmt. 2.4 2 1 7 2.01 10
Change Management 2.5 2 1 8 2.03 10
Service Level Management 2.7 3 1 5 1.15 10
Implementation Project Mgmt. 3.0 3 2 4 0.67 10
Project Management 3.1 3 2 4 0.57 10
Cost Control 3.4 4 2 5 1.07 10
Supplier Relationship Mgmt. 4.1 4 4 5 0.31 10
Commissioning Process for
4.2 4 4 5 0.42 10
new Hardware
Security 2.3 2 1 4 0.98 5
Network Security 2.0 2 1 4 1.05 10
Physical Security 2.1 2 1 4 0.74 10
Operating System Security 2.2 2 1 4 0.79 10
Cost Control 3.7 4 2 5 0.82 10
Application Support 2.8 2 1 5 1.47 5
Application Security 1.8 2 1 4 0.92 10
IS System Configuration 2.8 3 2 4 0.79 10
Service Desk Support 2.9 2 2 6 1.45 10
System Implementation 3.4 3 2 7 1.50 10
Programming Development 4.3 4 3 6 0.94 10
Software Management 3.2 3 2 5 0.98 5
Data Base backup & Delivery 2.4 2 1 6 1.50 10
Cost Control 3.4 4 2 5 0.96 10
Procurement 5.1 5 4 6 0.74 10
IT Strategy 3.4 4 1 5 1.62 5
Information Systems Planning 3.4 3 2 5 1.26 10
Consulting- Issue Identification,
3.4 3 2 6 1.42 10
Analysis, Presentation
Cost Control 3.6 4 2 5 1.26 10
Business analysis 3.6 3 1 7 1.77 10
Support Processes 3.8 4 3 5 0.75 5
Billing 3.0 3 1 5 1.29 10
Research, Business Awareness,
3.4 3 1 6 1.42 10
Training & Knowledge Mgmt.
Strategic Planning 3.6 3 3 6 1.07 10
Human Resource Planning 4.1 4 3 7 1.20 10
Internal IT Services 4.7 5 3 6 0.94 10
Timely payment to providers 4.8 4 3 8 1.55 10

Craig Huxley Page 172


A Member of the Centre for Information Technology Innovation

Business Development 5.1 4 3 10 2.28 10


Application Management 3.8 3 2 7 1.72 5
Server Monitoring 3.2 3 1 7 1.55 10
Monitoring & Alarms 3.3 3 1 7 1.56 10
Cost Control 3.8 4 2 5 0.92 10
Performance Monitoring 3.8 3 3 7 1.31 10
Hardware Management 4.6 5 4 5 0.49 5
Cost Control 3.6 4 2 5 0.96 10
Upgrade/Refresh 4.1 4 3 5 0.87 10
Procurement 4.7 5 4 5 0.47 10
Regular Hardware Maintenance 4.8 5 4 6 0.63 10
Product Development 5.0 5 3 8 1.67 5
Cost Control 3.9 4 2 5 0.87 10
Contract Negotiation 3.9 4 3 5 0.87 10
Solution Development 3.9 4 3 5 0.57 10
Architecture/Design 4.0 4 3 5 0.82 10
Marketing 5.3 5 4 7 0.94 10

Table 44- Collated and analysed data from the third round of the Delphi study

The data in this table is shown as prioritising the most critical functional areas
Service Support and Security are the most critical functional areas. Within
these functional areas, the processes themselves are also positioned so that
processes considered most critical by the mean rating are at the top of the
functional area.

The reader will also notice that the number of responses for the functional
areas is five while the processes received ten usable responses. Some
participants may not have correctly read the instructions for this last round of
the Delphi Study and not given a rating to the functional areas. The top four
functional areas, of which three were the same as in round two (Service
Support, Security and Application Support), hold all the processes which
have a mean rating of less than three. This is seen in Table 45, below, with
the most critical functional area by mean rating, service support, with a mean
rating of two. This functional area has a very low standard deviation of .89
and raw ratings with a minimum value of one and maximum value of three.
The small dispersal of the ratings shown by the minimum and maximum
rating indicate good consensus within the participants also.

Craig Huxley Page 173


A Member of the Centre for Information Technology Innovation

Mean Median Min Max Std. Dev. # of Entries


Service Support 2.0 2 1 3 0.89 5
Incidence Management 1.9 2 1 3 0.83 10
Service Level Agreement Mgmt. 2.3 2 1 4 0.82 10
Customer Relationship Mgmt. 2.4 2 1 7 2.01 10
Change Management 2.5 2 1 8 2.03 10
Service Level Management 2.7 3 1 5 1.15 10
Security 2.3 2 1 4 0.98 5
Network Security 2.0 2 1 4 1.05 10
Physical Security 2.1 2 1 4 0.74 10
Operating System Security 2.2 2 1 4 0.79 10
Application Support 2.8 2 1 5 1.47 5
Application Security 1.8 2 1 4 0.92 10
IS System Configuration 2.8 3 2 4 0.79 10
Service Desk Support 2.9 2 2 6 1.45 10
Software Management 3.2 3 2 5 0.98 5
Data Base backup & Delivery 2.4 2 1 6 1.50 10

Table 45- List of most critical processes in R3 by mean rating

Table 45 also shows that for round three the number of ratings (excluding
functional areas) with a critical rating of less than three were twelve. This has
reduced from that found in round two, where there were seventeen
processes with a mean rating below three (17 in R2 – 12 in R3). The
reduction may indicate that some participants were still unsure of how critical
some processes were. In Figure 36 we have provided a comparison of the
functional area of service support for all three rounds of the Delphi study.

Comparison of SD over three rounds of Delphi Study

2.5
2
1.5
1
0.5
0
t

t
nt

nt

t
t

nt
..

..
en

en

en
en

en
je.

en
e

e
e

e
...

...

...
j

oj
em

em

em
em

em

em
em

em
.

..

..
.
Pr

Pr

Pr
an

an

an
e.

e.

e.
ge
ag

ag

ag
ag

ag

ag
ag

ag
gr

gr

gr
lM

lM

lM
n

n
na
tio

tio

tio
lA

lA

lA
an

an

an
an

an

an
an

an
ve

ve

ve
a
ta

ta

ta
ve

ve

ve
M

M
M

M
M

tM

M
Le

Le

Le
en

en

en
Le

Le

Le
ce

ce

ce
ge

ge

ge
ct

ct
ec
em

em

em
e

ic e

e
je

je
en

en

en
an

an

an
ic e

ic e

ic e
v ic

v ic
oj
o

o
rv
pl

pl

pl
Pr

Pr

Pr
c id

c id

c id
Ch

Ch

Ch
r

r
rv

rv

rv
Im

Im

Im
Se

Se

Se
Se

Se

Se
In

In

In

Figure 36- Comparison of SD for service support for the 3 rounds of the Delphi study

Craig Huxley Page 174


A Member of the Centre for Information Technology Innovation

This chart shows three separate lines with diamonds to indicate the standard
deviation (SD) for each of the processes listed below them. The separate
lines indicate the three rounds of the Delphi study, with the first round on the
left and the third on the right. We can see that there is little comparison
between round one and round two but a similar shape to the line for round
two and round three. The similarity in shape indicates that there is a similarity
in standard deviation for the processes from round two to round three. The
similarity thus indicates that consensus was becoming greater for these
processes.

Service level management and change management, though, showed an


increase in SD or reduction of consensus in the participants. Change
management and service level management also were found to have a
maximum rating which was greater than two times SD plus the mean rating.
This indicates that there was a rating for each process which could be
considered an anomaly or statistical outlier and discarded. Discarding the
rating would also lead to greater consensus with the remaining results.
A comparison of the functional areas of the Delphi study for their mean
criticality is described in Figure 37. This chart shows that the functional
areas, developed in the initial focus group, of service support and security,
were given the lowest mean ratings. (1= most critical and 10 = least critical)

Criticality of Functional Are as

5.0
5
4.6
1=Most Critical 10=Least Critical

4 3.8 3.8

3.4
3.2

3 2.8

2.3
2.0
2

0
IT Strategy Software Application Service Hardware Product Security Support Application
Management Management Support Management Development Processes Support

Figure 37-Rating results for the process headings from the Delphi Study

Craig Huxley Page 175


A Member of the Centre for Information Technology Innovation

The responses shown in Figure 37 provided the research team with a mean
rating of nine functional areas. These functional areas from most critical to
least critical are;
1. Service Support- 2.0
2. Security- 2.3
3. Application Support- 2.8
4. Software Management- 3.2
5. IT Strategy- 3.4
6. Support Processes- 3.8
7. Application Management- 3.8
8. Hardware Management- 4.6
9. Product Development- 5.0

Through all three rounds of the Delphi study, the top three functional areas
have been part of the top three although in a different order. The third round
was the only round in which participants were asked to rate the functional
areas. Although this may allude to a lack of evidence it is supported by the
second round analysis in which the processes within the functional areas
were assessed for their mean, and this result was indicative of the result
shown for the third round.

Delphi Study Summary


This section of the chapter has reported on the collation and analysis of the
data from the Delphi study. The list of processes and the definition of a
critical process taken form the focus group session were used as the basis of
the first of three rounds of the study. The number of processes increased
from twenty seven processes and eight functional areas to a total of forty six
processes and nine functional areas at the end of the three round Delphi
study.

The aim of the study was to identify which functional area and which
processes were considered by the ASP industry to be the most critical. This
aim was achieved as we are now able to report that the functional area of

Craig Huxley Page 176


A Member of the Centre for Information Technology Innovation

service support and at least five of the processes within this functional area
are considered by the participants as the most critical to the ASP industry in
Australia.

The functional area of service support would become the focal area for the
case studies, which are used to test the efficacy of the proposed
methodology.

6.3 Identification of Critical Processes Summary

The research project needed to


1. Define critical processes as a generic term
2. Identify critical processes with the ASP industry
3. Provide a rank order of these critical processes and functional areas
4. Identify the most critical functional area and most critical processes

The focus group has provided the research team with data on the extent of
the focus group participants’ operations within the ASP industry. The major
output of the focus group was a list of processes for the first round of the
Delphi study.

In order to define the critical processes and obtain a draft list of critical
processes and functional areas in ASP, a focus group session was held with
participants from major ASP organisations. Participants defined a critical
process as - critical processes are those processes which have the greatest
effect on the attainment of Corporate Strategic Goals. The next phase of the
study was to test the generalisability of the list of processes and functional
areas in a wider context than that available in the focus group. A Delphi study
was undertaken with more participant organisations. The purpose of this
Delphi study was to identify the most critical processes and select a
functional area for testing the proposed methodology.

Craig Huxley Page 177


A Member of the Centre for Information Technology Innovation

Service support and Security are the functional areas which appear to be the
most critical as perceived by the participants and were selected as the focus
of the first two case studies.

The responses to study were both quantitative and qualitative data sets. The
quantitative responses, ratings using the ten point Likert scale (1=most
critical and 10=least critical), were collated to assess the mean rating of each
process, the maximum rating for each process, the minimum rating for each
process and the standard deviation for the ratings of each process. The
qualitative responses were collated in order to find common themes,
sanitised and resent in the following rounds. The Delphi study went through
three rounds to achieve a consensus of mean ratings in over 50% (24) of all
processes with only 6 responses revealing a growing disparity in ratings.
Collection of the data was achieved by importing all responses into one Excel
document.

The result of the collation and analysis of the Delphi study data is that the
research team is able to identify a functional area which is considered most
critical to the ASP industry. This area is that of Service Support and includes
the processes:
1. Incidence Management
2. Service Level Agreement Management
3. Customer Relationship Management
4. Change Management
5. Service Level Management.

The first two case studies in this research project used the functional area of
service support as their focus, ensuring that the case studies provided useful
business benefits to each participant. The processes identified by the Delphi
study as most critical were used by the associated research project as the
processes chosen to develop detailed reference models.

The next chapter reports on the action learning using case studies. The
targeting methodology was tested in four settings using the four phases of

Craig Huxley Page 178


A Member of the Centre for Information Technology Innovation

action learning, implementation, observation, reflection and revision. The


chapter starts with an examination of the pilot study and progresses through
three further case studies.

Craig Huxley Page 179


A Member of the Centre for Information Technology Innovation

7 Action learning Using Case Studies

This chapter describes the testing and refinement of the targeting


methodology. We used the action learning methodology, which incorporates
four phases, action, observation, reflection and revision (Bunning 1993).

Action

Observation
Revision

Reflection

Figure 38-The four phases of Action Learning

This iterative cycle of action learning (shown in Figure 38), was applied over
a series of case studies including an initial pilot study. The focus of the
iterative study was to validate and revise the targeting method. Each cycle of
action learning provided a phase of reflection and revision in which the
targeting methodology was revised in the light of experience and collected
data.

The research team implemented the targeting method in the action phase
and collected data on the implementation as part of the observation phase.
The two further phases of reflection and revision were conducted within the
research team (Kolb 1984; Bunning 1993; McGill and Beaty 2001). The
targeting methodology was tested using this four phase action learning
process in which case studies were used as the phases of action and
observation (Kolb 1984). The action learning went through four cycles with a
pilot case study and three further case studies.

Figure 39 on the next page describes the cyclic nature of the action learning.

Craig Huxley Page 180


A Member of the Centre for Information Technology Innovation

Action Learning Cycles


Using a pilot case study & three case studies
4 cycles of action, observation, reflection & revision

ACTION ACTION

PS 1 C1 C2 C3

Reflect
Re
Revise Observe

Figure 39-Description of the Action learning cycles using case studies

The action learning is iterative in that the four phases of an action learning
(one cycle) are followed by three further cycles of our action learning. Internal
cycles of action learning also occurred within each study, both for the
participants and for the research team. That is, there were phases of
reflection and revision that occurred during some of the case studies.

The approach taken with this research project was to implement the targeting
methodology into each of the case study participant organisations (the action
/implementation phase) and during the implementation observe the issues
arising. The initial implementation followed the outline provided by the steps
in the targeting methodology shown in chapter 4. Output from each case
study was a revised methodology and was informed by interviews and survey
questions with the participants. The case studies used the following five
questions as part of the data collection.

a) Are the results of the process what you expected?


b) Are the results of the process useful to you as a manager?
c) Are the results of the process useful to you as a company?
d) Was there anything in the identification process which you didn't
understand or had difficulty with?
e) Are you able to suggest any improvements to the way the process
was implemented?

Craig Huxley Page 181


A Member of the Centre for Information Technology Innovation

The reflection and revision phases occurred internally within the research
team. Meetings were conducted during and at the end of each case study to
discuss:
1. What went well and why?
2. What went badly and why?
3. What didn’t appear to be needed?
4. Suggestions made by the participants or the research team
5. What modifications to make?

The reflection process was to ask all 5 questions of the research team’s
experience during the cases study. Once comments were gathered and
conclusions provided, changes were made to the methodology for the next
implementation as the revision phase. We were able to identify the issues
that arose during the implementation and adjust the targeting method to take
account of these changes. Within this chapter the revisions are highlighted at
the end of each cycle. The grey highlighting indicates the changes made to
the text based model of the targeting method. The reflection and revision
phases were the improvement phases of the action learning process.

7.1 Case Study Participants

The participants for the case studies are a pathology company called PS,
Computer Science Corporation (CSC) (an international outsourcing
company), REALTECH Australia (a subsidiary of the international company
REALTECH AG) and Citec (a Queensland government owned commercial
entity).

In order to ensure that the results of the research project were generalisable
the research team aimed to select companies from within the ASP service
delivery environment for the case studies, apart from the pilot study
participant. These companies should necessarily have operations within
Australia so that generalisability of the results was possible with the limited

Craig Huxley Page 182


A Member of the Centre for Information Technology Innovation

number of case studies undertaken. The population used for the case studies
was considered too small to attempt to provide generalisability for a greater
geographic area. We intended to seek a mix of government and private
organisations and this was achieved. CSC and REALTECH are private
companies in Australia (no shares traded on the Australian stock-market) and
Citec is a wholly government owned company that operates as a commercial
entity. Citec are the ninth largest outsourcing company in Australia by
revenue (Benson 2002). CSC and REALTECH also extend the
generalisability of the results as CSC is a large outsourcing company (3rd
largest by revenue in Aust.) and REALTECH a relatively smaller niche player
in the Australian market (Benson 2002).

This section briefly summarises the organisations and the participants for the
case studies.

1. PS, a pathology company of international reputation and state-wide


facilities. We do not describe this company in detail to ensure that their
confidentiality is maintained. While the industry (pathology) is not the same
as that in which the focus of research was directed, this was not considered
inappropriate. The focus of the study is to generalise the targeting method
and not identify the strategies and objectives of a balanced scorecard (BSC)
in application hosting and application service provision. Thus, any
organisation intending to improve processes was considered to be an
appropriate partner. The main reason for selection of this company for the
pilot study was that they had already developed a corporate BSC and were
interested in process improvement. The research team anticipated that there
would be fewer BSC issues and thus we could focus on other aspects of the
targeting method.

The pilot case study was conducted with four participants: the Logistics
Project Manager, Manager Operations, Assistant Manager Operations and
the manager of the department PS1.

Craig Huxley Page 183


A Member of the Centre for Information Technology Innovation

2. CSC (Computer Science Corporation), is a private company and the third


largest outsourcing provider in Australia. The company is in the top three of
Australian Outsourcers with annual revenue in excess of AUD 1.16 billion. It
is worthwhile noting that the top three outsourcers in Australia share 76% of
the market revenue. CSC has more than 4500 employees in Australia almost
all of which work in the outsourcing services industry (Benson 2002). IDC
report that CSC operates across the banking and financial services,
government and manufacturing industries. Within these they provide services
classified by IDC as Systems Integration (SI) 29%, Information Systems
Outsourcing 69%, and Network Integration & Management 3% (Benson
2002). (See Appendix 5 for full definitions)

Within CSC, some management staff manage the operations of one


customer account if that account is sufficiently large. BHP-Billiton is
Australia’s largest mining company and it is the service support processes,
provided by CSC, for this customer providing the focus for this case study.
The case study was conducted in Brisbane with two participants, Nigel Hillier
and Mark Harris, who are the national Account Manager for BHP-Billiton and
Applications Manager for BHP-Billiton respectively.

3. REALTECH Australia, a wholly owned subsidiary of the international


company REALTECH AG, is a niche player in the outsourcing industry.
REALTECH Australia are the industry sponsor for this research project.
REALTECH have its corporate headquarters in Walldorf, Germany and it is a
publicly listed company on the Neuer Market in Frankfurt Germany. The
company was started in 1994 and listed in Germany in1999. It has 4 offices
in Germany with subsidiaries in Australia, United States, Italy, New Zealand,
Singapore, Spain and the United Kingdom. REALTECH specialises in
technology consulting, which is focussed on the improvement of and
maintenance of SAP R/3 applications and supporting hardware. The
company also develop application and system management software.
REALTECH provides ASP service as a ‘Remote Services’ product to clients
using SAP products to monitor the operating system and hardware. The

Craig Huxley Page 184


A Member of the Centre for Information Technology Innovation

remote service product has clients which are also ASP service providers as
well as companies supporting their SAP application in-house.

This case study was conducted with two participants Wayne Baker,
Managing Director Australasia and Helena Mendes, Senior IT Consultant and
Remote Services Manager.

4. Citec is a commercialised wholly Queensland government owned


company and the ninth largest outsourcer in Australia (Benson 2002). The
corporate headquarters of Citec are in Brisbane, Australia and the company
was established by the Queensland Government State Treasury in 1964 as a
data processing unit. In 1976 their role increased to the management of
major state government software applications. Citec became a
commercialised entity in 1992 and now have both government and company
clients, with a mixture of 57% government and 43% public clients. Citec now
have offices in Brisbane, Sydney, Melbourne, Canberra and Adelaide.

In 2001 Citec had revenue of AUD120 million and staff in excess of 700
people. Their revenue was from IS Outsourcing (51%), Network and Desktop
Outsourcing (12%), Content Delivery (32%) and Processing Services (5%).
The major service offerings are integrated infrastructure management
(Connectivity, hosting, managed networks and desktop, server and storage
management services), e-Business solutions (e-Commerce, e-Integration,
consulting and call centre /helpdesk services) and Application outsourcing
(Payroll, disbursement, cheque reconciliation, SAP R/3, Database
administration) (Benson 2002).

The participants for this case study were Peter Marshall, Manager Service
Strategies and Terence Collins, Strategy Development Manager.

Craig Huxley Page 185


A Member of the Centre for Information Technology Innovation

The timing of the case studies varied with the size of the individual project
and the availability of the participants. Table 46 highlights the start and finish
times for each of the four case studies.

Company Start Date Finish Date Duration


th rd
PS 4 April 2002 3 June 2002 9 weeks

CSC 29th July 2002 30th Aug. 2002 5 weeks

REALTECH 13th Sept. 2002 29th Sept. 2002 2 weeks

Citec 15th Oct. 2002 10th Dec. 2002 8 weeks

Table 46- Timings for the case studies

The next section describes the first cycle of action learning in the pilot study.

Craig Huxley Page 186


A Member of the Centre for Information Technology Innovation

7.2 Cycle one of the Action Learning PS C1 C2 C3

A pilot case study was utilised to explore the issues surrounding the
implementation of the methodology and was cycle one of the action learning
process.

The following section contains the background to the pilot study, the purpose
of the pilot study, implementation of the targeting method, observation and
reflection and then the revision, shown in version 2 of the targeting
methodology.

7.2.1 Background

When PS agreed to participate in this pilot case study, the research team
was aware of their need to improve and change processes within one
department. PS had previously completed considerable work on developing a
corporate balanced scorecard with an external consultant. It was for this
reason that PS was considered to be an ideal candidate for a pilot study as
most of the work of implementing the BSC had already occurred.

7.2.2 Purpose and Business Problem

The purpose of this pilot case study for the participant was to populate the
existing corporate BSC down to the level of a specific department, showing
the cause & effect linkages. This department was in the process of making
changes to the way they operated and were thus very interested in identifying
critical processes. The organisation also believed that the benefits of this
study would have greater impact other than identifying those processes
which needed to be improved or changed, such as, a catalyst for change

Craig Huxley Page 187


A Member of the Centre for Information Technology Innovation

within the organisation by providing a logical approach to process


improvement and further change.

PS had identified a business problem in identifying which areas of a


particular department it should focus on to provide overall improvements for
the organisation. The complexity of this area of decision making was
increased due to considerable restructuring throughout the organisation at
the time.

The pilot case study was started in early April 2002 as it did not need the
output of the focus group and Delphi study. The final communication and end
of the pilot case study was received in late April 2002, some three weeks
later.

7.2.3 Proposed Approach

Initially the approach proposed was to use a limited number of unstructured


interviews with the participants to determine strategies and objectives. It was
considered that this organisation would have reasonable in-house knowledge
of the balanced scorecard due to the previous development of their corporate
BSC. The results of these interviews would be emailed back to participants
for their verification and ultimate agreement.

In this way, the balanced scorecard would be populated down to the level of
a department named PS1. Once the balanced scorecard was populated
down to this level it would then be the task of the PS1 manager to provide the
assessments needed to complete the targeting methodology; that is, to
identify those processes to be included in the project and to assess the
impact of these processes on the internal process objectives, the rating for
dependency of the process and rating for the probability of failure of the
process. The selection target, of which critical processes to improve, could
be accomplished by senior management. A cost/benefit analysis would be

Craig Huxley Page 188


A Member of the Centre for Information Technology Innovation

performed on those processes with the greatest criticality. Processes with a


positive cost/benefit analysis would then be subjected to a probability of
successful improvement analysis in order to provide a rank order of
processes to improve. Presentation of results to the organisation would be
the final activity. This would either be by email or a formal presentation to the
managers involved dependent on time available.

7.2.4 Actual Approach and Observation

A letter outlining the benefits of the pilot study was forwarded to the CEO of
PS who passed this on to the Logistics Project Manager. The Logistics
Project Manager was the point of contact for this pilot case study. The
research team was contacted by email and a meeting was held to discuss
the time involved, benefits to PS, and other resources which might be
required. Following the first meeting, the Logistics Project Manager, and the
Manager Operations discussed the process and consent was given to start
the project.

A further meeting with the Manager Operations, the Logistics Project


Manager and the Assistant Manager Operations was held to outline their
input and time required. At this point the research team had been given a
copy of the existing corporate BSC. The next step taken by the research
team was to reconfigure this text based scorecard into a ‘map’ of goals,
strategies, objectives and cause and effect linkages. The original text based
document (a table) provided the goals, strategies and objectives though there
were no cause & effect linkages shown or discussed. The new ‘map’ of the
BSC (Figure 40), developed by the research team, also showed cause &
effect linkages which the research team had added. This was emailed back
to the Logistics Project Manager for distribution and verification.

Craig Huxley Page 189


A Member of the Centre for Information Technology Innovation

Figure 40- Section of 'Map' developed by research team showing the 'arrows'

Figure 40 is a section of a larger BSC ‘map’ in which the research team had
drawn cause & effect linkages. The arrows show these cause & effect
linkages and the section shows strategies in blue and objectives in black text.

It was at this point that some difficulties arose with the study partners
concerning the BSC ‘map’ (shown in Figure 40), namely, “problems
understanding all the arrows”. The Logistics Project Manager and Manager
Operations of PS related that they had spent “some” time attempting to
understand the reconfigured ‘map’, resulting in no increase in understanding.
The Logistics Project Manager stated that they (PS) understood the links in
their own version of the BSC even though these were not visually shown.
After one email and a telephone call from the Logistics Project Manager it
was decided to only provide the “key steps” section (objectives without the
cause & effect linkages) of the BSC to the Assistant Manager Operations and
the manager of the department PS1.

We then developed a ‘map’ containing only the objectives of the corporate


BSC and showing no cause and effect linkages as per the Logistics Project
Manager’s request. Included in the documentation, we provided information
on the need for cause & effect linkages to be able to use the targeting
method. Following our response, we received an email in the last week of

Craig Huxley Page 190


A Member of the Centre for Information Technology Innovation

April 2002 explaining that due to staff holidays and a resignation, the project
would have to be put on hold for at least one month. The research team
made contact again in early June 2002 to attempt further completion but was
given reasons for discontinuing the project. As far as this research project
was concerned that was the conclusion of the pilot case study.

Observations from the pilot case study took the form of organisational
specific observation and generic observations;

Organisational Specific Implementation Observations


The participants were not able to make available more than 4 hours contact
time over the period of a month. In addition to this, when it became apparent
that the research team could not adequately understand the cause & effect
linkages of the participants corporate BSC, there was a perceived lack of
interest in putting aside further time for the research team to explain their
approach to cause & effect. The research team was unfamiliar with the
corporate structure of the company and lacked sufficient awareness of the
politics surrounding the existing BSC. The research team’s attempt to change
the existing BSC (to visually show cause & effect) was outside the original
intended scope and this led to further problems.

Generic Observations
The level of understanding of the BSC method is difficult to gauge and this
may also affect the approach to developing and implementing the scorecard.
It was taken for granted that the participants were familiar with all aspects of
the BSC methodology as they had developed one. The use of cause and
effect links was an issue in the study. These are developed by heuristics and
are thus open to individual interpretation and may ultimately mean there are
no ‘wrong’ links only that some are more appropriate than others.

The next section describes the learning of the research team in the reflection
phase.

Craig Huxley Page 191


A Member of the Centre for Information Technology Innovation

7.2.5 Reflection Phase Cycle One

Key learning from this pilot study was categorised organisational specific and
generic and is detailed below;

Organisational Specific Learning


Insufficient time was sought to ensure that all participants understood the
output required from the BSC; that is, the cause and effect linkages. The
amount of time available may have affected the willingness shown by the
participants.

Ownership of the BSC (by the participants), may have reduced the
willingness of the participants to allow analysis by the research team to
positively impact upon the improvement of it. It is possible that by initially
developing an alternate BSC ‘map’ for PS that they were offended and that
this resulted in their reluctance to change or look at changing their BSC or
their approach to using it. An alternative possibility is that there was too little
time available for any changes to their version of the BSC, as it was to be
presented to the executive of the organisation early in May that year.

Generic Learning
Even though we had stated in our implementation plan a need to assess the
participants for their knowledge of the BSC and to provide a list of terms and
definitions, this was not done as the research team took for granted that a
company who had developed their corporate BSC would have a good
knowledge of the BSC. Thus, meetings need to be structured with the
provision of an agenda. The first meeting should document such items as: a
list of terms and definitions and a description of a basic BSC regardless of
the perceived knowledge base of the participants. The research team also
need to ensure that they learn more about the organisation before starting
the implementation.

Craig Huxley Page 192


A Member of the Centre for Information Technology Innovation

A separate BSC needs to be developed for each level of management focus.


This is especially necessary in organisations with few levels of management
and lower level managers who are not involved in or privy to higher level
strategy and strategy formulation.

The issue in which the pilot study stalled was concerned with the research
team’s use of cause & effect linkages within the corporate BSC. Improvement
to the visual ‘map’ of the BSC may reduce confusion and lead to better
communication.

The implementation of the targeting method was incomplete in part due to


the research team’s attempt to improve upon the existing BSC by adding
visual cause & effect linkages. The research team failed to take into account
the ownership of previous work and the level of importance given to the
existing BSC. We also found that cause & effect linkages are more difficult to
understand than first realised. This might be improved by using an improved
approach to visualising the BSC so that the number of ‘arrows’ does not add
complexity, instead reducing complexity. Further work needs to be done in
explaining the use of cause and effect in the BSC method.

Although the pilot study was not completed, there were many useful lessons
which arose from the experience.

The following section highlights the revision made to the initial targeting
method. The areas in which changes are made are highlighted in grey on the
concise version and described and highlighted in grey on the following
pages.

Craig Huxley Page 193


A Member of the Centre for Information Technology Innovation

Version 2 of the Targeting Method


1. Preplanning
1.1. Assessing participants
1.2. Preparation of any documents
2. Defining Scope
2.1. Introduction of the project as a whole to the project team
2.2. identify the processes
3. Assessing Dependency
3.1. Agree on the method to be used for assessing dependency
3.2. Identify the criteria to be used and rate each process
4. Assessing Probability of Failure of the Process
4.1. Agree on the method to be used for assessing probability of failure
4.2. Identify the criteria to be used and rate each process
5. Developing a Balanced Scorecard (BSC)
5.1. Identify the goals and strategies and objectives of the entity
5.2. Identify the cause & effect linkages within the BSC
5.3. Link processes identified earlier (2.2) to internal process objectives
6. Assessing the Impact of Processes on Goals
6.1. Assess the impact of each process on goals using heuristics and
total
7. Calculate the Criticality of each Process
8. Assess the Cost/Benefit of Improving the Process
8.1. Agree on the method to be used for assessing cost/benefit
8.2. Identify the criteria to be used and rate each process selected
9. Assess the Probability of Successful Improvement of the Processes with
positive cost/benefit
9.1. Agree on the method to be used for assessing probability of
success
9.2. Identify the criteria to be used and rate each process
10. Selection of which Critical Process to Improve First
10.1. Rank order the processes with positive cost/benefit by greatest
probability of successful improvement. Those processes with the
greatest probability of success and greatest cost/benefit should be
improved first

Craig Huxley Page 194


A Member of the Centre for Information Technology Innovation

An explanation of each part of the generic 10 part process

1. Preplanning
1.1. Assessing Participants
This step involves research of the participants and their organisation. If
possible initial contacts should be used to assess the knowledge of the
participants and who should attend the first implementation meeting.
1.2. Preparation of any documents
Before the first meeting of the project team there are a number of documents
which should be produced. The first is a simple outline of the targeting
methodology that is to be used. The second is the terms and meanings of the
words within the BSC. Initially meetings need to be structured; provision of an
agenda, list of terms and definitions and description of a basic BSC,
regardless of previous experience

2. Defining Scope
2.1. Introduction of the project as a whole to the project team
Discussion of the whole project is initiated. This is dependant on the number
of participants and time as to the detail required. It should be verified that the
participants are familiar with the BSC and its terms. Agreement should also
be reached as to the lines of communication and confidentiality.
The first meeting should ascertain through discussion the knowledge base in
regards to the BSC. It is necessary for an effective implementation that the
research team and the participants have a clear understanding and
agreement of what the BSC is, how it is supposed to work and definitions of
the terms used. Outcomes of this meeting should be that all participants
speak the ‘same language’ and there is a plan of how to go forward. Ideally
this plan will list suggested documents from which goals, strategies and
possible objectives may be sourced.
Define the business area in which to conduct the project, (not all targeting
projects are implemented for the whole organisation). The time frame of the
project should then be assessed and the roles and responsibilities for the
project outlined. Define and achieve agreement from the participants as to

Craig Huxley Page 195


A Member of the Centre for Information Technology Innovation

the necessary amount of time needed for successful completion of the


project.
2.2. Identify the processes applicable to the project
The processes and the level at which they should be seen are identified at
this point.

3. Assessing Dependency of the Organisation on the Process


3.1. Agree on the method to be used for assessing dependency
This project aimed to use heuristics for this task and this was completed by
assessing the dependency of the entity on the process in relation to each of
the other identified processes.
3.2. Identify the criteria to be used to rate each process
If greater reliability was needed then the criteria for dependency would be
used here with weightings for each. (Availability, reliability, safety,
confidentiality, integrity and maintainability) Each process would then be
rated by applying a rating to each criteria and multiplying this by the
weighting for that criteria and multiplying all these together for each process.

4. Assessing Probability of Failure of the Process


4.1. Agree on the method to be used for assessing probability of failure
The use of heuristics was suggested
4.2. Identify the criteria to be used to rate each process
No criteria are necessary for a heuristic approach to this assessment. An
improved approach to heuristics is to base your judgements on prior
knowledge and facts and then predict a possible result. Additional information
on failure could be the types of failure; over performance, failure over time,
intermittent failure, partial failure and complete failure.

5. Developing a Balanced Scorecard (BSC)


5.1. Identify the goals, strategies and objectives of the entity in the
project
If necessary an explanation of the BSC is necessary and the goals and
strategies of the entity are provided. Develop objectives as the mini goals for

Craig Huxley Page 196


A Member of the Centre for Information Technology Innovation

the strategies and place these objectives within the agreed perspectives of
the BSC.
5.2. Identify the cause & effect linkages within the BSC
Using the experience and skills of the project team identify the cause and
effect linkages within this BSC. If necessary add objectives to allow for
possible causes and effects.
5.3. Link processes identified earlier (2.2) to internal process objectives
With the processes identified earlier (or now) the project team link those
process which impact upon the internal process objectives within the BSC.
Provide a visual ‘map’ of the BSC for the participants to assess for
correctness of the work so far and make changes from any feedback.
A separate BSC needs to be drawn up as each management focus of an
organisation is brought into the project. This is especially necessary in
organisations with few levels of management and lower level managers who
are not involved in strategy formulation. Improvement of the visual ‘map’ of
the BSC may reduce the confusion and lead to better communication.

6. Assessing the Impact of Processes on Goals


6.1. Assess the impact of each process on goals using Heuristics
Using Heuristics assess the impact of each process on the objectives,
strategies and ultimately the goals of the entity. Each link is assessed one at
a time and in relation to all of the processes and objectives impacting on the
objective, strategy or goal. Calculate totals for each process by multiplying
the percentages together along the links.

7. Calculate the Criticality of each Process


Multiply the ratings or values for impact, dependency and probability of
failure. This total is the criticality of each process

8. Assess the Cost/Benefit of Improving the Process


8.1. Agree on the method to be used for assessing cost/benefit
If the projects are very small then heuristics may be suitable, otherwise the
suggested approach is to develop a business case for each process as a
separate project.

Craig Huxley Page 197


A Member of the Centre for Information Technology Innovation

8.2. Identify the criteria to be used to rate each process


Dependant on the approach taken above in 8.1 the criteria would include;
costs for resources and non-project related incurred costs, tangible and
intangible benefits. Only those processes with a positive cost/benefit would
then be assessed for the probability of successful improvement.

9. Assess the Probability of Successful Improvement of the Process


9.1. Agree on the method to be used for assessing probability of
success
The suggested approach here again is to use heuristics.
9.2. Identify the criteria to be used to rate each process
Some criteria which may support the decisions are; Team Orientation,
Project Management, Management Support, User Participation, Modeller’s
Expertise, Project Championship and Communication.

10. Selection of which Critical Process to Improve First


10.1.Rank order the processes with the greatest probability of successful
improvement.
Those processes with the greatest probability of successful improvement and
the best cost/benefit ratio should be selected first for improvement. This is
essentially then a business decision. The rank order is not a scientific
approach to selecting which processes to improve first, it is an improved
approach to current practice.

Craig Huxley Page 198


A Member of the Centre for Information Technology Innovation

7.3 Cycle 2 of Action Learning: Case Study One


PS C1 C2 C3

This section deals with the first of three case studies on organisations within
the ASP industry. We follow the same cycle of action learning (implement,
observe, reflect and revise) as was seen in the pilot case study. Within this
cycle of action learning there was a further cycle of reflection and revision. As
there were nearly four weeks between the first and second meeting it was
possible to conduct a reflection and revision phase after this meeting
concerning the events which had occurred. Thus, the case study report will
document that some revisions occurred to the targeting method during the
larger cycle of action learning.

CSC had indicated during the second focus group session of the associated
research project (reference models for ES service delivery), that they were
interested in participating in the case study phase of this research project.
They were contacted by email to arrange a meeting to start the case study.

This cycle of action learning took five weeks to complete and required three
meetings with the participants of the case study in addition to emails and
phone calls. The scheduling of subsequent meetings occurred at the end of
each meeting for the following meeting. The table below provides the list of
times and dates for each of the three meetings.

Meeting Date Time

Meeting One 29th July 2002 1pm

Meeting Two 22nd August 2002 12.30pm

Meeting Three 30th August 2002 11.30am

Table 47-Meeting dates and times for case study 1

In the table above, the dates and times of each of the three meetings which
took place are listed. They were scheduled for around lunch time for the

Craig Huxley Page 199


A Member of the Centre for Information Technology Innovation

participants as this provided them with some readily available free time in
their busy schedules.

The following sections describe the purpose and business problem for the
case study, the proposed approach and a description of the actual events of
the case study. Much of the data of the case study is shown at the end of this
section as output.

7.3.1 Purpose and Business Problem

The participants from CSC had stated that they were going to use the case
study as a form of revision for their areas of responsibility. They would also
take the results of the study and use it to help staff to understand the
priorities of their service to BHP-Billiton. They also stated that it would ensure
that any new process improvements would be focussed on appropriate goals.

7.3.2 Proposed Approach

This implementation of the targeting method for this cycle of action learning
would take into account the improvements suggested from the previous
cycles’ reflection and revision phases. Major changes to the targeting
methodology learnt in the previous cycle were;
1. Provision of an agenda, list of terms and definitions and description of the
entire targeting method;
2. Use a limited number of semi-structured meetings with participants to
determine the outputs of the targeting method;
3. Ensure that sufficient time is provided by the participants; and
4. Improve the visual presentation of results for the participants as an aid to
effective communication of these results.
These major revisions to the ten step targeting method were intended to
advance the outcome of this first case study.

Craig Huxley Page 200


A Member of the Centre for Information Technology Innovation

7.3.3 Actual Approach and Observations

The research team initially developed an agenda document, seen in Figure


41. The agenda document sets out the need to define the scope of the
service delivery for CSC, (which processes are to be included), and the type
of services provided to BHP-Billiton. The agenda also states that the first
meeting was to entail a description of the targeting method and, if possible,
begin to develop the BSC facet of the targeting model. Finally a date and
time for the next meeting should be set.

Craig Huxley Page 201


A Member of the Centre for Information Technology Innovation

Figure 41- Agenda for first meeting with CSC

In addition to the agenda document, the research team had also developed a
Power-point presentation concerned with the explanation of the targeting
method. The research team further developed a simple model of the

Craig Huxley Page 202


A Member of the Centre for Information Technology Innovation

targeting method and an explanation of terms. Other documents to be taken


to the meeting were the ethics consent forms for the case study.

This first meeting was on the 29th of July 2002 from 1pm in the afternoon at
the CSC offices in Coronation Drive, Brisbane. Although it was not on the
agenda, the first issue to be dealt with was the signing of the ethics
documents, which stated that the participants were happy to participate and
were able to withdraw at any point. We also needed to verbally inform the
participants that they could withdraw at any time and that if they had an
ethical issue that they felt the research team could not deal with they could
contact our University ethics department.

The research team undertook step 2.2 (Identification of processes) before


step 2.1 (explanation of the project) as we sought to scope the project before
explaining the rest of the steps required to implement the targeting method.
The reasoning behind this change was that with the knowledge of the scope
we would be able to provide relevant examples during the explanation of the
targeting method.

Discussion on defining the scope of this BHP-Billiton (BHP-B) service


resulted in the research team agreeing that it was unnecessary for project
success to understand this area. The management level at which the
participants were responsible ensured that they would be extremely familiar
with the service provided to BHP-Billiton and that no added value would arise
from detailing it as it changed regularly and at times substantially. The
service provided included all the elements of ASP service delivery.

The next step was to identify the processes to be included in the case study
within the functional process areas of Service Support and Service Delivery.
We used the ‘reference model of ASP service delivery’ developed by the
associated research project (reference process modelling for ES service
delivery) as a starting point. This model is shown in part, in Figure 42.

Craig Huxley Page 203


A Member of the Centre for Information Technology Innovation

Figure 42- Reference model of ASP service delivery

This model shows five separate functional areas; service definition at the top,
service infrastructure, service delivery and customer relationship
management as the next layer with service support in the bottom right hand
corner. We had printed this model on A3 size paper (twice the size of this
page) so that the participants were able to make any adjustments to the
model.

The output of this activity was a number of changes to the names of


processes and the addition of some new processes. This is identified in
Figure 43 below.

Figure 43- CSC version of AHC (changes highlighted in blue)

Craig Huxley Page 204


A Member of the Centre for Information Technology Innovation

The addition of “Define Billing” and “Define Reporting” to the Service


Definition functional area at the top of the model was the first change. The
second change was the addition of “Quality processes’ to the ‘Service
Delivery/Management area and the addition of “Enhancement” to Change
Management in the Service support functional area to become
“Change/Enhancement Management”. Service delivery, customer
relationship management and service support areas were placed within the
one functional area named “Service Delivery”. This meant that a new title for
the old Service Delivery area needed to be provided. (This as yet has not
been agreed upon) The final alteration was to the shape for the reference
model objects from those appearing as block arrows and directed towards
the right hand side of the model to rectangular objects with no apparent flow.

These changes provided a list of twelve ‘Service delivery’ processes which


had been identified for consideration in the targeting methodology. They
were:
1. Monitor Service levels
2. Manage capacity
3. Configuration
4. Quality processes
5. Billing
6. Reporting
7. Help desk
8. Incident reporting
9. Problem management
10. Change/Enhancement Management
11. Release management

This list of identified processes was then used in the subsequent steps of the
targeting method: assessing the effect of failure of these processes
(dependency), the probability of failure of these processes and the relative
contribution of these processes on internal process objectives (impact).

Craig Huxley Page 205


A Member of the Centre for Information Technology Innovation

Explanation of Targeting Method Process


The meeting then turned to an explanation the methodology to be used by
the participants. For this a printed copy of the targeting method, Power-point
slides were used. Approximately 30 minutes were taken to explain not only
the methodology but the use of the balanced scorecard (BSC). Both
participants had heard of the BSC but had not used it. Particular attention
was paid to the issue of ‘cause and effect’ as this facet is the major aspect of
the methodology. Participants were also asked on more than one occasion if
they had any questions or problems with the method. When there was
agreement with the process to be undertaken and understanding of the steps
required we moved to the next agenda item. This was to start the
development of their BSC.

The next agenda item of this first meeting was the start of the development of
a BSC for servicing the BHP-Billiton account. A white board was used to
firstly identify goals then strategies to achieve these goals and finally
objectives. The objectives were placed into the most suitable of the four
perspectives used. We used a modified set of five perspectives which were:
financial perspective, customer/supplier perspective, internal process
perspective, knowledge management perspective and partner perspective2.

The changes to the original BSC perspectives are the addition of supplier to
the customer perspective in order to account for the need in today’s
organisations to view the whole supply chain. The learning and knowledge
perspective becomes the knowledge management perspective as knowledge
management encompasses a greater set of issues than is covered by
learning and knowledge. The partner perspective was thought to be useful
within the IT industry as many companies today have partnerships which
cannot be described as a customer or supplier relationship. Both
organisations might supply a product or service which is beyond the
capability of either organisation individually.

2
These perspectives were proposed in the paper ‘Changing the Four Perspectives of the
Balanced Scorecard to Suit IT at the ANZAM 2002 Conference (Australian & New Zealand
Academy of Management).

Craig Huxley Page 206


A Member of the Centre for Information Technology Innovation

For the purposes of the targeting method it was not necessary to develop a
complete BSC, so the need for objectives within the knowledge management
perspective was less critical and there was no need for measures. The
Partner perspective was considered to add no value to the internal purpose
of the study by the participants and removed. It should be noted at this point
that all case study participants from the three organisations had similar
responses concerning the naming of perspectives.

The participants developed the BSC on the white board, with the completed
version shown in Figure 44. This BSC contained goals, strategies, objectives
and cause and effect linkages.

C S C - B H P -B illito n B u s in e s s U n it
B a la n c e d S c o re c a rd

G o a ls R e tu rn o n In v e s tm e n t 1 4 % G ro w th o f R e v e n u e b y 5 %
S tra te g ie s In c re a s e C o n s u m p tio n o f e x is tin g s e rv ic e s to G ro w th e n u m b e r o f s e rv ic e s / C a p tu re a c c u ra te a n d a ll la b o u r c o s t
e x is tin g c u s to m e rs p ro d u c ts a c tiv itie s

F in a n c ia l ¾ R e d u c e u n it c o s ts ¾ D e v e lo p a n d s e ll p ro fita b le n e w ¾ A c c u ra te re c o rd in g o f tim e a n d
P e rs p e c tiv e p ro d u c ts b illa b le h o u rs
¾ S e ll m o re u n its
¾ In c re a s e p ro d u c tiv ity
C u s to m e r / ¾ In c re a s e C R M a c tiv itie s ¾ Id e n tify a n d d e v e lo p c u s to m e r
S u p p lie r
¾ In c re a s e c u s to m e r s p e c ific k n o w le d g e n e e d s fo r p ro d u c ts /s e rv ic e s
P e rs p e c tiv e
¾ In c re a s e th e a w a re n e s s o f e x is tin g
c u s to m e rs to s e rv ic e s w e h a v e th a t th e y d o n 't
p re s e n tly u s e
In te rn a l P ro c e s s ¾ R e d u c e a c tiv ity c o m p le x ity ¾ R e c o n fig u re p ro d u c ts a n d
P e rs p e c tiv e
¾ In c re a s e u tilis a tio n o f lo w c o s t s o lu tio n s e rv ic e s to m e e ts n e e d s
p ro c e s s e s
¾ P ro c e d u ra lis e e x is tin g in fo rm a l p ro c e s s e s
¾ D e v e lo p s u p p o rt te a m m o d e l
¾ R e d u c e n o n -p ro d u c tiv e tim e
K n o w le d g e ¾ C u ltu ra l C h a n g e ¾ Id e n tify in d u s try tre n d s ¾ In c re a s e th e % o f n e w
M anagem ent
¾ In c re a s e tra in in g – te c h n ic a l ¾ U n d e rs ta n d c o s ts e m p lo y e e s w h o u n d e rta k e in d u c tio n
P e rs p e c tiv e a n d tra in in g
¾ In c re a s e tra in in g -c u s to m e r re la tio n s h ip
¾ Im p ro v e th e c o n te n t a n d v a lu e o f
¾ In c re a s e k n o w le d g e o f c u s to m e r u s e o f tra in in g a n d in d u c tio n s
e x is tin g p ro d u c ts
¾
¾ U n d e rta k e a n a g g re s s iv e u tilis a tio n o f
re s o u rc e s re v ie w

Figure 44-CSC balanced scorecard 'Map' showing 'cause & effect'

The diagram shows the four perspectives of the BSC in blue text to the left
hand side, with goals at the top and strategies beneath them. This
information was then put into a ‘map’ to show cause and effect and sent to
the participants for verification and any additions.

With final agreement of the participants we were now able to schedule a date
and time for the next meeting in which it was hoped to assess the
dependency and probability of failure factors. This was given as the 15th of
August 2002 at 12.30pm.

Craig Huxley Page 207


A Member of the Centre for Information Technology Innovation

At this time a reflection and revision phase was conducted.

Reflection
One of the issues to come out of the pilot study and not fully grasped till this
point was the complexity added by a ‘map’ which showed all the cause and
effect linkages in place. While this may not visually look too complex, it is in
practice difficult to absorb all the data shown. The research team did not
need to interpret the data on the map in the same manner as the participants.
It was clear that the participants experienced considerable difficulty in
verifying that the links were correct and complete.

The research team needed to find a better mechanism for presenting these
cause & effect links.

Cause & effect is a visual message reflecting how different objectives and
strategies affect other objectives and strategies within the BSC. It is
important that the participants are able to provide plausible links. As the BSC
becomes a more complete picture of the needs of the organisation the
complexity of these cause and effect linkages usually increases. Thus, in the
write-up of the first meeting the diagram used to show the BSC went through
a number of drafts. We attempted to put it into a Power-point slide using the
custom animation functionality to introduce the cause & effect linkages a few
at a time so that the participants might absorb the whole picture. This was not
much better than a static picture and thus ineffective.

The research team thus felt it important to provide an easy to understand and
visually uncomplicated picture of their BSC. The result was a search for
different ways of presenting the information in a way which was visually
easier to absorb as well as allowing for greater amounts of information to be
added. The approach finally taken for this case study and the next two case
studies, was to use software produced by Mind Jet called ‘Mind Mapper’
which presents visual concept analysis maps.

Craig Huxley Page 208


A Member of the Centre for Information Technology Innovation

Other possible solutions were to use different modelling software products


such as Microsoft Visio, ARIS toolset and DIA. These tool sets support
modelling applications but each of them failed to provide the functionality
required in displaying the information developed in this study. Modelling
packages generally use ‘objects’ which are different shapes and colours.
These objects allow for a small amount of text to be added as most
communication of information is provided by the standard shapes used. We
needed an easy means to show the logical connection of objectives to
strategies and also group the objectives into perspectives: concept mapping
provided this functionality. The mind-mapping software was also inexpensive
and easy to use.

We next sought to render the concept maps more effectively and investigated
the work of Tufte (2002) who writes on the visual display of information. His
book “Visual Explanations: Images and Quantities, Evidence and Narrative”
explains how a visual explanation has a number of levels. The first is the
colour of the explanation and how colour draws the eye to groups of
information. The second is the layout of the explanation and the placement of
objects or information as another way of relating information and the third,
the structure of text based information (Tufte 2002). We made extensive use
of colour, layout and text structure.

While the mind mapping software has limitations which were brought out
during the final case study, it was considered by the participants to be much
improved on the previous approach. The software is able to take into account
the three levels of information suggested by Tufte (2002). Text can be
coloured as well as highlighted and positioning of branches or connections
are easily achieved. In addition, the amount of text is not constricted by the
software.

Figure 45 compares the new mind-mapper style of BSC with that of the
previous style.

Craig Huxley Page 209


A Member of the Centre for Information Technology Innovation

C S C - B H P -B illito n B u s in e s s U n it
B a la n c e d S c o re c a rd

G o a ls R e tu rn o n In v e s tm e n t 1 4 % G ro w th o f R e v e n u e b y 5 %
S tra te g ie s In c re a s e C o n s u m p tio n o f e x is tin g s e rv ic e s to G ro w th e n u m b e r o f s e rv ic e s / C a p tu re a c c u ra te a n d a ll la b o u r c o s t
e x is tin g c u s to m e rs p ro d u c ts a c tiv itie s

F in a n c ia l
¾ R e d u c e u n it c o s ts ¾ D e v e lo p a n d s e ll p ro fita b le n e w ¾ A c c u ra te re c o rd in g o f tim e a n d
P e rs p e c tiv e
p ro d u c ts b illa b le h o u rs
¾ S e ll m o re u n its
¾ In c re a s e p ro d u c tiv ity
C u s to m e r /
¾ In c re a s e C R M a c tiv itie s ¾ Id e n tify a n d d e v e lo p c u s to m e r
S u p p lie r
n e e d s fo r p ro d u c ts /s e rv ic e s
P e rs p e c tiv e ¾ In c re a s e c u s to m e r s p e c if ic k n o w le d g e
¾ In c re a s e th e a w a re n e s s o f e x is tin g
c u s to m e rs to s e rv ic e s w e h a v e th a t th e y d o n 't
p re s e n tly u s e
In te rn a l P ro c e s s
¾ R e d u c e a c tiv ity c o m p le x it y ¾ R e c o n fig u re p ro d u c ts a n d
P e rs p e c tiv e
¾ In c re a s e u tilis a tio n o f lo w c o s t s o lu tio n s e rv ic e s to m e e ts n e e d s
p ro c e s s e s
¾ P ro c e d u ra lis e e x is tin g in fo rm a l p ro c e s s e s
¾ D e v e lo p s u p p o rt te a m m o d e l
¾ R e d u c e n o n -p ro d u c tiv e tim e
K n o w le d g e ¾ C u ltu ra l C h a n g e ¾ Id e n tify in d u s try tre n d s ¾ In c re a s e th e % o f n e w
M anagem ent
¾ In c re a s e tra in in g – te c h n ic a l ¾ U n d e rs ta n d c o s ts e m p lo y e e s w h o u n d e rta k e in d u c tio n
P e rs p e c tiv e
a n d tra in in g
¾ In c re a s e tra in in g -c u s to m e r re la tio n s h ip
¾ Im p ro v e th e c o n te n t a n d v a lu e o f
¾ In c re a s e k n o w le d g e o f c u s to m e r u s e o f tra in in g a n d in d u c tio n s
e x is tin g p ro d u c ts
¾
¾ U n d e rta k e a n a g g re s s iv e u tilis a tio n o f
re s o u rc e s re v ie w

CAPTURE ACCURATE AND


ALL LABOUR COST ACTIVITIES
Accurate recording of billable hours

Increase utilisation Proceduuralise existing


of low cost solution Reduce activity
informal processes
Reduce unit processes complexity
costs Develop Support
team model
INCREASE
CONSUMPTION OF Reduce non-productive time
SERVICES
Sell more units
Increase Productivity
Increase customer specific knowledge
Increase the awareness of existing
customers to services we have that
Goals- they don't presently use
Return on Investment
& Growth of Revenue
Develop and Sell profitable Increase CRM activities
6/11/2002 - v21 new products
GROW THE NUMBER OF Identify and develop cutomer needs for
SERVICES / PRODUCTS products / Services

Reconfigure products and


services to meet needs

Strategies
Financial Objectives
LEGEND Customer supplier Objectives
Internal Process Objectives
Knowledge Management Objectives

Figure 45- Comparison of first style of BSC representation with new Mind mapper style

Figure 45 we can see that the bottom diagram is a branch structure with
colour representing the different types of information such as strategies and
objectives, with the objectives also differentiated by perspective and these
are also different colours. Links which do not fit into the branch structure are
shown by drawing a link from objective to objective or objective to strategy as
necessary. Apart from the lack of knowledge management objectives in the
bottom diagram, there is no difference in the amount of information displayed
in both diagrams. Thus, we decided that it was an improvement to the

Craig Huxley Page 210


A Member of the Centre for Information Technology Innovation

methodology and concluded that we would continue to use the mind-mapping


software for visual explanations of the developed balanced scorecards.

We sought to verify this new form of presentation in the next meeting;


discussion of our experience is provided at the end of meeting two.

2nd Meeting
The second meeting with CSC occurred on the 22nd August 2002 at the CSC
offices in Coronation Drive Brisbane. The agenda was;
1. To verify the new style presentation of the BSC
2. To link the identified processes with the internal process objectives in the
BSC and assess the impact of these processes on the agreed goal.
3. To assess the factors of dependency and probability of failure for the
previously identified processes

The feedback on the new style of presentation of the BSC was very
favourable and confirmed the research team’s new approach. One of the
participants even had the software installed on his laptop. The participants
considered the use of colour and the branch style graphics a considerable
improvement on the previous style.

The next agenda item was to take the identified process from the first
meeting and for the participants to decide which processes impacted upon
the internal process perspectives in their BSC. To do this we used an A4
sheet of paper (Table 48) which listed the processes down one side of the
page, the internal process perspective objectives down the centre of the
page and a column for the results down the right hand side. We have not
provided the results as they are not necessary to the testing and validation of
the targeting methodology.

Craig Huxley Page 211


A Member of the Centre for Information Technology Innovation

Processes within Service Internal Process Perspective Example


Delivery Objectives results
1) Monitor Service levels 5,6,7,9,10,11
2) Manage capacity A) Reduce activity complexity
3) Configuration
B) Increase utilisation of low cost solution
4) Quality processes
processes
5) Billing
6) Reporting C) Proceduralise existing informal processes

7) Help desk D) Develop support team model


8) Incident reporting
9) Problem management E) Reduce non-productive time
10) Change / Enhancement
management
F) Reconfigure products and services to
11) Release management
meets needs

Table 48-Relationship Drawing Table

The participants would state which processes impacted on the internal


process perspective objective in the centre column and record the processes’
representative number in the relevant cell of the far right column. For
example, reduce activity complexity is shown as having six processes
impacting on it. They are represented by the numbers 5, 6, 7, 9, 10 and 11.
The number 5 represents Billing in the left hand column.

With this task complete we then looked at each of the processes and
assessed the dependency of the business unit on that process. The
assessment approach was to use heuristics or rule of thumb. The
participants had an extensive knowledge of their business processes within
the service delivery function and believed that the results of this approach
would be suitable to their needs. The results of the assessment of
dependency and probability of failure, for each participant, were different in
most cases. The only process in which they both applied the same rating was
for the dependency of Incident Management. This was one rating in the
twenty two supplied by each participant. In the time of the meeting, neither
participant was able to persuade the other that their rating was the correct
one. Thus, the results showed two ratings, of which the average was used.

Craig Huxley Page 212


A Member of the Centre for Information Technology Innovation

The research team used another A4 size form for this task and this is shown
in Figure 46.

Service Delivery Processes are the processes identified with the IS Outsourcing
model.

Impact is measured as 1% = Least Impact


100% = Most Impact

Dependency is measured as 1 = Least Dependant


10 = Most Dependant

Probability of Failure is measured as 1 = Least Probability of Failure


10 = Most Probability of Failure

These values are then multiplied to arrive at the total for ‘Criticality’ of the Process.

Service Delivery Processes Impact Dependency Probability of Total


Failure
Manage capacity
Incident reporting
Reporting
Monitor Service levels
Configuration
Change/Enhancement
management
Quality processes
Release management
Help desk
Problem management
Billing

Figure 46- Document used to provide values for impact, dependency and probability of failure

In the document above, the participants provided their assessed ratings for
dependency and probability of failure by asking the question:
For dependency, how much effect will the failure of this process have on the
organisation in comparison to all the other processes being assessed?
For probability of failure, we asked: What is the probability of failure of the
process in relation to the probability of failure of the other processes being
considered? The rating for impact was yet to be assessed and is therefore
shown in grey text in this document.

Craig Huxley Page 213


A Member of the Centre for Information Technology Innovation

Due to time constraints (one hour), the meeting was concluded at this stage.
We had identified which processes impacted on the internal process
objectives using the relationship diagram and had assessed the dependency
and probability of failure of the identified processes. While the output was low
by comparison to the first meeting, there was a need to revise the material
produced during the first in order for the participants to start the valuation and
linking phase. The material produced during the first meeting was the BSC,
revealing cause and effect linkages and the new model of ASP service
delivery, which represented their view of their business.

This second meeting also meant that the we were able to reiterate the steps
still left to be taken and provide time for the participants to reassess the
output so far. They were able to look at the developed BSC and the CSC
version of the reference model of ASP service delivery and discuss the
validity of the reasoning which produced it. There were no changes
requested to the previous work. One comment though was that “different
participants would probably result in different objectives.” The participants
were the applications manager and the national account manager for the
BHP-Billiton account. Mark Harris is a highly experienced applications
specialist and Nigel Hillier has a background in infrastructure for information
systems, coupled with responsibilities for the strategic improvement of the
BHP-B service. Although they both worked closely on an almost day-to-day
basis they had at times differing views of how to achieve the same goals. For
this reason they believed that different participants would provide a further
range of views on how to achieve goals with different objectives dependent
on their experience. The assessments for this methodology are based on the
knowledge and experience of the participants as well as their individual
perceptions and therefore differences of opinion between participants were
bound to occur. A greater number of participants would result in a broader
range of perceptions and possibly result in a more acceptable assessment.

The participants also suggested that we produce a new BSC map showing
the connection of processes to internal process objectives. Thus for the third

Craig Huxley Page 214


A Member of the Centre for Information Technology Innovation

and final meeting we produced a BSC map which included the connection
between the identified processes and internal process objectives.

3rd Meeting
The third and final meeting took place on the 30th of August 2002 at 11.30 in
the morning at the CSC offices in Brisbane. We had emailed the participants
an agenda which stated that the objective of this meeting was to assess the
impact of the identified processes on the goals described in the BSC. The
BSC was that which was developed by the participants in the first meeting
and confirmed during the second meeting. This map of the BSC also
contained the link to processes identified in the first meeting using the
reference model of ASP service delivery as a starting point. In addition to
assessing the impact, the agenda also stated that we would then combine
the three factors of criticality from the targeting methodology to arrive at a list
of the most critical processes within the service delivery functional area.

The research team had printed the BSC map onto A3 size paper (twice the
size of this page) so that there was sufficient space to insert the appropriate
percentage value straight onto the map itself. Each participant had an
individual copy and preferred to work individually on this task. A portion of the
larger map is shown in Figure 47.

Help Desk
%
Incident Reporting
Proceduuralise existing %
informal processes
Quality Processes
%
%
Problem Management
%

Incident reporting
%
Quality processes
Develop Support %
team model
Problem Management
%
%
Release Management
Reduce activity complexity
%
%

Figure 47- Portion of larger BSC map showing processes linked to objectives

In Figure 47 we can see that the processes in red text are linked to the
internal process objectives in green text. This small portion of the larger BSC
Craig Huxley Page 215
A Member of the Centre for Information Technology Innovation

map provides an indication of the diagram which the participants used to


record their assessment of impact of processes on objectives and objectives
on objectives or strategies.

The participants also needed to assess the impact of a process in a similar


way to their assessment for dependency and probability of failure. They
needed to make the assessment in relation to the other processes and
possibly objectives which were linked to the objective or strategy to the left of
the process being assessed.

In Figure 48 we have divided a portion of the BSC map into the six sectors in
which assessment of impact should be undertaken.

Incident Management
Reporting (Internal)
Problem Management
Release Management
Change / Enhancement Management
Help Desk
Configuration
Manage Capacity
Monitor Service Levels
1
Help Desk
Proceduuralise existing Incident Management
informal processes
Increase utilisation of low cost Quality Processes
solution processes Problem Management

Incident Management
Develop Support Quality processes
team model
Reduce unit costs Problem Management
Reduce activity complexity Release Management
2
Incident Management
Monitor Servie Levels
Problem Management
Release Management 3
Quality Processes
Billing (Output)
6 Change / Enhancement Management

Quality Processes
Release Management
Reporting (Internal)
Reduce non-productive time
Help Desk
Problem Management
4
Monitor Service Levels

S
Figure 48- Example of processes and or objectives which needed to be considered together

In the example above the processes in box 1 (help desk, incident


management, quality processes and problem management) provides a
relative contribution to achieving the objective (proceduralise existing
informal processes) and was measured as a combined value up to 100%. It

Craig Huxley Page 216


A Member of the Centre for Information Technology Innovation

is possible that the processes in box 1 are not the only processes that impact
on this objective and, thus, the unnamed processes may provide a small
percentage of the impact assessment. For box 3, the relative contribution
(impact) of up to 100% is taken from the combined assessments of the seven
processes in red text and the two objectives shown in green within the same
box.

This activity resulted in a % value for each process, objective and strategy on
the full BSC ‘Map’. These were then multiplied together to calculate the
impact of a process on the goals of the company. This process is fully
explained in chapter 4, the targeting methodology.

The example shown in Figure 49 is a portion of a BSC map which shows the
impact values provided by the participants.

Increase CRM activities

Develop and Sell profitable 40/50 -Av45- %


new products Identify and develop customer needs for
60/50 -Av55- % products / Services

Also impacts upon'Reconfiguring 60/50 -Av55- %


products and services to meet needs'
Manage Capacity
20/5 -Av12.5- %
Problem Management
Grow the number of
Services / Products 70/70 -Av70- % 20/5 -Av12.5- %
Configuration
30/30 -Av30- % Re configure products and 20/5 -Av12.5- %
services to meet needs
Billing (Output)
40/50 -Av45- % 20/5 -Av12.5- %
Incident Management
Reporting
10/5 -Av7.5- %
Monitor Service Levels
7.5% of 45% of 30% = 10/5 -Av7.5- %

Figure 49- Example of %'s taken to provide impact value

These three values shown (10/5 –Av7.5-%) for each process or objective are
one for each of the participants (10 and 5) and the average of their value
(Av7.5-%). The map is viewed from right to left, with the processes identified
by the participants shown in red text, and the internal process perspective

Craig Huxley Page 217


A Member of the Centre for Information Technology Innovation

objective they link to in green text to the left. The map portion also shows
three values for the impact rating for each of the processes, objectives and
strategy (10,5 &7.5) below each process or objective.

The impact values circled in blue are used to calculate the impact of a
process on a goal. That is, we multiply 7.5% x 45% x 30%. If a process is
impacting on more than one internal process objective then this type of
calculation is conducted for each incidence and then the values are added.
For example, if monitor service levels impacts on re-configure products and
services to meet needs as in Figure 49 and also impacts on a further
objective (not shown in Figure 49), the assessment for both impacts will be
added together once they are calculated individually for their impact on the
goals.

With the assessment of impact made for each of the processes, objectives
and strategies within the BSC map, we were then able to calculate the
criticality of each process. We had previously assessed the dependency and
probability of failure in meeting two and would now combine the assessments
to find out how critical each of the identified processes were to the CSC
service delivery function of the BHP-Billiton account.

The next step then was to place the ratings for impact, dependency and
probability of failure into a table to enable us to multiple them and arrive at a
total for each process. This is shown in Table 49 (following page) with the
ratings for each of the participants identified as NH and MH and the average
rating identified as Av across the bottom row of the table. The highlighted
columns are the average ratings and average total for each of the three
factors of criticality, and the criticality total.

Craig Huxley Page 218


A Member of the Centre for Information Technology Innovation

Impact X Dependency X Prob. of Failure = Totals for Criticality


Problem Mgmt. 0.018 0.022 0.020 8 7 7.5 9 2 5.5 1.310 0.306 0.808
Quality Processes 0.010 0.024 0.015 8 10 9 8 3 5.5 0.639 0.708 0.674
Help Desk 0.011 0.009 0.011 8 10 9 7 6 6.5 0.597 0.556 0.577
Monitor Service
0.009 0.008 0.009 7 10 8.5 6 2 4 0.368 0.165
Levels 0.267
Billing 0.008 0.004 0.006 6 10 8 7 4 5.5 0.338 0.148 0.243
Incident Mgmt. 0.008 0.008 0.008 6 6 6 6 2 4 0.287 0.098 0.193
Manage Capacity 0.008 0.007 0.008 4 7 5.5 4 5 4.5 0.135 0.230 0.183
Configuration 0.010 0.007 0.008 3 7 5 2 6 4 0.058 0.276 0.167
Release Mgmt. 0.005 0.027 0.013 3 7 5 3 2 2.5 0.042 0.379 0.211
Change/
Enhancement 0.002 0.012 0.006 5 9 7 5 1 3 0.053 0.104
Mgmt. 0.079
Reporting 0.004 0.007 0.006 5 7 6 5 1 3 0.108 0.049 0.079
NH MH Av NH MH Av NH MH Av NH MH Av

Table 49- Figures from valuation of impact, dependency and probability of failure

In Table 49 we have placed down the left hand column the eleven identified
processes from the CSC version of the reference model of ASP service
delivery. The columns to the right of this contain the ratings for impact,
dependency and probability of failure. The three columns to the far right
contain the totals for criticality for NH, MH and the average. The CSC
participants had to this point assessed the criticality of the eleven processes
identified from the first meeting. There were three processes (Problem
Management, Quality Processes and Help Desk) which stood out as the
clear leaders in this assessment. The participants chose to select these
processes for improvement from this value rather than continue with the
targeting method and use the two additional factors, cost/benefit and
probability of success.

This concluded the third and final meeting with CSC and the research team
reached closure by taking the participants to lunch and providing the
participants with an A0 size map of their BSC with impact ratings shown (as
seen in Figure 50) and documentation of the results of the targeting project.

The following part of this section provides the data output of the case study in
summary.

Craig Huxley Page 219


A Member of the Centre for Information Technology Innovation

Output of the Targeting Method


We have described the first case study, the three meetings and the
participants approach to using the targeting methodology. This part of the
case study description will summarise the output of the first case study by
describing and examining the results of the targeting method. The refection
and revision phases of the action learning will take place following this
section.

The major outputs of the targeting method project for the participants was the
BSC map which showed the impact values and cause & effect links and the
results table of the calculation of criticality for their identified processes.

We will provide and explain the BSC map first and then examine the results
of the criticality calculation which were shown previously in Table 49.

The diagram shown in Figure 50 (fold out A3 page) is the BSC map for CSC.
The diagram contains the impact values for both participants and the average
value below each of the processes, objectives and strategies. The processes
(red text) are linked to the internal process perspective objectives (green
text): The financial (blue text) and customer/supplier perspective (aqua text)
objectives and the strategies (lavender text) are also shown linked to the
goals which are in red text within a gold highlighted rectangle.

Craig Huxley Page 220


A Member of the Centre for Information Technology Innovation

Craig Huxley Page 221


A Member of the Centre for Information Technology Innovation

Figure 50- CSC map showing linked processes in red to internal process objectives in dark green with the valuations in black below them.
C A P TU R E A C C U R A TE A N D A LL Notes: The three figures situated
L A B O U R C O S T A C T IV IT IE S
below each of the processes and
A c c u ra te re c o rd in g o f b il la b le h o u rs
4 0 /3 0 -A v 3 5 - % 100 %

objectives represent those


In c id e n t R e p o rt in g
provided by the two participants
6 / 3 -A v 4 .5 - %
R e p o rt in g (In t e r n a l)
and the average of these two
6 / 3 -A v 4 .5 - %
P r o b le m M a n a g e m e n t
values.
1 2/ 1 2 - A v 1 2 - %
R e le a s e M a n a g e m e n t
3 /1 2 -A v7 .5 - %
A ls o Im p ac ts u p on C h a ng e / E nh an ce m en t M a n ag em e nt
'R e d u c in g u n it c os t s ' 3 /1 2 -A v7 .5 - %
H e lp D e s k
1 5 /3 -A v 9 - %
3 5 / 1 0 -A v 2 2 . 5- % C o n fig u ra ti o n
6 /6 -A v6 - %
M a n a g e C a p a c ity
3 /6 -A v4 .5 - %
M o n ito r S e rvi c e L e ve ls
6 /3 - A v 4 .5 - %
H e lp D e s k
1 5 /4 0 -A v2 2 . 5 - %
In c id e n t R e p o rtin g
P r o c e d u r a l is e e xis tin g 1 0/2 0 -A v1 5 - %
in fo r m a l p r o c e s s e s
Q u al i ty P ro c e s s e s
3 0/ 2 0 - A v 2 5 - %
6 0/ 2 0 -A v4 0 - %
In c re a s e u t ilis a tio n o f lo w c o s t
s o lu tio n p ro c e s s e s P ro b l e m M a n a ge m e n t
7 0/ 6 0 -A v6 5 - % 1 5 /2 0 - A v 1 7 .5 - %

In c id e nt re p o rtin g
1 0 /2 0 - A v 1 5 - %
N o te : ' D e v e l o p S u p p o rt T e a m M o d e l '
m e a n s to p u t to g e th e r te a m s w ith
Q u a lit y p ro c e s s e s
th e m a jo ri ty o f s k i lls re q u i re d to s o l v e
D e ve lo p S u p p o rt 5 0 /4 0 - A v 45 - %
te a m m o d e l
p ro b le m s fo r c lie n ts .
R e d u c e u n it c o s ts P ro b le m M a n a g e m e n t
3 0 / 2 0 -A v 2 5 - %
2 0/ 3 0 -A v2 5 - % 1 5/ 2 0 -A v1 7 .5 - %
R e le a s e M a n a g e m e n t
R e d u c e a c t ivi ty c o m p le xi ty N o te : 'R e d u c e a c tiv ity c o m p l e x ity '
1 5 / 2 0 -A v1 7 .5 - %
I N C R E A S E C O N S U M P T IO N O F 4 0 /4 0 -A v4 0 - % m e a n s s u c h th i n g s a s a u to m a tin g
S E R V IC E S In c id e n t R e p o rt in g ta s k s , id e n tify in g c o m p l e xi ty a n d
Strategies 3 0 /4 0 - A v 3 5 - % 5 /5 -A v5 - %
tra in i n g s t a ff a n d c l ie n ts i n
S tra te g ie s
u n d e r sta n d in g p r o c e s s e s .
Financial Objectives M o n ito r S e rvi c e L e ve ls
F i n a n c ia l O bje c tive s 5 /5 -A v5 - %
Customer/Supplier
C u s t o m e r s u p p l ie r O b j e c tive s
LEGEN D P ro b le m M a n a g e m e n t
Perspective Objectives
In t e rn a l P ro c e s s O b je c tive s
LEGEND
5 /5 -A v 5 - %
K nInternal
o w l ed g e Process
M a n a g e m e n t O b je c ti ve s R elea se M an a ge m e n t
Perspective Objectives P ro c e s s es G o a ls - 5 /1 0 -A v7 .5 - %
Processes R e tu r n o n In v e s t m e nt Q u a lit y P ro c e s s e s
& G r o w th o f R e v e nu e 5 /2 0 -A v1 2 . 5 - %
6 / 1 1 /2 0 0 2 - v 2 4 B illi n g (O u tp u t)
5 /5 -A v 5 - %
C ha ng e / E n ha nc em e nt M an ag em e nt
5 /1 0 - A v 7 . 5 - %

Q u al ity P ro c e s s es
2 0 /3 0 -A v 25 - %
R e l e a s e M a n a g em e n t
1 0/3 0 -A v2 0 - %
R e p o rt in g (In t e rn a l)
1 0 /1 0 - A v 1 0 - %
R e d u c e n o n -p ro d u c t ive ti m e
H e l p D es k
3 0 /4 0 -A v 3 5 - %
2 0 / 1 0 -A v 15 - %
P ro b le m M a n a g e m e n t
2 0 /1 5 -A v1 7 .5 - %
M o n i to r S e rvic e L e ve l s
1 0 / 5 -A v7 .5 - %

S e l l m o re u n its
1 5/3 0 -A v2 2 . 5 - %

In c r e a se P ro d u c ti vit y In c re a s e c u s t o m e r s p e c i fic k n o w l e d g e

1 5 / 3 0 -A v2 2 . 5 - % 1 00 %

In c re a s e t h e a w a r e n e s s o f e xis tin g
c u s to m er s to s e r vic e s w e h a ve th a t
th e y d o n 't p re s e n t ly u s e
1 5/ 2 0 -A v 17 . 5 - %

In c re as e C R M a c tivit ie s

D e ve lo p a n d S e ll p r o f ita b le 4 0 /5 0 -A v4 5 - %
n e w p ro d u c ts Id e n t ify an d d e ve lo p c u st o m e r n e e ds fo r
6 0 /5 0 -A v 55 - % p ro d u c ts / S e rvic e s

A ls o i m p ac ts u po n ' R e c o n f ig u rin g 6 0 /5 0 -A v5 5 - %
p ro d u c ts a n d s e rv ic e s to m e et n e ed s '
M a n a g e C a p a c i ty
2 0 /5 -A v1 2. 5 - %

GROW TH E N UM BER OF P ro b le m M a n a g e m e n t
S E R V IC E S / P R O D U C T S 7 0 /7 0 -A v 7 0 - % 2 0 /5 -A v 1 2 .5 - %
C o n f ig u ra tio n
3 0 /3 0 - A v 3 0 - %
R e c o n fi g u re p ro d u c ts a n d 2 0 /5 -A v1 2. 5 - %
s e rvic e s to m e e t n e e d s
B il lin g (O u tp u t)
4 0 /5 0 -A v 45 - %
2 0 /5 -A v1 2. 5 - %
In c id e n t R e p o rtin g
1 0 /5 - A v 7 . 5 - %
M o n i to r S e r vic e L e ve ls
1 0 /5 -A v7 .5 - %

Craig Huxley Page 222


A Member of the Centre for Information Technology Innovation

Craig Huxley Page 223


A Member of the Centre for Information Technology Innovation

In Figure 50 we can see that the processes from the CSC version of the
reference model of ASP service delivery are in red text and these are linked
by sub branches to the internal process objectives shown in green text.

There are two “notes” text areas to the right of the diagram which are the
result of one participant’s comments as to what was originally meant by two
of the objectives. These are ‘develop support team model’ which means to
put together teams with the majority of skills required to solve problems for
clients and ‘reduce activity complexity’ which means such things as
automating tasks, identifying complexity and training staff and clients in
understanding processes. These notes sections of the BSC map act as an
aid to improving communication of the intent of the business unit in achieving
its goals.

The legend taken from Figure 50 is shown again in Figure 51 and enables a
reader to see that: Strategies
Financial Objectives
1. The strategies are lavender in colour Customer/Supplier
Perspective Objectives LEGEND

Internal Process
2. The financial objects are bright blue Perspective Objectives
Processes
3. The customer supplier objectives are aqua

4. The internal process objectives are green Figure 51- Legend from Figure 50

5. The knowledge management objectives if shown in the diagram would be

light orange

6. The processes are in red

The colour in the diagrams enables a reader to absorb information on one


level and, combined with the branch structure, directs a reader to areas of
interest without the necessity of reading large amounts of text. The final map
for this project was printed onto an A0 size sheet (840mm x 1188mm). In this
size the information contained in the diagram was easier to assimilate. The
participants sought to use the BSC map as a way of initiating discussion
within their team. They felt that the content of the map was well enough

Craig Huxley Page 224


A Member of the Centre for Information Technology Innovation

presented to enable discussion without a lengthy introduction. This meant


that the whole team would become interested in the process improvement
activities as well as understanding the strategic intent of the business unit.
The second major output of the case study for the participants was the table
listing the results of the assessment of criticality. Table 50 is useful for
comparing the difference between the two participants from CSC.

Impact X Dependency X Prob. of Failure = Totals for Criticality


Problem Mgmt. 0.018 0.022 0.020 8 7 7.5 9 2 5.5 1.310 0.306 0.808
Quality Processes 0.010 0.024 0.015 8 10 9 8 3 5.5 0.639 0.708 0.674
Help Desk 0.011 0.009 0.011 8 10 9 7 6 6.5 0.597 0.556 0.577
Monitor Service
0.009 0.008 0.009 7 10 8.5 6 2 4 0.368 0.165
Levels 0.267
Billing 0.008 0.004 0.006 6 10 8 7 4 5.5 0.338 0.148 0.243
Incident Mgmt. 0.008 0.008 0.008 6 6 6 6 2 4 0.287 0.098 0.193
Manage Capacity 0.008 0.007 0.008 4 7 5.5 4 5 4.5 0.135 0.230 0.183
Configuration 0.010 0.007 0.008 3 7 5 2 6 4 0.058 0.276 0.167
Release Mgmt. 0.005 0.027 0.013 3 7 5 3 2 2.5 0.042 0.379 0.211
Change/
Enhancement 0.002 0.012 0.006 5 9 7 5 1 3 0.053 0.104
Mgmt. 0.079
Reporting 0.004 0.007 0.006 5 7 6 5 1 3 0.108 0.049 0.079
NH MH Av NH MH Av NH MH Av NH MH Av

Table 50- List of results for assessment of criticality

In Table 50 the difference in assessment, for the probability of failure for the
problem management process is highlighted in yellow. It is this factor
(probability of failure) of the problem management process which resulted in
the greatest difference of opinion for the two participants. They are almost at
opposite ends of the ten point scale (9 & 2). The difference in assessment by
the two participants in the same process for the factor dependency is one, in
the ten point scale (8 & 7).

The results show that there was no complete disagreement between the two
participants for all of the three factors assessed for a process. Instead, it was
more common to see the assessment of just one factor for a process
resulting in disagreement. This is evidenced by the example highlighted in
Table 50. Figure 52, on the following page, provides a simple comparison of

Craig Huxley Page 225


A Member of the Centre for Information Technology Innovation

the ratings provided by the two participants across the three factors for
criticality: impact, dependency and probability of failure.

Comparison of Diferences
9
Impact

3 Dependency

0
Change/ Enhancement Mgm

Incident Mgmt.
Release Mgm t.

Problem Mgmt.
Qual ity Processes

Billing

Conf iguration

Monitor Service Levels


Help Desk

Manage Capacity
Probability of
Failure

Figure 52- Comparison of differences in the two participants’ ratings

In Figure 52 above, we can see that the factor of probability of failure caused
the most disagreement. The impact factor realised the least amount of
disagreement and possibly indicates that the process for assessing impact
was more inclusive and thus generated greater agreement. Figure 53
compares the results of the calculations for criticality of the processes for the
participants and the average result.

Craig Huxley Page 226


A Member of the Centre for Information Technology Innovation

Comparison of Criticality Values


1.4
1.2
1 NH
0.8 MH
0.6 Av
0.4
0.2
0

Incident Mgmt.
Help Desk

Service Levels

Capacity

Configuration

Enhancement
Billing

Release

Reporting
Processes
Problem

Manage
Mgmt.

Mgmt.
Quality

Change/
Monitor

Mgmt.
Figure 53- Graphical view of the results.

Figure 53 shows that the process Problem Management was assessed as


the most critical process, with Quality Processes and Help Desk rated a close
second and third (if reviewed using the average rating). CSC intended to
focus improvement efforts on these three processes as they were from their
point of view the clear leaders in the criticality assessment.

Although the participants were asked on a number of occasions to complete


the five questions below we were unsuccessful in receiving a response.

a) Are the results of the process what you expected?


b) Are the results of the process useful to you as a manager?
c) Are the results of the process useful to you as a company?
d) Was there anything in the identification process which you didn't
understand or had difficulty with?
e) Are you able to suggest any improvements to the way the process
was implemented?
The next section is the reflection phase of this second cycle of action
learning.

Craig Huxley Page 227


A Member of the Centre for Information Technology Innovation

7.3.4 Reflection Phase Cycle Two

The lessons learnt from the pilot case study were applied successfully in this
case study. That is, sufficient time was sought to complete the project to the
conclusion sought by the participants and the research team. One suggestion
from the participants was that the process would benefit from being
completed over a two day period with as many of the management team as
possible. Their reason for this was that the more management involved the
greater the quality of the output and the greater the consensus of the results.
It would also ensure that the management team thoroughly understood the
approach undertaken to achieve their goals and the processes in which they
must excel.

The process level within the organisation at which the BSC was developed
enabled the participants to effectively link processes to objectives using
heuristic methods. Meetings had structure, with the provision of an agenda,
and the use of relevant documentation in support of the process. A list of
terms and definitions, a description of the BSC and explanation of the
process was also provided in printed format. Participants had valid and
positive business reasons for participation and thus were more proactive.
The company culture was more open to information sharing and thus found
the results to be of greater benefit to more than the two participants.

The major issue to arise out of this case study was the time taken to arrive at
completion. This was taken into account in the following case studies, with
the second being conducted over two half days three weeks apart.
Information has been added to the BSC maps to more effectively
communicate the meanings of terms used. The final map for this case study
has two ‘notes’ areas which are explanations of two objectives. (See Figure
50)

There was also some difference of opinion over the values applied for
impact, dependency and probability of failure for this case study which is

Craig Huxley Page 228


A Member of the Centre for Information Technology Innovation

seen by the use of three sets of values. (The two participants and the
average) A positive outcome was that it initiated discussion between the
participants as to why there is this perceived difference. It may also highlight
the need for a more structured approach to assessing these factors. A more
structured approach might include both the provision of information to the
participants on defining dependency and probability of failure more
accurately and initiating discussion on what criteria to use in assessing these
two factors before any assessment takes place.

The change to the use of mind-mapper software for developing and


describing the BSC map was also an important output of this case study. The
participants were very positive about the use of this software and the
diagrams produced with it. The use of the fifth perspectives (Partner) for the
BSC was considered to be unnecessary for all ASP service delivery
companies and should be assessed for suitability for each organisation.

The need to know the extent of the service being offered was also
considered as CSC did not believe this was necessary for them to use the
targeting method. The research team concluded that it also may be assessed
for suitability for each organisation.

This action cycle changed the order of the steps in the targeting methodology
by undertaking step 2.2 (Identify the processes) before 2.1 (Introduction of
the project as a whole to the project team). This was changed to provide
participants early on in the first meeting with an activity which the research
team had considered to be more interesting to participants than the
explanation of the targeting method process. In addition to this change, we
also conducted steps five and six before conducting steps three and four. We
believed that the participants needed to see some early output, which was
interesting to them. To do this, we implemented the process identification
steps first and then explained the targeting method and the terms and
definitions. We then sought to undertake this difficult part (developing the
BSC and impact assessments) before the smaller task of assessing the
factors of dependency and probability of failure.

Craig Huxley Page 229


A Member of the Centre for Information Technology Innovation

7.3.5 Revision Phase Cycle Two

The revisions for this cycle of the action learning are fully described in the ten
step targeting methodology on the following pages. The summary of these
changes is described below.
1. Use of mind-mapper software for presenting the BSC maps
2. The need for a comprehensive revision of previous activities and
outstanding activities, if the time frame is greater than two weeks.
3. The use of notes sections or alternative which explains portions of the
BSC map that may be difficult to understand.
4. Improved approach to the assessment of the factors dependency and
probability of failure.
5. Optional use of the BSC perspective ‘Partner’ in organisations and also the
optional need for defining the extent of the service supplied. It would also be
advantageous to obtain agreement of the participants for the names of each
of the other perspectives.
6. Change of order of Steps 2.1 and 2.2 to reverse them so that 2.2 came
before 2.1.
7. Change of order of steps, completing steps 5 and 6 before steps 3 and 4.
That is, developing the BSC and impact assessment before assessing
dependency and probability of failure.

In the text based ten step model of the targeting methodology we have
changed the order of the steps mentioned above in point six and seven.
Those parts highlighted in grey are the areas in which changes have been
made or should be considered.

Craig Huxley Page 230


A Member of the Centre for Information Technology Innovation

Version 3 of the Targeting Method


1. Preplanning
1.1. Assessing participants
1.2. Preparation of any documents
2. Defining Scope
2.1. Identify the processes
2.2. Introduction of the project as a whole to the project team
3. Developing a Balanced Scorecard (BSC)
3.1. Identify the goals and strategies and objectives of the entity
3.2. Identify the cause & effect linkages within the BSC
3.3. Link processes identified earlier (2.2) to internal process objectives
4. Assessing the Impact of Processes on Goals
4.1. Assess the impact of each process on goals using heuristics and
total
5. Assessing Dependency
5.1. Agree on the method to be used for assessing dependency
5.2. Identify the criteria to be used and rate each process
6. Assessing Probability of Failure of the Process
6.1. Agree on the method to be used for assessing probability of failure
6.2. Identify the criteria to be used and rate each process
7. Calculate the Criticality of each Process
8. Assess the Cost/Benefit of Improving the Process
8.1. Agree on the method to be used for assessing cost/benefit
8.2. Identify the criteria to be used and rate each process
9. Assess the Probability of Successful Improvement of the Processes with
positive cost/benefit
9.1. Agree on the method to be used for assessing probability of
success
9.2. Identify the criteria to be used and rate each process
10. Selection of which Critical Process to Improve First
10.1. Rank order the processes with positive cost/benefit by greatest
probability of successful improvement. Those processes with the
greatest probability of success and greatest cost/benefit should be
improved first

Craig Huxley Page 231


A Member of the Centre for Information Technology Innovation

An explanation of each part of the generic 10 part process

1. Preplanning
1.1. Assessing Participants
This step involves some research of the participants and their organisation. If
possible initial contacts should be used to assess the knowledge of the
participants and who should attend the first implementation meeting.
1.2. Preparation of any documents
Before the first meeting of the project team there are a number of documents
which should be produced. The first is a simple outline of the targeting
methodology that is to be used. The second is the terms and meanings of the
words within the BSC. Initially meetings need to be structured; provision of an
agenda, list of terms and definitions regardless of previous experience and
description of a basic BSC

2. Defining Scope
2.1. Identify the processes applicable to the project
The processes and the level at which they should be seen are identified at
this point. Conduct this step before the introduction of the entire project.
Though this is dependant on the situation or context in which you are
implementing the targeting method.
2.2. Introduction of the project as a whole to the project team
Discussion of the whole project is initiated. This is dependant on the number
of participants and time as to the detail required. It should be verified that the
participants are familiar with the BSC and its terms. Agreement should also
be reached as to the lines of communication and confidentiality.
The first meeting should ascertain through discussion the knowledge base in
regards to the BSC. It is necessary for an effective implementation that the
research team and the participants have a clear understanding and
agreement of what the BSC is, how it is supposed to work and definitions of
the terms used. Outcomes of this meeting should be that all participants
speak the ‘same language’ and there is a plan of how to go forward. Ideally
this plan will list suggested documents from which goals, strategies and
possible objectives may be sourced.

Craig Huxley Page 232


A Member of the Centre for Information Technology Innovation

Define the business area in which to conduct the project, (not all targeting
projects are implemented for the whole organisation). The time frame of the
project should then be assessed and the roles and responsibilities for the
project outlined. Define and achieve agreement from the participants as to
the necessary amount of time needed for successful completion of the
project.

3. Developing a Balanced Scorecard (BSC)


Conduct this step and that in step four before assessing dependency and
probability of failure now steps five and six. Use the mind-mapper software
for visualising the BSC maps. In addition to this there is a need in the
implementation phase to use further documentation to assist participants in
remembering the meanings of objectives and strategies. A further
requirement in this longer form of implementation is to provide summaries of
what has occurred previously and what still needs to be completed to
implement the process.
3.1. Identify the goals, strategies and objectives of the entity in the
project
If necessary an explanation of the BSC is necessary and the goals and
strategies of the entity are provided. Develop objectives as the mini goals for
the strategies and place these objectives within the agreed perspectives of
the BSC. Identify a need for using a fifth perspective such as ‘partner’ before
using it. It would also be advantageous to obtain agreement of the
participants for the names of each of the other perspectives.
3.2. Identify the cause & effect linkages within the BSC
Using the experience and skills of the project team identify the cause and
effect linkages within this BSC. If necessary add objectives to allow for
possible causes and effects.
3.3. Link processes identified earlier (2.2) to internal process objectives
With the processes identified earlier (or now) the project team link those
process which impact upon the internal process objectives within the BSC.
Provide a visual ‘map’ of the BSC for the participants to assess for
correctness of the work so far and make changes from any feedback.

Craig Huxley Page 233


A Member of the Centre for Information Technology Innovation

A separate BSC needs to be drawn up as each management focus of an


organisation is brought into the project. This is especially necessary in
organisations with few levels of management and lower level managers who
are not involved in strategy formulation. Improvement of the visual ‘map’ of
the BSC may reduce the confusion and lead to better communication.

4. Assessing the Impact of Processes on Goals


4.1. Assess the impact of each process on goals using Heuristics
Using Heuristics assess the impact of each process on the objectives,
strategies and ultimately the goals of the entity. Each link is assessed one at
a time and in relation to all of the processes and objectives impacting on the
objective, strategy or goal. Calculate totals for each process by multiplying
the percentages together along the links.

5. Assessing Dependency; the effect of failure of a process


5.1. Agree on the method to be used for assessing dependency
An improved approach to the assessment of the factor of dependency is
required. It is possible that an improved and more complete explanation of
the criteria for assessing dependency would provide an improved result. In
addition to clearer information there should be agreement between
participants as to the criteria used and the weightings for each criteria.5.2
identifies the major criteria for dependency. This project aimed to use
heuristics for this task and this was completed by assessing the dependency
of the entity on the process in relation to each of the other identified
processes.
5.2. Identify the criteria to be used to rate each process
If greater reliability was needed then the criteria for dependency would be
used here with weightings for each. (Availability, reliability, safety,
confidentiality, integrity and maintainability) Each process would then be
rated by applying a rating to each criteria and multiplying this by the
weighting for that criteria and multiplying all these together for each process.

Craig Huxley Page 234


A Member of the Centre for Information Technology Innovation

6. Assessing Probability of Failure of the Process


6.1. Agree on the method to be used for assessing probability of failure
The use of heuristics was suggested- as for 5.1
6.2. Identify the criteria to be used to rate each process
An improved approach to the assessment of the factor of probability of failure
is required. It is possible that an improved and more complete explanation of
the criteria for assessing probability of failure would provide an improved
result. In addition to better information there should be agreement between
participants as to the criteria used. No criteria are required for a heuristic
approach to this assessment. Additional information on failure could be the
types of failure; over performance, failure over time, intermittent failure,
partial failure and complete failure.

7. Calculate the Criticality of each Process


Multiply the ratings or values for impact, dependency and probability of
failure. This total is the criticality of each process

8. Assess the Cost/Benefit of Improving the Process


8.1. Agree on the method to be used for assessing cost/benefit
If the projects are very small then heuristics may be suitable, otherwise the
suggested approach is to develop a business case for each process as a
separate project.
Dependant on the approach taken above in 8.1 the criteria would include;
costs for resources and non-project related incurred costs, tangible and
intangible benefits. Only those processes with a positive cost/benefit would
then be assessed for the probability of successful improvement.

9. Assess the Probability of Successful Improvement of the Process


9.1. Agree on the method to be used for assessing probability of
success
The suggested approach here again is to use heuristics.
9.2. Identify the criteria to be used to rate each process

Craig Huxley Page 235


A Member of the Centre for Information Technology Innovation

Some criteria which may support the decisions are; Team Orientation,
Project Management, Management Support, User Participation, Modeller’s
Expertise, Project Championship, Communication,.

10. Selection of which Critical Process to Improve First


10.1.Rank order the processes with the greatest probability of successful
improvement.
Those processes with the greatest probability of successful improvement and
the best cost/benefit ratio should be selected first for improvement. This is
essentially then a business decision. The rank order is not a scientific
approach to selecting which processes to improve first, it is an improved
approach to current practice.

This second cycle of action learning has provided the research team with
seven items in which to improve the targeting methodology, the most
important being the use of the Mind Jet software, Mind Mapper, to visualise
the results of the BSC map and the impact linkages of the processes on the
goals shown in the BSC Map.

The next action learning cycle uses the second case study participant
REALTECH AG for the implementation and observation phases of the action
learning.

Craig Huxley Page 236


A Member of the Centre for Information Technology Innovation

7.4 Cycle 3 of Action Learning: Case Study Two

PS C1 C2 C3

This cycle of action learning is the third cycle of four to be used to test and
improve the targeting methodology. We follow the same process as occurred
in the previous cycle of action learning, using the four phases,
implementation, observation, reflection and revision. We implemented the
targeting method within this case study participant, REALTECH, and
observed and recorded the comments and issues that arose during the
implementation. Then the research team reflected on the observations and
revised the targeting method to what is now termed version 4 of the targeting
method. This description of the second case study does not include the level
of detail seen in the previous case study. This description focuses on the
changes to the methodology and the issues that arose during the case study
which are different to those found in previous case studies. In this way we
were able to focus on the improvement of the targeting methodology in the
action learning cycles. The reflection phase does include some learning,
taken from comparisons between each of the case studies.

7.4.1 Purpose and Business Problem

REALTECH is the industry partner and sponsor of this research project. They
were interested in using the targeting methodology to identify those areas of
their Remote Services business, which they needed to improve in order to
continue their present market growth. They had commented that the remote
hardware and application maintenance market for which their company
provided unique software and ASP product was ‘very tight’ in Australia. This
meant that companies were cutting back on the maintenance of their IT
systems. REALTECH were seeking to use the targeting method as a way of
showing potential and existing clients that cutting IT budgets was not in their
best interests if they were trying to achieve organisational goals. The
targeting method was perceived by REALTECH to be able to logically show a

Craig Huxley Page 237


A Member of the Centre for Information Technology Innovation

measurable link between IT and corporate goals. If this is the case, then
REALTECH would be able to use the targeting method to show customers
why their continued spending on IT was necessary. This would then be one
support for REALTECH’s objective of continued market growth.

The functional area ‘Remote Services’ was the focus for this iteration of the
targeting methodology. The remote services supplied by REALTECH were
the monitoring and maintenance of hardware and applications used to
support the SAPR/3 enterprise system. REALTECH are a specialist in this
area and also produce and market software which automates many of the
hardware maintenance tasks for SAP applications.

7.4.2 Proposed Approach

The proposed approach by the research team in this case study was to
continue to use the improved targeting methodology steps. We should
specifically mention that we intended to try to reduce the length of time the
implementation took. This would reduce the need for any comprehensive
revision of the targeting methodology at subsequent meetings. We would
also ensure that the assessment of dependency and probability of failure
would include a more detailed explanation of the criteria, which might be
used to assess these two factors. This would improve the heuristic approach
and allow for greater agreement of the resulting assessment.

The sequencing of the targeting methodology steps would also be utilised to


increase the pro-activeness of the participants. With this new sequence, we
would identify the processes before the detailed explanation of the
methodology, develop the BSC map and assess the impact of the identified
processes before assessing dependency and probability of failure.

Craig Huxley Page 238


A Member of the Centre for Information Technology Innovation

7.4.3 Actual Approach and Observations

The first meeting with REALTECH was on Friday the 13th September 2002 in
the REALTECH boardroom at 122 Arthur Street, North Sydney in NSW. The
research team flew to Sydney from Brisbane on an early flight to arrive in
time for an 8.30am meeting. We had agreed to a full half day time frame and
completed the meeting at 1pm that afternoon. The two participants from
REALTECH were Helena Mendes, Senior IT Consultant and Remote
Services Manager and Wayne Baker, Managing Director Australasia. The
agenda had been emailed to the participants on the Monday before, along
with digital copies of the targeting method process and the terms and
definition information.

The agenda was an updated version of the agenda used for CSC (the first
case study). The first order of business though was the signing of forms to
formally agree to participation in the project. We reiterated the confidentiality
of their information and that anything which was to be published would be
provided to them before publication for their approval.3

The second agenda item dealt with the identification of the processes in
REALTECH which made up the Service Support functional areas. We were
focusing on this functional area as it was considered in the Delphi study to be
the most critical functional area. To do this we again used the reference
model of ASP service delivery as a starting point and included the CSC
version of the model.

The participants made several changes to the model, which are highlighted
by the blue boxes in the diagram named Figure 54 on the following page.

3
The details in this section of the thesis have been approved for release.

Craig Huxley Page 239


A Member of the Centre for Information Technology Innovation

Service Definition

Define Service Scope


Define Service Levels
Define Billing
Define Reporting
Service Infrastructure (Hosting) Remote Support Services
Service Delivery
QA Customer Relationship Management
Monitor Service

Quality Processes
Billing
Levels

Enhancement Mgmt
H/W S/W App
Mgmt Mgmt Reporting
Mgmt

Change Mgmt
Manage Capacity Site visit Management

Release
Mgmt
Service Support
Configuration

Help desk
Problem

Incident
Mgmt

Mgmt

Mgmt
Security/Continuity Mgmt
Proactive Mgmt

Figure 54- REALTECH version of reference model of ASP Service Delivery (alterations in blue)

The changes made by REALTECH to the reference model of ASP service


delivery took into account the changes suggested by CSC but also added to
and extended the model, through the inclusion of Proactive Management and
Site Visit Management.

Within the Customer Relationship Management (CRM) functional area, a new


process named Site Visit Management was also added. The REALTECH
participants also separated change/enhancement management into two
separate processes, Change Management and Enhancement Management.
The final changes were to show the cross over of the processes Release
Management, Enhancement Management and Change Management from
Service Support functional area into the functional area of CRM. This was
meant to illustrate the connection between these three processes and the
processes within the CRM functional area.

An outstanding issue from the CSC model was that the processes Monitor
Service Levels, Manage Capacity and Configuration had no functional
process name. The REALTECH participants named this functional area
Service Delivery. They also added a further process called Proactive
Management to the Service Delivery functional area. The participants re-
named the functional area CSC had called Service Delivery calling it Remote

Craig Huxley Page 240


A Member of the Centre for Information Technology Innovation

Support Services. This functional area encompasses Service Support,


Customer Relationship Management, Quality Assurance (QA) and Service
Delivery.

The research team did not intend to investigate the reasoning behind these
changes, only to identify the major processes within the functional area now
called ‘Remote Support Services’.

In total there were fourteen processes identified from this activity to be used
in the targeting methodology. These were:
1. Billing 8. Incident Management
2. Reporting 9. Help Desk Management
3. Site Visit Management 10. Quality Processes
4. Release Management 11. Monitor Service Levels
5. Change Management 12. Manage Capacity
6. Enhancement Management 13. Configuration
7. Problem Management 14. Proactive Management

The task of identifying which processes would be used in the targeting


method led the participants to explain the scope of the services which they
provided within their Remote Services product. The research team thought
that it was not necessary to define exactly what this service was, due to their
experience with CSC, though in fact, this did occur. The participants
explained the product scope to help explain the reasons for the new
processes within their version of the model for ASP service delivery. In hind-
sight, it was useful in this instance to be able to effectively define the service
as there were many parts of their consulting activities which overlapped with
the Remote Services product. This list as shown in Table 51 was used to
ensure that the participants focussed on the Remote Services product only
when assessing the factors of the targeting method.

Craig Huxley Page 241


A Member of the Centre for Information Technology Innovation

Remote Services
Remote Services Remote Services
System Checks Standard Administrative Tasks
CCMS alerts 24 x 7 System Monitoring
Performance checks Reporting on usage and incidents
Database checks of performance Importing of Support applications
Database checks of free space Applying Kernal patches
R/3 System log check Transports of data
Dump analysis User creation and security
CPU load check Backup scheduling

Remote Services Consulting


Topology auditing
Hardware sizing
Technical auditing
Project management
Implementation of applications

Table 51- List of service provided within the 'Remote Services' product

There are three distinct areas within the Remote Services product: Remote
Services System Checks, Remote Services Standard Administration Tasks
and Remote Services Consulting. System checks refer to the monitoring and
analysis of the hardware used to support the SAP R/3 systems. Standard
administrative tasks refer to the maintenance of the applications being used.
Remote Services Consulting activities refer to the range of services provided
by a consultant usually within the customer’s premises. The customer might
require all or part of the offered services and in most cases purchase the
consulting services as a one off requirement.

The most common approach to new customers for the Remote Services
product was to provide a quality control service in the form of an audit of their
present maintenance procedures. The new client would then be introduced to
the Remote Services product run by a team of highly knowledgeable SAP
R/3 administration experts. The Remote Services team are able to provide

Craig Huxley Page 242


A Member of the Centre for Information Technology Innovation

services which improve the operation of the SAP system, the operating
system and the hardware on which the software resides. This is effectively an
application service provision business. From their premises in North Sydney
they are able to connect with customers all over Australia and remotely
monitor and maintain their system on a twenty four hour seven day a week
basis. As with many businesses, the major management focus is on
increasing the client’s use of the available services. Current clients are
provided with regular written reports and contacted by phone to resolve
issues and suggest improvements. Site visits are used to build relationships
with clients and as an avenue to raise issues which the Remote Services
team then provide solutions to.

The next agenda item was to explain the targeting method process. For this
task a printed copy of the Targeting method as Power-point slides were used.
Approximately 30 minutes were taken to explain the methodology and the
use of the balanced scorecard (BSC). Both participants had heard of the
BSC but had not used it. As per the notes in the ten step methodology, we
asked the participants if they believed the fifth perspective (Partner) was
necessary in providing a balanced view of their environment. They
considered that this fifth perspective was unnecessary to their view of the
environment as it was covered by the customer/supplier perspective.

Particular attention was paid to the issue of ‘cause and effect’ as this facet is
the major aspect of the methodology. The terms and definitions used in the
targeting method were also discussed with the participants so that we were
all at the same level of understanding of the process. Participants were also
asked on more than one occasion if they had any questions or problems with
the method. The Remote Services manager was interested to find out if there
was an area of her responsibilities that she should be focusing on more than
another. Both participants were also interested in the cause & effect process
as a means of revising their present approach to achieving business goals.

Once the scope, identification of processes and the explanation of the


method were complete, our next agenda item was to produce a BSC for the

Craig Huxley Page 243


A Member of the Centre for Information Technology Innovation

whole business initially and then drill down into the remote service area. The
research team undertook to develop first a corporate BSC and then develop
a ‘Remote Services’ BSC, treating Remote Services as a separate business
unit. In this respect, this case study differed from the CSC case study but
took into account the lessons learnt from the pilot study. That is, if a
corporate BSC was used initially it would be necessary to develop a new
BSC at the level of the business unit so that the different environment, skills,
abilities and needs of the business unit were taken into account. This
provided increased understanding of the objectives, and most importantly, to
provide improved ‘cause & effect’ linkage to their processes. For these
reasons the research team developed two BSC’s.

The first BSC to be developed was the corporate BSC and this was
developed using the Mind Jet software. We used a data projector linked to a
notebook computer so that the participants had a large scale image of the
developing BSC. Using the mind-mapper software, the participants could
easily change the structure as the discussion took place.

The diagram below (Figure 55) is the corporate BSC developed during the
first REALTECH case study meeting.
Diversification of products and solutions
Increased Customer base
Growth Geographic expansion
Organisational size
Greater Utilisation of Consultants

Long term contracts


Goal- Conultancy Service Diversification
Finacial Stability Consultant Diversification

Long term Projects Solution Focus


SAP Partner support Selection & Training
Advisory Services & Client
Relationship Mgmt.
REALTECH
GOALS Correct infrastructure

Goals Redirection of organisation into functional areas


Strategies Decrease Overheads
Reduced Employee costs
Legend Financial Objectives
Customer/Supplier Objectives
Internal Process Objectives

Figure 55- REALTECH (example) corporate level BSC

Craig Huxley Page 244


A Member of the Centre for Information Technology Innovation

In Figure 55, we can see at the bottom of the diagram the legend used to
explain the colours in use within the diagram. The star located next to ‘correct
infrastructure’ indicates that it was this objective which was used as the goal
for the remote services BSC. The box containing the text “REALTECH’s
goals” indicates the name of the map. The map is developed from right to
left, with the goal developed first and strategies put forward to achieve the
goal. With the strategies in place, the participants provided objectives which
should be reached in order to achieve the strategies. Each objective is
ultimately connected to the goal stated as financial stability.

The corporate BSC is not complete as the purpose of the corporate BSC was
to provide the Remote Services BSC with a goal or goals at a level applicable
to their situation. In this case, that goal is to provide the Correct
Infrastructure. This infrastructure would then support what the participants
considered to be the most important objective of Long Term Projects. Once
we had the goal agreed by the participants we then started the development
of the Remote Services BSC. An example of this BSC is shown in Figure 56.

Scheduling of consultants
Scheduling improvements
Greater Utilisation Scheduling of Customer requests
of Consultants
Right consultants with the right
training

Correct hardware
Improve Communication Networks
Increase the awareness of Survey Client
existing customers to services Provide Customer surveys
solutions as Customer knowledge Survey User
Greater we have that they don't
presently use value add
Net Profit Meetings with SAP client Managers

Combine products for economy of scale

Increase site meetings &


Sell system communication with customer
Correct improvements to existing Increase customer & Increase monitoring
Infrastructure customers SATISFY A Knowledge of possible Find a new communication
CUSTOMER NEED system improvements to customers
Goals Increase cross communication
between Staff
Strategies
Increase product training
Financial Objectives
Legend
Customer/Supplier Objectives
Internal Process Objectives
Knowledge Management Objectives

Figure 56- REALTECH (example) Remote Services BSC

Figure 56 is an example of the Remote Services BSC developed by the


participants. It is read in the same way as the previous BSC example and
uses the same legend as the corporate BSC. The participants decided that
the goal to achieve correct infrastructure should be ‘greater net profit’. They

Craig Huxley Page 245


A Member of the Centre for Information Technology Innovation

were undecided as to which of these two items should come first. That is,
does greater net profit lead to the correct infrastructure or does the correct
infrastructure lead to greater net profit? This was considered a “chicken and
egg” style question and the participants chose to leave the question
unanswered. The participants indicated that they believed the answer would
have little impact on the outcome.

The arrows shown in red in the diagram (Figure 56) indicate that there is an
impact initiated by one objective on another objective. For example, the
internal process objective ‘Increase product training’ impacts on the customer
/supplier objective ‘Increase customer knowledge of possible system
improvements (to the left along the branch) and also impacts on the financial
objective ‘combine products for economy of scale’ as indicated by the arrow.
These are cause & effect linkages developed by the participants which do not
fit into the branch system used to show linkages.

With this part of the meeting complete, the participants agreed to close the
session and select a date and time for the next meeting. It was anticipated at
this point that there would need to be only one more meeting to conclude the
case study. The date for this meeting was the 29th of September 2002 at 1pm
in the afternoon. In the two intervening weeks the research team would print
the BSC maps onto A3 size paper (twice the size of this page) and send
them to the participants for verification and any possible changes. We would
also do the same for the REALTECH version of the reference model of ASP
service delivery.

This first meeting achieved a substantial amount considering that the


participants needed to develop two BSC maps and had spent nearly an hour
working on the REALTECH version of the reference model of ASP service
delivery.

Craig Huxley Page 246


A Member of the Centre for Information Technology Innovation

Reflection
The approach taken to choosing the goals for the ‘Remote Services’ BSC
requires further consideration. The corporate level BSC was not fully
complete and may have benefited from further consideration and adjustment.
In this respect it is reasonable to suggest that the completeness of the high
level BSC might be reflected in the selection of goals in the lower level BSC.
This also suggests the question: how do you know when a BSC is complete?
The research team were motivated by the need to complete the targeting
methodology with as little wasted effort as possible. This may have resulted
in some aspects (fully developing each BSC) of the project being abbreviated
to the detriment of the possible additional organisational outcomes of the
project.

The revisions made to the methodology in the light of these reflections were
set aside till the end of the case study and are described in the revision
phase.

2nd Meeting
The research team met with the REALTECH participants on Friday the 29th of
September for a four hour meeting. It was again held in the offices of
REALTECH in North Sydney. The agenda for the meeting stated that the
assessment of impact, dependency and probability of failure was still
required to find the criticality of each of the fourteen processes identified
during the first meeting. The first task was to show the linkages between the
fourteen processes and the internal process perspective objectives in the
‘Remote Services’ BSC.

We followed the same approach as was used for CSC by firstly providing an
A4 document with the fourteen processes on the left, the internal process
perspective objectives in the middle and with a right hand column to write in
the links. A small version of the document is shown in Table 52.

Craig Huxley Page 247


A Member of the Centre for Information Technology Innovation

Processes within Service Internal Process Perspective Impact


Delivery Objectives Links
1) Monitor Service levels A) Attract and contract the right consultants with the 2,4,6,7,9,10
right training
2) Manage capacity
3) Configuration B) Improve scheduling of consultants
4) Quality processes
5) Billing C) Improve scheduling of customer requests

6) Reporting D) Increase customer surveys

7) Enhancement E) Increase meetings with SAP client Managers


management
8) Help desk F) Increase site meetings & communication with
customer & increase monitoring
9) Incident reporting
10) Problem management G) Find new communication channel to customers

11) Change management H) Increase cross communication between staff


12) Release management
13) Proactive Management I) Increase product training for staff
14) Site Visit Management

Table 52- Relationship drawing table for REALTECH

Table 52 was used by the participants to draw the links between the
identified processes shown on the left and the internal process perspective
objectives shown in the centre column. The cell in the column to the far right
was used to record the numbers which were representative of the processes
considered to be impacting on the objective being assessed. (For example,
Manage capacity is #2) The use of heuristics was again the method of choice
for the participants as it was for the participants of the previous case study.

As was shown in the previous case study, the targeting methodology requires
that the assessment of impact, which is the relative contribution of a process
to organisational goals, should be made in relation to all processes and
objectives impacting on an individual objective. The research team
considered that improved decisions would be made if the processes were
linked to the internal process perspective objectives and then in a separate
step assessed for their impact value in relation to the other processes linked

Craig Huxley Page 248


A Member of the Centre for Information Technology Innovation

to the objective. This is the same approach as was used in the CSC case
study.

Once the processes were linked to the internal process perspective


objectives we then added this information to the Remote Services BSC. With
a map showing the processes that impacted on objectives, as linked using
the branch structure of the concept map, the participants were then able to
assess impact for each process, as a percentage, on the objective to the left
and follow the linkages (including links defined by arrows) to the Remote
Services goal; the same process as was used in the CSC case study. These
results are shown in Figure 57.

The assessment of dependency and probability of failure was then


undertaken by the participants. Unlike the CSC case study, the participants in
this case study agreed to discuss their assessments and provide one agreed
result for each process rather than provide individual assessments. The
research team perceived there to be greater consensus by the REALTECH
participants than the previous CSC case study participants. The consensus
was evidenced by the lack of discussion surrounding the assessments made
and any discussion occurring appeared to be aimed more at clarification of
how a suggested assessment was arrived at than complete disagreement.

The results for this part of the targeting method were then placed into the
document identified as Figure 57 on the following page. The document lists
the results of assessments for the three factors impact, dependency and
probability of failure.

Figure 57 provides the results of the assessment of criticality for each of the
processes identified in the first meeting using the reference model of ASP
service delivery. The dependency and probability of failure results are based
on a scale of one to ten with one being the least probability of failure or least
dependency and ten are the most probability of failure or most dependency.
The results of the impact assessment are the calculation of the percentages

Craig Huxley Page 249


A Member of the Centre for Information Technology Innovation

applied by the participants to the linkages shown on the Remote Services


BSC.

Results of the Identification Processes for Remote Services within


REALTECH
Remote Service Delivery processes identified within the REALTECH version
of the ASP Service Delivery model.
Impact is measured as 1% = Least Impact
100% = Most Impact
Dependency is measured as 1 = Least Dependent
10 = Most Dependent
Probability of Failure is measured as 1 = Least Probability of Failure
10 = Most Probability of Failure
These values are then multiplied to arrive at the total for ‘Criticality’ of the
Process.
Service Delivery Impact Dependency Probability Total
Processes of Failure
Manage capacity 18.3 7 8 1024.8
Site Visit Management 19 9 4 684.0
Proactive Management 21 8 4 672.0
Enhancement 11 7 6.5 500.5
management
Incident reporting 10.4 10 4.5 468.0
Reporting 8.5 10 4.5 382.5
Monitor Service levels 7.1 7 4.5 223.7
Configuration 6.5 8 3.5 182.0
Change management 3.6 7 4 100.8
Quality processes 3.8 7 3.5 93.1
Release management 2.2 7 6 92.4
Help desk 3.8 9 2 68.4
Problem management 2 9 4 72.0
Billing 1.3 10 2 26.0

Figure 57- Results of valuation of dependency, probability of failure and impact

The results for criticality in the far right column show that the process
Manage Capacity was considered the most critical and this might be due to
the high rating for probability of failure applied. The high rating for the
probability of failure for the process ‘Manage Capacity’ is eight, which is well
above the average rating (4.4) applied to the other processes. Compared to
‘Site Visit Management’ the second most critical process has a similar impact
rating (19 versus 18.3) and a similar dependency rating (9 versus 7). The
difference in the criticality rating for both is clearly the result of the high

Craig Huxley Page 250


A Member of the Centre for Information Technology Innovation

probability of failure rating (4 versus 8) for ‘Manage Capacity’. The research


team did not look into the reasoning why the participants provided the
resultant assessments.

Once the calculations for criticality were complete and the table shown in
Figure 57 completed the participants looked to the selection of which
processes to improve. As with the CSC case study, the REALTECH
participants chose to select their processes solely on the basis of criticality
and not to follow the final steps of the targeting methodology, namely, to
assess each of the processes using the factors cost/benefit and probability of
successful improvement. The research team did not believe that they should
pressure the participants to complete all ten steps of the methodology but to
assist where needed and to observe the issues that arose.

This concluded the meeting and the assessment phases of the targeting
methodology. The final activity which occurred in the observation phase of
this cycle and the results are shown as part of the reflection and revision
phases of the action learning was to ask the participants, via email, to answer
5 questions concerning the project.

a) Are the results of the process what you expected?


b) Are the results of the process useful to you as a manager?
c) Are the results of the process useful to you as a company?
d) Was there anything in the identification process which you didn't
understand or had difficulty with?
e) Are you able to suggest any improvements to the way the process
was implemented?

Question a) sought to find the participants agreement with the outcome.


Questions b) and c) looked to find out possible business benefits and
business problems. Questions d) and e) seek to gain further insights
concerning the implementation method. The responses are summarised
under each question and the analysis of the output including these responses
will be provided in the reflection section which follows.

Craig Huxley Page 251


A Member of the Centre for Information Technology Innovation

The questions were attached to a copy of both the results page (Figure 57),
showing the assessments for each of the processes for the three factors
dependency, probability of failure and impact and a pie chart showing
graphically the results of the criticality assessment for each process The
participants were also sent an A3 size map of the remote services BSC.

The next section outlines the outputs of the project for the participants.

Output
The output of this case study for the participants was similar to that of the
CSC case study, namely the completion of the development of a BSC used
to identify the cause & effect links from process to goals. The participants
also assessed the identified processes for their effect of failure on the
organisation (dependency) and probability of failure.

The major outputs of the targeting method project for the participants was the
BSC map which showed the impact values and cause & effect links and the
results table of the calculation of criticality for their identified processes. In
addition they considered that the REALTECH version of the model of ASP
service delivery was useful in defining their activities to their client’s
management.

The criticality scores seen in the table previously were also produced in a pie
chart to visually describe the results.

Craig Huxley Page 252


A Member of the Centre for Information Technology Innovation

REALTECH Results of Critical Process Identification


Enhancement
Incident reporting 468 Change
Management 500.5
Management 100.8
Configuration 182
Quality processes 93.1
Proactive
Management 672 Monitor Service
Release
levels 223.7
Management 92.4
Help desk 68.4
Site Visit Other 1240.9
Problem Management 72
Management 684
Billing 26

Manage Capacity 1024.8 Reporting 382.5

Figure 58-Chart showing results of REALTECH Process Identification project

The left hand pie chart in the diagram contains the results of five processes
with the highest criticality rating and the sixth segment is made up of the
remaining nine processes. The values used are the results taken from the
assessments of criticality for the fourteen processes.

The BSC map for the remote services business unit was an important output
for the case study and the participants. The participants sought to use this
map to highlight to staff their thinking on how to achieve the goals that had
been set. The impact percentages shown on the map would also highlight the
importance of each process, objective and strategy. This map is shown as
Figure 59 on the A3 size page following. In this diagram the left hand side
contains the goals of the business unit highlighted in gold and the legend
above the major goal. Moving right are the strategies for achieving the goals
which are in black text. The objectives for each of the three different
perspectives are shown as red text for financial, blue for customer/supplier
and green for internal process objectives. The brown text represents the
processes identified from the model of ASP service delivery. Beneath each
strategy and objective are the percentage impacts assessed by the
REALTECH participants. At the end of the process text are the percentages
indicating the rating for the impact factor of each process on the objective to
the left of the process.

Craig Huxley Page 253


A Member of the Centre for Information Technology Innovation

The arrows drawn on the diagram represent the additional impact of an


objective on another objective or strategy within the BSC map. These also
have percentage impact ratings attached. The different colours of each arrow
are to distinguish each line as an aid to clarity. Where a line crosses over
another line it is possible for a reader to conclude that an objective impacts
on another objective when it is not. As shown below:

Objective A Objective C Objective D


Objective B

The eventual target of objective A and B may be C or D when in black text


but with colour the target of each arrow would be clear.

The next page contains the A3 size map sent to REALTECH and then the
followed section is a description of the reflection and revision phases of this
action learning cycle.

Craig Huxley Page 254


A Member of the Centre for Information Technology Innovation

Craig Huxley Page 255


A Member of the Centre for Information Technology Innovation

Manage Capacity 30%


Problem Management 5%
Billing 5%
Change Management 5%
Scheduling of consultants Incident Reporting 5%
50% Enhancement Management 5%
Proactive Management 20%
Site Visit Management 30%
Reporting 5%

Scheduling improvements Manage Capacity 30%


60% Site Visit Management 30%
Proactive Management 20%
Reporting 5%
Greater Utilisation
of Consultants Enhancement Management 5%
Scheduling of Customer requests
Incident Reporting 5%
50% 50%
Monitor Service Levels 5%
Problem Management 5%
Change Management 5%
Quality Processes 5%

Release Management 5%
Enhancement Management 5%
Problem Management 5%
Right consultants with the right Help Desk 5%
training
Configuration 25%
40%
Incident Reporting 15%
Quality Processes 15%
Proactive Management 10%

Correct hardware
5%
Improve Communication Networks
5%
Survey Client
50%
Goals Survey User
Strategies 50%
Financial Objectives Incident Reporting 10%
Legend
Customer/Supplier Objectives Enhancement Management 10%
Internal Process Objectives Increase the awareness of existing Customer knowledge Customer surveys Proactive Management 10%
customers to services we have 40% Reporting 10%
Processes Provide solutions as value add 40%
that they don't presently use
Help Desk 10%
25% 100%
Monitor Service Levels 10%
Manage Capacity 10%
Site Visit Management 10%
Greater Net Profit Release Management 10%

Combine products for economy of scale


60%
Correct
Proactive Management 15%
Infrastructure
Change Management 5%
Billing 5% 75%
Increase site meetings &
communication with customer Release Management 5%
& Increase monitoring Monitor Service Levels 25%
25% Enhancement Management 5%
Site Visit Management 25%
70%
Incident Reporting 10%
Reporting 5%

Proactive Management 20%


Help Desk 10%
Find a new communication Enhancement Management 10%
to customers
Site Visit Management 20%
10%
Incident Reporting 20%
Reporting 20%

Release Management 10%


Change Management 10%
Enhancement Management 10%
Sell system improvements to Increase customer
Monitor Service Levels 10%
existing customers SATISFY A Knowledge of possible
Increase cross communication Help desk 10%
CUSTOMER NEED system improvements
between Staff 25%
100% Manage Capacity 10%
15% 10%
Problem Management 10%
Incident Reporting 10%
Reporting 20%
Proactive Management 20%

Configuration 10%
Site Visit Management 10%
Release Management 5%
Problem Management 5%
Increase product training 75%
Monitor Service Levels 5%
25%
Manage Capacity 25%
Enhancement Management 20%
Proactive Management 20%

Reporting 10%
Site Visit Management 10%
60%
Change Management 10%
Meetings with SAP client Managers
Manage Capacity 25%
30%
Enhancement Management 20%
Proactive Management 25%

Figure 59- Mock up of the REALTECH Remote Services Business unit BSC

Craig Huxley Page 256


A Member of the Centre for Information Technology Innovation

Craig Huxley Page 257


A Member of the Centre for Information Technology Innovation

7.4.4 Reflection Phase Cycle Three

This section, the third of the four phases of the action learning cycle, will
examine the observations and data collection of the case study. We will
reflect on the participants’ reflections and the issues that arose during the
case study. There are two elements of reflection in this cycle of action
learning; the participants as seen in their responses and the research team
as seen in the reflections that follow those of the participants.

The answers provided by the case study participants are described next.

1. Are the results of the process what you expected?


“Yes in some aspects. I am still having difficulty in understanding how this will
assist me in running the company. It has given me some insight into what I
have already thought we needed, but will it assist me to grow the business.
Not Sure!”
2. Are the results of the process useful to you as a manager?
“Yes they have highlighted the areas that we need to look at but it doesn’t
provide a solution. I know it really can’t, but would like to possibly see some
case studies/results after the study, ie, if we went this way what could we
expect”
3. Are the results of the process useful to you as a company?
“Yes, again it has reinforced what I thought and that is, we must ensure that
we get off our backsides and get to the client, face to face.”
4. Was there anything in the identification process which you didn't
understand or had difficulty with?
“It was difficult at the start to pick up, maybe there could have been an intro
(½ hour) on what is expected etc.”
5. Are you able to suggest any improvements to the way I implemented the
process?
“As per 4 above plus, give a bit more thought on how to deliver each section,
that is, how to open up the discussion and lead it.”

Craig Huxley Page 258


A Member of the Centre for Information Technology Innovation

Research Team’s Reflections


The initial explanatory phase of the targeting method received some critical
comments in the answers provided by the participants. One participant
commented that “more thought on how to deliver each section, [was needed.
That is;] how to open up the discussion and lead it”. The second participant
saying that the ten step method was “difficult at the start to pick up”.

It may be that the explanation of the ten step method needs to be less
detailed initially with greater detail provided as each step is taken in the
implementation.

Another time related issue observed by the research team was concerned
with the mental effort required to develop two BSC’s in one morning session.
Due to the participants extremely busy work schedules there was possibly an
unseen pressure on the participants to achieve completion of the
implementation in a small time period. This pressure may have resulted in
the pace of the analysis and thus the explanation of the process to have
occurred too quickly for the participants to comprehend all parts of the
methodology. There may be a balance to be found between the supply of
information and the time taken to implement the process. That is, shorter
implementation times create difficulties in supplying information in a coherent
and easy to absorb manner. Longer implementation times create difficulties
for participants in retaining the information supplied.

The first participant’s comment (“how to open up the discussion and lead it”)
suggests that the research team needed to be more involved in the
development of the BSC map rather than act as an involved observer. This
issue is perhaps contextual to the people and organisation and as such the
research team believed that the method should be modified to take into
account the need of participants in this area. In some cases the
implementation needs to proactively deal with issues surrounding the
development of strategy.

Craig Huxley Page 259


A Member of the Centre for Information Technology Innovation

Even though the participants believed that the critical processes in the
remote service business unit had been correctly identified and assessed,
there were some concerns as to the use of the results.

For example one participant commented; “How will this assist me in running
the company?” and “Will it assist me to grow the business?” and the other
participant highlighted the lack of future direction with the comment: this has
“highlighted the areas that we need to look at, but doesn’t provide a solution”
These comments suggest that the research team should provide within the
targeting methodology a component which directs the user to possible
approaches they might undertake to make use of the results. That is, to
produce a tangible business benefit as an outcome of the methodology.

An alternative view is that the targeting method does not intend to provide
this type of direction, although the practical application of the method may
call for this addition. It was not the intention of the research team to provide a
methodology which included a process improvement element. The objective
of the method is to identify and select the most appropriate process for
process improvement.

The comments by one participant asking “how will this assist me in running
the company,” might have been aided in this case study by developing the
corporate BSC completely. We believed initially that this was not necessary
to the implementation of the targeting method and there is no evidence to
disprove that belief. By fully developing the corporate BSC we may have
assisted the participants in assessing the importance of the remote services
business unit goals on those of the organisation. In this way we would also
be assisting the organisation in “growing the business” and “running the
company.”

The REALTECH participants, like the CSC participants, chose to ignore the
final three steps of the targeting method. That is, steps eight, nine and ten;
step eight is the assessment of the cost/benefit of the process improvement
project for those processes which are most critical, step nine is the

Craig Huxley Page 260


A Member of the Centre for Information Technology Innovation

assessment of the probability of successful improvement of those processes


which have positive cost/benefit and step ten is the selection of which
process to improve from the results of step eight and nine and the results of
the assessment of criticality. The research team have a number of alternative
explanations for this issue.

It is possible that the participants saw the last three steps as time consuming
for little further gain. They may have considered that their present knowledge
of the processes was sufficient to select the processes they would improve.
Their process improvement process may have been very informal and thus
they were not going to improve processes using a project management
approach. An informal process improvement would not need to undertake a
cost/benefit analysis or assessment of the projects probability of success.
The last alternative we considered is that the participants were happy with
the criticality rating as a method for selection of which processes to improve.
This last approach was their stated reason and would fit neatly with an
informal approach to process improvement.

The research team did not raise any further issues from this case study. We
agreed that the changes made as a result of the previous action learning
cycle were useful.
These were:
1. To alter the sequence of some of the steps in the methodology. The new
sequence was well received as evidenced by the early completion of the
case study and the continued interest of both participants during the two
meetings used to implement the targeting method.
2. Use the mind-mapper software for BSC maps, not just to show cause &
effect links. The new software proved exceptionally useful as the
development of the BSC maps were visually more interesting to the
participants. The digital presentation enabled easy changes to the maps
which meant the participants could suggest changes and then view that
change within minutes and undo the change if desired.
3. Provide greater detail of the criteria for assessment of dependency and
probability of failure. This issue might be considered successfully resolved as

Craig Huxley Page 261


A Member of the Centre for Information Technology Innovation

indicated by the agreement of the participants in this case study in their


assessment of the two factors. It is not possible to compare the results
statistically to the previous case study as this case study had only the
combined result. In the previous study we were able to calculate the
difference between each participant’s results to show their disagreement in
assessing dependency and probability of failure. In this case study there is
only one result for each process assessed, due to the agreement of the
participants.
4. The use of the fifth perspective within the BSC map was again considered
unnecessary. As per the notes in the ten step methodology we asked the
participants if they believed the fifth perspective (Partner) was necessary in
providing a balanced view of their environment.

This section of the third cycle of action learning has examined the issues that
arose during the case study. We also examined the changes made to the ten
step targeting model in the revision phase of the previous action learning
cycle.

The following section is the revision phase which puts forward the changes to
the methodology and inserts them in Version 4 of the Targeting methodology
model.

7.4.5 Revision Phase Cycle Three

This section provides the proposed changes which the research team have
developed after examining the issues in the reflection phase of this action
learning cycle.

1. The targeting methodology should be altered in the explanatory steps to


suit the company context and the time frame. That is, in step 2.2 (Introduction
of the project as a whole to the project team) we will provide a less detailed
explanation of the targeting methodology. This will be confined to the detail

Craig Huxley Page 262


A Member of the Centre for Information Technology Innovation

seen in the Figure 60 below, with any questions answered as well. The
participants would be informed that the more detailed explanation would be
provided as each step was conducted.

Figure 60- Section of the less detailed information provided in the 2.2 explanation step

2. The research team believed that their role was not to participate in the
development of strategy and objectives, rather to act in a support role by
recording information and explaining the steps of the methodology. By
changing the methodology so that we fully develop the corporate BSC (when
there is a business unit BSC being focussed on) we might assist the
participants in assessing the importance of the business unit goals on those
of the organisation. In this way we would also be assisting the organisation in
“growing the business” and “running the company.”

3. The final revision to the ten step methodology was to actively encourage
the use of the final three steps of the method. That is, assessing the
cost/benefit of the improvement project for a process as step eight, assessing
the probability of successful improvement of the process as step nine and
combining these with the assessment for criticality by multiplying them as
step ten. One possible approach is to promote the value of project
management for process improvement which would then call for a similar
type of assessment before starting the project. Many successful projects
have a ‘business case’ written before commencement of the project so that

Craig Huxley Page 263


A Member of the Centre for Information Technology Innovation

future measures of success are compared with the benefits outlined in the
business case.

Variations to the targeting methodology are shown in grey highlight.

Craig Huxley Page 264


A Member of the Centre for Information Technology Innovation

Version 4 of the Targeting Method


1. Preplanning
1.1. Assessing participants
1.2. Preparation of any documents
2. Defining Scope
2.1. Identify the processes
2.2. Introduction of the project as a whole to the project team
3. Developing a Balanced Scorecard (BSC)
3.1. Identify the goals and strategies and objectives of the entity
3.2. Identify the cause & effect linkages within the BSC
3.3. Link processes identified earlier (2.2) to internal process objectives
4. Assessing the Impact of Processes on Goals
4.1. Assess the impact of each process on goals using heuristics and total
5. Assessing Dependency
5.1. Agree on the method to be used for assessing dependency
5.2. Identify the criteria to be used and rate each process
6. Assessing Probability of Failure of the Process
6.1. Agree on the method to be used for assessing probability of failure
6.2. Identify the criteria to be used and rate each process
7. Calculate the Criticality of each Process
8. Assess the Cost/Benefit of Improving the Process
8.1. Agree on the method to be used for assessing cost/benefit
8.2. Identify the criteria to be used and rate each process
9. Assess the Probability of Successful Improvement of the Processes with
positive cost/benefit
9.1. Agree on the method to be used for assessing probability of success
9.2. Identify the criteria to be used and rate each process
10. Selection of which Critical Process to Improve First
10.1. Rank order the processes with positive cost/benefit by greatest
probability of successful improvement. Those processes with the
greatest probability of success and greatest cost/benefit should be
improved first

Craig Huxley Page 265


A Member of the Centre for Information Technology Innovation

An explanation of each part of the generic 10 part process

1. Preplanning
1.1. Assessing Participants
This step involves some research of the participants and their organisation. If
possible initial contacts should be used to assess the knowledge of the
participants and who should attend the first implementation meeting.
1.2. Preparation of any documents
Before the first meeting of the project team there are a number of documents
which should be produced. The first is a simple outline of the targeting
methodology that is to be used. The second is the terms and meanings of the
words within the BSC. Initially meetings need to be structured; provision of an
agenda, list of terms and definitions regardless of previous experience and
description of a basic BSC

2. Defining Scope
2.1. Identify the processes applicable to the project
The processes and the level at which they should be seen are identified at
this point. Conduct this step before the introduction of the entire project.
Though this is dependant on the situation or context in which you are
implementing the targeting method.
2.2. Introduction of the project as a whole to the project team
Targeting methodology projects should have a less detailed introduction that
lists the basic ten steps of the method. This should then be followed by an
introduction of each step in greater detail at the beginning of each step.
Dependant on the number of participants and time as to the detail required. It
should be verified that the participants are familiar with the BSC and its
terms. Agreement should also be reached as to the lines of communication
and confidentiality.
The first meeting should ascertain through discussion the knowledge base in
regards to the BSC. It is necessary for an effective implementation that the
research team and the participants have a clear understanding and
agreement of what the BSC is, how it is supposed to work and definitions of
the terms used. Outcomes of this meeting should be that all participants

Craig Huxley Page 266


A Member of the Centre for Information Technology Innovation

speak the ‘same language’ and there is a plan of how to go forward. Ideally
this plan will list suggested documents from which goals, strategies and
possible objectives may be sourced.
Define the business area in which to conduct the project, (not all targeting
projects are implemented for the whole organisation). The time frame of the
project should then be assessed and the roles and responsibilities for the
project outlined. Define and achieve agreement from the participants as to
the necessary amount of time needed for successful completion of the
project.

3. Developing a Balanced Scorecard (BSC)


Introduce this step in detail. Conduct this step and that in step four before
assessing dependency and probability of failure now steps five and six. Use
the mind-mapper software for visualising the BSC maps. In addition to this
there is a need in longer implementation to use further documentation to
assist participants in remembering the meanings of objectives and strategies.
A further requirement in longer implementation is to provide summaries of
what has occurred previously and what still needs to be completed to
implement the process.
3.1. Identify the goals, strategies and objectives of the entity in the
project
If necessary an explanation of the BSC is necessary and the goals and
strategies of the entity are provided. Develop objectives as the mini goals for
the strategies and place these objectives within the agreed perspectives of
the BSC. Identify a need for using a fifth perspective such as ‘partner’ before
using it. It would also be advantageous to obtain agreement of the
participants for the names of each of the other perspectives.
3.2. Identify the cause & effect linkages within the BSC
Using the experience and skills of the project team identify the cause and
effect linkages within this BSC. If necessary add objectives to allow for
possible causes and effects.
3.3. Link processes identified earlier (2.2) to internal process objectives
With the processes identified earlier (or now) the project team link those
process which impact upon the internal process objectives within the BSC.

Craig Huxley Page 267


A Member of the Centre for Information Technology Innovation

Provide a visual ‘map’ of the BSC for the participants to assess for
correctness of the work so far and make changes from any feedback.
A separate BSC needs to be drawn up as each management focus of an
organisation is brought into the project. All BSC’s produced in the targeting
methodology should be completed as an aid to further use of the results.
This is especially necessary in organisations with few levels of management
and lower level managers who are not involved in strategy formulation.
Improvement of the visual ‘map’ of the BSC may reduce the confusion and
lead to better communication.

4. Assessing the Impact of Processes on Goals


Introduce this step in detail.
4.1. Assess the impact of each process on goals using Heuristics
Using Heuristics assess the impact of each process on the objectives,
strategies and ultimately the goals of the entity. Each link is assessed one at
a time and but in relation to all of the processes and objectives impacting on
the objective, strategy or goal. Calculate totals for each process by
multiplying the percentages together along the links.

5. Assessing Dependency; the effect of failure of a process on the


organisation
Introduce this step in detail.
5.1. Agree on the method to be used for assessing dependency
An improved approach to the assessment of the factor of dependency is
required. It is possible that an improved and more complete explanation of
the criteria for assessing dependency would provide an improved result. In
addition to better information there should be agreement between
participants as to the criteria used and the weightings for each criteria.5.2
identifies the major criteria for dependency. This project aimed to use
heuristics for this task and this was completed by assessing the dependency
of the entity on the process in relation to each of the other identified
processes.
5.2. Identify the criteria to be used to rate each process

Craig Huxley Page 268


A Member of the Centre for Information Technology Innovation

If greater reliability was needed then the criteria for dependency would be
used here with weightings for each. (Availability, reliability, safety,
confidentiality, integrity and maintainability) Each process would then be
rated by applying a rating to each criteria and multiplying this by the
weighting for that criteria and multiplying all these together for each process.

6. Assessing Probability of Failure of the Process


Introduce this step in detail.
6.1. Agree on the method to be used for assessing probability of failure
The use of heuristics was suggested
6.2. Identify the criteria to be used to rate each process
An improved approach to heuristics is to base your judgements on prior
knowledge and facts and then predict a possible result. An improved
approach to the assessment of the factor of probability of failure is required. It
is possible that an improved and more complete explanation of the criteria for
assessing probability of failure would provide an improved result. In addition
to better information there should be agreement between participants as to
the criteria used. No criteria are required for a heuristic approach to this
assessment. Additional information on failure could be the types of failure;
over performance, failure over time, intermittent failure, partial failure and
complete failure.

7. Calculate the Criticality of each Process


Introduce this step in detail. Multiply the ratings or values for impact,
dependency and probability of failure. This total is the criticality of each
process

8. Assess the Cost/Benefit of Improving the Process


Introduce this step in detail. Actively encourage the use of this step.
8.1. Agree on the method to be used for assessing cost/benefit
If the projects are very small then heuristics may be suitable, otherwise the
suggested approach is to develop a business case for each process as a
separate project.
8.2. Identify the criteria to be used to rate each process

Craig Huxley Page 269


A Member of the Centre for Information Technology Innovation

Dependant on the approach taken above in 8.1 the criteria would include;
costs for resources and non-project related incurred costs, tangible and
intangible benefits. Only those processes with a positive cost/benefit would
then be assessed for the probability of successful improvement.

9. Assess the Probability of Successful Improvement of the Process


Introduce this step in detail. Actively encourage the use of this step.
9.1. Agree on the method to be used for assessing probability of
success
The suggested approach here again is to use heuristics as for 5.1.
9.2. Identify the criteria to be used to rate each process
Some criteria which may support the decisions are; Team Orientation,
Project Management, Management Support, User Participation, Project
Championship and Communication.

10. Selection of which Critical Process to Improve First


Introduce this step in detail. Actively encourage the use of this step.
10.1.Rank order the processes with the greatest probability of successful
improvement.
Those processes with the greatest probability of successful improvement and
the best cost/benefit ratio should be selected first for improvement. This is
essentially then a business decision. The rank order is not a scientific
approach to selecting which processes to improve first, it is an improved
approach to current practice.

Craig Huxley Page 270


A Member of the Centre for Information Technology Innovation

This third cycle of action learning has provided the research team with three
revisions to the ten step targeting methodology. They were a revision to the
approach taken to explaining the methodology which now takes into account
the time frame of the implementation of the methodology. Increase the
involvement of the research team to assist in developing strategy for the
participants. This might be accomplished by ensuring that the balanced
scorecards developed during the project are fully developed. Finally the
proactive encouragement of the participants to use the final three steps of the
methodology as the selection of which of the critical processes to improve
first.

The following section is the fourth action learning cycle of the research and
will describe the implementation and observation phases of the cycle using
the case study method. The case study participant for this cycle is Citec the
ninth largest outsourcing company in Australian. The reflection and revision
phases of the action learning will examine the results of the implementation
and observation phases. This fourth cycle is the last cycle of the action
learning and will complete the testing and improvement of the targeting
methodology.

Craig Huxley Page 271


A Member of the Centre for Information Technology Innovation

7.5 Cycle 4 of Action Learning; Case Study Three


PS C1 C2 C3

This cycle of action learning is the last cycle of the four cycles used to test
and improve the targeting methodology. We followed the same process using
the four phases, implementation, observation, reflection and revision as has
occurred in the previous cycles of action learning. Citec was the case study
participant in this cycle of action learning. During the case study we observed
and recorded the comments and issues that arose during the implementation
as well as the answers to questions asked of the participants at the case
study conclusion. Then the research team reflected on the observations and
revised the targeting method to version 5 of the targeting method.

This description of the third case study does not include the level of detail
seen in the first case study. The description focussed on the changes to the
methodology and the issues that arose during the case study which are
different to those found in the previous case studies. In this way we were
able to focus on the improvement of the targeting methodology in the action
learning cycles. The intent was not to perform a cross case analysis of each
case study although there is some learning taken from comparisons between
the three case studies.

7.5.1 Purpose and Business Problem

Citec staff members were the participants for the focus group and the Delphi
study and at the original meeting which explained the whole research project
indicated an interest in participating in this stage of the research project.

The original intent of Citec was to use the case study as their method of
selecting processes for process improvement projects. They were at the time
about to instigate a continual process improvement program in line with their
strategic plan which would see all ‘important’ processes reviewed for possible
improvement. They were to follow the current approach by most

Craig Huxley Page 272


A Member of the Centre for Information Technology Innovation

organisations in selecting processes for review based on perceived


importance or perceived problems.

Thus the timing for the implementation of the targeting method was, although
coincidental, useful to Citec. It will be explained in detail in the ‘Actual
approach and observation’ section that, although the main intent of Citec was
to select which processes to improve, the participants also saw the
implementation as an opportunity to review the organisation’s strategic plan.
Unlike the previous case study participants, the Citec participants took
control of the project once they had made the decision to also use the
targeting methodology to improve their strategic plan.

7.5.2 Proposed Approach

The approach taken with this case study was the same as that undertaken
for the previous two case studies and included the revisions from the
reflection and revision phases of the last action learning cycle. These
revisions were to explain the ten step methodology in a broad sense initially
and then to explain in detail each of the ten steps as they were introduced. In
this way we would reduce confusion and also improve the retention of
participants regardless of the time frame of the project.

The second revision was to ensure that the balanced scorecards developed
were fully developed regardless of their need to be complete to support the
targeting methodology. Fully developing each BSC would provide strategic
information to the participants outside of the results of the targeting method.
The third revision was to actively encourage the use of the final three steps of
the targeting methodology. These three steps were the assessment of the
cost/benefit factor (step eight), assessment of the probability of success
factor (step nine) and the rank ordering of the processes with the greatest
probability of successful improvement (step ten).

Craig Huxley Page 273


A Member of the Centre for Information Technology Innovation

The next section describes the action and observation phases of the fourth
action learning cycle.

7.5.3 Actual Approach and Observations

The Citec case study differed from the previous case studies in a number of
ways which will be explained in this section. The participants used the
targeting methodology for a purpose which the research team had not
considered previously, to reduce the complexity of their corporate strategic
plan. They were far more proactive in their approach to the case study, which
was seen in how they actively pursued the goals of the project and spent
time on the project without the research team. The research team conducted
seven meetings over eight weeks from 15th Oct. 2002 to the 10th Dec. 2002
for a total of fourteen and a quarter hours. Table 53 lists the meeting times
and the duration of these meetings with the Citec participants.

Meeting Meeting Meeting Meeting Meeting Meeting Meeting Total of 7


Meetings
1 2 3 4 5 6 7

Total of 8
15th Oct 22nd Oct 5th Nov 12th Nov 26th Nov 3rd Dec 10th Dec Weeks Duration

Total of 14 ¼
¾ Hr 1½Hrs 2Hrs 3Hrs 3Hrs 2Hrs 2Hrs Hours

Table 53- List of meeting and times for Citec case study

In Table 53 above, we can see that the meetings with Citec were spread over
an eight week period and apart from two fortnight periods (22 Oct.- 5Nov.
and 12Nov. to 26Nov.) were weekly meetings. Although the meetings were
intended to be of one and half hours to two hours duration we can see that
two meetings (26th Nov. and 3rd Nov.) were an hour longer.

The research team met with the participants from Citec on one occasion
before actually starting the case study. This meeting was on the 15th of
October 2002 at the Citec offices located at 317 Edward Street, in the central

Craig Huxley Page 274


A Member of the Centre for Information Technology Innovation

business district of Brisbane. The research team met with one participant
(Peter Marshall) on this occasion to provide an explanation of the targeting
methodology and to answer any questions. At this stage we had not received
an agreement from the participants to undertake the case study.

The meeting lasted almost three quarters of an hour and included some
detailed explanations of the outcomes of the targeting methodology. We
discussed starting the case study with the Citec strategic plan and then
developing a balanced scorecard (BSC) for a service which provided
“Desktop pc” support. In this functional area, Citec provided some customers
with a twenty four hour/seven days a week service, which included a help
desk, hardware and software maintenance and database services. Peter
Marshall thanked us for our time and we arranged to meet again the following
week to introduce the project to the second participant Terry Collins.

The following week on the 22nd October the research team and the two
participants met at the Citec offices in Edward Street in the fourth floor
boardroom. This room contained a large board table and a data projector.

The agenda document used at this meeting was similar to those of the
previous case studies. At this first meeting with both participants we did not
achieve all the points on the agenda. The first part of the meeting was as
usual to deal with the signing of consent forms and to answer any questions
concerning confidentiality. With the consent forms signed we would normally
have looked at step 2.1 (Identify the processes) and then step 2.2
(Introduction of the project as a whole to the project team).

In this case we undertook step 2.2 first as the second participant had not
been present at the previous meeting. The participants then suggested that
we use their corporate strategic plan as the starting point and turn this into
their first BSC. The second half of the one and a half hour meeting was taken
up with their explanation of how their corporate strategic plan was meant to
be read and explaining the abbreviations used in it. They suggested that the

Craig Huxley Page 275


A Member of the Centre for Information Technology Innovation

research team take the strategic plan and reassemble it into one of our BSC
maps using the mind-mapper software.

The second meeting had explained the targeting methodology, received the
formal consent of the participants to undertake the case study and identified
the scope of the BSC’s which would be used for the targeting method. We
had not identified the processes which would be used in the targeting
method, nor obtained a description of the scope of the services for the area
of the business we would ultimately focus on or started developing the first
BSC. We did, however, have a corporate strategic plan which contained five
planks or areas of focus. Each ‘plank’ is defined as a strategic focal area. All
the ‘planks’ combined make up the strategic ‘platform’. These planks were:
1. Strategic Portfolio: Corporate Services Focus
2. Marketing Orientation
3. Integrated Service Solutions Partnership
4. Excellence in People
5. Financial Viability/Internal Efficiencies

The planks formed the five major areas of the strategic plan, with each plank
of the strategic plan containing one or more goals and nearly seventy
strategic objectives. Unknown to the research team at this stage, the
participants intended to use the case study to identify those parts of the
strategic plan which the participants considered most important. They sought
to reduce the complexity of the strategic plan by utilising the targeting
methodologies impact assessment process. Once they had reduced the
strategic plan to those items which were most critical they intended to use the
most critical items as the basis of the first BSC within the targeting
methodology.

The next meeting at the Citec offices was on the 5th of November at 12.30.
This meeting started with a review of the research team’s attempt to
reassemble the Citec strategic plan into a BSC map. The research team had
attempted to place each of the ‘planks’ of the strategic plan into one of the
four perspectives of the BSC. That is, financial perspective,

Craig Huxley Page 276


A Member of the Centre for Information Technology Innovation

customer/supplier perspective, internal process perspective and knowledge


management perspective. In addition, we renamed the ‘planks’ to use a
naming convention similar to that in many BSC. This BSC map is shown
below as Figure 61.

Stltct prtftrrtd pkrtwtr fzr przvisizw zf ISM krchittcturt kwd ttchwzlzgy by 1


Stpt 2002 kwd impltmtwt stkgt 1 by 1 Mkrch 2002
Impltmtwt Frkmt Rtlky rtfrtsh przgrkm iw
Financial perspective stkgts by Juwt 2003
Iwcrtkst tht Mkrgiws zw Przvidt iwttgrkttd Imprzvt tht ttchwzlzgy Czmplttt tht wtxt ttchwickl krchittctukl
Strvicts strvict szlutizws zur przducts krt bkstd rtvitw fzr tSD Iwfrkstructurt by Mkr 2003
upzw. Czwduct ttchwickl krchittcturkl rtvitws fzr

Strategic Goals
kty CITtC przducts by Juw 2003
Rtvitw stcurity iwfrkstructurt by Juw 2003
Kchitvt txttrwkl kccrtditktizw tz stcurity
stkwdkrd ISz 17799 by Mkr 2003
Marketing Orientation Goals
Czmplttt impltmtwt zf kty ptrfzrmkwct iwdicktzrs by 1 Dtc 2002
Crtktt iwttrwkl tfficitwcits iw CITtC thrzugh kutzmktizw zf fiwkwcikl kwd kdmiwistrktivt suppzrt przctssts
Dtvtlzp Busiwtss Suppzrt Fuwctizw iw Fiwkwct Uwit tz dtlivtr sptciklistd fiwkwcikl rtpzrtiwg kwd kwklysis
Integrated Service Solution Goals
Dtvtlzp kwd impltmtwt fiwkwciwg strkttgits by 31 Dtc 2002
Rktizwklist kcczmmzdktizw by 31 Mkr 2003 Excellance in People
Build wktizwkl strkttgic kllikwcts/ pkrtwtrships with hkrdwkrt supplitrs

Wktizwkl
tz sigwifickwtly rtduct iwttrwkl czsts zf zptrktiwg by 30 Juwt 2003
Czmplttt impltmtwtktizw zf Czstiwg Systtm (kBBM) by 30 Juwt 2003
Objectives
Rtductizw zf przpzrtizwkl stkffiwg czsts by kw kdditizwkl 10% by 30 Juwt 2003 Legend Sub- Objectives
Kdditizwkl rtductizw zf wzw-HR zvtrhtkds by 10% by 30 Juwt 2003 thrzugh strtkmliwiwg czrpzrktt przctssts
Kchitvt wzrmkl przfitkbility zf 5% zf rtvtwut by Juwt 2003
Rktizwklisktizw zf txistiwg mkrktts, clitwts kwd przducts by 30 Juwt 2003 Financial perspective
Sigwifickwt rktizwklisktizw zf kll iwttrwkl "przjtcts" sz thkt zwly przjtcts dirtctly rtlkttd tz rtvtwut gtwtrktizw kwd czst rtductizw przcttd by 30 Juw 2003
Dtvtlzp kwd impltmtwt k czwsisttwt pzlicy kwd strkttgy fzr ttchwzlzgickl ksstt liftcyclt mkwkgtmtwt fzr CITtC by 30 Juw 2003
Customer/Supplier Perspective
Rtvitw vklut zf Sydwty Dktk Ctwtrt by 31 Dtctmbtr 2002

Rtgizwkl
Quttwslkwd brkwch tz btczmt kwd rtmkiw przfitkblt by 31 Dtctmbtr 2002
WSW/VIC brkwch tz btczmt kwd rtmkiw przfitkblt by 30 Juwt 2003
Internal Process Perspective
KCT tz rtmkiw przfitkblt
Knowledge Management
Dtvtlzp przduct mix kwd czwsisttwt priciwg strkttgits thkt mzvt CITtC's strvict zfftriwg up tht vklut chkiw kwd iwcrtkst rtvtwuts by 30 Juwt 2003
Dtvtlzp 2 zr 3 wtw high czwtributizw przducts/strvicts by 30 Juwt 2003 Perspective
Sigwifickwtly iwcrtkst mkrktt kwkrtwtss kwd btczmt pkrt zf K sigwifickwt iwcrtkst iw mkrktt kwkrtwtss by Yt 2002 (mtksurtd by k bkstliwt survty zf
tht tvzktd stt fzr dtcisizw-mkktrs Gzv kwd privktt stctzr IT czmpkwits dzwt wzv 2001, kwd tz bt rtptkttd DtC 2002)
Mzvt up zw IDC's Tzp 10 List zf zutszurctrs frzm wumbtr 10 tz wumbtr 9 by Yt 2002
Customer/Supplier Perspective
Disczvtriwg dzrmkwt custzmtr wttds
Ptwttrktiwg wtw mkrktts
Sigwifickwtly iwcrtkst mkrktt ptwttrktizw, Tkrgtttd Brkwch fzcus
GOAL - tsptciklly iw dtsigwkttd przduct stgmtwts by
Citec Optimise Financial Returns  30 Juwt 2003 kwd kgkiw iw 30 Juwt 2004
Dtvtlzpiwg wtw przducts/strvicts
Ckpturiwg mkjzr zppzrtuwitits
Perspective View Customer/Supplier Perspective Txistiwg przduct stt mzdifitd kwd txpkwdtd

Crtktt k Strvict Cktklzgut zf strvict zfftriwgs by 1 Stpt 02


Shzw sigwifickwt iwcrtkst iw mkrktt ptwttrktizw iw tztkl zutszurciwg by 30 Juwt 2003 kwd kgkiw iw 30 Juwt 2004
Financial perspective Liwk kcczuwt kwd strvict mkwkgtmtwt, prt-sklts, kwd ttchwickl skills (it txttwdtd kcczuwt ttkms) iw k
Btczmt k przfitkblt ttkmiwg kpprzkch tz sigwifickwtly imprzvt CITtC's kwzwltdgt zf zur custzmtr's busiwtss by 30 Stpt
2002
zrgkwisktizw Strvict Imprzvtmtwt Przjtcts zptrktizwkl with Plktiwum Clitwts by 30 DtC 2002; kll Gzld kwd Silvtr clitwts by 30 Juwt 2003
Impltmtwt zptrktizwkl custzmtr sktisfkctizw survtyiwg kwd gtt k bkstliwt mtksurt by 30 zct 2002
Shzw kw iwcrtkst iw iwttgrkttd strvict szlutizws by 30 Juwt 2003; mzrt substkwtikl iwcrtksts by 30 Juwt 2004
Tstkblish k strkttgic kllikwct with k czwsultkwcy czmpkwy tz stll CITtC strvicts kt k Stwizr Mkwkgtmtwt ltvtl iw clitwt zrgkwisktizws
Twgkgtmtwt tz Dtlivtr
Przduct Dtvtlzpmtwt kwd Iwwzvktizw
Custzmtr Strvict Przvisizw
Twsurt thkt przctssts krt fully
fuwctizwiwg kcrzss divisizws kwd Knowledge Management
dtlivtriwg tkwgiblt zrgkwisktizwkl Perspective
kwd custzmtr btwtfits by DtC 2002 Crtktt k skilltd, czmptttwt, fltxiblt Skills kudit by 31 Dtc 2002
Sigwifickwtly imprzvt busiwtss, strvict, wzrkfzrct Wzrkfzrct plkw by 31 Mkrch 2003
ttchwickl, ltkdtrship kwd mkwkgtmtwt skills Succtssizw Plkw by wzv 2003
zf CITtC tz mktch busiwtss rtquirtmtwts
Thrzugh k brzkd rkwgt zf trkiwiwg iwitiktivts, dtlivtr mkwkgtrs whz krt
Customer/Supplier Perspective czmptttwt tz mkwkgt bzth stkff kwd busiwtss by Juwt 2003
Zptimist Mkrktt pzsitizwiwg Dtvtlzp k mzrt tfftctivt Rtczgwitizw kwd Rtwkrd przgrkm by 1 Mkrch 2003"

Internal Process Perspective


Sigwifickwtly imprzvt tht tfficitwcy zf tht Twsurt Custzmtr Strvict Przvisizw przctss is iwttgrkttd kcrzss kll CITtC fuwctizwkl krtks iw kll divisizws
zrgkwisktizw's strvict dtlivtry przctss tz Czwszlidktt strvict dtlivtry przctss kcrzss CITtC by kdzptiwg tht ITIL frkmtwzrk kwd
Customer/Supplier Perspective tht pziwt zf czmpttitivt kdvkwtkgt ksstss kwd impltmtwt strvict imprzvtmtwt plkws fzr 3 przctssts by 31 Dtc 2002
Pzsitizw Cittc ks k trusttd Shzw 10% iwcrtkst iw zptrktizwkl custzmtr sktisfkctizw ks rtgkrds strvict dtlivtry by 30 zct 2003
busiwtss pkrtwtr withiw tkrgtttd (ks mtksurtd by Custzmtr Sktisfkctizw Mzwitzr); shzw kdditizwkl 10% iwcrtkst by 30 zct 2004
iwdustry vtrtickls Internal Process Perspective
Strkttgicklly mkwkgt tht Idtwtify zthtr kty uwdtrlyiwg busiwtss przctssts kwd dtvtlzp k czmprthtwsivt stt fzr CITtC by Juwt 2003
uwdtrlyiwg busiwtss przctssts iw
Knowledge Management
zrdtr tz kchitvt whzlt-zf-CITtC
iwttgrkttd strvicts Perspective
Impltmtwt k przgrkm tz twczurkgt
Build CITtC's culturt kwd czmmuwicktizw chkwwtls tz twczurkgt
busiwtss iwwzvktizw by 1 Dtc 2002
iwwzvktizw kwd czwvtrt this iwtz high vklut przduct/strvict zfftriwgs
Fzsttr iwttrwkl czmmuwicktizw zf busiwtss strvict, ptrfzrmkwct
kwd culturt by 1 Mkr 2003

K Czmprthtwsivt Mkrkttiwg Iwfzrmktizw Systtm fully zptrktizwkl by 30 Stpt 2002


Impltmtwtktizw zf txtcutivt ltvtl Custzmtr Sktisfkctizw Przgrkm by 1 zct 2002
Czmprthtwsivt kcczuwt Plkwwiwg fzr kll Plktiwum, Gzld kwd Silvtr kcczuwts iw plkct by 1 DtC 2002
Czmplttizw zf zrgkwisktizwkl rt-structurt fully rtfltctivt zf Mkrkttiwg zritwtktizw by 1 Stpt 2002
Czrpzrktt rt brkwdiwg tz rtfltct chkwgt tz Mkrkttiwg zritwtktizw czmpltttd by 31 DtC 2002
Customer/Supplier Perspective Crtktt k sklts culturt thkt is rtspzwsivt tz clitwt & mkrktt wttds by 1 DtC 2002
Sigwifickwtly imprzvt tht zrgkwisktizws' Mkjzr sklts kllikwcts/pkrtwtrships tz gtt "fttt zw tht grzuwd" impltmtwttd by 1 DtC 2002
mkrktt rtspzwsivtwtss, btczmiwg k
Twsurt twgkgtmtwt tz Dtlivtry przctss is iwttgrkttd kcrzss kll CITtC fuwctizwkl krtks iw kll divisizws by 1
mkrkttiwg zritwttd czmpkwy
DtC 2002
Mkjzr chkwwtls/kllikwct kgrttmtwts impltmtwttd iw G3 przduct/strvicts spkcts by 1 DtC 2002
Iwvtstigktt kwd mkkt rtczmmtwdktizws rtgkrdiwg CITtC's G3 CRM strkttgy kwd kllikwct by 1 Stpt 2002

Internal Process Perspective Dtvtlzp kwd mkwkgt iwttgrkttd strvict pzrtfzliz iwczrpzrktiwg kpplicktizw kcctss,
Kdzpt k Czrpzrktt Strvicts tSD kwd IIM przduct stt with rtltvkwt sklts kwd mkrkttiwg czllkterkl by 31 July
fzcus fzr Cittc przduct stt 2002
Prtstwt CITtC tz tht mkrktt ks k sptciklist iw Czrpzrktt
Strvicts suppzrt by 31 DtC 2002

Figure 61- BSC Perspective view of Citec strategic plan

(This diagram is intentionally too small to read and has been encrypted to
preserve the confidentiality of the strategic plan.)
The result of re-naming the Citec strategic plan ‘planks’, goals and objectives
was a combination of strategic goals, sub-goals and objectives. The legend
in the top right hand corner of Figure 61 indicates the naming and colour
convention used in the maps. We provided a coloured text above each goal
to indicate which perspective the goal would align with. For example, in

Craig Huxley Page 277


A Member of the Centre for Information Technology Innovation

Figure 62 the top left hand section contains the text customer/supplier
perspective above the gold highlighted text ‘Optimise Market Positioning’. We
provide additional explanations of some goals by adding a block of text
beneath the goal. This can be seen in the diagram as blue text surrounded
by a grey line. This type of further explanation is in line with the changes to
the targeting methodology from previous action learning cycles.

Customer/Supplier Perspective
Optimise Market positioning

Internal Process Perspective


Customer/Supplier Perspective Significantly improve the efficiency of the
Position Citec as a trusted organisation's service delivery process to
business partner within targeted the point of competitive advantage
industry verticals Internal Process Perspective
Strategically manage the
Transform CITEC's positioning up the underlying business processes in
value chain in our client's minds from order to achieve whole-of-CITEC Identify other key underlying business process
being a service provider into a trusted integrated services comprehensive set for CITEC by June 2003
business partner within targeted
industry verticals. In this way we will Knowledge Management
manage our customer relationships to Perspective
the point of competitive advantage. Build CITEC's culture and communica
innovation and convert this into high va
A
I
C
C
Customer/Supplier Perspective C
Significantly improve the organisations' C
market responsiveness, becoming a M
marketing oriented company
E

Figure 62- Section of the BSC perspective view of the Citec strategic plan

The BSC developed with Citec included all four perspectives which was also
an outcome of the previous action learning cycle.

The Citec participants raised an important issue during the meeting which
altered the approach taken with the development of the BSC. One participant
was concerned that to bring into the organisation new terminology in regards
to the strategic plan would only lead to confusion and disenchantment
amongst managers and staff that already had spent time understanding the
present terminology. It was then agreed that the existing terminology would
be used and that we would not mention that it was a BSC map. In addition to
this issue both, participants explained that the BSC map produced by the
research team did not take into consideration the cause and effect linkages
that their present strategic plan contained. In order to capture these links in a

Craig Huxley Page 278


A Member of the Centre for Information Technology Innovation

visual map we would need to ignore the perspective views as these added
communication complexity and communication confusion. We would also
need to change the visual map to show the ‘layered’ approach taken by the
Citec strategic plan. The diagram below,
Figure 63 is an example of the layered approach used in the next version of
the strategic plan visual map.

National
Objectives
Regional
Objectives

Objectives
Strategic Goal
Objectives
Sub-Goals Marketing
Objectives Orientation
Objectives

Sub-Goals Marketing
Objectives Orientation
Objectives

Objectives
Sub-Goals
Excellence in People Sub-Goals Marketing Become a profitable
Objectives
Connector
Orientation organisation
Sub-Goals
Excellence in People
Strategic Goal
Objectives Connector

Sub-Goals Integrated
service solutions
Objectives
Sub-Goals Integrated
service solutions
Objectives
Sub-Goals
Objectives Integrated service
solutions
Objectives
Objectives

Objectives
Objectives Strategic Goal

Figure 63- example of the type of layering needed to show cause & effect linkages

In Figure 63, the map is read from right to left with the strategic goals and
sub-goals highlighted. The layering we are referring to is shown by the
positioning of the sub-goals and strategic goals. The two vertical lines have
been added to enhance this layering effect. The first layer is the two sub-
goals called Excellence in People. These two sub-goals impact upon all six of
the sub-goals in the next layer. The middle layer contains three sub-goals
‘Integrated Service Solutions’ and three sub-goals ‘Marketing Orientation’.
The link between these six sub-goals and the previous two to the left (in the
first layer) is indicated by the arrows and the branch. The same type of
linkage is shown between the six sub-goals (in the middle layer) and the
three strategic goals (in the right hand layer). All six sub-goals impact upon
all three strategic goals.

Craig Huxley Page 279


A Member of the Centre for Information Technology Innovation

With the layout of the map agreed by the participants we concluded this
meeting and arranged to meet again on the 12th of November. The major
outcome of this meeting was the agreed layout of the Citec strategic plan
using the mind-mapper software.4

While the literature suggests that using different names for the perspectives
and even adding perspectives was appropriate to suit particular industries,
the research team saw their experience as evidence that the main driver of
this type of change appears to be to promote clear communication. This
altered the research team’s original notion that perspective names needed to
be changed within industries. We now assert that perspective names should
be changed if they improve communication within an organisation or a
department.

The case study continued with the following two meetings (12th & 26th Nov.)
improving upon the layout of the strategic map and reducing the number of
objectives within the strategic map. The participants used cause and effect
relationships to position the goals and objectives within the map. Much of this
work occurred between meetings without the research team. The participants
also decided to segment the objectives and goals of the strategic plan into
long term and short term objectives and goals. There were some objectives
that were removed as being applicable to a previous time frame and others
that had already been completed. With the removal of some of the objectives
they then agreed on the impact percentages for the remaining objectives and
goals within the strategic plan. A further map was printed showing these
impacts. Some had been developed during the meeting and others by the
participants on a separate occasion. The Strategy Development manager
had also taken the map showing impact assessments to a managers’
meeting in which they provided that participant with their thoughts on the

4
The issue raised: concerning the renaming of the planks and goals within the Citec strategic plan, became part of
a presentation to the attendees at the 2002 Australian & New Zealand Academy of Management (ANZAM)
conference.

Craig Huxley Page 280


A Member of the Centre for Information Technology Innovation

assessments. This added to the consensus of the output of the case study
for the participants.

Although the participants had, by this stage in the case study, a good
knowledge of the targeting method, they chose to undertake a review of the
goals and objectives within their strategic plan. They sought to reduce the
strategic plan by using the impact assessments to identify which of the three
strategic goals they should focus on.

These three strategic goals were:


1. Adopt a Corporate Services focus for Citec’s product set
2. Strategically manage Citec’s portfolio to optimise market positioning and
financial returns
3. Strategically manage the underlying business processes in order to
achieve whole-of-Citec integrated services

Of these three strategic goals, the participants chose to focus the project on
the third goal: to ‘strategically manage the underlying business processes in
order to achieve whole-of-Citec integrated services’. This approach reduced
the number of objectives available to the case study project and the size of
the project.

Initially, the participants assessed the impact of objectives on the goals of the
organisations in the map that had been produced. There were at this stage
no processes linked to the objectives.

Between the meeting on the 26th of November and the next meeting on the
3rd of December the participants undertook to develop a list of processes to
suit the strategic map we had thus far developed. These were high level
processes under five categories (Field Operations, Service Operations,
Human Resources, Financial & Corporate affairs, Branch Operations and
Technical & Strategic Services).

Craig Huxley Page 281


A Member of the Centre for Information Technology Innovation

The table below lists the forty two processes identified by the Citec
participants to be used in this case study. There are five categories and one
sub-category. (Service Operations contains the sub-category of IT Services
Management) These processes are to be linked to the objectives in the
strategic map.

Service Operations Field Operations Financial &


Software Development Information Gathering & Corporate Affairs
Technical Information & Dissemination Financial Administration
Communications Strategic Planning Financial Accounting
Technology (ICT) Product Management Financial Reporting
Infrastructure Partnership Purchasing
IT Services Management Budgeting
Management Communications Contract Administration
Incident Management Management Administration
Problem Management Bids, Quotes & Bid
Management Human Resources
Change Management
Payroll
Release Management Branch Operations Workforce Planning
Configuration Business Training & Development
Management Development (New IR & EEO
Service Level Business)
Management EAS
Account Management Recruitment & Selection
Capacity Management
(Existing Clients)
Availability Management Technical &
Sales Support
Financial Management Strategic Services
IT Service Continuity
Bid Management
Billing Innovation & Product
Security Management Development
Security
Architecture & Design
Asset Management
Information Management
Table 54- List of the 42 process used by Citec

At the meeting conducted on the 3rd of December the case study participants
undertook an alternative approach to the steps previously conducted at this
stage in the targeting method. The participants linked processes to all
objectives, not just those which might have been categorised as internal
process perspective objectives.

The strategic maps developed to date in the case study did not contain many
of the components of the typical BSC. For example, as there were no
perspective views (financial, customer/supplier, internal process and
knowledge management) identification of which objectives were considered
internal process perspective objectives, (or financial perspective objectives,

Craig Huxley Page 282


A Member of the Centre for Information Technology Innovation

customer/supplier perspective objectives and knowledge management


perspective objectives) had not occurred.

The case study had, thus far, deviated from the targeting methodology by not
developing a traditional BSC. Although the participants had not developed a
typical BSC they had developed a strategic plan with the goals and
objectives positioned to show the cause & effect linkages. In this way they
had achieved the outcome for steps 3.2 and 3.3 (to identify goals, strategies
and objectives and to show the cause & effect linkages).

We were now up to step 3.3, which is to link the processes to the internal
process objectives. As the participants had not identified which objectives
were internal process objectives, they chose to link their identified processes
with all the objectives within the strategic map. This decision, led to the size
of the strategic map when printed on A0 width paper (840mm) to be 2.8
metres long. There were four hundred and twenty three processes linked to
thirty four objectives. The processes in the strategic map were from the forty
two identified earlier by the Citec participants (Table 54).

The research team took the list of links between process and objective
developed during the meeting and over the following week produced a large
scale strategic map which showed the processes linked to the objectives
within the strategic map.

At the last meeting on the 10th of December the research team presented the
participants with the full scale strategic map. Figure 64 is the strategic map
(11% of original size) with the individual processes linked to the objectives.
The diagram also showed the categories of processes which were linked to
the objectives.

Craig Huxley Page 283


A Member of the Centre for Information Technology Innovation

Figure 64- Complete strategic map for Citec including processes linked to objectives

Craig Huxley Page 284


A Member of the Centre for Information Technology Innovation

Although the strategic map shown in Figure 64 is too small to read it provides
a view of the strategic map with which we can explain that the colour down
the left side of the diagram represents the 423 processes linked to the thirty
four objectives. The horizontal lines of green are the objectives. The diagram
is the view that the participants had of the strategic plan when they were
assessing the impact of the identified processes on the objectives in the
strategic map. The 2.8 metre long strategic map was laid onto the boardroom
table so that the participants were able to visually see the linkages within the
map.

The layout of the map is clearer in Figure 65 where we have removed the
processes and left only the functional areas in which the processes were
categorised.

Figure 65- Citec Strategic Map not showing processes linked to objectives

Craig Huxley Page 285


A Member of the Centre for Information Technology Innovation

To the left of the diagram, the processes would have been linked to the
objectives. They are represented in this diagram by the blue, plum, orange
and dark yellow colours.

This was the last part of the targeting methodology completed, with the
assistance of the research team. Due to confidentiality, the results of the
assessment of the processes by the participants for this case study is not
shown in this thesis. The participants in their own time undertook the
assessment for the factors dependency and probability of success.

There were a number of reasons for the early conclusion of the case study.
Citec had that week hired a new CEO and the CEO put on hold any strategic
activities until he had time to review them. The Christmas period was nearly
upon us and that meant many extra tasks for the participants before the
break. Both participants were to take time off after Christmas and one
participant (the main instigator of the project) was away from the company till
late February 2003. In addition, by the time the participants were ready to
complete the case study the new CEO had scrapped the strategic plan. After
contacting both of the participants (during March 2003) the research team
concluded the case study. We then sent an email to both participants with the
five questions used for the REALTECH case study. The answers are
described and examined in the reflection section.

7.5.4 Reflection Phase Cycle Four

This section is the third phase of four phases of the action learning cycle and
will examine the output of the case study. We will analyse the issues that
arose during the case study and in particular the answers provided by the
participants to the five questions emailed to them at the conclusion of the
case study.

Reflection by the Participants

Craig Huxley Page 286


A Member of the Centre for Information Technology Innovation

The major data from the case study is provided by the answers to the
questions by the participants. These answers are shown as quotes here in
this section and examined in the reflection section which follows.

1. Are the results of the process what you expected?


“Yes. Our intention at the outset was to identify and prioritise our strategic
planning objectives by understanding the criticality of the underlying business
processes that contribute to their achievement. By following the process we
were able to establish a core set of objectives and underlying processes that
will give us the best return for our effort.”
“We had wanted from the process a way of weighting and rating the
importance or impact of each of the objectives of the strategic plan. We had
also wanted help in creating something of a critical path in the relationships
between objectives, which objective would better precede the others.”

“We had to make choices on values based on our experience with the
organisation. What I found was that the software package enabled us then to
hold together the choices we made and to present these in a unified
manner.” “The choices were, of course, subjective. The next stage was to run
these past others for validation. We found that others were interested in the
connections and presentation and keen to make suggestions of their own.”

2. Are the results of the process useful to you as a manager?


”Yes, is a number of ways;
1. Clarifying the relationships between objectives and organisational divisions
2. Demonstrating the contribution of a particular business process or
objective to the overall company results, enabling concentration on, or
elimination of, objectives as appropriate.
3. As a communication tool for both staff and senior management”
“Yes, they were useful, in that they made me deliberate longer on the
impacts and critical paths. They were also helpful in creating senior
management interest in the relationships between objectives and in creating
a conversation around objectives and their interrelationships. Equally
important was the ability to reduce a large number of objectives to those with

Craig Huxley Page 287


A Member of the Centre for Information Technology Innovation

the most critical impact now and, therefore, those that should be
implemented first. In other words, it enabled something of a project plan to
emerge around strategic objectives.”

3. Are the results of the process useful to you as a company?


”Yes, see above.”
“In addition, even though the executive management team has changed and
the Strategic Planning activity in this form is no longer in vogue, the specific
core objectives identified and the focus on related business processes is
continuing.”

4. Was there anything in the identification process which you didn't


understand or had difficulty with?
“No. The only issue for us was the volume of assessment information we
generated, however we understood this at the start. I believe this was driven
by our desire to take a comprehensive approach to the assessment rather
than apply the approach to a particular segment of our strategic plan.” “The
process is similar to other processes used in project planning and in
recruitment and selection.”

5. Are you able to suggest any improvements to the way I implemented the
process?
”The overall approach was clearly and simply represented. A small case
study over-viewing the entire process would have been useful to demonstrate
the whole process. (Obviously this is an outcome of the research! so I
suggest it to enhance the approach and the model)”

“My only caveat for the whole process was that the discussion that I would
have occasionally with [the second participant]. That is, this is a type of “left-
brain” tool, meaning that it improves analysis. While it is helpful in showing
links between objectives there is no use in over using it for a couple of
reasons. One is the old analysis paralysis syndrome of getting lost in
arguments over the weightings and links and not getting the job done.
Another is not realising that the game keeps changing each day and

Craig Huxley Page 288


A Member of the Centre for Information Technology Innovation

yesterdays assessments and links are not today’s. A third is that it can tend
to make one look too much within the organisation when the organisation
needs to be looking outside at clients and market. Fourthly, it is not “right-
brain” in that it does not reflect, and never can reflect, that most effective
strategies and successful outcomes depend more on the personalities of
leaders and their ability to engage and enthuse others as well as follow
through, than on detailed analysis of strategic objectives.”

There is a large amount of very useful information in the comments of the


participants and it will be examined in the next section.

The participants considered that the targeting method process was simple to
understand, confirming the improved approach to explaining the steps. The
comment that “the only issue for us was the volume of assessment
information we generated” was evidence of this improvement. One
participant also suggested that the process was similar “to other processes
used in project planning and in recruitment and selection”, which would have
been a contributing factor to their understanding.

Both participants considered the results of the targeting method to be what


they had expected. One comment suggested that the targeting method
demonstrated “the contribution of a particular business process or objective
to the overall company results, enabling concentration on, or elimination
of, objectives as appropriate”. The second participant supported this type of
contribution by saying that the process provides the “ability to reduce a large
number of objectives to those with the most critical impact now and,
therefore, those that should be implemented first”.

The research team had not previously considered that the targeting method
would be used in this way and it is a useful and valuable addition to the
benefits of the method. This is further evidence that the targeting method
process is a valuable methodology as it can be used as an identification
process for critical processes as we have suggested and as suggested by
the participants for identifying the critical objectives of and organisation.

Craig Huxley Page 289


A Member of the Centre for Information Technology Innovation

The participants had an objective, which overrode the research objective of


the case study, that is, to use the targeting method to prioritise strategic
objectives. This new objective is summarised by the answer of one
participant who stated that “our intention at the outset was to identify and
prioritise our strategic planning objectives by understanding the criticality of
the underlying business processes that contribute to their achievement”. The
participants also considered that they were successful in achieving this
outcome.

Additional benefits of the targeting method were provided by what one


participant called a “critical path in the relationships between objectives”. [It]
clarified the relationships between objectives and organisational divisions”
and as the other participant noted “were also helpful in creating senior
management interest in the relationships between objectives”. He
summarised the method’s impact on relationships by that saying that the
targeting method “enabled something of a project plan to emerge around
strategic objectives”.

The mind-mapper software was given some praise, with one participant
saying that “the software package enabled us then to hold together the
choices we made and to present these in a unified manner”. The participants
were not the only ones to have been involved in the case study, with the
maps being used to successfully communicate the results developed during
the case study meetings to other executives. “Others were interested in the
connections and presentation [of the information] and keen to make
suggestions of their own”.

These comments are positive support for the software and for the visual
layout which had taken so long to develop. The departure away from the
traditional balanced scorecard terminology and structure and a focus on the
cause & effect relationships as the primary concern was supported by the
comments of the participants and other executives within Citec. Ensuring that

Craig Huxley Page 290


A Member of the Centre for Information Technology Innovation

the language used was familiar and thus understood would have also been a
contributing factor to the strategic map’s acceptance.

Longer term benefits of the targeting method were noticed by the participants
when the new executive management team took over. Although the strategic
plan used has “changed and the Strategic Planning activity in this form is no
longer in vogue, the specific core objectives identified and the focus on
related business processes is continuing”. The participants considered that
the output of the targeting method had provided them with the same core
objectives that the new executive management team had identified for future
use. The strategic planning activity which is no longer in vogue is not the
targeting method but the use of silo structured strategic planning. A silo
structured strategic plan is one that does not show or allow for the
intersection of objectives with other objectives or goals (Davidson and Griffin
2000).

The last question asked of the participants was to consider any possible
improvements to the process. The first participant suggested that although
the explanation of the targeting method was simple and clearly presented it
might be improved by providing “a small case study over-viewing the entire
process”. A further comment by this participant was that it was expected that
the research project would provide a suitable case study for this purpose.
Outputs of this thesis are case studies which might be used in future
implementations of the targeting method.

The second participant’s response to the question was quite detailed. He


made several cautionary suggestions which we have summarised:
1. That we should be wary of paralysis by analysis
2. Yesterday’s assessments and links are not today’s
3. “That [the targeting methodology] can tend to make one look too much
within the organisation when the organisation needs to be looking outside at
clients and market”
4. “That most effective strategies and successful outcomes depend more on
the personalities of leaders and their ability to engage and enthuse others”

Craig Huxley Page 291


A Member of the Centre for Information Technology Innovation

The research team believe that the first point indicates a possible preference
for the use of heuristics for assessments rather than the use of more detailed
and quantitative methods.

The second point suggests the targeting method not be considered an


activity, which occurs only once a year and, instead, is reviewed as the
external and internal environment changes. This type of regular review would
ensure the validity of the decisions made with the targeting methodology.

The third point suggests that a focus on internal processes may cause a
reduction in the client or market focus. If the original strategies developed are
appropriate for the organisation and the environment it is active within, then it
follows that there will always be a need to improve the activities of the
organisation in order to more effectively and efficiently achieve the
organisational goals. This line of reasoning supports the use of the targeting
method.

The fourth point suggests that in most cases organisational outcomes are
highly dependent on the leader engaging and motivating the organisational
members. The targeting method in this case study increased executive input
and ‘buy-in’ to the strategic plan as well as improving the understanding of
the strategic plan. This is, at the very least, positive support for the leadership
of the organisation and thus a valuable aid to leadership in motivation and
communication.

The following section describes the issues raised by the research team and
changes made in the previous action learning cycle.

This case study differed from the first two studies in that it was apparent early
in the project that the participants wanted to use the project to achieve some
high level strategic needs. Although this meant that the scope and length of
the case study would be much larger than the previous case studies, the
research team believed that this would be balanced by the proactive

Craig Huxley Page 292


A Member of the Centre for Information Technology Innovation

approach of the participants and validate the targeting method. Consequently


the project started with the strategic plan for the company which had far more
detail than the previous two case studies. Much of the initial meetings were
taken up with the transfer of the existing corporate strategic plan into a form,
which was suitable for showing cause and effect perceived by the
participants. This went through a number of iterations until a final form was
agreed to. The final form did not show the objectives as sitting within a
particular perspective or any perspective views.

The linkages of cause and effect within the strategic map were between the
forty-two identified processes and the objectives in the map. The participants
did not differentiate between objectives that might be from within the internal
process objective perspective or any other BSC perspective. All objectives in
the strategic map were linked to one or more of the identified processes. The
reasoning behind this decision was, initially, to reduce possible confusion
within Citec caused by altering the existing strategic plan and adding
perspectives. The participants and the research team then considered that
the linkages of processes to all objectives was still consistent with the
methodology as these linkages used cause and effect and would not be
linked if they were not logical.

The Citec participants used the impact assessment method to help in


reducing the number of goals, sub-goals and objectives that were used in the
strategic map developed during the case study.

Their approach to the assessment of impact, probability of failure and


dependency for the case study was similar to that used by the REALTECH
participants. The Citec participants discussed each of the assessments using
the heuristic approach, explaining why they were considering a rating if there
was any disagreement with a particular assessment. The cost/benefit and
probability of success assessments was not undertaken at this level of the
process as the participants did not wish to undertake an involved cost/benefit
analysis on the improvement of such large and comprehensive processes.
Instead, they would drill down into the most critical processes to provide them

Craig Huxley Page 293


A Member of the Centre for Information Technology Innovation

with a list of less complex processes and select one of these. Once a
process was selected from within the most critical processes they would
carry out the selection assessments of cost/benefit and probability of
success.

We applied the following revisions from case study two: the explanatory
steps were altered to improve the understanding of the participants, change
the methodology to ensure that if a corporate BSC was used then that BSC
was fully developed and actively encourage the use of the final three steps of
the method.

The first successful revision was the change to the explanatory steps to
improve the understanding of the participants. The methodology was
explained on a high level on two occasions at the beginning of the case study
and each step was explained in detail as the case study progressed.
Evidence of the success of the approach could be seen when the participants
were able to explain the process to other executives and for those executives
to understand the methodology sufficiently to then provide useful input.
Further evidence might be that the participants effectively ran the project and
completed some parts of the methodology outside of the meeting times.

The second successful revision was the change to the methodology to


ensure that if a corporate BSC was used then that BSC was fully developed.
The context here was that after developing a corporate BSC a business unit
BSC would also be developed. It was considered that providing a fully
developed corporate BSC would add value to the methodology by providing
the participants with information that would allow the organisation to
understand and value the criticality of the business unit goals on those of the
organisation (when business unit BSC’s were also developed). In this way we
would also be assisting the organisation in “growing the business” and
“running the company.” Although in practice this was not tested completely,
the project did complete the corporate BSC in sufficient detail to provide
valuable business benefits. This is evidenced by some of the comments
provided by the participants in the participant reflection section.

Craig Huxley Page 294


A Member of the Centre for Information Technology Innovation

The third and unsuccessful revision was to “actively encourage the use of the
final three steps of the method” (Step 8 Assess the Cost/Benefit of Improving
the Process, step 9 Assess the Probability of Successful Improvement of the
Process and step 10 Selection of which Critical Process to Improve First). In
this case study although the participants were interested in using the final
three steps the change in CEO and other executive staff combined with their
intent to completely revise the existing strategic plan, meant that to continue
the project would have provided little value.

This reflection phase has provided considerable information, which supports


the targeting methodology’s ten step model. The participants were positive
about the explanatory steps and realised benefits beyond those considered
by the research team. Two of the three revisions from the third action
learning cycle were provided with positive evidence of their appropriateness.
The participants suggestions for improvement by using a small case study as
an example is useful and will be added to the revisions. The cautions
provided by the participants are more difficult to apply and should be added
to the explanatory notes.

Overall, this case study has provided reasonable evidence that at least the
first seven steps of the ten step model is appropriately structured for use
within the ASP service delivery industry.

The next section describes the final changes to methodology guided by the
reflection phase of this action learning cycle.

7.5.5 Revision Phase Cycle Four

The final phase of the fourth cycle of action learning is the revision phase. In
this section we will describe the changes to be made to the ten step model
for implementing the methodology.

Craig Huxley Page 295


A Member of the Centre for Information Technology Innovation

The revisions which the research team have decided to make to the ten step
model in this cycle of action learning are:
1. The future addition of a small case study to the explanatory step and
2. To further expand the explanatory step to include the cautionary aspects
for using the method.

As a result of the case study, the research team suggests that a small case
study should be written using a fictitious company, with the case study
incorporating all the improvements to the methodology. This might be used to
provide the users of the targeting methodology with a descriptive explanation
of the method. Woven into the case study might be the further benefits and
uses for the method as well as the cautionary issues.

There were only two minor revisions to the model in this cycle, which
indicates the completeness of the ten step model developed in this research
project. The number and size of the changes required for the revision phase
of the cycle are indicative of the continued validation of the targeting
methodology. The criteria against which this methodology is being validated
are the perceptions and responses of the participants and the perceptions of
the research team.

The next section highlights the areas within the ten step model which have
been revised (shown in grey) and is followed by a cross case analysis and
then a summary of the action learning cycles.

Craig Huxley Page 296


A Member of the Centre for Information Technology Innovation

Version 5 of the Targeting Method


1. Preplanning
1.1. Assessing participants
1.2. Preparation of any documents
2. Defining Scope
2.1. Identify the processes
2.2. Introduction of the project as a whole to the project team
3. Developing a Balanced Scorecard (BSC)
3.1. Identify the goals and strategies and objectives of the entity
3.2. Identify the cause & effect linkages within the BSC
3.3. Link processes identified earlier (2.2) to internal process objectives
4. Assessing the Impact of Processes on Goals
4.1. Assess the impact of each process on goals using heuristics and
total
5. Assessing Dependency of the Organisation on the Process
5.1. Agree on the method to be used for assessing dependency
5.2. Identify the criteria to be used and rate each process
6. Assessing Probability of Failure of the Process
6.1. Agree on the method to be used for assessing probability of failure
6.2. Identify the criteria to be used and rate each process
7. Calculate the Criticality of each Process
8. Assess the Cost/Benefit of Improving the Process
8.1. Agree on the method to be used for assessing cost/benefit
8.2. Identify the criteria to be used and rate each process
9. Assess the Probability of Successful Improvement of the Processes with
positive cost/benefit
9.1. Agree on the method to be used for assessing probability of
success
9.2. Identify the criteria to be used and rate each process
10. Selection of which Critical Process to Improve First
10.1. Rank order the processes with positive cost/benefit by greatest
probability of successful improvement. Those processes with the
greatest probability of success and greatest cost/benefit should be
improved first

Craig Huxley Page 297


A Member of the Centre for Information Technology Innovation

An explanation of each part of the generic 10 part process

1. Preplanning
1.1. Assessing Participants
This step involves some research of the participants and their organisation. If
possible initial contacts should be used to assess the knowledge of the
participants and who should attend the first implementation meeting.
1.2. Preparation of any documents
Before the first meeting of the project team there are a number of documents
which should be produced. The first is a simple outline of the targeting
methodology that is to be used. In addition to the simple outline of the
targeting method there should be a small case study which explains the
steps taken, the further benefits of the method and describes the cautionary
aspects of the methods use. The second is the terms and meanings of the
words within the BSC. Initially meetings need to be structured; provision of an
agenda, list of terms and definitions regardless of previous experience and
description of a basic BSC

2. Defining Scope
2.1. Identify the processes applicable to the project
The processes and the level at which they should be seen are identified at
this point. Conduct this step before the introduction of the entire project.
Though this is dependant on the situation or context in which you are
implementing the targeting method.
2.2. Introduction of the project as a whole to the project team
Targeting methodology projects should have a less detailed introduction that
lists the basic ten steps of the method. This should then be followed by an
introduction of each step in greater detail at the beginning of each step.
Dependant on the number of participants and time as to the detail required. It
should be verified that the participants are familiar with the BSC and its
terms. Agreement should also be reached as to the lines of communication
and confidentiality.

Craig Huxley Page 298


A Member of the Centre for Information Technology Innovation

The first meeting should ascertain through discussion the knowledge base in
regards to the BSC. It is necessary for an effective implementation that the
research team and the participants have a clear understanding and
agreement of what the BSC is, how it is supposed to work and definitions of
the terms used. Outcomes of this meeting should be that all participants
speak the ‘same language’ and there is a plan of how to go forward. Ideally
this plan will list suggested documents from which goals, strategies and
possible objectives may be sourced.
Define the business area in which to conduct the project, (not all targeting
projects are implemented for the whole organisation). The time frame of the
project should then be assessed and the roles and responsibilities for the
project outlined. Define and achieve agreement from the participants as to
the necessary amount of time needed for successful completion of the
project.

3. Developing a Balanced Scorecard (BSC)


Introduce this step in detail. Conduct this step and that in step four before
assessing dependency and probability of failure now steps five and six. Use
the mind-mapper software for visualising the BSC maps. In addition to this
there is a need in longer implementation to use further documentation to
assist participants in remembering the meanings of objectives and strategies.
A further requirement in longer implementation is to provide summaries of
what has occurred previously and what still needs to be completed to
implement the process.
3.1. Identify the goals, strategies and objectives of the entity in the
project
If necessary an explanation of the BSC is necessary and the goals and
strategies of the entity are provided. Develop objectives as the mini goals for
the strategies and place these objectives within the agreed perspectives of
the BSC. Identify a need for using a fifth perspective such as ‘partner’ before
using it. It would also be advantageous to obtain agreement of the
participants for the names of each of the other perspectives.
3.2. Identify the cause & effect linkages within the BSC

Craig Huxley Page 299


A Member of the Centre for Information Technology Innovation

Using the experience and skills of the project team identify the cause and
effect linkages within this BSC. If necessary add objectives to allow for
possible causes and effects.
3.3. Link processes identified earlier (2.2) to internal process objectives
With the processes identified earlier (or now) the project team link those
process which impact upon the internal process objectives within the BSC.
Provide a visual ‘map’ of the BSC for the participants to assess for
correctness of the work so far and make changes from any feedback.
A separate BSC needs to be drawn up as each management focus of an
organisation is brought into the project. All BSC’s produced in the targeting
methodology should be completed as an aid to further use of the results.
This is especially necessary in organisations with few levels of management
and lower level managers who are not involved in strategy formulation.
Improvement of the visual ‘map’ of the BSC may reduce the confusion and
lead to better communication.

4. Assessing the Impact of Processes on Goals


Introduce this step in detail.
4.1. Assess the impact of each process on goals using Heuristics
Using Heuristics assess the impact of each process on the objectives,
strategies and ultimately the goals of the entity. Each link is assessed one at
a time and but in relation to all of the processes and objectives impacting on
the objective, strategy or goal. Calculate totals for each process by
multiplying the percentages together along the links.

5. Assessing Dependency of the Organisation on the Process


Introduce this step in detail.
5.1. Agree on the method to be used for assessing dependency
An improved approach to the assessment of the factor of dependency is
required. It is possible that an improved and more complete explanation of
the criteria for assessing dependency would provide an improved result. In
addition to better information there should be agreement between
participants as to the criteria used and the weightings for each criteria.5.2
identifies the major criteria for dependency. This project aimed to use

Craig Huxley Page 300


A Member of the Centre for Information Technology Innovation

heuristics for this task and this was completed by assessing the dependency
of the entity on the process in relation to each of the other identified
processes.
5.2. Identify the criteria to be used to rate each process
If greater reliability was needed then the criteria for dependency would be
used here with weightings for each. (Availability, reliability, safety,
confidentiality, integrity and maintainability) Each process would then be
rated by applying a rating to each criteria and multiplying this by the
weighting for that criteria and multiplying all these together for each process.

6. Assessing Probability of Failure of the Process


Introduce this step in detail.
6.1. Agree on the method to be used for assessing probability of failure
The use of heuristics was suggested- as for 5.1
6.2. Identify the criteria to be used to rate each process
An improved approach to heuristics is to base your judgements on prior
knowledge and facts and then predict a possible result. An improved
approach to the assessment of the factor of probability of failure is required. It
is possible that an improved and more complete explanation of the criteria for
assessing probability of failure would provide an improved result. In addition
to better information there should be agreement between participants as to
the criteria used. No criteria are required for a heuristic approach to this
assessment. Additional information on failure could be the types of failure;
over performance, failure over time, intermittent failure, partial failure and
complete failure.

7. Calculate the Criticality of each Process


Introduce this step in detail. Multiply the ratings or values for impact,
dependency and probability of failure. This total is the criticality of each
process

8. Assess the Cost/Benefit of Improving the Process


Introduce this step in detail. Actively encourage the use of the step of the
method.

Craig Huxley Page 301


A Member of the Centre for Information Technology Innovation

8.1. Agree on the method to be used for assessing cost/benefit


If the projects are very small then heuristics may be suitable, otherwise the
suggested approach is to develop a business case for each process as a
separate project.
8.2. Dependant on the approach taken above in 8.1 the criteria would
include; costs for resources and non-project related incurred costs,
tangible and intangible benefits. Only those processes with a positive
cost/benefit would then be assessed for the probability of successful
improvement.

9. Assess the Probability of Successful Improvement of the Process


9.1. Agree on the method to be used for assessing probability of
success
The suggested approach here again is to use heuristics
9.2. Identify the criteria to be used to rate each process
Some criteria which may support the decisions are; Team Orientation,
Project Management, Management Support, User Participation, Modeller’s
Expertise, Project Championship and Communication.

10. Selection of which Critical Process to Improve First


10.1.Rank order the processes with the greatest probability of successful
improvement.
Those processes with the greatest probability of successful improvement and
the best cost/benefit ratio should be selected first for improvement. This is
essentially then a business decision. The rank order is not a scientific
approach to selecting which processes to improve first, it is an improved
approach to current practice.

Craig Huxley Page 302


A Member of the Centre for Information Technology Innovation

7.6 Cross Case Analysis


PS C1 C2 C3 CCA

A cross case analysis performed on the cycles of action learning occurred


during each of the reflection and revision phases. The research team made
revisions to the targeting methodology with an understanding of the previous
action learning cycles. A summary of these reflections are described next.

There major issues that occurred in each of the case studies were: time,
information, understanding, communication, language and visual
representation of the results.

An issue that arose in the first action learning cycle and was still an issue in
the last action learning was that of time. The experience of the research team
was that their appeared to be no ‘happy medium’ for the time frame of the
project. Where the time frame was constricted (eight hours over three weeks)
there were problems with explaining the methodology in detail in a short time.
Where the time frame was over five weeks with only three meetings then the
explanation needed revisiting for each meeting. The research team agreed
upon an approach where an overview is provided at the beginning and the
detailed information is given before each step of the method. The final cycle
of action learning has seen the suggested addition of a small case study for
future targeting method implementations.

A second issue was the communication of suitable information and the


understanding of that information, which was also a recurring issue in all
cycles. The explanation of the targeting method was expanded and altered to
assist in participant understanding after all cycles of action learning. The use
of a terms and definitions document early on in the implementation ensured
that all members of the project team had a similar understanding of the terms
used. In the final case study (Citec), one participant requested a small case
study which would provide an overview of the steps required; the information
needed, the time frames used and the possible outputs of the targeting

Craig Huxley Page 303


A Member of the Centre for Information Technology Innovation

methodology. A case study was also suggested in the third action learning
cycle (REALTECH), and not properly considered at that time.

The pilot study (PS) and third case study(Citec), used a different set of terms
and definitions in their BSC than was used in the first (CSC) and second
(REALTECH) case studies. The first and second case studies used similar
terms and definitions in their BSC’s, but there was a difference in some
meanings and the naming of functional areas within the reference model
used to identify the processes. The first case study (CSC) also required a
way of communicating the participants’ understanding of terms in their BSC
to other members of the company. This was achieved by adding ‘notes’
section to the visual maps. These differences in terms and definitions within
each company and the need to extend this knowledge to others not directly
participating suggests that more attention needs to be paid to ensuring that
all participants are conversant with meanings and that they are recorded in
an easily accessible form.

The communication of the results of each assessment made during the


implementation of the targeting method was a major issue. It was considered
by the research team to be a major cause of the problems in the pilot study
and even in the last case study (Citec) was still providing difficulties for the
research team. Although the mind mapper software was a vast improvement
over the original versions of the BSC maps it still does not provide an answer
to all the configurations that might be needed for strategic maps. The
‘layering’ of information suggested by Tufte (2002) needs further refinement
to enable the participants to apply the lessons learnt from this author. A
process model which identifies the structure, colour coding and naming
conventions would enhance this aspect of the targeting method.

A final issue that arose with the first (CSC) and second (REALTECH) case
studies was that the participants of both these case studies chose to select
processes based on their criticality and not use the final three steps of the
targeting method. It may be that, the organisations involved in the case
studies, did not intend to move to a process improvement phase and thus did

Craig Huxley Page 304


A Member of the Centre for Information Technology Innovation

not require the final three steps. Initiation of a cost/benefit analysis would not
provide immediately useful information in an industry that changed so
regularly if process change was not undertaken shortly after the assessment.
Steps eight and nine were the assessment for cost/benefit and the
assessment for the probability of successful improvement. Step ten was the
selection of which process to improve first based on that processes criticality
rating, positive cost/benefit and probability of successful improvement.

The first seven steps of the targeting method has now been tested and
validated using four cycles of action learning and a cross case analysis.
Although the participants did not use the final three steps of the ten step
method, each participant agreed, that these final three steps (Cost Benefit
Analysis, Assessment of Probability of Success and Selection of which
Critical Process to Improve), were important in the selection process.

In case study one, the participants did not intend to formally undertake a
process improvement phase, instead, choosing to informally identify possible
changes to the critical processes. In case study two, the participants chose to
use the results of the first three steps to identify for staff, those processes
which required most ‘attention’, as defined by their criticality assessment. In
case study three, the objective of the project was not process improvement,
but the application of the identification of critical processes and critical
strategies, as a way of identifying those parts of the strategic plan which were
most critical. Their intent following this objective was to then identify the high
level processes required to achieve the strategic objectives, although this
was over a longer time frame than this research project.

We have modified the ten step model of the targeting method to reflect the
lessons learnt during the action learning process and now have the lessons
learnt through reflection across the pilot study and the three case studies.

Conclusions
The issues that arise from the cross case analysis are issues that would be
difficult to resolve in the brevity of the ten step model. These key

Craig Huxley Page 305


A Member of the Centre for Information Technology Innovation

organisational constraints of time, information, understanding,


communication, language and visual representation of the results should be
understood by the project team at the time of starting a targeting method
implementation. They are usually unique to the organisation, department and
business unit and can even vary with the participants, thus making it difficult
to suggest a generic approach to resolving these issues. It should be noted
that these key organisational constraints are typical of most projects
conducted by organisations. Organisations need to, (as with all tools) use the
targeting method with an understanding of the organisation and project
team’s skills and address these constraints in the light of that knowledge.

The next section is the action learning summary.

Craig Huxley Page 306


A Member of the Centre for Information Technology Innovation

7.7 Action Learning Summary


PS C1 C2 C3 CCA

This chapter has taken the targeting methodology developed by the research
team and supported by the literature review and tested it in four cycles of
action learning using case studies. The action learning cycles are shown in
Figure 66 as PS1, C1, C2 and C3. The four phases of each action learning
cycle are action, observe, reflect and revise. We have completed the four
cycles which has resulted in a tested and revised process targeting
methodology suitable for use within the ASP industry and possibly within
many other industries.

1 Action
PS 1 C1 C2 C3
2 Observe

3 Reflect 4 Revise

Figure 66- Four cycles of action learning

The first cycle of action learning used an organisation, which had an existing
BSC and it was anticipated that this would provide an effective method to
initially testing the targeting method without some of the complexity of
developing a BSC. One cycle of action learning occurred with this
organisation and the research team learnt some valuable lessons, such as:
seek sufficient time for the project, provide documents that define terms and
definitions used in the method and do not assume knowledge of the
component parts of the method.

The second cycle of action learning contained an internal cycle of action


learning in which the research team reflected on and revised the approach
taken to visualising the BSC maps. The research team also learnt that the
explanatory phase of the targeting method required further improvement and
that the participants needed documentation to refresh their memory of the
meanings of terms. An improved approach to the assessments for the two
factors dependency and probability of failure was also undertaken from the
Craig Huxley Page 307
A Member of the Centre for Information Technology Innovation

reflection phase of this case study. The ten step method also underwent a
reorganisation of the steps in order to provide a more interesting approach to
the project for the participants.

The third cycle of action learning introduced some issues concerned with a
possible extension of the targeting method to provide direction and support
for the process improvement of processes identified by the targeting method.
The research team concluded that this need was outside of the scope of the
project but did suggest that all corporate BSC’s produced be completed as
much as possible to provide the management of the organisation with useful
outputs in addition to identified processes. The participants of this case study
like the previous chose to select processes from their criticality rating alone
without considering the assessments for cost/benefit and probability of
successful improvement. This issue was not successfully resolved in the third
case study due to executive management changes in the organisation which
impacted on the case study.

The final cycle of action learning provided the least number of revisions to the
targeting method, which is evidence of the method’s validity. These changes
were based on improvements to the explanatory step of the ten step
methodology. In addition to the changes suggested by the participants, this
case study was unique in that the participants used the project for a task
which had not been considered by the research team; to identify critical
objectives and goals from within their corporate strategic plan.

The last section of this action learning chapter was a cross case analysis of
the pilot study and case studies. The results of this analysis was a set of key
issues which future project teams using the targeting method need to
consider in order to assist the project’s success.

The changes to the methodology were the product of the reflection phase of
the action learning. The research team undertook to review each step of the
methodology in the light of the issues and responses of the participants after
each case study. There were no major changes to the targeting method or

Craig Huxley Page 308


A Member of the Centre for Information Technology Innovation

issues that arose which indicated that the factors used in the method were
incorrect. Almost all the changes made to the ten step model were indicated
by issues that are familiar to project managers. That is, their broad headings
are: time, information, communication and a process orientation.

The next section is chapter 8 and is the concluding chapter which describes
the significance of the project to the target groups:
1) The research community;
2) The wider business community; and
3) The specific industry of ASP.

This chapter will also describe the limitations of the project, by reflecting on
the types of organisations involved, their similarities and differences, the
focus of the study on service support and the limitations due to being situated
in the Australian context. Generalisability and validity will also be discussed
with a final section concerning possible future research directions.

Craig Huxley Page 309


A Member of the Centre for Information Technology Innovation

8 Findings and Limitations

This chapter will summarise the thesis: the purpose and the approach taken.
The limitations of the research project arise from the contexts of the ASP
Industry, the geographic area, the IT industry, the research approach and the
construction of the methodology. The section will then provide conclusions
arising from the research by detailing the areas of specific benefit and
generalisable benefit. Finally, we describe the future possible research areas.
The next section describes the limitations of the research project.

The purpose of this research project was to improve the existing method for
selecting business processes to improve. The research objective was to
develop a methodology that identified critical processes in application hosting
and application service provision. The method would provide a step by step
guide for the identification of critical processes and the selection of which of
these processes should then be improved first. It was intended that the step
by step guide would be a practitioner’s method, providing instructive
documents, tested practices, suitable software and user friendly forms. The
methodology should contain details on how to logically identify criticality
whilst also linking critical processes with business objectives and goals.

The thesis described how the research team conducted a literature review in
order to discover the elements of a critical process and how to identify a
critical process and choose which critical process to improve first. We then
described the ten step targeting method which was used in four cycles of
action learning to validate and test the methodology. We also described how
we needed to reduce the focus of the case studies and that this was
achieved by conducting a focus group which provided us with a list of critical
processes in the ASP industry and a definition of a critical process. This
definition was “those processes which have the greatest effect on the
attainment of Corporate Strategic Goals”.

Craig Huxley Page 310


A Member of the Centre for Information Technology Innovation

The thesis then described how in order to validate and extend the results of
the focus group, a Delphi study was conducted with more participants, and
used to identify the most critical functional area within the ASP industry from
the participant’s perspective. We then described in detail the research team’s
experience in the case studies and the lessons learnt, which resulted in
version five of the ten step targeting methodology. The thesis then concludes
that the targeting methodology has been tested and validated and that the
objectives of the research project have been achieved. The remaining tasks
of the thesis are to describe the limitations and findings of the research
project.

The next section describes the findings of the research project.

8.1 Findings

The results of this research project, is a methodology for identifying and


selecting processes for improvement. We have developed and tested the
methodology, which has five factors:
1. Dependency, which is the effect of failure of the process;
2. Probability of Failure, which is the probability that a process will fail; and
3. Impact, which is the relative effect of a process on the strategies and
business goals of the organisation.

The top ‘few’ of the ranked critical processes are then assessed for the value
they might provide to the organisation if they are improved. Of those
processes assessed for cost/benefit, only those having a positive value for
the organisation are assessed for the probability of successful improvement.
Process improvement projects with greatest positive value for the
organisation, the greatest chance of successful improvement and the largest
criticality rating are the selected processes for improvement.

Craig Huxley Page 311


A Member of the Centre for Information Technology Innovation

4. Cost/Benefit is the positive or negative result of comparing the expected


costs of improving a process against the expected benefits resulting from
improving the process
5. Probability of Success is the assessment of the risk that the organisation
can successfully improve the process.

Co
Cos stenefit
t/B

s
es
cc
Cr

Su
iti
ca

of
ende
epen
Dep enycy
dnc

lit

.
ob
y
re

Pr
lu
ai
Im

fF
pa

.o
ct

ob
Pr

Figure 67- Five factors for identifying and selecting a critical process

The figure identifies each of the five factors shown in green triangles within
the larger triangles. The first triangle indicates that the results of the
assessments of three factors, impact, dependency and probability of failure
combine to form criticality (shown in the red triangle).

The research outputs are significant for the research community, the
business community and the organisations that participated in the research.
The significance of the research to the research community is that the
research team have developed a new methodology which extends the work
of Hammer & Champy (1993) and Davenport (1993). We have taken
knowledge from the nuclear, pharmaceutical and automotive industries and
applied this production and manufacturing focus to business processes. This
aligns with the objectives of the larger combined research project called the
Process-oriented Administration of Enterprise Systems. We have
successfully combined case studies with action learning to test and validate
the targeting method and shown that a short research project can achieve
useful practical results.

Craig Huxley Page 312


A Member of the Centre for Information Technology Innovation

The significance of the research project to the business community is that the
methodology provides a practitioners guide to using the method. The
methodology improves on the existing approaches to process selection,
ensuring that money invested in process improvement is provided the largest
possible value. This point alone might save organisations considerable time
and money by ensuring that the processes improved offer the greatest
benefit through improvement to the organisation. We have also shown that
the targeting methodology has further uses, extending the identification of
critical processes to enabling organisations to identify critical objectives and
critical strategies within their strategic plans. The targeting method also
causes organisations to take a process view of them-selves, and understand
the effect of change within critical processes.

The use of the mind mapper software to visualise BSC’s was considered by
the participants to be extremely useful in communicating the information
contained in their BSC maps. The use of the software combined with the
layering of the information contained in the BSC maps produced for each
organisation is a significant improvement in the communication of strategic
plans to others in the organisation.

The research is significant to each organisation as it enabled them to identify


those processes within the area of focus which was critical to achieving the
goals which they had suggested. The last case study participant also used
the results of the project to identify those parts of their strategic plan which
were more critical than others. The participants of the focus group and Delphi
study phases of the research now have a greater understanding what is
defined as a critical process and which processes others from within their
industry perceive to be critical and why.

The significance of the results is improved by the significance of the


participants of the research project. Three of the participants make up the top
three IS outsourcing companies in Australia. A fourth participant is the ninth
largest outsourcing company in Australia and the largest Government IS

Craig Huxley Page 313


A Member of the Centre for Information Technology Innovation

outsourcing provider in Australia. One participant is a top four global


consulting company and another, a highly competitive developer and
outsourcing provider of enterprise system software. These and other
participants contributed to this research project, providing the study with data
that contributes significantly to the value of the outcomes.

The ability to identify critical processes reliably will enable process re-
engineering and process improvement projects to focus on those processes
which have the greatest impact upon the goals of an organisation and thus
greatest value to the organisation. This may lead to competitive advantage in
an environment in which knowledge is considered to be a large part of any
competitive advantage.

Knowledge of the critical processes in this area will enable further research to
focus on improving these processes. This might provide the greatest benefits
to the industry. The dynamic environment of the application hosting and
application service provision market-place demands that organisations adapt
quickly to market forces and knowledge that supports effective and efficient
change within organisations may lead to competitive advantage.

The limitations of the research are discussed in the next section.

8.2 Limitations

This research project does not intend to verify the achievement of business
benefit, document the change to an organisation due to its use of the
targeting methodology or determine the long term benefits of applying the
targeting methodology. These questions might be answered in a longer and
larger study. As for external validity, the study has focused on the AHC and
ASP industries within Australia. For the case studies of this project the
participants come from a multinational outsourcing provider, a
commercialised government owned outsourcing provider and a specialist
niche product outsourcing provider.

Craig Huxley Page 314


A Member of the Centre for Information Technology Innovation

Validity and Reliability


The focus of the research project was on the application hosting and
application service provision industry in Australia. The study’s external
validity is limited in this respected and we are unable to state that the results
indicate that the targeting methodology would be as useful in the wider
information technology industry or in fact the wider business community.
Although, it is the research team’s belief that the targeting methodology
would be useful to companies in many industries and regions.

The testing of the targeting methodology was carried out in three case
studies within the Australian ASP industry. These organisations are typical of
organisations within IS outsourcing market in Australia. Although each of
these organisations has services within the application service provision
industry, each organisation has a number of differences. Their size in the
Australian market differentiates them considerably, with the smallest having
less than fifty employees, the next largest seven hundred plus employees,
and the largest in excess of four thousand five hundred employees.

The organisations differ in the breadth of the services they provide, which
follows a similar pattern to their size. That is, CSC the largest organisation,
and third largest IS outsourcer in Australia, offer a large range of information
technology outsourcing to many industries within the business community,
which includes Systems Integration, Information Systems (IS) Outsourcing,
and Network Integration & Management. (IS outsourcing includes: data
centre computing services, distributed or client/server computing, local and
wide area network operations, help desk operations, application development
and maintenance, and related consulting and systems integration activities.)

Citec are the ninth largest IS outsourcing organisation in Australia and are a
wholly Queensland Government owned commercial organisation. Their
service range is similar to CSC involving such activities as IS Outsourcing,
Network and Desktop Outsourcing, Content Delivery and Processing
Services.

Craig Huxley Page 315


A Member of the Centre for Information Technology Innovation

REALTECH are a much smaller organisation in Australia with services based


around the improvement of and maintenance for SAP R/3 applications and
the supporting hardware. They also develop and sell software which is used
to automate these services. REALTECH and CSC are similar in that they are
both private companies in the Australian market (no shares traded on the
Australian stock-market) with parent companies based in Germany and The
United States respectively.

The targeting methodology benefits from these differences, as the result


enables the method to be considered generalisable, due to the diversity of
these organisations. These are: that the targeting methodology is flexible
enough to apply in a diverse range of organisations from relatively small
niche market operations to very large and diverse organisations, the method
is equally applicable in both government and private organisations and
across organisations which offer a range of services.

Being able to generalise due to diversity also means that the results are also
limited due to the lack of similarities between these organisations. A larger
study of seven to thirteen more organisations would support greater
generalisability.

The issue of reliability of the study is a limitation of the study also, as many of
the ‘decisions’ made by the participants in assessing the five factors for the
targeting methodology are based on heuristics. The use of heuristics is
contextual and as they are only partly based on fact will result in different
‘decisions’ dependent on the person, time and experiences when the
decisions are made.

The issue of internal validity is one which is supported by the multiple case
study approach. The type and number of changes made to the targeting
methodology after each of the case studies was reduced with each case
study. The reduction in changes to the ten step model in each of the revision
phases of the action learning cycles supports the claim that the inferences

Craig Huxley Page 316


A Member of the Centre for Information Technology Innovation

made by the research team concerning the validity of the ten step method
were appropriate.

The issue of construct validity was a limitation of the study in that although
the research team used the ten step model of the targeting method a the
focus of the investigation within each cycle of action learning we did not
demonstrate that the changes made to the ten step model were indeed
related to the experience of the research team during the case studies.
These changes were based on our perceptions of logic, connecting
experience by cause and effect.

Methodology
Within the methodology itself there are some further limitations. Is the BSC
the most efficient and effective method by which impact can be assessed?
Are the three factors of criticality: impact, probability of failure and
dependency independent? This study did not address either of these
questions and should be the focus of future research. The final three steps of
the targeting method were not completely validated in the case studies as
none of the participants implemented these last three steps, (step 8,
assessment of cost/benefit, step 9, assessment of probability of successful
process improvement and step 10, the selection of which process to improve
first).

The project did not seek to verify the use of the Balanced Scorecard as the
‘best’ method for identifying critical processes. There are many methods from
which the research team might have chosen to identify and assess the
impact factor. The research team consider that a methodology which enables
cause & effect to be shown within a visual view of a strategic plan would be a
suitable method within the targeting methodology.

The case studies were not used to provide quantitative results on the use of
the BSC. Rather, the case studies used the BSC method as a part of the
identification of critical processes methodology. The results of the case

Craig Huxley Page 317


A Member of the Centre for Information Technology Innovation

studies suggest that the BSC is a suitable method to use although the third
case study indicates that it may not be completely appropriate in all
applications.

It is not possible to develop a generic BSC as each organisation will have


unique versions of the BSC which reflects their skills, resources, position in
the market and view of the environment. The ability to develop generic
solutions to the BSC aspect of the targeting method would reduce the risk of
poor quality decision making.

The selection of critical processes within organisations still relies on the skills
and knowledge of the management team to identify ‘cause and effect’ factors
which link these processes to the objectives and ultimately the strategies of
the company. It may be that there are no ‘wrong’ cause & effect links only
that some are stronger than others. The use of the anchoring and adjustment
style heuristics to assess many of the factors within the targeting method is
also a limitation.

These limitations of the method suggest that it may be very difficult to judge
the quality of the decision making in the targeting method.

The quality of the decision making in the targeting method, although a


limitation of the method, is the same type of decision making that presently
occurs in many organisations. The objective of the research project was to
improve the way organisations selected processes for improvement. The
research team suggests that the targeting method improves the current
approaches to selection of critical processes for improvement, which includes
such approaches as the ‘squeaky wheel’ process, the ‘fad’ process and the
‘pin the tail on the donkey’ process.

The ten step methodology is limited in being able to claim that it is a fully
validated and tested method. The case studies did not manage to complete
the full implementation of the method and thus we can not say from this
experience that the method is fully tested. It may be that, the organisations

Craig Huxley Page 318


A Member of the Centre for Information Technology Innovation

involved in the case studies, did not intend to move to a process


improvement phase and thus did not require the final three steps. Although
the final three steps are untested in practice, the participants did not suggest
that these steps were impractical or superfluous, instead suggesting that at
this point in time they did not intend spending the time and thus money on
assessing the processes for selection.

Application to Practice
A further caveat if not a limitation of the methodology, is the problems
encountered within each of the case studies. These were defined as time,
information, understanding, communication and the visual representation of
the results. Time, cost, scope and communication are constants within which
organisations are expected to operate during all projects (Schwalbe,
2002). These constants should ideally be given careful consideration before
a project is undertaken. A goal of further research might be to focus on
appropriate controls for these constants. Practitioners need to be aware that
these constants impact the application of the Targeting Methodology.

The next section discusses the possible future or follow-on research in this
area.

8.2.1 Future/Follow-on Research

Follow-on research which would enhance the validity of the targeting method
would be to track the improvements and usage of the outputs of the method.
It might gain considerable benefit from being applied in a greater number of
different organisations within the application hosting and application service
provision industries. This can also be extended to organisations in other
industries. Finding organisations that are intending to undertake process
improvement projects would also ensure that all ten steps of the methodology
are fully tested.

Craig Huxley Page 319


A Member of the Centre for Information Technology Innovation

Future research might look at the effectiveness of the BSC method as the
most appropriate method to assess the relative contribution of a process to
organisational goals.

Researchers may also undertake to examine the cause and effect aspect of
the BSC to acquire greater understanding of the validity of cause and effect
links and how these are arrived at.

Benefits realisation projects might also gain from research into the
application of the targeting methodology in this type of project. Future
research may also investigate the use of the combined BSC and targeting
method to improve Business/IT alignment and IT governance.

Further research is required in the area of the weightings of the first three
factors, (impact, dependency and probability of failure) to understand the
relationships between threshold values and criticality.

8.2.2 Epilogue

Within the three organisations that participated in the implementation of the


targeting method, one company has taken the output from the identification
process and used these as the basis of their next strategic plan. Two
companies are using the cause & effect maps to communicate to their teams
the importance of some processes over others and to indicate the path along
which they believe the team should travel to accomplish their goals.

Reviewers’ Comments
This sub-section of the epilogue reflects on three comments made by the
reviewers of the theses.

Linking Processes to Internal Process Objectives


The literature review proposed that processes should be linked to internal
process objectives and this is as stated by one of the reviewers “not based

Craig Huxley Page 320


A Member of the Centre for Information Technology Innovation

on theory”. This fact is acknowledged in section 4.1.5 as quoted “We


considered that there was a logical link between internal process perspective
objectives and processes, and could not find guidance from the literature due
to the innovative nature of the method. The case studies indicate that this
view may not be useful for all BSC’s.”
The third case study also identified that it may not be useful to only link
processes to internal process objectives as there appeared to be no logical
reason preventing processes from linking to all objectives within a cause and
effect map (See paragraph 2, page 288). The research team learnt from the
experiences within the third case study that our earlier assumption was
incorrect and that it also appears appropriate to link processes to any type of
objective in a cause and effect map. Future research is required to identify
the most appropriate set of perspective objectives to use in the Targeting
Methodology.

Possible methods of linking processes to objectives and goals


One reviewer suggested that Requirements Engineering (RE) may be a
suitable approach for linking processes to goals. This section examines that
approach.
Ross and Schoman’s 1977 seminal paper in Lamsweerde (2000), defines RE
as the
“careful assessment of the needs that a system is to fulfill. It must
say why a system is needed, based on current or foreseen
conditions, which may be internal or an external market. It must
say what system features will serve and satisfy the context. And
it must say how the system is to be constructed” (van
Lamsweerde 2000).

The purpose of the RE approach is to identify new processes for the new
system. The purpose of the Targeting Method is to identify the most
appropriate processes from an existing set of processes for use in business
process improvement. Thus the purposes of these two approaches are quite
different. Although the domain of RE uses similar terms to those used in the

Craig Huxley Page 321


A Member of the Centre for Information Technology Innovation

Targeting Method, the intent of the RE method is to identify processes which


should be part of a new application or software system (Russell 2003).

Russell (2003) states that the CREWS L’Ecritoire approach to Goal-based


RE has been positively assessed by many industry organisations for a
number of applications. The CREWS L’Ecritoire approach uses scenarios to
act as an aid to identifying processes which then inform the original goals in a
cyclical fashion to produce a list of processes which can viably be
implemented as a new software system. Although the scenario approach in
RE may assist in the mapping of goals to strategies and strategies to
objectives this may not be entirely appropriate given the purpose of the
Targeting Method. Further research in this area may assist practitioners in
the 1st stages of the Targeting Method.

Multiple criteria decision making


The third point concerns the relationship of the Targeting Method to multiple
criteria decision making.

"Multi-criteria decision analysis (MCDA) is a quantitative approach to


evaluating decision problems that involve multiple variables (criteria)"
(MicroImages 2003). Most proponents (Geldermann and Zhang 2001; Bouri,
Martel et al. 2002; Hallerbach and Spronk 2002; MicroImages 2003) of the
MCDA approach suggest that it is a good choice of tool when the “decision -
maker(s) feel the decision is too large and complex to handle intuitively,
because it involves a number of conflicting objectives, or involves multiple
stakeholders with diverse views" (Mabin and Beattie 1999)(p5).

The decision environment in which the Targeting Method is useful is not


totally quantitative as there are intangible elements (qualitative decisions)
which need to be considered. Thus, the use of MCDA approaches should be
treated with caution. It is possible to use these approaches in analysing the
quantitative elements, but only to further inform the entire process.

Craig Huxley Page 322


A Member of the Centre for Information Technology Innovation

As Mabin and Beattie (1999) state, "there is a desire for a formal procedure
so that the decision making process can be made open and transparent, and
is seen to be fair" (p5) and use this as justification for using the MCDA
approach. The Targeting Method is transparent and open, and allows the
business to select an approach which provides the outcomes required, and
deals effectively with both quantitative and qualitative data and can
incorporate MCDA within the method. Future research is required to define
precisely which aspects and approaches should be used to identify the
quantitative and qualitative factors.

8.3 Summary

This chapter has described and discussed the limitations of the research
project such as: construct validity, internal validity, external validity and
reliability of the results. We have also described the targeting methods
limitations due to the use of the heuristics style of assessment and the
selection of the BSC as an appropriate tool for assessing impact.

We have suggested that the targeting method is an improved approach to


identifying and selecting critical processes for improvement over the current
approaches used today. We have described the significance of the research
to the business community, the research community and the organisations
which were involved.

Finally we have described possible future research that might extend and
improve the targeting methodology such as: aligning business and IT
strategically and operationally and to improve the governance of IT.
(De Lone and McLean 1992; Ballantine, Bonner et al. 1998; Garrity and
Sanders 1998; Ishman 1998; Jennex, Olfman et al. 1998; Myers, Kappelman
et al. 1998; Seddon, Staples et al. 1999)

Craig Huxley Page 323


A Member of the Centre for Information Technology Innovation

9 References

Abell, D. F. (1999). "Competing Today While Preparing for Tomorrow." Sloan Management
Review 40(3): 73-81.
Andersen, H., I. Cobbold, et al. (2001). Balanced Scorecard implementation in SMEs:
reflection on literature and practice. 4th SME-SME International Conference, Allborg
University, Denmark, 2GC Active Management.
Australian Bureau of Statistics (2000). Computing Services Industry. Canberra, Australian
Bureau of Statistics: 16.
Australian Quality Council (2001). Australian Business Excellence Framework. Sydney,
Australian Quality Council.
Australian Research Council (2003). National Competitive Grants Programme.
Commonwealth of Australia, 5th Feburary 2003.
http://www.arc.gov.au/ncgp/linkage/international/default.htm.
Bach, N., P. Calais, et al. (2000). "Marketing residential grid-connected PV systems using a
Balanced Scorecard as a Marketing tool." Renewable Energy 22(1-3): 211-216.
Baker, W. (2001). Personal communication. C. Huxley. Brisbane.
Ballantine, J., M. Bonner, et al. (1998). Developing a 3-D model for Information Systems
Success. Information Systems Success Measurement, series in Information
Technology Management, Idea Group Publishing: pp. 46-59.
Bartett, J., D. Hinley, et al. (2001). Best Practice for Service Delivery ITIL. London, The
Stationery Office.
Beauchamp, G. R. (1999). "Competing on the Edge: The Five "S" Levels of Enterprise
Health." Physician Executive 25: 25-29.
Becker, J., M. Rosemann, et al. (2000). Guidelines of business process modeling. Business
process management: models, techniques and empirical studies. W. v. d. Aalst, J.
Desel and A. Oberweis. Berlin, Springer: 30-49.
Benbasat, I., D. K. Goldstein, et al. (1987). "The Case Research Strategy in Studies of
Information Systems." MIS Quarterly 11(3): 18.
Bender, K. W. C., Jose E; Cirone, John F; Klaus, Kenneth P; Leahey, Lee C; Menyhert,
Tibor D (2000). "Process Innovation-- Case studies of critical success factors."
Engineering Management Journal 12(4): 17-24.
Benson, K. (2002). Australia IS Outsourcing Services - IS Outsourcing Competitive analysis,
IDC: 63.
Berkhout, M., R. Harrow, et al. (2001). Best Practice for Service Support ITIL. London, The
Stationary Office.
Biscotti, F. and R. Fulton (2002). Infrastructure and Applications Worldwide Software Market
Definitions. Gartner Dataquest, November 2002.

Craig Huxley Page 324


A Member of the Centre for Information Technology Innovation

Botten, N. and J. McManus (1999). Competitive Strategies for Service Organisations.


Basingstoke Hampshire UK, Macmillan Press Ltd.
Bouri, A., J. M. Martel, et al. (2002). "A multi-criterion approach for selecting attractive
portfolio." Journal of Multicriteria Decision Analysis 11(4-5): 269.
Bowles, N. (1999). "The Delphi technique." Nursing Standard; Harrow-on-the-Hill 13(45): 32-
.
Brancato, C. K. (1995). New Performance Measures- A research report. New York, The
Conference Board.
Bunning, C. (1993). Placing Action Learning and Action Rsearch in Context. Action Learning
at Work. A. Mumford. Hampshire, Gower: 25-29.
Carnegie, G., S. Jones, et al. (1998). Accounting: Financial and Organisational Decision
Making. Melbourne, McGraw-Hill Book Company of Australia Pty Ltd.
Carpinetti, L. C. R., M. C. Gerlamo, et al. (2000). A conceptual framework for deployment of
strategy-related continuous improvements. The TQM Magazine, Bedford. 12: 14.
Chan, R. and M. Rosemann (2001). Integrating Knowledge into Process Models - A Case
Study. 12th Australasian Conference on Information Systems, Coffs Harbour.
Chang, S.-I. and G. G. Gable (2000). A critique of the Delphi method in the context of IS key
issues studies. 4th Pacific Asia Conference on Information Systems, Hong Kong,
Publishing Technology Center, The Hong Kong University of Science and
Technology.
Chang, S.-I., G. G. Gable, et al. (2000). A Delphi examination of public sector ERP
implementation issues. 21st International Conference on Information Systems,
Brisbane, Qld.
Chang, S.-I., E. Smythe, et al. (2000). Methods for distilling IS key issues using a Delphi
approach. 11th Australasian Conference on Information Systems, Brisbane, Qld.
Conn, H. P. and G. S. Yip (1997). "Global transfers of critical capabilities." Business
Horizons 40(1): 10.
Connected Corporation (2001). ONE OF AUSTRALIA'S LEADING STORAGE SERVICE
PROVIDERS OFFERS CONNECTED TLM TO AUGMENT COMPREHENSIVE
OFFERINGS. Connected Corporation, 24 Dec 2002.
http://www.connected.com/news/releases/2001releases/Hitachi.htm.
Corporate Services Agency (2002). csa.qld.gov.au. Corporate Services Agency, 23 Dec
2002. http://www.csa.qld.gov.au/default.htm.
CorVu (1999). The New Balanced Scorecard. CorVu Corporation, August 2001.
http://www.corvu.com.
Dalkey, N. C. and O. Helmer (1963). "An Experimental Application of the Delphi Method to
the Use of Experts." Management Science 9(3): 458-467.
Davenport, T. H. (1993). Process Innovation Reengineering Work through Information
Technology. Boston, MA., Harvard Business School Press,.

Craig Huxley Page 325


A Member of the Centre for Information Technology Innovation

Davidson, P. and R. W. Griffin (2000). Management: Australia in a Global Context. Brisbane,


John Wiley & Sons Australia Ltd.
De Lone, W. and E. R. McLean (1992). "Information Systems Success: The Quest for the
Dependent Variable ." Journal of Information Systems Research 3(1): pp. 60-95.
De Loof, L. (1997). Information Systems Outsourcing Decision Making: A Managerial
Approach. London, Idea Group Publishing.
Deloitte Touche Tohmatsu (2002). Our History. Deloitte Touche Tohmatsu, 24 Dec 2002.
http://www.deloitte.com.au/.
Dervitsiotis, K. N. (1999). "How to attain and sustain excellance with performance-based
process management." Total Quality Management 10(3): 309-326.
Dettmer, H. W. (1998). Breaking the Constraints to World-Class Performance. Milwaukee,
ASQ Quality Press.
Dollinger, M. (1999). Entrepreneurship: Strategies and Resources. Upper Saddle River, NJ,
Prentice-Hall.
Dunn, P. (2001). "ASPs: The lure of task outsourcing." Hospitals and Health Networks;
Chicago 75(8): 38-40.
Fern, E. F. (2001). Advanced focus Group research. Thousand Oaks, California, SAGE
Publications.
Frenzel, C. W. (1999). Management of Information Technology. Scarborough, Course
Technology ITP.
Garrity, E. J. and G. L. Sanders (1998). Dimensions of IS Success. Information Systems
Success Measurement, series in Information Technology Management, Idea Group
Publishing: 13-45.
Gatfield, T., M. Barker, et al. (1999). "Measuring Communication Impact for University
Advertising Materials." Corporate Communications; Bradford 4(2): 73-79.
Geldermann, J. and K. Zhang (2001). "Software review: "Decision Lab 2000"." Journal of
Multicriteria Decision Analysis 10(6): 317.
Gendron, M. (1997). Using the Balanced Scorecard. august 2001.
Germain, C. J. (2000). "Balance Your Project." Strategic Finance 81(11): 46-52.
Grant, D. (2002). "A Wider View of Business Process Engineering." Communications of the
ACM 45(2): 7.
Hacker, M. E. and P. A. Brotherton (1998). "Designing and Installing Effective Performance
Measurement Systems." IIE Solutions Magazine(August 1998): 18.
Hallerbach, W. G. and J. Spronk (2002). "The relevance of MCDM for Financial Decisions."
Journal of Multicriteria Decision Analysis 11(4-5): 187.
Hammer, M. and J. Champy (1993). Reengineering the Corporation: A Manifesto for
Business Revolution. New York, Harper Business.
Hax, A. C. and N. S. Majluf (1984). Strategic Management: an integrative perspective.
Englewood Cliffs,New Jersey, Prentice-Hall.

Craig Huxley Page 326


A Member of the Centre for Information Technology Innovation

Hines, T. (2000). "An evaluation of two qualitative methods (focus group interviews and
cognitive maps) for conducting research into entrepreneurial decision making."
Qualitative Market Research; Bradford 3(1): 9.
Hoover's Online (2002). Mincom Ltd. Hoover's Inc, 23 Dec 2002.
http://www.hoovers.com/co/aag/6/0,2658,100706,00.html.
Huxley, C., C. Taylor, et al. (2002a). Changing the Four Perspectives of the Balanced
Scorecard to Suit IT. Enhancing Business & Government Capability, Beechworth,
Victoria, Australian New Zealand Academy of Management.
IBIS World (2002). Computer Consultancy Services, IBIS World Australia: 32 pages.
IBIS World (2002). Mincom Ltd. Seek.com, 23 Dec 2002.
http://www.seek.co.nz/if.asp?loc=ibis.
IBM Global Services Australia (2002). Corporate Information. IBM Global Services Australia,
23 Dec 2002. http://www-8.ibm.com/services/au/corpinfo.html.
Ishman, M. (1998). Measuring Information Systems Success at the Individual Level in Cross-
Cultural Environments. Information Systems Success Measurement, series in
Information Technology Management, Idea Group Publishing: pp. 60-78.
Ittner, M. G. and D. F. Larker (1998). "Innovations in Performance Measurement: Trends and
Research Implications." Journal of Management Accounting 10: pgs 34.
Jeffery, D., A. Ley, et al. (2000). "Delphi survey of opinion on interventions, service principles
and service organisation for severe mental illness and substance misuse problems."
Journal of Mental Health: Abingdon 9(4): 371-384.
Jennex, M., L. Olfman, et al. (1998). An Organizational Memory Information Systems
Success Model: An Extension of De Lone and Mclean's IS Success Model. 31st
Annual Hawaii International Conference on Systems Sciences, Hawaii.
Jewels, T. (2001). Benefits Realisation from IT Projects. QUT, December 2001.
https://olt.qut.edu.au/it/ITN251/admin/index.cfm?fa=displayPage&rNum=85277&pTy
pe=curr.
Kaplan, R. S. (1998). "Innovation Action Research: Creating New Management Theory and
Practice."
Kaplan, R. S. and M. Bower (2000). Balanced Scorecard Report. Executive-Team
leadership: Insight, Experience and Ideas for Strategy-Focused Organisations.
Boston, Harvard Business School.
Kaplan, R. S. and D. P. Norton (1992). "The Balanced Scorecard: Measures that Drive
Performance." Harvard Business Review 70(1): p9.
Kaplan, R. S. and D. P. Norton (1996). The Balanced Scorecard. Boston Massachusetts,
Harvard Business School Publishing.
Kaplan, R. S. and D. P. Norton (1996a). "Renaissance Solutions, Using the Balanced
Scorecard as a Strategic management System." Harvard Business Review(1): p75.

Craig Huxley Page 327


A Member of the Centre for Information Technology Innovation

Kaplan, R. S. and D. P. Norton (2001). The Strategy Focused Organisation: How Balanced
Scorecard companies thrive in the new business environment. Boston, Harvard
Business School Publishing Corporation.
Kaplic, B. and P. Bernus (2001). "Business Process Modelling in Industry: The powerful tool
in enterprise management." Computers in Industry 47(3): 299-318.
Kinetic, L. (1999). FMECA Process. Kinetic, LLC, Accessed 06/06/2002.
http://www.fmeca.com/ffmethod/definiti.htm.
Kolb, D. (1984). Experiental Learning: Experience as the source of learning and
development. Englewood Cliffs NJ, Prentice Hall.
Lin, C. and G. Pervan (2001). IS/IT Investment Evaluation and Benefits Realisation Issues in
a Government Organisation. 12th Australasian Conference on Information Systems,
Coffs Harbour, ACIS.
Lingle, J. H. and W. A. Schiemann (1996). "From Balanced Scorecard to IS Measurement."
Management Review 85(3): 56-61.
Lipe, M. G. and S. E. Salterio (2000). "The Balanced Scorecard: Judgmental Effects of
Common and Unique Performance Measures." Accounting Review 75(3): 16.
Littler, K., P. Aisthorpe, et al. (2000). "A New Approach to Linking Strategy Formulation and
Strategy Implementation: An example from the UK Banking Sector." International
Journal of Information Management 20(6): pgs 411-428.
Mabin, V. J. and M. Beattie (1999). A Practical Guide to Multi-Criteria Decision Analysis: A
Workbook Companion to V.I.S.A. Wellington, Victoria University of Wellington.
Mabin, V. J. and J. Davies (2002). Framing an Ethical Dilemma: A case for questioning
identity. Wstern Decsion Sciences Institute 31st Annual Meeting, Las Vegas.
Mabin, V. J., S. Forgeson, et al. (2001). "Harnessing Resistance: Using the theory of
constraints to assist change management." Journal of European Industrial Training
25(2): 168-191.
Mabin, V. J., M. Menzies, et al. (2001a). "Public Sector Priority Setting Using Decision
Support Tools." Australian Journal Public Administration 60(2): 44-59.
Mah, M. (1999). "High-definition software measurement." Software Development 7(5).
Makins, M., Ed. (1992). Collins English Dictionary. Sydney, Harper Collins Publishers.
Mankiw, N. G., S. P. King, et al. (1999). Principles of Macroeconomics. Sydney, Harcourt
Australia.
Martinson, M., R. Davison, et al. (1998). "The Balanced Scorecard: A foundation for the
strategic management of information systems." Decision Support Systems 25(1):
pgs 71-88.
McFarlan, F. W. and J. L. McKenney (1983). Corporate Information Systems Management;
The Issues Facing Executives. Homewood, Illinois, Richard D. Irwin INC.
McGill, I. and L. Beaty (2001). Action Learning: A guide for professional, management &
educational development. London, Kogan Page Limited.

Craig Huxley Page 328


A Member of the Centre for Information Technology Innovation

Melnyk, S. A. (2000). "Value-Driven Process Management: Using Value to Improve


Processes." Hospital Materiel Management Quarterly; Rockville 22(1).
Mendoza, C. and R. Zrihen (2001). "Measuring Up." Financial Management: pgs 26-30.
Meredith, J. R. and S. J. Mantel Jr (2000). Project Management: A Managerial Approach.
Brisbane, John Wiley and Sons.
MicroImages, I. (2003). Multi-Criteria Decision Analysis. MicroImages Incorporated,
September 2003. www.microimages.com.
Microsoft Business Solutions (2001). Navison Axapta, Knowledge Management. Microsoft
Business Solutions, March 2001.
Mincom (2002). Annual Report. Brisbane, Mincom: 52.
Mitchell, D., C. Coles, et al. (1999). The 2,000 Percent Solution. Free Your Organisation
from "Stalled" Thinking to Achieve Exponential Success. New York, AMACOM.
Moreton, P. H. (1997). Business Process reengineering: seperating Fact from Myth. School
of Management, faculty of business. Brisbane, Queensland University of
Technology.
Morgan, D. L. (1988). Focus Groups as Qualitative Research. Newbury Park, California,
SAGE Publications.
Murphy, K. and R. Russell (2002). Use the Balanced Scorecard to Execute CRM Strategy.
GartnerG2 and Balanced Scorecard Collaborative, 03/09/2002.
http://www.gartnerg2.com/research/rpt-0702-0117.asp.
Myers, B. L., L. A. Kappelman, et al. (1998). A comprehensive model for assessing the
quality and productivity of the information systems function: toward a theory for
information systems assessment. Information Systems Success Measurement,
series in Information Technology Management, Idea Group Publishing: pp. 94-121.
Nel, J. (1997). IT investment management : a case study and survey of effects of IT useage
on organisational strategic performance. 3rd Pacific Asia Conference on Information
Systems, Brisbane, Qld, Faculty of Information Technology, Queensland University
of Technology.
Nickols, F. (2000). The Accountability Scorecard. A Stakeholder-Based Approach to
'Keeping Score'. June 2001. http://home.att.net/~nickols/articles.htm.
Olve, N.-G. and A. Sjostrand (2002). The Balanced Scorecard. Oxford, Capstone Publishing
(a Wiley company).
O'Neil Jr., H. F. and E. M. Bensimon (1999). "Designing and Implementing an Academic
Scorecard." Change 31(6): p241-255.
O'Neill, J., B. B. Small, et al. (1999). The Use of Focus groups within a Participatory Action
Research Environment. Using Qualitative Methods in Psychology. M. Kopala and L.
A. Suzuki. Thousand Oaks, California, SAGE Publications: 237.
O'Neill, P. and A. S. Sohal (1997). Business Process Reengineering: Application and
Success in Australia. Melbourne, Monash University: 30.

Craig Huxley Page 329


A Member of the Centre for Information Technology Innovation

Parmalat (2001). Corporate. Web Raven, 23 December 2002.


http://www.pauls.com.au/page.cfm?pageID=7.
Paul Budde Communication (2002). Australia - MASP - ASP. Sydney, Paul Budde
Communication Pty Ltd: 11.
Pedler, M., J. Burgoyne, et al. (1986). A Manager's Guide to Self-development. Maidenhead,
McGraw-Hill.
Peters, T. J. and R. H. J. Waterman (1982). In Search of Excellence: Lessons from
America's best-run companies. New York, Warner Books.
Porter, M. (1980). Competitive Strategy: Techniques for analyzing industries and
competitors. New York, The Free Press.
Poullot, G., D. Doutriaux, et al. (2001). ICSBEP Guide to the Expression of Uncertainties,
Nuclear Energy Agency Nuclear Science Committee: 44.
Queensland University of Technology (2003). Human research Ethics. Queensland
University of Technology, March 3rd 2003.
www.qut.edu.au/draa/ethics/human/forms#exempt.
REALTECH, A. (2002). Investor Relations. REALTECH, AG, 18 November 2002.
http://www.realtech.de/international/default.asp.
REALTECH, A. (2002a). Annual Report. Walldorf, Germany: 83.
Rosemann, M. (2000). Using Reference Models within the Enterprise Resource Planning
Lifecycle. Brisbane, Queensland University of Technology: 24.
Rosemann, M. (2002). Process-oriented Administration of Enterprise Systems. Queensland
University of Technology, 5 Febuary 2003.
http://www.citi.qut.edu.au/research/ism/projects/admin_enterprise.jsp.
Rosemann, M. (2002a). "Application Reference Models and Building Blocks for Management
and Control."
Rosemann, M. and Wiese (1999). Measuring the Performance of ERP Software - a
Balanced Scorecard Approach. 10th Australiasian Conference on Information
Systems,, Wellington NZ, ACIS.
Rosemann, M. and J. Wiese (1999). Management of ERP-software using balanced
scorecard. Symposium on IT Balanced Scorecard, Antwerp, Technologisch Instituut
Antwerp.
Rummler, G. A. and A. P. Brache (1991). "Managing the White Space on the Organisation
Chart." Supervision 52(5): 6.
Russell, N. (2003). Brisbane, Queensland University of Technology: 15-24.
Sauer, C. (2001). Project Management in the Australian Construction Industry. Q. Faculty of
IT. Brisbane.
Saulnier, C. F. (2000). "Groups as data collection method and data analysis technique:
Multiple perspectives on urban social work education." Small Group Research 31(5):
607-627.

Craig Huxley Page 330


A Member of the Centre for Information Technology Innovation

Saxe, J. G. (1963). "The Blind Men and the Elephant; John Godfrey Saxe's version of the
famous Indian legend. New York, Whittlesey House.
Scheer, I. (2000). ARIS Toolset, Scheer.
Scott, J. E. and I. Vessey (2000). Implementing Enterprise Resource Planning Systems: The
Role of Learning from Failure. Information Systems Frontiers, Kluwer Publishers.
Seddon, P. B., S. Staples, et al. (1999). "Dimensions of Information Systems Success."
Communications of the Association for Information Systems (CAIS) 2(20).
Sedera, D., M. Rosemann, et al. (2000). Using Performance Measurement Models for
Benefit Realisation with Enterprise Systems- The Queensland Government
Approach, Brisbane, Queensland University of Technology.
Sedera, W. (2001). Critical success factors of process modeling for enterprise systems.
Doctoral Consortium, 5th Pacific Asia Conference on Information Systems. Seoul.
Sedera, W., M. Rosemann, et al. (2001a). Process modelling for enterprise systems: factors
critical to success. 12th Australasian Conference on Information Systems.
Seely, R. J., H. V. Hutchins, et al. (1999). "Defining critical variables in well-characterised
biotechnology processes." Biofarm 12(4): 33-36.
Sisco, M. (2002). Prioritze your recommendations to focus clients' efforts. Techrepublic.com,
18/03/02.
http://www.techrepublic.com/article_guest.jhtml?id=r00720020313MS01.htm&rcode=&rcod
e=&page=3.
Smyth, R. W. (2001). Challenges to successful ERP use. 9th European Conference on
Information Systems, Bled, Slovenia.
Smyth, R. W. (2001a). CASE Success Factors; An Evaluation of Factors Involved in the
Successful Adoption of a Computer Aided Software Engineering (CASE) Package.
Faculty of IT. Brisbane, Queensland University of Technology: 350.
Sofaer, S., B. Kreling, et al. (2001). "Family Members and Friends Who Help Beneficiaries
Make Health Decisions." Health Care Financing Review; Washington 23(1): 16.
Stamatis, D. H. (1995). Failure Mode and Effect Analysis: FMEA from Theory to Execution.
Milwaukee, Wisconsin, Quality Press.
Stewart, G. (2001). The use of Yin's case study research approach as a means to
stimulating emancipatory action research. 7th Americas Conference on Information
Systems, Boston, MA.
Stewart, G. (2002). Research discussions on defining a critical process. C. Huxley.
Stewart, G. and G. G. Gable (2001a). "Emancipating IT leadership: an action research
program." Journal of information technology cases and applications 3(2): 7-20.
Stewart, W. E. (2001). "The Balanced Scorecard for Projects." Project Management Journal
32(1): pgs 38-53.
Stratecast Partners (2001). Obtaining Business Value:A Critical Assessment of the
TeleManagement Forum’s Telecom Operations Map (TOM) and eTOM. Frost &
Sullivan, 13/3. http://www.stratecast.com/pdf/tom_toc.pdf.

Craig Huxley Page 331


A Member of the Centre for Information Technology Innovation

Talwar, R. (1994). Business Process re-engineering: Myth and reality. London, Kogan Page
Ltd.
Taylor, C. (2003). Development of the High-Level Model for IT Service Delivery. C. Huxley.
Brisbane.
Thorp, J. (1998). The Information Paradox. Realizing the Business Benefits of Information
Technology. New York, McGraw-Hill.
Timbrell, G., N. M. Andrews, et al. (2001). Impediments to inter-firm transfer of best practice
in an enterprise systems context. 7th Americas Conference on Information Systems,
Boston, MA.
Tucker, M. and R. Clark (2000). The implementation of a balanced scorecard (BSC) at the
Royal Bank of Scotland. World Productivity Congress of Management Services.
Tufte, E. R. (2002). Visual Explanations: images and Quantities, Evidence and Narrative.
Cheshire, Connecticut, Graphics Press.
Van der Zee, D. I. (1999). Alignment is not enough: Integrating Business and IT
Management with the Balanced Scorecard. Symposium on IT Balanced Scorecard,
Antwerp, Belgium, Technologish Instituut VZW.
van Lamsweerde, A. (2000). Requirements engineering in the year 00: a research
perspective. Proceedings of the 2000 International Conference on Software
Engineering., Limerick , Ireland, IEEE.
Viljoen, J. and S. Dann (2000). Strategic Management: Planning and Implementing
Successful Corporate strategies. Sydney, Pearson education Australia Pty Ltd.
Walker, R. W. (2000). Assessment of Technical Risks. 2000 IEEE Conference on
Management of Innovation and Technology, IEEE.
Weiss, J. W. and R. K. Wysocki (1992). 5-Phase Project Management: A practical planning
and implementation guide. US, Perseus Books Publishing.
Yin, R. K. (1994). Case Study Research: design and methods. Thousand Oaks California,
Sage Publications Inc.

Craig Huxley Page 332


A Member of the Centre for Information Technology Innovation

10 Appendices

10.1 Appendix 1- BSC Implementation Issues

This section draws on the issues raised in previous sections of the BSC and
has broken these issues into six categories, namely:
1. The Concept
2. Support Methods
3. Skills of the Organisation
4. Changed Perspective Names /Communication
5. Cause & Effect
6. Measures
7. Change Management

The Concept
Robert Kaplan in his 1998 article ‘Innovation Action Research’ says that
evaluation of an implementation should start with identifying if the
organisation has “implemented the concept in an accurate and faithful way”
(Kaplan 1998). Kaplan and Bower (2000) reported two similar issues in
implementation; implementing the concept prematurely (“without the
oversight of the innovators”) when the concept has yet to progress to be of
most benefit to the organisation and a “lack of knowledge of the concept”
(Kaplan and Bower 2000).

There have been many examples of implementation of the original concept


into organisations for which Kaplan and Norton have not tested the BSC
(Lingle and Schiemann 1996; Beauchamp 1999; Rosemann and Wiese
1999; Van der Zee 1999; Bach, Calais et al. 2000; Sedera, Rosemann et al.
2000; Tucker and Clark 2000; Andersen, Cobbold et al. 2001; Microsoft
Business Solutions 2001; Stewart 2001; Murphy and Russell 2002). It is the
research team’s personal experience that organisations have implemented
the BSC with gaps in their knowledge of how the concept works. This might
be remedied by further reading and experience with the concept.

Craig Huxley Page 333


A Member of the Centre for Information Technology Innovation

Mendoza and Zrihen (2001) state that the BSC “has been criticised for the
way its implementation process is structured.” It presupposes that the
organisation will naturally take hold of the ‘key indicators’ and “direct their
efforts towards improving performance around the selected indicators.” The
authors suggest that most of the literature ignores the needs of the
organisation itself and “deny the existence of local specificity” in regard to the
objectives of middle managers (Mendoza and Zrihen 2001). To ignore the
needs of the organisation itself the organisation would need to set goals
which excluded the needs of the organisation. That is, it may focus on
shareholder returns to the exclusion of training, wages and team building
within the entity. As for the local specificity, the BSC should be developed on
many levels of the organisation to ensure that goals are both specific to the
local need and still achieve corporate needs. There are tools which can
support this type of problem solving where two parties have opposing needs
and ultimately similar goals. See (Mabin, Forgeson et al. 2001; Mabin,
Menzies et al. 2001a; Mabin and Davies 2002) for some examples.

Support Methods
Beauchamp (1999) stresses that the BSC is only useful for monitoring the
internal organisation, adding that another method should be used to monitor
the performance of a health unit externally (Beauchamp 1999). This alludes
to a version of the BSC that takes into account only the internal customer and
internal processes. Part of this issue may be the mindset of the users
surrounding the meaning of the names of the perspectives. As suggested in
the section on perspectives it is possible to change these names or even just
the meaning of them to suit particular circumstances.

Nickols (2000) suggests that the BSC may “drive the wrong strategy” blindly
“unless there are other gauges of organisational health and performance”
(Nickols 2000). Kaplan and Norton (1992) stated in their original journal
article that, “not all long-term strategies are profitable strategies” (Kaplan and
Norton 1992) and again in their 2001 book, (The Strategy Focused
Organisation) that strategy needs to be chosen with care as the BSC does

Craig Huxley Page 334


A Member of the Centre for Information Technology Innovation

not test strategies for correctness (Kaplan and Norton 2001). The BSC does
not claim to develop strategy but to support the implementation and
measurement of achievement of the strategies selected.

Skills of the Organisation


Germain (2000) states that subjective analysis "will help you and your
managers differentiate between strategic and tactical project opportunities,"
and that a further implementation issue is concerned with management’s
knowledge of what is correct strategy (Germain 2000). Incorrect strategy
would replicate the proverbial computer system with ‘garbage in and garbage
out.’ Germain (2000) is not the only writer who raises concern for this area of
the BSC implementation. Littler, Aisthorpe, Hudson and Keasey (2000)
concluded that, “the implementation role of BSC needs to be supported by a
defined strategy formulation process” (Littler, Aisthorpe et al. 2000). These
issues are similar in solution to that of Nickols (2000) (under support
methods). Kaplan (2000) stated that strategy development was still an art
and not yet a science (Kaplan and Bower 2000). This is further supported by
Viljoen and Dann (2000) who suggest that strategy is unique to every
organisation and as such companies can only be guided by their experience
and that of others when they make strategic choices (Viljoen and Dann
2000). The solution here then appears to be similar to that used for cause
and effect decisions. Apply an anchoring and adjustment style heuristic (as
discussed in tools for assessing factors) which is based on reliable facts and
as much experience as possible (Davidson and Griffin 2000).

Anderson, Cobbold and Lawrie (2001) from 2GC Active Management state
that “insights from Kaplan & Norton (1996) and Epstein and Manzoni (1997)
suggest that to be successful implementation requires changes to other
management processes” which are impacted by the BSC (Andersen,
Cobbold et al. 2001). Anderson et al (2001) do not describe which
management processes are impacted by the BSC (Andersen, Cobbold et al.
2001). In addition as the implementation of the BSC is really a project, good
project management practices would ensure that during feasibility studies
that these impacts on other management processes would be taken into

Craig Huxley Page 335


A Member of the Centre for Information Technology Innovation

account and solutions provided. The use of problem solving methods (Mabin,
Forgeson et al. 2001) and well developed cause and effect trees might be a
solution to these issues.

Kaplan and Bower (2000) also state that many BSC implementations suffer
from poor project management (Kaplan and Bower 2000). Ensuring that
correct project management practices and an experienced project manager
is part of the team would be a partial solution to this problem. When
conducting feasibility studies for the project the project would need to ensure
that the organisation had the resources, time and skills to implement the BSC
successfully.

Changed Perspective Names /Communication


O’Neil, Bensimon, Diamond and Moore (1999) changed the names of some
of the perspectives of their BSC “to more accurately reflect the needs of
readers and users” (O'Neil Jr. and Bensimon 1999). Sedera, Rosemann and
Gable (2000) state that a further fifth perspective might be added according
to the specific demands and circumstances of an organisation (Sedera,
Rosemann et al. 2000).

Bach, Calais and Calais raise similar issues with their proposal to use the
BSC as a “personal Scorecard in which to qualify and quantify the feasibility
of investing” in an alternate power source (Bach, Calais et al. 2000). Their
paper explains that while the names of the four perspectives remain the
same the context is changed to meet the needs of a private consumer.

The “importance of clearly articulated objectives” is also highlighted (Lingle


and Schiemann 1996) along with the need to define clearly the meanings and
definitions of terms and perspectives (Olve and Sjostrand 2002). The names
need to represent the major focuses of the entity from the point of view of the
entity.

The issues for implementation are highlighted by Lingle and Schiemann


(1996) and Olve and Sjostrand (2002) in their comments that users of the

Craig Huxley Page 336


A Member of the Centre for Information Technology Innovation

BSC must clearly define and articulate the meanings and definitions of terms
within the BSC. Clearly changing the names of perspectives is a
communication issue that would be served by ensuring that meanings and
definitions are correct for that entity and well communicated to all users.

Cause and Effect


Mitchell, Coles and Metz (1999) relate the story of the six blind men and the
elephant to describe the results of poor cause and effect judgements
(Mitchell, Coles et al. 1999).

The story is about six blind men encountering an elephant


for the first time. Each man, seizing on the single feature
of the animal, which he appeared to have touched first,
and being incapable of seeing it whole, loudly maintained
his limited opinion on the nature of the beast. The
elephant was variously like a wall, a spear, a snake, a
tree, a fan or a rope, depending on whether the blind men
had first grasped the creature's side, tusk, trunk, knee, ear
or tail. (Saxe 1963)

The story highlights the issue that people make judgements based on their
perception of their environment and experiences. Where possible then we
should ensure that those using cause and effect linkages base decisions on
a collection of experiences and knowledge and not just one person.

Davidson and Griffin (2000) describe cause and effect linkages as being
based in some part on heuristics. They state that heuristics "are rules of
thumb that may simplify decision making, but that may also lead to less
optimal decisions" (Davidson and Griffin 2000)(p310).

There are four types of heuristic judgements:


1. Judgmental heuristics- habitual strategies used to asses a situation
2. Availability heuristics- reflect the influence of things such as recent
memory, perception or imagination

Craig Huxley Page 337


A Member of the Centre for Information Technology Innovation

3. Representative heuristics- involve the comparison of perceived similar


items or occurrences
4. Anchoring and Adjustment heuristics- involves assessing an event by
starting with an initial piece of fact or evidence and then postulating from
there (Davidson and Griffin 2000).
Of these four types Davidson and Griffin (2000) suggest that #4, anchoring
and adjustment style heuristics, is the better choice when resorting to a
heuristic style of decision.

Mitchell, Coles and Metz (1999) offer advice with cause and effect stating
that over time cause and effect links can be correlated statistically and
practitioners should remove things which are not linked (Mitchell, Coles et al.
1999).

The issues with ‘cause & effect’ would then be concerned with ensuring the
validity of the cause & effect links over time and that cause and effect links
are developed in the anchoring and adjustment heuristic style with as much
input from experienced and knowledgeable people as is possible.

Measures
As mentioned previously the facet of measures within the BSC is not one that
will be used by the targeting methodology. For this reason this section will not
examine the implementation issues for measures.

Change Management
Kaplan (1998) says that “personal or organisational resistance or Managers
that fail or refuse to act on the signals” from their BSC causes
implementation failure (Kaplan and Bower 2000). He does not define what
‘implementation failure’ constitutes though presumably it is financial decline
over time. Change management is a much explored area of management
and competent managers will understand the need for change management
in any project such as implementing a BSC. Dependant on the amount and
effect of any change Viljoen and Dann (2000) suggest an organisation
should;

Craig Huxley Page 338


A Member of the Centre for Information Technology Innovation

1. Prepare for change- unfreeze the organisation and gain acceptance for the
need for change
2. Make the change- ensure that employees adopt the change
3. Consolidate the change- refreeze the organisation by providing support for
the new systems and needs. (Viljoen and Dann 2000)
In addition an organisation can use change agents and build partnerships
that assist the change. Finally leaders of change should not forget the soft
issues of change, such as fear of the unknown. A Dilbert principle is useful
for describing the soft issues of change. “People hate change because
compared to their previous state it makes them dumber” (Dilbert Principle).
That is, when change is initiated new knowledge is brought into an
organisation and the amount of knowledge previously held by employees is
now less.

Summary
The issues raised and solutions provided for the six categories are;
1. The Concept
There have been many examples of implementation of the original concept
into organisations for which Kaplan and Norton have not tested the BSC
(Lingle and Schiemann 1996; Beauchamp 1999; Rosemann and Wiese
1999; Van der Zee 1999; Bach, Calais et al. 2000; Sedera, Rosemann et al.
2000; Tucker and Clark 2000; Andersen, Cobbold et al. 2001; Microsoft
Business Solutions 2001; Stewart 2001; Murphy and Russell 2002). The
targeting methodology is not necessarily an expansion of the concept but an
abbreviation. The needs of the organisation and local specificity (Mendoza
and Zrihen 2001) would be catered for by ensuring that goals for the BSC do
not exclude the needs of some parts of that organisation. As for the local
specificity, the BSC should be developed on many levels of the organisation
to ensure that goals are both specific to the local need and still achieve
corporate needs. There are tools which can support this type of problem
solving where two parties have opposing needs and ultimately similar goals.
See (Mabin, Forgeson et al. 2001; Mabin, Menzies et al. 2001a; Mabin and
Davies 2002) for some examples.

Craig Huxley Page 339


A Member of the Centre for Information Technology Innovation

2. Support Methods
The issues uncovered under support methods were more closely connected
to those of communication and to the need to develop strategy with care as
the BSC does not test strategies for correctness (Kaplan and Norton 2001). It
is not within the scope of the research project to identify the multiple
approaches to developing correct strategy. The approach here is to ensure
that those people from whom this information is sought are seen to have the
right skills and experience for developing strategy.

3. Skills of the Organisation


The project team must asses the skills and knowledge of the organisation in
regards to that of the use and implementation of the BSC, the development
of strategy and the project teams project management skills. The use of
problem solving methods (Mabin, Forgeson et al. 2001) and current project
management tools (business case, Gantt charts & project software) will
assist in this area (Weiss and Wysocki 1992; Meredith and Mantel Jr 2000;
Jewels 2001).

4. Changed Perspective Names /Communication


The issues for changing the names of perspectives and communication are
that the project team must clearly define and articulate the meanings and
definitions of terms within the BSC. The project team should ensure that clear
consistent and quality communication occurs in the lead up to, during and
after the project.

5. Cause & Effect


The issues with ‘cause & effect’ are concerned with ensuring the validity of
the cause & effect links over time and that cause and effect links are
developed in the anchoring and adjustment heuristic style with as much input
from experienced and knowledgeable people as is possible.

6. Measures
Defined as out of scope to this research project

Craig Huxley Page 340


A Member of the Centre for Information Technology Innovation

7. Change Management
Undertake an appropriate assessment of the amount of change required for
the project and initiate suitable phases of unfreeze, change and refreeze.
Ensure that those affected understand of the need to change and that these
people are given an opportunity to provide input to the project and thus take it
onboard as their own. Communication should be sufficient to keep those
affected well informed and timed to provide time for responses.

Implementation Issue from the Literature


1 (Andersen, Cobbold Requires changes to other management processes
et al. 2001) which are impacted by the BSC
2 (Bach, Calais et al. The context of perspectives is changed to meet the
2000) needs of a private consumer
3 (Davidson and Griffin Heuristics "are rules of thumb that may simplify
2000) decision making, but that may also lead to less
optimal decisions" (p310) when developing cause &
effect
4 (Germain 2000) Knowledge of what is correct strategy
5 (Kaplan 1998) Implemented the concept in an accurate and faithful
way”
6 (Kaplan and Bower Implementing the concept prematurely
2000)
Lack of knowledge of the concept
BSC implementations suffer from poor project
management
Personal or organisational resistance
7 (Lingle and Importance of clearly articulated objectives
Schiemann 1996)
8 (Littler, Aisthorpe et The implementation role of BSC needs to be
al. 2000) supported by a defined strategy formulation process
9 (Mendoza and Zrihen Ignores the needs of the organisation or deny the
2001) existence of local specificity
10 (Mitchell, Coles et al. Over time cause and effect links should be correlated
1999). statistically and should remove things which are not
linked
11 (Nickols 2000) May drive the wrong strategy

12 (Olve and Sjostrand Define clearly the meanings and definitions of terms
2002). and perspectives
13 (O'Neil Jr. and Changed the names of some of the perspectives of
Bensimon 1999) their BSC “to more accurately reflect the needs of
readers and users”

Craig Huxley Page 341


A Member of the Centre for Information Technology Innovation

14 (Sedera, Rosemann Fifth perspective added according to the specific


et al. 2000) demands and circumstances of an organisation

Table 55- Issues in implementation of a BSC

The table lists the issues taken from the literature. The previous summary
has provided possible solutions to these issues and these will be taken into
account in the targeting methodology. Some of these issues are the result of
poor project management, communication and understanding of the BSC
concept.

Successful implementation requires good project management, appropriate


knowledge of the concept and how to develop successful strategy. It also
demands an attention to the needs of the organisation including allowance
for local needs within large organisations. It should also include appropriate
consideration to communication. This communication must deal with the
definitions and meanings of terms within the BSC and also the cause and
effect linkages. The change management needs of the implementation were
considered to have little impact on the targeting methodology as it could be
undertaken by a small group. The correct approach to cause and effect will
help to improve these linkages though the research project does not intend to
verify the validity of the linkages over time. Organisations using the BSC
should approach it with as much knowledge as possible of how the system
was intended to be used making changes sparingly. The success of the
concept is impacted by the skills and abilities of the management team in
defining the inputs and managing the implementation process.

Craig Huxley Page 342


A Member of the Centre for Information Technology Innovation

10.2 Appendix- 2 Benefits Documents IS Management Research Group


Centre for IT Innovation, QUT
Level 5, 126 Margaret St
Brisbane, 4000
AUSTRALIA
Ph: +61 7 3864 9476
Fax: +61 7 3864 9390
Email: c.huxley@qut.edu.au
ci.taylor@qut.edu.au

“Critical, Best-Practice Processes in ERP Service Delivery”

Signing this consent form indicates that the participant involved in the
research and the authorised person (if different) from your organisation have
read and understood the information package included with this form.

It also indicates that this person or persons;


¾ Has had any questions answered to their satisfaction;
¾ Understands that if they have any additional questions they can contact
the research team;(see contact details above and included in the
information package)
¾ Understands that they are free to withdraw at any time, without comment
or penalty; (Withdrawal does not negate your right to see and approve
items for publication which may be of a sensitive nature)
¾ Has read, understood and initialled accompanying information pages
¾ Understands that the Focus Group sessions will be audio-taped
¾ Understands that they can contact the Secretary of the University Human
Research Ethics Committee on 3864 2902 if they have concerns about
the ethical conduct of the project;
¾ Agrees to participate in the focus groups
YES NO
(Please circle)

Date / / 2002

Participant Name
(Signature) (Printed Name)

Authorised Officer
(If different to participant) (Signature) (Printed Name)

Participant Organisation

Chris Taylor Craig Huxley


IT Researcher IT Researcher

Craig Huxley Page 343


A Member of the Centre for Information Technology Innovation

10.3 Appendix 3 Focus Group – Information Pages


Focus Group Specific Information
• Focus groups consist of a ‘round-table’ meeting with researchers and
industry peers for the discussion of the relevant topic.
• Focus group sessions will be audio-taped. The resultant tapes will only be
used by and made available to the researchers to accurately record and
interpret the focus group outputs.
• Data collected from the focus groups will be:
o thoughts, ideas and knowledge on questions raised including;
ƒ Definition of critical process
ƒ Examples of critical processes
ƒ Potential improvements to processes
ƒ Feedback on proposed process models
• Up to 7 focus group sessions are planned (time and research outcome
dependant), each will run for 2-3hrs
• A non-binding commitment to the first 5 focus groups is requested (lack of
participation due to other commitments is understandable)
• Participation is completely voluntary and each participant has the right to
withdraw at any time, by simply email or telephoning either of the
researchers. Data collected from withdrawn organisations will, where
practical, be removed and destroyed on request.
• Data collected from focus groups will be subject to the confidentiality contract
as signed by the participant and the researchers
Project Description
This research project (supported by REALTECH AG and the Australian Research
Council), is aimed at developing knowledge in the field of critical, best practice
processes in ERP (Enterprise Resource Planning) service provision.

The output of this research project can be categorised into two areas. Firstly, a
business strategy implementation tool, the Balanced Scorecard will be used to
identify strategically critical processes in an Application Hosting Centre. Secondly,
best practice process models will be developed for selected critical processes.

The foundation of this research is a number of industry focus groups, with approx.
9-12 participants, used to identify critical processes and then to develop best
practice process models. Case studies will be used, focusing on implementing a
process targeting method, to develop individual lists of critical processes and action
research, involving implementation, benchmarking and validation of developed best
practice models will be done.

Potential participants for this research where identified as being an ERP service
provider, or participant with knowledge in this area.

Benefits for Industry Participants


Hosting
A benefit from this research will be the potential improvements in the quality and
efficiency of your Application Hosting. The ability to identify critical processes will

Craig Huxley Page 344


A Member of the Centre for Information Technology Innovation

provide an effective method of aligning process improvement projects to process


and organisational objectives.
Communication
Targeting of specific critical processes will display a level of knowledge and
understanding that will be apparent to prospective clients.
Developed internal process models will provide a common language for discussion
with clients (and potential clients) interested in the quality your operations, hence
communicating your commitment to quality service provision.
Academic Involvement
This research team brings considerable expertise and provides detailed researched
knowledge and consultancy at no cost to industry participants. Joint research
projects also provide an excellent base for further industry involvement in
developing leading edge methods and techniques.
Exposure
The opportunity to be involved with this collaboration provides a unique setting for
frank discussion with industry peers and academic experts. Also, where appropriate
(and with your approval), companies involved in this project will be identified in
publications submitted to trade journals and magazines.
Real World Intelligence
Involvement in this research project, which is based upon the knowledge and
experience of practitioners within this industry, will provide opportunities for
participants to gain tangible, valuable knowledge.
Risks
Dissemination of Sensitive Information
This research will be dealing with potentially sensitive information and data. There is
a risk that this information could be used by others to the detriment of the provider, if
not controlled within binding confidentiality agreements. The procedures for dealing
with information gathered from industry participants in this research are outlined in
the Confidentiality Agreement. The risk also exists in the focus group forum that
sensitive information may be provided to parties (other focus group participants)
whose actions cannot be controlled by QUT or the researchers.
Confidentiality
See: Individual “Confidentiality Agreement”
Voluntary Participation
Participation is completely voluntary and each participant has the right to withdraw
at any time, by simply email or telephoning either of the researchers. Data collected
from withdrawn organisations will, where practical, be removed and destroyed on
request.
Feedback
Feedback will be provided via access to relevant publications by the researchers,
through informal means such as email and telephone as reasonably requested, and
through a formal focus group wrap up meeting
Further Questions
Chris Taylor and Craig Huxley

Ph: +61 7 3864 9476


Fax: +61 7 3864 9390
Email: c.huxley@qut.edu.au
ci.taylor@qut.edu.au

Craig Huxley Page 345


A Member of the Centre for Information Technology Innovation

10.4 Appendix- 4 Ethics Documentation

Informed Consent Package: Focus Groups – Page 1 of 3


IS Management Research Group
Centre for IT Innovation, QUT
Level 5, 126 Margaret St
Brisbane, 4000
AUSTRALIA
Ph: +61 7 3864 9476
Fax: +61 7 3864 9390
Email: c.huxley@qut.edu.au
ci.taylor@qut.edu.au

“Critical, Best-Practice Processes in ERP Service Delivery”

Signing this consent form indicates that the participant involved in the
research and the authorised person (if different) from your organisation have
read and understood the information package included with this form.

It also indicates that this person or persons;


¾ Has had any questions answered to their satisfaction;
¾ Understands that if they have any additional questions they can contact
the research team;(see contact details above and included in the
information package)
¾ Understands that they are free to withdraw at any time, without
comment or penalty; (Withdrawal does not negate your right to see and
approve items for publication which may be of a sensitive nature)
¾ Has read, understood and initialled accompanying information pages
¾ Understands that the Focus Group sessions will be audio-taped
¾ Understands that they can contact the Secretary of the University
Human Research Ethics Committee on 3864 2902 if they have
concerns about the ethical conduct of the project;
¾ Agrees to participate in the focus groups
YES NO

(Please circle)

Date / / 2002

Participant Name
(Signature) (Printed Name)

Authorised Officer
(If different to participant) (Signature) (Printed Name)

Participant Organisation

Chris Taylor Craig Huxley


IT Researcher IT Researcher

Figure 68- Informed Consent for Focus Groups page 1 of 3

Craig Huxley Page 346


A Member of the Centre for Information Technology Innovation

Informed Consent Package: Focus Groups – Page 2 of 3

Focus Group – Information Pages


Focus Group Specific Information
• Focus groups consist of a ‘round-table’ meeting with researchers and
industry peers for the discussion of the relevant topic.
• Focus group sessions will be audio-taped. The resultant tapes will only
be used by and made available to the researchers to accurately record
and interpret the focus group outputs.
• Data collected from the focus groups will be:
o thoughts, ideas and knowledge on questions raised including;
ƒ Definition of critical process
ƒ Examples of critical processes
ƒ Potential improvements to processes
ƒ Feedback on proposed process models
• Up to 7 focus group sessions are planned (time and research outcome
dependant), each will run for 2-3hrs
• A non-binding commitment to the first 5 focus groups is requested (lack
of participation due to other commitments is understandable)
• Participation is completely voluntary and each participant has the right
to withdraw at any time, by simply email or telephoning either of the
researchers. Data collected from withdrawn organisations will, where
practical, be removed and destroyed on request.
• Data collected from focus groups will be subject to the confidentiality
contract as signed by the participant and the researchers
Project Description
This research project (supported by REALTECH AG and the Australian
Research Council), is aimed at developing knowledge in the field of critical,
best practice processes in ERP (Enterprise Resource Planning) service
provision.

The output of this research project can be categorised into two areas. Firstly,
a business strategy implementation tool, the Balanced Scorecard will be used
to identify strategically critical processes in an Application Hosting Centre.
Secondly, best practice process models will be developed for selected critical
processes.

The foundation of this research is a number of industry focus groups, with


about 9-12 participants, used to identify critical processes and then to develop
best practice process models. Case studies will be used, focusing on
individual Balanced Scorecard analysis, to develop individual lists of critical
processes and action research, involving implementation, benchmarking and
validation of developed best practice models will be done.

Potential participants for this research where identified as being an ERP


service provider, or participant with knowledge in this area.
Figure 69-Informed Consent for Focus Groups page 2 of 3

Craig Huxley Page 347


A Member of the Centre for Information Technology Innovation

Informed Consent Package Focus Groups page 3 of 3

Benefits for Industry Participants


Hosting
A benefit from this research will be the potential improvements in the quality and
efficiency of your Application Hosting. The ability to identify critical processes will
provide an effective method of aligning process improvement projects to process and
organisational objectives.
Communication
Targeting of specific critical processes will display a level of knowledge and
understanding that will be apparent to prospective clients.
Developed internal process models will provide a common language for discussion
with clients (and potential clients) interested in the quality your operations, hence
communicating your commitment to quality service provision.
Academic Involvement
This research team brings considerable expertise and provides detailed researched
knowledge and consultancy at no cost to industry participants. Joint research
projects also provide an excellent base for further industry involvement in developing
leading edge methods and techniques.
Exposure
The opportunity to be involved with this collaboration provides a unique setting for
frank discussion with industry peers and academic experts. Also, where appropriate
(and with your approval), companies involved in this project will be identified in
publications submitted to trade journals and magazines.
Real World Intelligence
Involvement in this research project, which is based upon the knowledge and
experience of practitioners within this industry, will provide opportunities for
participants to gain tangible, valuable knowledge.
Risks
Dissemination of Sensitive Information
This research will be dealing with potentially sensitive information and data. There is
a risk that this information could be used by others to the detriment of the provider, if
not controlled within binding confidentiality agreements. The procedures for dealing
with information gathered from industry participants in this research are outlined in
the Confidentiality Agreement. The risk also exists in the focus group forum that
sensitive information may be provided to parties (other focus group participants)
whose actions cannot be controlled by QUT or the researchers.
Confidentiality
See: Individual “Confidentiality Agreement”
Voluntary Participation
Participation is completely voluntary and each participant has the right to withdraw at
any time, by simply email or telephoning either of the researchers. Data collected
from withdrawn organisations will, where practical, be removed and destroyed on
request.
Feedback
Feedback will be provided via access to relevant publications by the researchers,
through informal means such as email and telephone as reasonably requested, and
through a formal focus group wrap up meeting
Further Questions
Chris Taylor and Craig Huxley

Ph: +61 7 3864 9476


Fax: +61 7 3864 9390
Email: c.huxley@qut.edu.au
ci.taylor@qut.edu.au

Figure 70-Informed Consent for Focus Groups page 3 of 3

Craig Huxley Page 348


A Member of the Centre for Information Technology Innovation

The three page document seen as Figure 68 is indicative of the requirements


for the project from the ethics committee. The first page is the formal consent
page and it ensures that each participant is aware of the ethical issues
concerning withdrawal, complaints, questions and formal acceptance to
participate. The following two pages describe the focus group study
specifically, the entire combined project and the benefits of participation.
The major issue raised by the participants was that of confidentiality. We
were asking them to discuss their business (in most cases) in front of their
competitors during the focus group session. In order to provide for the
confidentiality of participants in the case studies a list of choices for treating
possible confidentiality was provided for participants.

Item # Item Explanation


1 Approvals for publications Any publication from which entities not involved
with the research could reasonably identify the
(Applicable to all research
participants involved in the research will be
and results gathered)
forwarded to the identifiable parties for approval
before publication.
2 Sanitisation of Data Data collected will be edited to remove any
information that could lead to the identification of
(Applicable to all research
the involved parties prior to publication unless
and results gathered)
approval is provided by the identifiable parties.
3 Voluntary Participation and All participation in this research is completely
self-censorship voluntary. There is no compulsion to disclose any
sensitive information, particularly within the focus
(Particularly applicable to
group meetings.
focus groups)
4 Individual Meetings Individual meetings can be arranged in conjunction
with the focus groups.
(Applicable to Focus Groups.
This option is available but Individual meetings can be arranged instead of
not encouraged) participation in the focus groups.
5 Option for Embargo on A stop on the public release of the Master’s thesis
release of thesis for a set time frame (up to 2 years) would limit the
readership of the publication to:
(Applicable to all research
and results gathered) ¾ Researcher (Huxley)
¾ Supervisors (Rosemann and Stewart)
¾ External Markers (2 selected persons with
expert domain knowledge)
This will be offered as an option to be exercised if
and when any industry participant feels that the
information to be presented in the thesis is of
sensitive nature.

Table 56- Options for treating confidential information

Craig Huxley Page 349


A Member of the Centre for Information Technology Innovation

It was not necessary to use item 5 the embargo on the thesis, as the
research team ensured that the data was sanitised for the Delphi study and
focus group phases. The pilot study data was also sanitised as was any data
from the three case studies that may have been commercially confidential
and of value to competitors or other parties. No participants chose to use
item 4. Thus items 1, 2, and 3 were the only choices used for this research
project. Any material for publication was provided to participants who might
be identified in some way even if the identification was positive.

Craig Huxley Page 350


A Member of the Centre for Information Technology Innovation

10.5 Appendix 5- IDC Definitions

“Information Systems (IS) outsourcing services involve a long-term,


contractual arrangement in which a service provider takes responsibility for
managing all or part of a client's information systems operations or
department, based on service level agreement. An IS outsourcing contract
usually includes data centre computing services and may also include such
services as distributed or client/server computing, local and wide area
network operations, help desk operations, application development and
maintenance, and related consulting and systems integration activities”
(Benson 2002).

“Applications Outsourcing (AO) is a service wherein responsibility for the


deployment, management and enhancement of a packaged or customised
software application only is transferred contractually to an external service
provider” (Benson 2002).

“Systems Integration (SI) is a process that includes the planning, design,


implementation, and project management of a solution that addresses a
customer's specific technical or business needs. It includes systems and
custom applications development, as well as implementation and integration
of enterprise package software” (Benson 2002).

Craig Huxley Page 351

Potrebbero piacerti anche