Sei sulla pagina 1di 13

A Comparison of Systemwide and

Hospital-Specific Performance
Measurement Tools
Clarence Yap, M.D., associate, McKinsey & Company, New York, New York; Emily
Siu, BSc, analyst. Health Results Team, Ontario Ministry of Health and LongTerm Care, Toronto, Canada; C. Ross Baker, Ph.D., professor. Department of Health
Policy, Management, and Evaluation, University of Toronto, Ontario, Canada; and
Adalsteinn D. Brown, D.Phil, assistant professor. Department of Health Policy,
Management, and Evaluation, University of Toronto, and Lead, Health Results Team,
Ontario Ministry of Health and Long-Term Care

E X E C U T I V E
S U M M A R Y
Balanced scorecards are being implemented at the system and organizational levels
to help managers link their organizational strategies with performance data to
better manage their healthcare systems. Prior to this study, hospitals in Ontario,
Canada, received two editions of the system-level scorecard (SLS)a framework,
based on the original balanced scorecard, that includes four quadrants: system
integration and management innovation (learning and growth), clinical utilization
and outcomes (internal processes), patient satisfaction (customer), and financial
performance and condition (financial). This study examines the uptake of the
SLS framework and indicators into institution-specific scorecards for 22 acute care
institutions and 2 non-acute-care institutions.
This study found that larger (teaching and community) hospitals were significantly more likely to use the SLS framework to report performance data than
did small hospitals [p < 0.0049 and 0.0507) and that teaching hospitals used the
framework significantly more than community hospitals did (p < 0.0529). The
majority of hospitals in this study used at least one indicator from the SLS in their
own scorecards. However, all hospitals in the study incorporated indicators that
required data collection and analysis beyond the SLS framework.
The study findings suggest that SLS may assist hospitals in developing
institution-specific scorecards for hospital management and that the balanced
scorecard model can be modified to meet the needs of a variety of hospitals. Based
on the insight from this study and other activities that explore top priorities for
hospital management, the issues related to efficiency and human resources should
be further examined using SLSs.
For more information on the concepts in this article, please contact Dr. Brown at Adalsteinn.
Brown@utoronto.ca. To purchase an electronic reprint of this article, go to www.ache.
org/pubs/jhmsub.cfm, scroll down to the bottom of the page, and click on the purchase link.

251

JOURNAL OF HEALTHCARE MANAGEMENT 5 0 : 4 JULY/AUGUST 2 0 0 5

ince an initial call for greater


research into the use of the
balanced scorecard framework in
hospital care (Forgione 1997), a
number of hospitals have reported
implementing the framework and
achieving success with its use (Kaplan
and Norton 2001; Curtright, StolpSmith, and Edell 2000). Many
examples of balanced-scorecardlike measurement systems for
individual hospitals and hospital
departments are presented in the
literature (e.g., Curtright, StolpSmith, and Edell, 2000; Harber 1998;
Jones and Filip 2000; MacDonald
1998; Meliones 2000; Rimar and
Garstka 1999; Gordon et al. 1998).
A number of papers also document
the development or diffusion of
the balanced scorecards in hospital
and healthcare systems (e.g., Baker
and Pink 1995; Castaneda-Mendez,
Mangan, and Lavery 1998; Zelman,
Pink, and Matthias 2003; Chan and
Ho 2000; Magistretti, Stewart, and
Brown 2002; Pink and Freedman
1998; Sahney 1998). The balanced
scorecard is used by different types
of healthcare organizations, and
it is modified to reflect each user
group's realities. Documented users
include hospital systems, hospitals,
long-term-care facilities, national
healthcare organizations such as the
Joint Commission on Accreditation
of Healthcare Organizations, and
health departments of local and federal
governments (Zelman, Pink, and
Matthias 2003).
System-level scorecards (SLSs)
typically indicate the performance
of multiple hospital or healthcare

organizations in the dimensions of


learning and growth, internal business
processes, customer, and financial
performance. These performance
dimensions are often adapted in
ways that are specific to the hospital.
However, to what extent SLSs reflect the
strategies of the individual hospitals
is unclear. The strategy of the funders
for SLStypically hospital associations
or payers such as governmentmay
be different, and a common set of
measures for all hospitals may not
include the diversity of an individual
hospital's own strategies. However, the
link between strategy and performance
measurement is critical to the success
of the balanced scorecard (Kaplan and
Norton 2001). A review of the literature
indicates that the most common reason
for the failure of balanced scorecards is
that they are developed by individuals
external to the organization or by
those who are unfamiliar with the
organization's strategy (McCunn 1998).
With SLS, the critical question
is. To what extent do the strategies
of the payer or hospital system
resemble the strategies of the
healthcare organization's individual
hospitals, which supports the use
of a standardized scorecard for the
hospital's specific management needs?
Such a model may be possible in
hospital systems such as the ones in
Ontario, Canada, where the provincial
government provides about 85 percent
of hospitals' funding (Canadian
Institute for Health Information
2003) and where issues such as capital
expansion and the introduction of
high-cost, high-technology care
modalities are largely centrally
coordinated. SLS may have an added
252

A COMPARISON OF PERFORMANCE MEASUREMENT TOOLS

attraction in systems like those in


Ontario, where many small hospitals
(more than 25 hospitals have fewer
than 50 beds) have limited resources
to support the careful development of
successful scorecards (e.g., Kaplan and
Norton 2001).
At the time of this study, acute
care hospitals in Ontario received
one edition of SLSs called the Hospital
Report (Baker et al. 1999). These scorecards, based on the balanced scorecard
framework established by Kaplan and
Norton (1992), included 38 indicators
reported for virtually every hospital
and an additional 21 reported at the
regional level. These indicators covered system integration and change
management (learning and growth),
clinical utilization and outcomes (internal processes), patient satisfaction
(customer), and financial performance
and condition (financial). Sponsored
by the provincial government and the
provincial hospital association. Hospital
Report 1999 enjoyed the voluntary
participation of 89 hospitals out of 129
eligible hospitals, including 12 teaching
hospitals (100 percent of teaching
hospitals), 63 community hospitals (82
percent of community hospitals), and
14 small hospitals (35 percent of small
hospitals) (Baker et al. 1999).
The SLS uptake into and response
from Ontario hospitals over time have
led to an expansion of the Hospital
Report series beyond the acute care hospital sector and into four other sectors:
complex continuing care, rehabilitation,
emergency department care, and mental health. An ongoing excerpt from the
SLS focuses on women's healthcare in
hospitals.
Although the SLS provides a com253

mon measurement tool for hospitals


and is an important component of
both the provincial government's and
hospitals' own accountability policies,
hospitals need to develop their own,
institution-specific scorecards for a
number of reasons. First, a hospital's
scorecard has to reflect only the institution's own strategies. Second, a
hospital's performance data on the
scorecard should be updated more often than is possible with the yearly SLS.
Third, a hospital needs to disaggregate
scorecard data to the business-unit level
and to align incentives and practices
within the hospital. Fourth, institutionspecific scorecards encourage managers
and staff to change their behavior and
activities to achieve corporate strategic
objectives, because the performance
indicators on these institutional scorecards, such as patient satisfaction and
clinical efficiency, show how individuals can directly influence the results
(Oliveira 2001). However, at the time
of the release of the first Hospital Report,
only two Ontario hospitals had published examples of their own scorecards
(Harber 1998; MacDonald 1998), and
a review of hospital performance measurement activities across Canada did
not identify many additional scorecard
examples (Baker et al. 1998).
This article reports on the uptake of
SLS {Hospital Report) into Ontario hospitals' own performance measurement
systems within one year of the SLS's
release. It also explores some of the
hospital characteristics associated with
the uptake of the standardized model.
Figure 1 summarizes the Hospital Report
framework used in 1999.

JOURNAL OF HEALTHCARE MANAGEMENT 50:4

JULY/AUGUST 2 0 0 5

FIGURE 1
Summary of the Hospital Report 1999 Framework

Clinical Utilization and Outcomes


Access
Outcome
Clinical efficiency
Financial Performance and Condition
Financial viability
Efficiency
Liquidity
Capital
Human resources
Patient Satisfaction
Patient satisfaction
System Integration and Management Innovation
Information use
Internal coordination of care
Hospital-community integration

METHODS
In 2000, through a mailing list maintained by the Ontario Hospital Association, we contacted all acute care hospitals in Ontario (n=129), requesting
them to send in a copy of their balanced scorecards, executive dashboards,
or other corporatewide performance
measurement tools developed by the
hospitals. Hospitals could return copies
of this material by fax, e-mail, or mail
to us (the researchers) or to the Ontario Hospital Association. After one
month, we sent a reminder by mail
to all hospitals. Through its member
relations department, the Ontario
Hospital Association also nominated

hospitals that were most likely to


have a hospital scorecard; from the
nominations, we contacted members
of senior management at ten leading
hospitals by phone or in person to
request copies of their performance
measurement framework. We perused
through conference proceedings and
searched Medline for examples of
hospital scorecards. Based on these
follow-ups, we believe that no other
scorecards were being implemented at
hospitals in Ontario at that time.
We entered information from all
returned material into a database and
created a final list of all indicators contained in any of the submitted perfor-

254

A COMPARISON OF PERFORMANCE MEASUREMENT TOOLS

mance measurement tools. To classify


the indicators, one of us conducted initial cataloging and another conducted
a second. Another researcher reconciled
any differences found between the first
and the second indicator classifications.
We coded indicators on each hospital's
scorecard as "same," "similar," or "different" from the indicators on the SLS.
We tallied the number of indicators
on each hospital's scorecard that were
similar to indicators on the SLS according to whether the indicators were the
same, and then we separately tallied
the number using only indicators that
were the same. We made several assumptions when classifying indicators
as the same as or similar to indicators on the SLS. In several instances,
we contacted hospitals to clarify our
interpretations. Rules used to code
indicators are available from us and
are included in Appendix 1.^ We coded
the use of the quadrants included in
the SLS in a similar fashion.
We stratified acute care hospitals
into the three peer groups used in Ontario: community, small, and teaching.
We did not include the two non-acute
hospitals in our tests. We used Fisher's
exact test to compare proportions of
hospitals across peer groups because
this test has a more stringent significance criteria to account for small cell
sizes.

RESULTS
Thirty acute care hospitals (response
rate of 23 percent) and two non-acute
care hospitals responded. Some hospitals returned a range of performance
measurement tools, including corporate
and unit-level balanced scorecards.

draft versions of a scorecard, and executive dashboards with a different


framework from the balanced scorecard. A total of 24 hospitals provided
balanced scorecards, which included 22
acute care hospitals (17 percent of all
acute care hospitals) and 2 non-acute
care hospitals (1 mental health care
hospital and 1 chronic care hospital).
Two more acute care hospitals (2 percent of all acute care hospitals) were
developing balanced scorecards at the
time of the survey, one acute care hospital (0.8 percent) stated that it used a
balanced scorecard but did not provide
evidence and thus was coded as not
using a balanced scorecard, and five
acute care hospitals (4 percent) did not
use the balanced scorecard framework
to communicate performance data.
Follow-up by mail and by phone did
not identify any additional scorecards.
Since completion of the study, we
have not found evidence of additional
scorecards at any Ontario hospitals.
Although the response rate was relatively low (23 percent), we believe that
this percentage represents almost the
entire sample of balanced scorecards
at Ontario hospitals during the study
period. Although we may have worked
from a nearly complete sample of
Ontario hospitals that used balanced
scorecards, the relatively low number
of responses means that differences
across hospitals should be interpreted
with caution because they will rarely be
statistically significant.
Based on 1999 data, an average
of 656 beds were available in teaching hospitals, 265 beds in community hospitals, and 39 beds in small
hospitals. Acute care institutions that

255

JOURNAL OF HEALTHCARE MANAGEMENT 50:4

JULY/AUGUST 2 0 0 5

TABLE 1
Use of the SLS Framework Quadrant Among Acute Care Hospitals with Balanced Scorecards
(n=22)

Community (n=15)
Small (n=2)
Teaching (n=5)
Total number of hospitals

None

At least
1 HR
Quadrant

At least
2 HR
Quadrants

At least
3 HR
Quadrants

At least
4 HR
Quadrants

Only used
the 4 HR
Quadrants

0
1
1
2

15
1
4
20

13
1
4
18

9 (10)
1
4
14 (15)

8
1
4
13

7
0
4
11

used balanced scorecards tended to be


larger; 5 were teaching hospitals (42
percent of all teaching hospitals), 15
were community hospitals (20 percent
of all community hospitals), and 2
were small hospitals (5 percent of all
small hospitals). These differences
across peer groups were significant
using Fisher's exact test and suggest
that larger hospitals were more likely
to have scorecards (teaching versus
smallp < 0.0049; community versus
smallp < 0.0507). Similarly, the two
non-acute hospitals (mental health
hospital and chronic care hospital) had
strong research and teaching missions.
Likewise, use of a framework similar to the SLS framework was significantly more common among teaching
hospitals compared to community
hospitals [p < 0.0529) and small hospitals (p < 0.0079). Twenty acute care
hospitals employed a framework with
at least 1 SLS quadrant, and 13 used at
least all 4 of the SLS quadrants. Table
1 summarizes the formats used by the
22 acute care hospitals that provided
balanced scorecards and shows the
frequency with which hospitals used

quadrants in their own scorecards


that were the same as those in the SLS
framework. The number in parentheses
indicates the number of hospitals that
used the same or similar quadrants.
Thirteen hospitals (59 percent of
acute care hospitals with a scorecard)
employed a similar framework to that
used by the SLS, with at least four
quadrants to represent system integration and managerial innovation, clinical utilization and outcomes, patient
satisfaction, and financial performance
and condition. Two of these hospitals
also added extra domains of indicators that emphasized or added certain
aspects of performance.
One goal of the balanced scorecard is to identify a core set of indicators that can be used to summarize an organization's performance
(Chan and Ho 2000). On average,
hospitals reported 27.3 indicators per
scorecard (range: 13-53). Another
goal of the balanced scorecard is to
provide a clear set of metrics with
which to evaluate an organization's
strategy (Kaplan and Norton 2001).
Thus, the extent to which hospitals

256

A COMPARISON OF PERFORMANCE MEASUREMENT TOOLS

TABLE 2
Number of SLS Indicators Used Among Acute Care Hospitals with Balanced Scorecards (n=22)
1-10
Community
Small
Teaching
Total number of hospitals

11-20

21-30

1 (4)

13 (1)
2 (0)

5 (0)
20 (1)

0 (3)
1 (7)

use similar indicators may reflect the


extent to which hospitals are pursuing
similar strategies in Ontario's largely
single-payer hospital system. Table 2
shows the number of indicators that
were used in the 22 acute care hospitals' scorecards, regardless of whether
they followed the SLS framework. The
number in parentheses shows the number of indicators that were the same or
similar.
Although there were differences in
the number of SLS indicators used, the
majority of acute care hospitals with
a scorecard, regardless of peer group,
used at least one indicator from the
SLS. All hospitals used at least one
similar indicator, and 82 percent of
hospitals used at least one indicator
that was the same as an indicator on
the SLS. Table 3 describes the number
of all indicators, regardless of whether
they are from the SLS or not, and
the number of same SLS indicators
included in the corresponding SLS
quadrants by the 13 acute care hospitals that used an SLS-based framework.
The number in parentheses indicates
the number of indicators that are the
same or similar.
Table 4 shows the total number
of indicators different from the SLS

257

31-40

41 +

1 (2)

0 (2)
0
0
0 (2)

(6)
0 (2)
0 (2)

0 (10)

1 (2)

used by the 24 hospitals that provided


balanced scorecards, regardless of
whether the hospitals followed the
SLS model. The number in parentheses
reflects the number of indicators when
both same and similar indicators were
counted as being the same as the ones
on the SLS.
All of the acute and non-acute care
hospitals with balanced scorecards had
at least one indicator that was different
from those contained in the SLS, and
all had a fairly high average number
of different indicators. This suggests
that hospitals developed their own
scorecards using data beyond those
easily available in the SLS framework to
reflect the difference in their corporate
strategies.
Interestingly, the choice of the
SLS indicators was not consistently
related to the availability of data. Of
the 41 unique clinical utilization and
outcomes indicators, 25 (61 percent)
were based on routinely collected
discharge abstracts; the remainder were
based on other data sources, including
chart abstracts and admission discharge
and transfer systems. In contrast, of the
24 unique financial performance and
condition indicators, 22 were based

JOURNAL OF HEALTHCARE MANAGEMENT

50:4

)ULY/AUGUST

2005

TABLE 3
Mean Number and Range of Indicators Included in Balanced Scorecards That Follow ttie SLS Model
(n=13)
Clinical
Utilization
and
Outcomes

Patient
Satisfaction

Financial
Performance
and
Condition

System
Integration
and Change

Mean number of indicators


in SLS quadrant

7.6; 3-13
(19.7; 3-34)

5.6; 2-10
(8.2; 2-16)

5.5; 1-12
(6.1; 1-12)

5.2; 2-10
(5.5; 2-10)

Mean number of SLS


indicators in quadrant

0.54 (15.3)

0.62 (4.1)

1.5 (2.7)

0.38 (1.3)

TABLE 4
Total Numher of Indicators Different from the SLS Indicators (n=24)
Number of different indicators
Number of hospitals

0
0

on routinely collected financial income


and balance sheet data.

D I S C U S S I O N AND
CONCLUSION
Like hospitals in other jurisdictions,
hospitals in Ontario face tremendous
challenges as they experience mergers,
closures, and funding restraints (Chan
and Lynn 1998). Managers of hospital
systems are increasingly concerned
about measuring and managing organizational performance in an attempt
to remain focused on delivering highquality patient care while maintaining
expenditures within global budgets
that are centrally established. Not surprisingly, a number of hospitals have
adopted balanced scorecards, which is

1-10
3(2)

11-20
9(7)

21-30
9(11)

31 +
3(4)

likely to help improve organizational


performance. This article is consistent
with earlier work by Chan and Ho
(2000), who studied the uptake of the
balanced scorecard model by hospital
executives across Canada. Their study
reveals that most executives supported
the strategic use of the balanced scorecard in management.
It is unclear whether the adoption
of the SLS framework or indicators
resulted from its relevance to individual hospitals or from ease of using
already-available data to measure performance. Hospitals with their own
balanced scorecard were more likely
to be teaching or community hospitals than smaller hospitals. A lack of

258

A COMPARISON OF PERFORMANCE MEASUREMENT TOOLS

resources may have prevented small


hospitals from developing any type of
scorecard. This suggests that a greater
proportion of small hospitals with
scorecards follow the SLS framework
(containing at least four of the SLS
quadrants) because lack of resources
prevents them from further developing
their scorecards. However, the proportion of small hospitals that used
the framework was not significantly
greater than that of larger hospitals,
which may be a result of the small
sample size. It may also result from the
unsuitability of the SLS framework for
small hospital management, but it is
not possible to test this approach using
the available data.
Increased attention on balanced
scorecards may also have influenced
the development of an institutional
balanced scorecard and uptake of the
SLS framework by hospitals. According
to the Hospital Report 2004 Strategic
Priorities Survey, 68 of 123 acute care
hospitals (55 percent) reported that
they had a documented and clearly
articulated accountability framework
(e.g., local balanced scorecard) used
in the planning and evaluation of
internal programs. This suggests that
the balanced scorecard may be increasingly recognized as a valuable
management tool for hospitals since
this study. Further assessments on
the use of the Hospital Report (SLS)
framework will examine the use of
the SLS among Ontario acute care
hospitals over time. Increased or sustained use of the framework may signify its relevance, but research should
further evaluate its relevance to hospital managers.

The relatively strong, early use of a


common set of indicators by nearly a
quarter of hospitals in Ontario suggests
that SLSs may have some relevance in
quasi-market systems, where strong
government regulation and a largely
single-payer system can encourage the
development of similar strategies across
separate organizations. Compared to
other systems, healthcare organizations
in Ontario are not unique in terms of
government intervention in operational
or capital financing and other forms of
regulation, although Ontario may have
greater degrees of such intervention.
These findings suggest that system-level
balanced scorecards may have some
relevance to a number of quasi-market
hospital systems. At a minimum, broad
benchmarking activities that provide
a common set of indicators to a large
number of hospitals may be valuable,
as these organizations design and
implement strategies to respond to
common environmental characteristics.
The early uptake of the SLS framework by acute care hospitals with a
balanced scorecard suggests that the
SLS may be a useful starting point as
hospitals develop their own performance measurement system. A larger
proportion of teaching hospitals used
the framework, suggesting that the
framework may be more relevant to
this group of hospitals than to other
peer groups. The greater use of the Hospital Report among teaching hospitals
may be an artifact, but this is unlikely
because of its statistical significance
one of the few strongly statistically
significant findings in our work.
The use of available data may have
been another factor in the use of the

259

louRNAL OF HEALTHCARE MANAGEMENT 5 0 : 4 JULY/AUGUST 2 0 0 5

framework; however, an average of 19


indicators different from the SLS per
scorecard were found even when indicators similar to the SLS were included.
This suggests that hospitals were able
to collect their own data. Furthermore,
not all the SLS indicators were used.
Regardless, the uptake of SLS indicators
at the local level is interesting and
suggests that there may be a role for
these sorts of cross-organizational
performance measurement tools in
supporting hospital management. This
may result from the fact that the framework reduces data-collection burden,
that hospitals have similar strategies,
or that hospitals may find difficulty in
obtaining performance measurement
experts to develop more appropriate
indicators.

3. Some institutions reorganized indicators to highlight other perspectives such as patient perspective or
institutional perspective. Although
this makes it difficult to work from
the cause-and-effect relationships
embedded in the balanced scorecard framework, it may reflect the
importance of internal and external
communication of needs. It may
also help highlight trade-offs between different groups within the
hospital, such as between staff and
patients.
Likewise, a number of common
differences highlight some useful directions for future scorecards that
describe an entire system. First, as
indicated by 17 of 24 (71 percent)
institution-specific scorecards and in

A number of hospitals in our study


have taken great efforts to ensure a
broad spectrum of indicators to help
align their strategy. Although some of
these hospitals built from indicators
included in the SLS, their indicators
differed from the SLS framework in
three interesting and common ways,
which reflect the strategic and accountability needs of individual hospitals:
1. Some institutions, particularly
teaching hospitals, focused on
research and teaching indicators
such as new research contracts and
medical student education. These
indicators reflect the relative importance of research and teaching at
these institutions.
2. Some institutions looked at measures of community benefit and
related their own performance to
the community they serve.

the Hospital Report 2004 Strategic Pri-

orities Survey as an important area of


focus, entire-system scorecards should
examine human resources issues, which
include recruitment, retention, staff
satisfaction, turnover, and quality of
work life. Second, as the province of
Ontario provides the majority of funds
to hospitals, greater focus should be
placed on efficiency as a key source
of competitive advantage. An example
of such an indicator used by certain
hospitals is turnaround time for clinical
information such as health records,
diagnostic imaging, and labs.
The SLS appears to be an important
tool used by healthcare institutions to
measure their individual performance,
but other measures of performance that
these specific institutions used may
not have been accounted for in this
study. These include consulting reports.

260

A COMPARISON OF PERFORMANCE MEASUREMENT T O O L S

reports that v^ere under development


or were not shared, and individual
unit-level scorecards not known by
contacts at the hospital. Exclusion of
these documents is unlikely to represent a serious bias to this study, but it
must be acknowledged as the use of
a hospital-specific balanced scorecard
may be underestimated in our results.
This study is the first to examine the use of the balanced scorecard
among hospitals. Future assessments of
scorecard use are necessary to examine
the continuing impact on hospitals.
The study also presents a somewhat
contrasting perspective on earlier work
that focuses on individual organization's own scorecards, suggesting
that features of scorecards and other
performance measurement tools may
have some relevance across different
organizations and that joint performance measurement activities should
not be neglected because of differences
in strategy across organizations. Further
work may usefully focus on interventions that may help hospitals integrate
common indicators into institutionspecific scorecards, on factors in healthcare systems that lead to or are associated with common sets of indicators,
and on the growth of the scorecards as
a measurement framework over time.
Within Ontario, ongoing research is
identifying the growth in the scorecard
as a measurement tool across the hospital system and describing similarities
in hospital strategies.
Note
1. To access the appendix, go to www.
ache.org/pubs/jhmsub.cfm and scroll
down.

Acknowledgments
Paula McColgan helped collect information on hospital scorecards. Special
thanks to Carey Levinton for his guidance and assistance in statistical analysis
and interpretation. The Hospital Report Project is supported by the Ontario
Hospital Association and the Ontario
Ministry of Health and Long-Term Care.
Reterences
Baker, G. R., and G. H. Pink. 1995. "A Balanced Scorecard for Canadian Hospitals."
Healthcare Management Forum 8 (4): 7-21.
Baker, G. R., G. M. Anderson, A. D. Brown,
1. McKillop, M. Murray, and G. H. Pink.
1999. Hospital Report '99A Balanced
Scorecard for Ontario Hospitals. Toronto,
ON: Ontario Hospital Association.
Baker, G. R., N. Brooks, G. Anderson, A.
Brown, I. McKillop, M. Murray, and G. H.
Pink. 1998. "Healthcare Performance
Measurement in Canada: Who's Doing
What?" Hospital Quarterly 2 (2): 2 2 26.
Canadian Institute for Health Information.
2003. Hospital Report 2002: Acute Care.
Toronto, ON: Ontario Hospital Association and Government of Ontario.
Castaneda-Mendez, K., K. Mangan, and A. M.
Lavery. 1998. "The Role and Application
of the Balanced Scorecard in Healthcare
Quality Management." Journal for Healthcare Quality 20 (1): 10-13.
Chan, Y. C-L, and S-J K. Ho. 2000. "The Use
of Balanced Scorecards in Canadian Hospitals." Unpublished paper. Michael G.
DeGroote School of Business, McMaster
University, Hamilton, Ontario.
Chan, Y. C-L, and B. E. Lynn. 1998. "Operating in Turbulent Times: How Ontario's Hospitals Are Meeting the Current
Funding Crisis." Health Care Management
Review 23 (3): 7-18.
Curtdght, I. W., S. C. Stolp-Smith, and E. S.
Edell. 2000. "Strategic Performance Management: Development of a Performance
Measurement System at the Mayo Clinic."
Journal of Healthcare Management 45 (1):
58-68.
Forgione, D. A. 1997. "Health Care Financial
and Quality Measures: International Call

261

JOURNAL OF HEALTHCARE MANAGEMENT 50:4

JULY/AUGUST

for a 'Balanced Scorecard' Approach,"


Joumal of Health Care Finance 24 (1): 5 5 58,
Gordon, D,, M. Carter, H, Kunov, A, Dolan,
and E Chapman, 1998. "A Strategic
Information System to Facilitate the Use
of Performance Indicators in Hospitals,"
Health Services Management Research 11
(2): 80-91,
Harber, B, 1998, "The Balanced Scorecard
Solution at Peel Memorial Hospital."
Hospital Quarterly 1 (4): 59-61, 63.
Jones, M. L. H., and S. J. Filip. 2000. "Implementation and Outcomes of a Balanced
Scorecard Model in Women's Services in
an Academic Health Care Institution"
Quality Management in Health Care 8 (4):
40-51.
Kaplan, R. S., and D. P Norton. 1992. "The
Balanced ScorecardMeasures that Drive
Performance." Harvard Business Review 70

in Women's Health: The Women's Health


Report, Hospital Report 2001 Series, A
Canadian Experience." Women's Health
Issues 12 (6): 327-37.
McCunn, P 1998. "The Balanced Scorecard . . . The Eleventh Commandment."
Management Accounting 76 (11): 34-36.
Meliones, J. 2000. "Saving Money, Saving
Lives." Harvard Business Review 78 (6):
57-62, 64, 66-67.
Oliveira, J. 2001. "The Balanced Scorecard:
An Integrative Approach to Performance
Evaluation." Healthcare Financial Management 55 (5): 42-46.
Pink, G. H., and T. J. Freedman. 1998. "The
Toronto Academic Health Science Council Management Practice Atlas." Hospital
Quarterly 1 (3): 26-34.
Rimar, S., and S. J. Garstka. 1999. "The 'Balanced Scorecard': Development and
Implementation in an Academic Clinical
Department." Academic Medicine 74 (2):
114-22.

(1): 71-79.
. 2001. The Strategy Focused Organization. Boston: Harvard Business School
Publishing Corporation.
Macdonald, M. 1998. "Using the Balanced
Scorecard to Align Strategy and Performance in Long Term Care." Healthcare
Management Forum 11 (3): 33-38.
Magistretti, A. I., D. E. Stewart, and A. D.
Brown. 2002. "Performance Measurement

2005

Sahney, V. 1998. "Balanced Scorecard as a


Framework for Driving Performance in
Managed Care Organizations." Managed
Care Quarterly 6 (2): 1-8.
Zelman, W. N., C. H. Pink, and C. B.
Matthias. 2003. "Use of the Balanced
Scorecard in Health Care." Journal of
Health Care Finance 29 (4): 1-16.

P RAC T

A PPLI CA T

Mimi P. Lowi-Young, FACHE, FCCHSE, vice president, Central Canada, and


executive director, Ontario Division, Canadian National Institute for the Blind

ince the inception of balanced scorecards, there has been limited research to
understand the link between strategy and performance measurements as well
as the usefulness and applicability of the indicators to individual institutions. The
popularity of balanced scorecards has increased in the hospital sector over the past
few years. The focus on accountability to flinders (especially when the hospitals are
funded at 85 percent from a single payer) and to the patients and the community
served by the hospital necessitated the development of a framework to measure
performance. The system-level scorecard provides that overall framework, including
the relevant quadrants regardless of the size or type of hospital. The data can also
serve to provide benchmarking opportunities.

262

Potrebbero piacerti anche