Sei sulla pagina 1di 9

Focus On Quality

By Jane Metzger, Emily Welebob, David W. Bates, Stuart Lipsitz, and David C. Classen
doi: 10.1377/hlthaff.2010.0160

Mixed Results In The Safety


HEALTH AFFAIRS 29,
NO. 4 (2010): 655–663
©2010 Project HOPE—
The People-to-People Health

Performance Of Computerized Foundation, Inc.

Physician Order Entry

Jane Metzger (jmetzger2@


ABSTRACT Computerized physician order entry is a required feature for csc.com) is a principal
hospitals seeking to demonstrate meaningful use of electronic medical researcher at CSC Healthcare
in Waltham, Massachusetts.
record systems and qualify for federal financial incentives. A national
sample of sixty-two hospitals voluntarily used a simulation tool designed Emily Welebob is an
independent consultant in
to assess how well safety decision support worked when applied to Indianapolis, Indiana.
medication orders in computerized order entry. The simulation detected
David W. Bates is division
only 53 percent of the medication orders that would have resulted in chief for general internal
fatalities and 10–82 percent of the test orders that would have caused medicine at Brigham and
Women’s Hospital in Boston,
serious adverse drug events. It is important to ascertain whether actual Massachusetts.
implementations of computerized physician order entry are achieving
Stuart Lipsitz is a researcher
goals such as improved patient safety. at Brigham and Women’s
Hospital.

David C. Classen is an
associate professor of
medicine at the University of

M
any people have suggested In this application of clinical decision support, Utah in Salt Lake City, and is
that electronic health rec- physicians are made aware of potential safety also with CSC Healthcare.
ords represent essential infra- issues that can result—for example, when ampi-
structure for the provision of cillin is given to a patient with a known allergy to
safe health care in the United penicillin, or the dose being ordered for a pedi-
States. For several years, the Institute of Medi- atric patient is much higher than the therapeutic
cine, the Leapfrog Group, the National Quality range for a child of this age and weight. Prescrib-
Forum, and other national groups concerned ing errors such as these can lead to anaphylaxis
about patient safety have recommended, in par- or seizures, which are known as adverse drug
ticular, widespread adoption of electronic health events, if the medications are actually adminis-
records with computerized physician order tered. The goal of medication safety decision
entry.1–4 support in computerized physician order entry
is to prevent these types of serious errors as the
orders are being written.
Background On Decision-Support A study demonstrated that one in ten patients
Tools hospitalized in Massachusetts suffered an ad-
With computerized physician order entry, phy- verse drug event that could be prevented by
sicians and other licensed clinicians write their decision-support tools in computerized physi-
orders for hospitalized patients electronically. cian order entry.8 The study spurred the passage
This recommendation is based in large part on of legislation requiring Massachusetts hospitals
demonstrations by pioneering organizations. to implement computerized physician order
The organizations have shown that important entry by 2012 as a condition of licensure.9
improvements in safety can be achieved when More recently, “meaningful use” of computer-
rules-based decision support aids in averting ized physician order entry and clinical decision
medication errors and adverse events by provid- support has been singled out as a requirement
ing advice and warnings as physicians write or- for hospitals to qualify for new financial incen-
ders using a specially programmed computer.5–7 tives. These incentives will be offered under the

APRIL 2010 29:4 H E ALT H AF FAI RS 655


Focus On Quality

health information technology (IT) stimulus scribing errors in “live” hospital settings.
provisions of the American Recovery and Rein- The tool was intentionally designed to give
vestment Act (ARRA) of 2009. individual hospitals detailed and specific feed-
Computerized physician order entry interacts back on their performance and to give purchas-
with other applications in the suite of digital ers, through Leapfrog, an overall score for the
tools that constitute the inpatient electronic hospital that can be used for benchmarking
health record (for example, to obtain infor- purposes.17,18
mation on allergies and patients’ weight) and This assessment complements efforts by the
is typically one of the later modules to be imple- Certification Commission for Health Informa-
mented. Hospitals’ adoption of the compu- tion Technology (CCHIT) to evaluate the capa-
terized physician order entry module is in- bilities available in vendors’ electronic medical
creasing, but slowly.10,11 Several reports have record products “on the shelf.” It evaluates how
suggested that the successful application of products were implemented and are actually
decision support achieved among pioneering being used in hospitals.
organizations is not being replicated and that The development of the tool was initially
implementation can create new problems.12–15 funded by the Robert Wood Johnson Foun-
This report summarizes results for sixty-two hos- dation, the California HealthCare Founda-
pitals across the United States that used a new tion, and the Agency for Healthcare Research
simulation tool to assess their use of medication and Quality, and was completed in 2006. In
safety decision support in electronic health April 2008 the assessment was incorporated into
records with computerized physician order the Leapfrog Annual Safe Practices Survey for the
entry. first time, and hospitals completing the assess-
ment received a feedback report. Beginning in
2009, assessment results were also factored into
Study Data And Methods determining the extent to which the computer-
History Of The Assessment Tool The impetus ized physician order entry implementation met
for developing the assessment tool was initially the Leapfrog standard.
the standard developed by the Leapfrog Group. Design Of The Assessment Tool The assess-
This is an employer group that seeks to accom- ment methodology is modeled after tools that are
plish breakthroughs, or “big leaps,” in hospital commonly used in other industries. It mimics
patient safety through a combination of public what happens when a physician writes an order
awareness and rewards to higher-quality provid- for an actual patient in the implemented elec-
ers. The group selected computerized physician tronic health record with computerized physi-
order entry as one of the first three leaps in 2001. cian order entry. But it uses test patients—in
There was accumulating evidence concerning effect, fictitious patients created for purposes
the frequency, tragic consequences, and finan- of the assessment—and test orders.
cial costs of adverse drug events in hospitalized A group of experts on adverse drug events, as
patients—and computerized physician order en- well as the use of decision support in computer-
try and decision support had demonstrated the ized physician order entry to decrease adverse
ability to help avert many of them.16 drug events, developed test orders that are
The Leapfrog standard includes two elements judged likely to cause serious harm (rather than
of meaningful use to ensure that computerized those with low potential for harm). The test
physician order entry has been implemented in orders belong to the categories of adverse
such as way as to improve medication safety. drug events (such as drug-to-allergy or drug-
According to the standards, physicians and other to-diagnosis contraindication) that prior re-
licensed providers must enter at least 75 percent search shows cause the most harm to patients.
of medication orders using computerized entry. In most cases, they are actual orders that have
Clinical decision support must also be able to caused adverse drug events, taken from primary
avert at least 50 percent of “common, serious adverse drug event data collection studies. The
prescribing errors.”16 assessment offers a one-time, cross-sectional
Clinical decision support in this setting is the look at whether decision support provides advice
logic built into the computerized physician order to a physician writing such an order.18,19
entry system that, for example, checks to see if Decision support for this purpose is a set of
ampicillin has been ordered for a patient who is tools or logic that can be integrated into the
known to be allergic to penicillin. This tool was computerized physician order entry system to
developed to specifically measure the ability of suggest appropriate orders (such as a dose cal-
implemented electronic health record systems culator or a reminder to consider renal function)
with computerized physician order entry to de- or to critique them once they have been entered,
tect and avert these common yet serious pre- as through a message or an alert. The assessment

656 H E ALT H AF FAI RS APRIL 2010 29:4


gives credit for all of these forms of decision dressed by the test orders include ones for which
support as relevant advice or information. decision-support tools are fairly straightforward
Because of differences in the epidemiology of to implement (drug-to-drug or drug-to-allergy
preventable adverse drug events, different ver- interactions, therapeutic duplication, inappro-
sions of the assessment were designed for adult priate single dose, and inappropriate route of
and pediatric inpatient settings. administration). The assessment also included
Use Of The Assessment Tool A designated test orders that require more effort to configure
team in the hospital performs a self-assessment or customize—for example, use of an inappro-
in the following fashion. First, the team down- priate daily dose or weight-based dose, drug-to-
loads instructions and information profiles for age or drug-to-diagnosis contraindication, con-
ten to twelve test patients. Then the team down- traindications based on renal status or other
loads around fifty test orders, instructions, and metabolic abnormalities indicated by laboratory
observation sheets to be used in the assessment. tests, or lack of monitoring.
A participating physician enters test orders for The framework for basic and advanced deci-
the test patients into the local electronic health sion support was based on prior work on clinical
record and observes and notes any guidance pro- decision support in computerized physician or-
vided by decision support, such as the calculated der entry,21 and the categories are consistent
dose, a message or alert displayed, and so on. The with recent research on preventable adverse
team enters the results obtained for each test drug events.8
order (decision support received or not).17 The
assessment tool instantly computes an overall
score (percentage of test orders identified), as Statistical Analysis
well as the score for the orders in each adverse The basic unit for the analyses was the hospital.
drug event category. It then displays results to Overall scores for test orders and the scores for
the testing team. The entire process takes no orders assigned to the “basic” group were both
more than six hours, and many hospitals com- found to be approximately normally distributed.
plete it more quickly. However, scores for test orders in the “advanced”
During the development period (2002–6), group were slightly right-skewed. Thus, we pre-
multiple testing was performed at more than sent additional statistics to aid in interpretation.
twenty-five different hospitals. The testing re- A fuller description of our analysis is available in
flecting all of the leading electronic health record the Online Appendix.20
and computerized physician order entry ven- Tests based on standard assumptions about
dors’ products and ensured that the test could the population, such as through parametric
effectively evaluate each product. Details of the tests, gave results almost identical to those of
reliability and validity of the assessment meth- tests that made no such assumptions (nonpara-
odology are available in the Online Appendix.20 metric tests). Thus, for simplicity, parametric
tests are displayed (see the Online Appendix
for details).20
Analysis The total hospital scores were found to be ap-
Between April and August 2008, eighty-one U.S. proximately normally distributed, as is typically
hospitals completed the version of the assess- the case with normal outcomes. Thus, we as-
ment for adult patients. Test orders and posted sumed a linear regression model with the overall
results were reviewed, and information poten- score as the dependent variable (estimating the
tially identifying patients was removed. Nine relationships between the outcome and covari-
hospitals were eliminated because registration ates using ordinary least squares regression).
information indicated that computerized physi- Covariates that we identified as having a possible
cian order entry was being used only in the emer- influence on total hospital score were vendor
gency department rather than more broadly in (there were nine), teaching status, hospital size
the hospital. groupings (number of beds), and whether or not
In addition, ten hospitals were excluded be- the hospital was part of a health system.
cause they exceeded a deception-analysis thresh- We tested for appropriateness of our linear
old based on standard gaming detection stra- regression model using techniques that are de-
tegies (testing irregularities such as exceeding scribed in the Online Appendix.20 We also used
time limits or multiple false positive results). standard statistical techniques to determine the
A review of excluded hospitals did not reveal percentage of the overall variation explained by
results that would have skewed the findings in each factor.22
this study. The resulting sample contained sixty- We tested our model for appropriateness using
two hospitals. techniques that are described in the Online Ap-
The categories of adverse drug events ad- pendix.20 For dichotomous outcome variables,

APRIL 2010 29:4 H E A LT H A FFA IRS 657


Focus On Quality

such as test order detected (yes, no), percentages that can be addressed by basic decision support
were calculated. However, to account for the (61 percent) than for those requiring more ad-
hierarchical nature of the order data, or the test vanced decision support (25 percent).
orders nested within hospitals, the 95 percent Aggregate Results: Adverse Drug Events
binomial confidence intervals for percentages When results for all hospitals were pooled, the
were adjusted using the approach described by adverse drug event category detected most reli-
David Williams.23 ably was drug-to-allergy contraindication. Much
All analyses were conducted using the statis- higher scores were obtained for each of the cat-
tical software package SAS 9.2. All tests were two- egories addressed by basic clinical decision sup-
tailed, and a p value less than 0.05 (therefore port than for those requiring advanced tools
not likely to be due to chance) was considered (Exhibit 4).
statistically significant. Drug-to-diagnosis contraindication includes
pregnancy, which was also analyzed separately.
These potential adverse drug events were only
Study Results detected 15 percent of the time.
The types of hospitals included in the sample The set of test orders for each hospital includes
were broadly representative of larger U.S. hospi- four that are judged to result in patient fatality.
tals (Exhibit 1). Nearly two-thirds were teaching When results for this subset were analyzed, we
hospitals. The higher representation of teaching found that 47 percent (95 percent confidence
hospitals and lower representation of smaller interval: 36.9–57.6) were not detected by the
hospitals is consistent with the current pattern decision support in use in these hospitals.
of adoption of computerized physician order en- Although hospitals do have pharmacy and nurs-
try in hospitals.10,11 Among the sixty-two hospi- ing review processes in place that sometimes
tals, all but one reported using electronic health catch orders like these before the medication
record applications including computerized reaches the patient, these medication orders
physician order entry from one of seven commer- are far outside safe limits and would never be
cial vendors. appropriate physician orders.
Individual Hospitals Scores for individual Contributing Factors The information
hospitals ranged from 10 percent to 82 percent available for exploring contributing factors
of test orders detected. The scores for the top was limited to the vendor software solution in
10 percent, or six hospitals, ranged from 71 per- use, teaching status, hospital size by number of
cent to 82 percent. Scores for the six hospitals beds, and whether or not the hospital was part of
with the lowest scores ranged from 10 percent to a health system. We assessed the relationship
18 percent (Exhibit 2). between performance on the assessment and
Mean hospital scores (Exhibit 3) were higher these factors.
for orders that would lead to adverse drug events High-low scores for hospitals using the same

EXHIBIT 1

Characteristics Of Hospitals Participating In The Study Of Computerized Physician Order Entry (CPOE), 2008
Participating hospitals
Characteristic Number Percent All U.S. hospitals (%) Hospitals with CPOE (%)
HOSPITAL SIZE (BEDS)
<50 0 0 32.9 8.9
50–99 3 4.8 15.9 11.8
100–199 14 22.6 22.4 12.8
200–299 8 13 12.7 15.3
300–399 9 15 7.7 30.1
400–799 23 37 7.3 33.9
800+ 5 8 0.9 41.2
TEACHING STATUS

Teaching 39 63 21.7 –
Nonteaching 23 37 78.3 –
SYSTEM STATUS

Independent 15 24 45 –
Multihospital health system 47 76 55 –

SOURCES: Authors’ analysis; and Notes 10 (for hospital size data) and 11 (for teaching and system status) in text.

658 HE A LT H A FFA IR S A P R I L 2 0 10 2 9 :4
EXHIBIT 2

Hospital Scores For Detection Of Test Orders That Would Cause An Adverse Drug Event In An Adult Patient According To
The Software Product (Vendor) Implemented
Hospital score (percent)

SOURCE Authors’ analysis.

computerized physician order entry software In a multiple regression model, vendor choice
product ranged by as much as 40–65 percent was significantly correlated with performance
(Exhibit 2). Some hospitals using each product (p ¼ 0:009, or not likely to be due to chance).
detected at least 50 percent of the potential ad- This means that there is good statistical evidence
verse drug events. The six top-performing hos- to suggest that choice of vendors does have some
pitals used six different software products: one positive effect on performance. However, vendor
homegrown solution and five vendor products. choice accounted for only 27 percent of the total

EXHIBIT 3

Hospital Mean Scores For Detecting Test Orders For Adult Patients Corresponding To Adverse Drug Event Categories
Addressed By Basic And Advanced Decision Support
Percent of test orders detected
Adverse drug event categories grouped according to
level of clinical decision support Mean (SE) Median Interquartile range
Basic—relatively easy to implementa 61.4 (2.4) 61.1b 53.9–76.2
Advanced—requires more configuration or customizationc 24.8 (2.6) 18.8b 5.9–38.9
Overall score 44.3 (2.3) 41.7 31.6–57.1

SOURCE Authors’ analysis. NOTES N ¼ 62 hospitals. SE is standard error. Interquartile range is the range in which the middle 50 percent
of the observations are seen—25 percent below the median, 25 percent above. ap < 0:0001 using a paired t-test for Basic
Score = Advanced Score. bDrug-to-drug or drug-to-allergy contraindication, inappropriate single dose, therapeutic duplication, and
inappropriate route. cInappropriate cumulative (daily) dose, inappropriate dose (patient weight), age or diagnosis contraindication,
contraindication based on renal function or other condition indicated by laboratory tests, lack of monitoring.

AP R I L 2 0 1 0 2 9 :4 HE A LT H A FFA IR S 659
Focus On Quality

EXHIBIT 4

Pooled Hospital Scores For Detecting Test Orders For Adults In Various Adverse Drug Event Categories
Lower 95 percent Upper 95 percent
Adverse drug event category Percent detected confidence interval confidence interval
a
ADDRESSED BY BASIC CLINICAL DECISION SUPPORT

Drug-allergy contraindication 83.3 77.7 87.8


Inappropriate single dose 46.4 37.6 56.6
Therapeutic duplication 54.5 43.7 64.9
Drug-drug interaction 52.4 43.4 61.3
Inappropriate route 65.3 55.7 72.5
b
ADDRESSED BY ADVANCED CLINICAL DECISION SUPPORT
Inappropriate cumulative (daily) dose 39.1 28.9 50.4
Inappropriate dosing (patient weight) 36.7 27.9 46.4
Age contraindication 14.1 7.9 24.0
Labs—creatinine 20.2 12.9 30.1
Labs—other 26.1 18.7 35.1
Drug-diagnosis contraindication 15.0 9.9 22.1
Corollary orders (monitoring) 27.0 19.7 35.7

SOURCE Authors’ analysis. NOTE N ¼ 62 hospitals. aImplementation of applicable decision support is relatively straightforward.
b
Implementation of applicable decision support requires more effort to configure or customize.

variation that we observed in performance. mented advanced clinical decision support, as


Teaching status also correlated significantly well as basic tools, and their performance pro-
with performance (p ¼ 0:007, very unlikely to vides a benchmark for the continuing efforts of
be due to chance), accounting for 10 percent lower-performing hospitals.
of the observed variation in performance. But A comparison of aggregate scores with the
hospital size and being part of a hospital system findings from a recent study of adverse drug
did not correlate significantly with performance. events, based on chart reviews in six community
At least as far as we can detect statistically, hos- hospitals not using the computerized physician
pital size and being part of a hospital system do order entry module of the inpatient electronic
not influence performance one way or the other. health record,8 showed that decision support in
We tested for interactions between vendor and the sixty-two hospitals is doing a much better job
all other variables, and none were significant detecting adverse drug events that occur infre-
(p > 0:2). Finally, although we were able to quently than those that occur more frequently.
account for only 39 percent of the observed The categories in our study with the three high-
variation in performance (overall R2 ¼ 0:39), est detection rates—drug-to-allergy, drug-to-
standard tests (goodness-of-fit statistics and as- drug, and duplicate medication—together only
sessments) indicated that our model was appro- contributed 7 percent of the adverse drug events
priate for our study, as detailed in the Online in the community hospital chart review study.
Appendix.20 Fifty-five percent of the adverse drug events in
the community hospital chart review study
required considering laboratory results or pa-
Discussion tients’ age in determining medication appropri-
Many of the benefits from inpatient electronic ateness or dosing. For these categories in
health records and the computerized physician aggregate, corresponding test orders were de-
order entry module in particular come from de- tected only 20.6 percent of the time—not very
cision support.24 In this study, we found wide good odds from the patients’ perspective.
variation in the ability of implemented comput- Different Products In Use The high variabil-
erized physician order entry decision support to ity found, including among hospitals using the
detect medication orders judged likely to cause same electronic health record software product,
serious harm to adult patients. emphasizes the importance of this type of assess-
Many hospitals performed poorly, and the ment to gauge the extent to which decision
mean score was only 44 percent of potential support is being used. In fact, this is the ration-
adverse drug events detected. However, top- ale used by the Leapfrog Group and the Na-
performing hospitals in this sample achieved tional Quality Forum in including the use of
scores of 70–80 percent or greater. To achieve the assessment tool in their recommended safe
these higher results, the hospitals have imple- practices.4,16

66 0 H E ALT H AF FAI RS APRIL 2010 29:4


Work is now under are contributing to the variability observed.
These include the completeness and ease of
way to begin to applying the decision-support tool sets in the
electronic health record software; relevant
develop libraries of knowledge and experience within the hospital;
the availability and commitment of staff resour-
decision-support rules ces; and whether the use of clinical decision sup-
port began recently or has been honed over
and practical advice. many years.
There are multiple possible explanations for
the observed correlation between hospital teach-
ing status and performance. These include such
factors as research interest and having more staff
resources to invest. However, these could not be
Hospital boards and executive teams who are explored further in this study.
investing in these systems, practicing physicians For specific categories of adverse drug events,
who adopt the technology in their routine work there are also multiple possible explanations for
to improve patient safety, and external stake- variability among hospitals. For drug-to-diagno-
holders requiring the use of electronic health sis contraindications, for example, many hospi-
records with computerized physician order entry tals are likely not yet applying decision support
or providing incentives for adoption all need a because there is no constantly updated elec-
way to gauge progress in applying decision sup- tronic problem list maintained by physicians
port to improve medication safety. This becomes for patients during their hospital stay.
acutely important as hospitals attempt to meet The assessment also does not provide insight
meaningful-use requirements to quality for fed- into whether or not the information or advice
eral stimulus incentives. being presented is followed in practice—another
Assessing Patient Safety Given the large prerequisite for improving medication safety.26
investment in achieving meaningful use nation- A better understanding of all of these factors is
wide, it will be important to ascertain whether needed so that strategies for speeding progress
actual implementation is achieving important can be devised.
goals such as improved patient safety. To our Some reports have suggested one remedy: pro-
knowledge, this test is the first such objective viding assistance to hospitals in using decision
evaluation of electronic health record systems support effectively.3,21,27 This assessment tool
in actual use. provides guidance to the hospital in the form
The assessment measures the extent to which of a scorecard. Several publications exist to guide
clinical decision support is providing some form hospitals in their planning for and decisions
of advice or an alert in response to medication about clinical decision support.21,26,28,29 Also,
orders that would cause an adverse drug event. some vendors of computerized physician or-
The design and scope of decision-support capa- der entry software provide clinical decision sup-
bilities in electronic health record software prod- port starter sets or facilitate sharing among
ucts vary, and local hospital configuration and customers.
customization are always required. Work is now under way to begin to develop
Although there was a relationship between the libraries of decision-support rules and practical
product involved and performance on the assess- advice, although it is in early stages.
ment, only 27 percent of the variation is associ- Study Limitations This study has several lim-
ated with using different electronic health record itations. Like all voluntary surveys, the study is
products. This suggests that other factors had a subject to possible response bias. It is likely that
bigger influence on the wide variation in perfor- the hospitals completing the assessment were
mance among the hospitals. Although results are more interested than many of their peers were
presented for one homegrown system, this is not in clinical decision support because it was
a sufficient sample from which to draw any gen- initially offered as a learning exercise instead
eral conclusions. of being required for the annual Leapfrog survey.
In most hospitals, the use of decision support Another limitation is that the sixty-two hospi-
grows over time. At the outset, managing it is a tals in the study represent about 8 percent of U.S.
new process.6 Including too much, poorly de- hospitals with the computerized physician order
signed clinical decision support initially can entry module of the inpatient electronic health
even contribute to lack of computerized physi- record in use.10 Thus, they might not be
cian order entry adoption.25 representative. The fact that this is the first test
Exploring Variability In Use Several factors of its kind means that there is no “gold standard”

AP R I L 2 0 1 0 2 9 :4 HE A LT H A FFA IR S 6 61
Focus On Quality

for comparison of these results. However, during and provide advice or an alert concerning a medi-
development of the assessment tool, the reliabil- cation order that would result in serious harm to
ity and validity were extensively evaluated and an adult patient. Some hospitals performed very
found to be very high. well, while others performed very poorly. In ad-
Finally, because results are self-reported, hos- dition, the studied hospitals as a group were
pitals may have reported better performance using basic decision support far more than the
than was actually assessed. However, it should more advanced tools needed to detect types of
be noted that public reporting of overall scores, orders that are major contributors to adverse
which could influence hospitals’ behavior, was drug events in chart-review studies.
not implemented until 2009, and hospitals were These findings point to the importance of eval-
asked to participate in part to aid them with uations of the use of clinical decision support by
improving their decision support. To address hospitals to help guide their continuing efforts
the potential for gaming, several standard safe- to improve medication safety. In addition, incen-
guards were built into the test to detect patterns tives to hospitals relating to computerized physi-
of use that suggest that gaming may be going on. cian order entry should include some type of
Any assessment that included these character- demonstration that clinical decision support is
istics was excluded from analysis. actually being employed, instead of being based
solely on whether computerized physician order
entry is in use.
Conclusions The broader use of this type of assessment
In sixty-two hospitals using an inpatient elec- of meaningful electronic health record use
tronic health record with computerized physi- should be explored for other software applica-
cian order entry, we found significant vari- tions used in direct clinical care. ▪
ability in the use of decision support to detect

The authors thank Peter Kilbridge, Fran development of the assessment tool of the manuscript. Four of the authors
Turisco, and the project expert advisers, used in this study was supported by (Jane Metzger, Emily Welebob, David W.
as well as individuals in many hospitals, grants from the California HealthCare Bates, and David C. Classen) were
health systems, and software vendor Foundation, the Robert Wood Johnson involved in the development of the
companies, for their involvement and Foundation, and the Agency for assessment tool. The content is solely
assistance in some phase of the Healthcare Research and Quality the responsibility of the authors and
development and testing of the (Contract no. 290-04-0016, Subcontract does not necessarily represent the
assessment tool used in the study. They no. 6275-FCG-01). These organizations official views of the funding agency.
also thank the Leapfrog Group for did not sponsor and were not involved
access to the data for analysis. The in the reported analysis or preparation

NOTES
1 Institute of Medicine. Preventing Clemmer TP, Weaver LK, Orme JF 10 Pedersen CA, Gumpper KF. ASHP
medication errors: quality chasm Jr., et al. A computer-assisted man- survey on informatics: assessment of
series. Washington (DC): National agement program for antibiotics and the adoption and use of pharmacy
Academies Press; 2007. other anti-infective agents. N Engl J informatics in U.S. hospitals—2007.
2 Institute of Medicine. Crossing the Med. 1998;338(4):232–8. Am J Health Syst Pharm. 2008;65:
quality chasm: a new health system 7 Teich JM, Merchia PR, Schmiz JL, 2244–64.
for the 21st century. Washington Kuperman GJ, Spurr CD, Bates DW. 11 American Hospital Association.
(DC): National Academies Press; Effects of computerized physician Continued progress: hospital use of
2001. order entry on prescribing practices. information technology [Internet].
3 Aspden P, Corrigan JM, Wolcott J, Arch Intern Med. 2000;160(18): Chicago (IL): AHA; 2007 [cited 2010
Erickson SM. Patient safety: achiev- 2741–7. Feb 21]. Available from: http://
ing a new standard of care. Wash- 8 Adams M, Bates DW, Coffman G, www.aha.org/aha/content/2007/
ington (DC): National Academies Everett W. Saving lives, saving pdf/070227-continuedprogress.pdf
Press; 2003. money: the imperative for comput- 12 Nebeker JR, Hoffman JM, Weir CR,
4 National Quality Forum. Safe prac- erized physician order entry in Bennett CL, Hurdle JF. High rates of
tices for better healthcare: a con- Massachusetts hospitals [Internet]. adverse drug events in a highly
sensus report [Internet]. Washing- Boston (MA): Massachusetts Tech- computerized hospital. Arch Intern
ton (DC): National Quality Forum; nology Collaborative and New Med. 2005;165(10):1111–6.
2003 [cited 2010 Feb 21]. Available England Healthcare Institute; 2008 13 Walsh KE, Landrigan CP, Adams
from: http://www.ahrq.gov/qual/ [cited 2010 Feb 21]. Available from: WG, Vinci RJ, Chessare JB, Cooper
nqfpract.pdf http://web3.streamhoster.com/ MR, et al. Effect of computer order
5 Bates DW, Leape LL, Cullen DJ, Laird mtc/cpoe20808.pdf entry on prevention of serious
N, Petersen LA, Teich JM, et al. Ef- 9 Getz L. On the right track in Mas- medication errors in hospitalized
fect of computerized physician order sachusetts. For the Record [serial on children. Pediatrics. 2008;121(3):
entry and a team intervention on the Internet]. 2008 Nov 11; 20(23) e421–7.
prevention of serious medication [cited 2010 Feb 21]. Available from: 14 Koppel R, Metlay JP, Cohen A,
errors. JAMA. 1998;280(15):1311–6. http://www.fortherecordmag.com/ Abaluck B, Localio AR, Kimmel SE,
6 Evans RS, Pestotnik SL, Classen DC, archives/ftr_111008p16.shtml et al. Role of computerized physician

66 2 H E ALT H AF FAI RS APRIL 2010 29:4


order entry systems in facilitating .org/media/file/Leapfrog-CPOE_ 25 Connolly C. Cedars-Sinai doctors
medication errors. JAMA. 2005;293 Evaluation2.pdf cling to pen and paper. Washington
(10):1197–203. 19 Kilbridge PM, Welebob EM, Classen Post. 2005 Mar 21. p. A01.
15 Han YY, Carcillo JA, Venkataraman DC. Development of the Leapfrog 26 Bates DW, Kuperman GJ, Wang S,
ST, Clark RS, Watson RS, Nguyen methodology for evaluating hospital Gandi T, Kitter A, Volk L, et al. Ten
TC, et al. Unexpected increased implemented inpatient computer- commandments for effective clinical
mortality after implementation of a ized physician order entry systems. decision support: making the prac-
commercially sold computerized Qual Saf Health Care. 2006;15(2): tice of evidence-based medicine a
physician order entry system. Pedi- 81–4. reality. J Am Med Inform Assoc.
atrics. 2005;116(6):1506–12. 20 The Online Appendix can be ac- 2006;13:523–30.
16 Leapfrog Group. Fact sheet: com- cessed by clicking on the Online 27 Osheroff JA, Teich JM, Middleton
puterized physician order entry. Appendix link in the box to the right BF, Steen EB, Wright A, Detmer DE.
Washington (DC): Leapfrog Group; of the article online. A roadmap for national action on
2009 Mar 3 [cited 2010 Feb 21]. 21 Kuperman G, Bobb A, Payne TH, clinical decision support. Bethesda
Available from: http://www.leap Avery AJ, Gandi TK, Burns G, et al. (MD): American Medical Infor-
froggroup.org/media/file/ Medication-related clinical decision matics Association, CDS Roadmap
FactSheet_CPOE.pdf support in computerized provider Steering Committee; 2006 Jun 13
17 Metzger JB, Welebob E, Turisco F, order entry systems: a review. J Am [cited 2008 Sep 14]. Available from:
Classen DC. The Leapfrog Group’s Med Inform Assoc. 2007;14(1): https://www.amia.org/files/
CPOE standard and evaluation tool. 29–40. cdsroadmap.pdf
Patient Safety and Quality Health- 22 Lin DY, Wei LJ, Ying Z. Model- 28 Osheroff JA, Pifer EA, Sittig DF,
care [serial on the Internet]. 2008 checking techniques based on cu- Jenders RA, Teich JM. Clinical deci-
Jul/Aug: p. 22–5 [cited 2010 Feb 21]. mulative residuals. Biometrics. sion support implementers’ work-
Available from: http://www.psqh 2002;58(1):1–12. book. 2nd ed. Chicago (IL): Health
.com/julaug08/cpoe.html 23 Williams DA. Extra-binomial varia- Information Management Systems
18 Kilbridge P, Welebob E, Classen DC. tion in logistic linear models. Appl Society; 2004 [cited 2010 Mar 9].
Overview of the Leapfrog Group Stat. 1982;31;144–8. Available from: http://www
evaluation tool for computerized 24 Kaushal R, Shojania KG, Bates DW. .himss.org/cdsworkbook
physician order entry [Internet]. Effects of computerized physician 29 Health Information and Manage-
Lexington (MA): Leapfrog Group order entry and clinical decision ment Systems Society. Improving
and First Consulting Group; 2001 support systems on medication outcomes with clinical decision
Dec [cited 2010 Feb 21]. Available safety: a systematic review. Arch In- support: an implementers’ guide.
from: http://www.leapfroggroup tern Med. 2003;163(12):1409–16. Chicago (IL): HIMSS; 2005.

AP R I L 2 0 1 0 2 9 :4 HE A LT H A FFA IR S 6 63

Potrebbero piacerti anche