Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
February 2008
An Introduction to Evidence-Informed Public Health and
L. Buffett, K. Barnett
February 2008
Contact:
D. Ciliska
ciliska@mcmaster.ca
905-525-9140
Production of this paper has been made possible through a financial contribution
from the Public Health Agency of Canada. The views expressed herein do not nec-
essarily represent the views of the Public Health Agency of Canada.
3
Summary
Purpose
This background paper defines and summarizes the concept of Evidence-Informed Public Health
(EIPH) recognizing that, to use evidence in public health practice and policy development, one must
first critically appraise the available research that provides the basis for that evidence.
This paper addresses the need for critical appraisal of primary research studies and systematic
reviews to inform effective public health practice. It also outlines a hierarchy of quality of research
evidence that can be used to inform public health policy and program delivery.
For that reason, this paper presents some of the more commonly used critical appraisal tools. These
tools provide basic guidelines and checklists for public health professionals to evaluate the quality of
research when reading the literature. Web links in the compendium that accompanies this paper will
direct users to some of the most current and usable tools.
Methods
Relevant literature collected and reviewed for this background paper comprises all literature (grey and
published) used for the environmental scan for the National Collaborating Centre for Methods and
Tools (Ciliska et al., 2006), including an update of that literature, and a review of relevant references
and websites.
Conclusions
The highest quality evidence available is vital to the interactive process of moving knowledge into
practice in the complex world of public health; however, the time constraints typically faced by public
health practitioners can preclude a consistent implementation of the principles of evidence-informed
decision-making. Critical appraisal provides an efficient method of reviewing evidence for its quality,
and is an important part of the process of evidenced-informed practice and policy development. The
use of quality checklists and other tools can provide a systematic and effective means to help identify
rigorous studies with valid conclusions for potential implementation and assessment.
5
An Introduction to Evidence-Informed Public Health and
Introduction
Evidence-Informed Public Health (EIPH) depends on good sound evidence. Although decisions to
develop and implement new programs and services must be grounded in best practices, the methods
and frameworks needed to inform knowledge and translate evidence into practice are often consid-
ered time-consuming and difficult to understand.
Public health professionals live in a world of heavy workloads, inadequate staffing and insufficient
dedicated resources. Especially in the face these realities, critical appraisal of existing evidence is
fundamental to the search for quality evidence to inform the process of public health decision-making.
Critical appraisal, as described by public health professionals within the Environmental Scan for the
National Collaborating Centre for Methods and Tools (NCCMT) (Ciliska et al., 2006), included the use
of users’ guides to assess the rigor/strength of research; standardized methods of quality assessment
of primary studies and reviews of evidence; and up-to-date and easy-to-use tools to rate the quality of
evidence/research. The compendium of tools can assist with this critical appraisal.
Primary Audience
Primary audiences for this paper include busy public health managers and policy-makers who may
have little or no experience in assessing qualitative or quantitative research.
Literature Search
The comprehensive search for published literature from 1996-2006 is described fully in the NCCMT
Environmental Scan (Ciliska et al., 2006). An update of the initial search was conducted in February
2007. Of the literature collected, 51 articles were retrieved for review. Appendix 3 of the Environmen-
tal Scan (Ciliska et al., 2006) provided a valuable list of relevant websites that were scanned in order
to identify current tools and literature. Personal databases of McMaster faculty who teach courses in
critical appraisal were reviewed to ensure the currency and relevance of available literature and web-
sites. Two other reviews that assessed critical appraisal tools, primarily from the perspective of the
systematic reviewer (Deeks et al., 2003; West et al., 2002), provided additional sources for consider-
ation.
6
Introduction to Evidence-Informed Public Health
EIPH is rooted in evidence-based medicine (EBM), at term coined by Guyatt et al. in 1992
(Cullum et al., 2008). Under Guyatt’s leadership, the Evidence-Based Medicine Work Group
published a series of articles for the Journal of the American Medical Association between
1993 and 2000 that outlined the criteria for evaluating current evidence to support clinical
decisions. These articles formed the basis of most existing critical appraisal tools. Acceptance
for EBM has grown substantially over the past fifteen years among nurses and other health
professionals including public health practitioners (Gandelman et al., 2006; Kohatsu et al.,
2004; Rychetnik & Wise, 2004).
The expansion of EBM to include evidence-based public health (EBPH) (Kohatsu et al., 2004) is
defined as “the process of integrating science-based interventions with community preferences to
improve the health of populations.”
Evidence-Informed Public Health builds on the ideas of EBM and EBPH, but acknowledges the many
factors, beyond simply the evidence, that influence decision-making. EIPH is a complex, multi-disci-
plinary process that occurs within dynamic and ever-changing communities and encompasses differ-
ent sectors of society.
EIPH has several distinct stages: define, search, appraise, synthesize, adapt, implement and evalu-
ate (see Table 1). At each step, there are resources and best practices that can inform and improve
the process. Methods and tools are available to help public health practitioners and policy makers
hone their EIPH skills.
7
Stage in EIPH Description
EIPH does not happen in a vacuum. Figure 1 illustrates the intersecting components of effective
public health decision-making and offers a way to visualize the spectrum of factors to be considered
toward developing and providing the best public health interventions possible. Generic clinical ex-
pertise can provide a valuable understanding of the integration of the components required to make
effective clinical decisions (Dicenso et al., 2005). Effective evaluation of evidence to support public
health practice must address these multidimensional issues.
8
Three questions need to be answered for the purpose of critically appraising research specific to pub-
lic health practice (Rychetnik et al., 2002):
Critical appraisal tools or checklists can facilitate the process and help practitioners to readily under-
stand why an intervention may appear to be effective in one setting, but ineffective in another (Ry-
chetnik et al., 2002).
9
Question Type Example
The following example of an application of the PICO formulation uses the effectiveness question from
Table 1: Can a multi-component obesity prevention program in a secondary school increase adoles-
cent physical activity?
Types of Research
EBM was founded largely on quantitative research; however both quantitative and qualitative re-
search contribute important knowledge to public health and can answer different questions of interest
related to public health interventions.
Public health programs and actions must be not only effective, but appropriate for our communities
and target populations. EIPH effectively transfers knowledge from both quantitative and qualitative
research.
An understanding of the factors that support or impede the delivery of public health actions is seldom
found solely in quantitative studies of effectiveness (Jackson & Waters, 2005). Qualitative informa-
tion is critical to determining the community relevance of a program or intervention. It provides public
health practitioners with essential information about the effectiveness of interventions, and the contex-
tual circumstances in which these interventions were delivered or could work.
10
The usefulness of a recently proposed hierarchy of evidence for qualitative research has not yet been
conclusively established (Daly et al., 2007).
In 1998, the Effective Public Health Practice Project (EPHPP) began to systematically summarize
research evidence to inform public health practice and policy for the Ontario Ministry of Health and
local provincial health units. EPHPP developed a standardized tool to appraise individual studies for
systematic reviews and summarized worldwide collaborative efforts to maintain databases for this
purpose (Thomas et al., 2004). An exhaustive report assessed ways of appraising non-randomized in-
tervention studies (Deeks et al., 2003). Of the 197 tools reviewed, only six were considered adequate
for use in performing systematic reviews; the Thomas tool was one of these six superior tools.
Good systematic reviews or overviews are particularly helpful for busy public health practitioners
because the evidence has already been found, the quality of that evidence evaluated, and the find-
ings summarized for a specific question of interest (Ciliska et al., 2001; Jackson & Waters, 2004).
However, one cannot assume that all systematic reviews are good, and therefore, they too need to be
critically appraised.
Pre-processed Evidence
Pre-processed evidence is an important and readily available resource for public health profession-
als. Pre-processed evidence has already been reviewed for methodological rigour by an individual or
group who has then summarized the best quality evidence for consideration in public health practice.
Systematic reviews, guidelines, EB textbook/journal summaries, clinical decision-making tools all
fall under pre-processed information. Many on-line resources and products are kept current and are
readily accessible.
Economic Evaluation
Public health interventions are challenging to evaluate and synthesize because of their complexity,
the possible involvement of professionals from a variety of disciplines, the unique features of the con-
text of the study setting, the characteristics of the study population, and other methodological issues
(Lin, 2004; Rychetnik et al., 2002; Waters et al., 2006).
11
Despite the challenges, decisions to implement new public health interventions or programs (or to
maintain current practices) should be based on methodological economic evaluations (Birch & Gafni,
2003), and not simple cost-benefit analyses. Economic evaluation helps to determine whether the
relative values of different outcomes and consequences of an intervention are worth the associated
costs in both dollars and health risks. Public health practitioners must also evaluate whether the esti-
mated costs associated with effective interventions seem realistic for their local settings.
There are relatively few economic evaluations in public health to date. Appendix 1 contains a link to
the United Kingdom National Health Service Economic Evaluation Database, a free resource that
summarizes evidence and includes an economic evaluation tool.
Meta-Synthesis
Meta-synthesis incorporates the findings from multiple qualitative studies and can increase the
transferability or generalizability of findings for public health practice. Meta-synthesis can enhance our
understanding of the processes involved for the population of interest or for health care delivery, and
inform decision-making for policy and program development.
Once selected, evidence can be organized or grouped according to its susceptibility to bias (Rychet-
nik & Wise, 2004). A hierarchy of evidence provides a way to rate the quality of evidence, where the
same research question has been studied using different research methods or approaches.
By identifying the strongest evidence, the hierarchy allows practitioners to 1) limit a search and 2)
consider when to weigh alternative approaches or interventions for program delivery in public health.
Table 3 summarizes the hierarchy of strength of quantitative evidence for treatment or public health
interventions (adapted from Dicenso, Ciliska, & Guyatt, 2005).
12
Relative Strength
of Evidence Type of Evidence
(with 1 being the strongest)
A hierarchy or order can also indicate the strength of pre-processed evidence (see Table 4).
13
Relative strength Type of Description
(with 1 being the strongest) Research
those studies selected by an organiza-
tion and pre-processed based on high
5 Single studies relevance and characterized by study de-
signs that minimize bias and thus permit
a high strength of inference
Note: Although samples may have been randomly allocated to either control or intervention groups,
other sources of bias may not have been addressed; for example, researchers may not have imple-
mented the blinding of outcome assessors and/or there may be very high drop-out rates. Given that,
in most interventions, drop-outs are least likely to have accomplished the outcome goal (e.g. quit
smoking, avoid adolescent pregnancy), this can skew the results. Alternatively, some cohort studies
have strong methodology, so excluding them from systematic reviews could be an error.
Tools
This document provides a compilation of selected tools and recommendations for critically appraising
relevant research for EIPH; it does not provide detailed discussion about each of the tools or check-
lists.
Appendix 1 includes a summary of the websites of leading organizations and links to critical appraisal
tools for public health professionals. Recommendations are indicated for the use of various tools by
public health practitioners and policy-makers. Relevant links to current pre-processed evidence are
also provided.
This background document provides a brief discussion of some of the key concepts related to EIPH
and critical appraisal for public health practice. A compendium of resources that are important to
EIPH, including helpful websites and links to relevant tools, can further support the implementation of
EIPH among the target audience.
14
References
Birch, S. & Gafni, A. (2003). Economics and the Evaluation of Health Care Programmes:
Generalisability of methods and implications for generalisability of results. Health Policy,
64(2), 207-219.
Buffett, C., Ciliska, D., & Thomas, H. (2007a). Can I Use This Evidence in My Program De-
cision? Assessing Applicability and Transferability of Evidence. Unpublished Work
Ciliska, D. K., Clark, K., Thomas, B. H., Valaitis, R., & Van Berkel, C. (2006). Environmen-
tal Scan Hamilton, ON: National Collaborating Centre, Public Health Methodologies and
Tools.
Ciliska, D. K., Cullum, N., & Marks, S. (2001). Evaluation of systematic reviews of treat-
ment or prevention interventions. Evidence-Based Nursing, 4, 100-104.
Cullum, N., Ciliska, D. K., Marks, S., & Haynes, B. (2008). An Introduction to Evidence-
Based Nursing. In: N. Cullum, D. Ciliska, R.B. Haynes, & S. Marks (Eds.), Evidence-
Based Nursing: An Introduction (pp. 1-8). Oxford, U.K.: Blackwell.
Daly, J., Willis, K., Small, R., Green, J., Welch, N., Kealy, M. et al. (2007). A hierarchy of
evidence for assessing qualitative health research. Journal of Clinical Epidemiology., 60,
43-49.
Deeks, J. J., Dinnes, J., D’Amico, R., Sowden, A. J., Sakarovitch, C., Song, F. et al. (2003).
Evaluating non-randomised intervention studies. Health Technology Assessment 7[27],
1-187. Journal (Full)
Dicenso, A., Ciliska, D. K., & Guyatt, G. (2005). Introduction to Evidence-Based Nursing.
In: A. Dicenso, G. Guyatt, & D. K. Ciliska (Eds.), Evidence-Based Nursing: A Guide to
Clinical Practice. (pp. 3-19). St. Louis, MO.: Elsevier/Mosby.
Gandelman, A. A., Desantis, L. M., & Rietmeijer, C. A. (2006). Assessing community needs
and agency capacity: An integral part of implementing effective evidence-based interven-
tions. AIDS Education & Prevention, 18 (4 Suppl A), 32-43.
Jackson, N. & Waters, E. (2004). The challenges of systematically reviewing public health
interventions. Journal of Public Health, 26, 303-307.
15
Jackson, N. & Waters, E. (2005). Criteria for the systematic review of health promotion and
public health interventions. Health Promotion International, 20, 367-374.
Kohatsu, N. D., Robinson, J. G., & Torner, J. C. (2004). Evidence-based public health: an
evolving concept. American Journal of Preventive Medicine., 27, 417-421.
Lin, V. (2004). From public health research to health promotion policy: On the 10 major con-
tradictions. Sozial- und Praventivmedizin, 49, 179-184.
Rychetnik, L., Frommer, M., Hawe, P., & Shiell, A. (2002). Criteria for evaluating evidence
on public health interventions. Journal of Epidemiology and Community Health, 56,
119-127.
Thomas, B. H., Ciliska, D. K., Dobbins, M., & Micucci, S. (2004). A process for systemati-
cally reviewing the literature: Providing the research evidence for public health nursing
interventions. Worldviews on Evidence-Based Nursing, 1, 176-184.
Waters, E., Doyle, J., Jackson, N., Howes, F., Brunton, G., & Oakley, A. (2006). Evaluating
the effectiveness of public health interventions: The role and activities of the Cochrane
Collaboration. Journal of Epidemiology & Community Health, 60, 285-289.
West, S., King, V., Carey, S., Lohr, K. N., McKoy, N., Sutton, S. F. et al. (2002). Systems to
rate the strength of scientific evidence. (vols. 47) AHRQ Evidence Report/Techonology
Assessment.
16
17
APPENDIX 1
Purpose
To provide some tools for conducting critical appraisal (step 3 below).
Audience
Public health decision-makers in practice or policy.
1
How to use this tool
Consider the type of question you are asking (first column); then consider the type of evidence you have found. That will lead you to
what tool to use.
This is not an exhaustive list of critical appraisal tools; merely a listing of tools that are commonly used.
The status “Recommended,” represented by the , indicates that the tool 1) was judged as relevant for most studies in public
health, and 2) includes an explanation of criteria within the tool, so the use of the criteria are self-explanatory.
diagnostic studies:
http://www.phru.nhs.uk/Doc_Links/Diagnostic%20Tests%
2012%20Questions.pdf
2
Type of Research Website Link Type of Study - Link to Tools
notes on use:
http://sign.ac.uk/guidelines/fulltext/50/notes2.html
notes on use:
http://sign.ac.uk/guidelines/fulltext/50/notes3.html
notes on use:
http://sign.ac.uk/guidelines/fulltext/50/notes4.html
notes on use:
http://sign.ac.uk/guidelines/fulltext/50/notes5.html
3
Type of Research Website Link Type of Study - Link to Tools
Scottish Intercollegiate Guidelines Systematic reviews and meta-analysis tools and guidelines:
Network (SIGN): checklist:
http://www.sign.ac.uk/ http://sign.ac.uk/guidelines/fulltext/50/checklist1.html
notes on use:
http://sign.ac.uk/guidelines/fulltext/50/notes1.html
4
Type of Research Website Link Type of Study - Link to Tools
Evidence for Policy and Practice Methods: Stages of a Systematic review link:
Information and Coordinating Centre http://eppi.ioe.ac.uk/cms/Default.aspx?tabid=89
(EPPI-centre) (University of London,
UK): Quality Assessment and Relevance of evidence:
http://eppi.ioe.ac.uk/cms/ http://eppi.ioe.ac.uk/cms/Default.aspx?tabid=177
PHRED Effective Public Health Practice
Project
http://oldhamilton.ca/.phcs/ephpp/Review
sPortal.asp
5
Type of Research Website Link Type of Study - Link to Tools
Clinical Practice Scottish Intercollegiate Guidelines SIGN 50: A Guideline Developers’ Handbook:
Guidelines Network (SIGN): http://sign.ac.uk/guidelines/fulltext/50/index.html
http://www.sign.ac.uk/
What is the best
intervention/
management of
….? AGREE (Appraisal of Guidelines Critical appraisal tool for Guidelines – AGREE Tool
Research and Evaluation) Collaboration
(considers the best http://www.agreecollaboration.org/ http://www.agreecollaboration.org/instrument/
evidence, context
and expert Recommended
opinion.)
National Institutes for Health and Clinical
Excellence
http://www.phel.nice.org.uk/
6
Type of Research Website Link Type of Study - Link to Tools
The Guide to Community Preventive Chapter 10 “Methods Used for Reviewing Evidence and
Services. Linking Evidence to Recommendations” (tool):
The Task Force on Community http://www.thecommunityguide.org/methods/methods.pdf
Preventive Services (US):
http://www.thecommunityguide.org/librar
y/book/Front-Matter.pdf