SlideShare uma empresa Scribd logo
1 de 21
Baixar para ler offline
Implementation Science and its Application to Population Health
Rebecca Lobb and Graham A. Colditz
Department of Surgery, Washington University in St. Louis, St. Louis, Missouri
Rebecca Lobb: lobbr@wustl.edu
Abstract
Implementation science studies the use of strategies to adapt and use evidence-based interventions
in targeted settings (e.g., schools, workplaces, health care facilities, public health departments) to
sustain improvements to population health. This nascent field of research is in the early stages of
developing theories of implementation and evaluating the properties of measures. Stakeholder
engagement, effectiveness studies, research synthesis, and mathematical modeling are some of the
methods used by implementation scientists to identify strategies to embed evidence-based
interventions in clinical and public health programs. However, for implementation science to
reach its full potential to improve population health the existing paradigm for how scientists create
evidence, prioritize publications, and synthesize research needs to shift toward greater stakeholder
input and improved reporting on external validity. This shift will improve the relevance of the
research that is produced and provide information that will help guide decision makers in their
selection of research-tested interventions.
Keywords
knowledge translation; research translation; public health impact; community-engaged research
INTRODUCTION THE EVOLUTION OF IMPLEMENTATION SCIENCE
Implementation science for health evolved from practice-based evaluations that were
conducted in the 1960s and 1970s to understand problems with implementing national
initiatives (e.g., President Kennedy’s New Frontier and President Johnson’s Great Society
and War on Poverty) (34). These programs required the adoption of new ideas by
organizations, as much as individuals, to improve health and reduce the burden of poverty
(34). The emphasis on organizational accountability and application of research questions to
assess whether the policies were implemented as planned led to the emergence of
implementation science (34).
During the 1980s and 1990s, evaluations of the Healthy People objectives further shaped the
field of implementation science. Preliminary assessments of the initial Healthy People found
poor progress with achieving many of the 226 objectives, primarily owing to the lack of
measures. These assessments led to the selection of objectives for Health People 2000 that
were more feasible to measure (3, 51). In addition, tools and implementation strategies are
now distributed to states and local health departments to increase the potential for achieving
Health People objectives.
Copyright © 2013 by Annual Reviews. All rights reserved
DISCLOSURE STATEMENT
The authors are not aware of any affiliations, memberships, funding, or financial holdings that might be perceived as affecting the
objectivity of this review.
NIH Public Access
Author Manuscript
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
Published in final edited form as:
Annu Rev Public Health. 2013 ; 34: 235–251. doi:10.1146/annurev-publhealth-031912-114444.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
Current applications of implementation science for health continue to develop tools and
strategies to improve the success of implementation and to study how programs and
interventions are used in practice-based settings. However, the focus of this field has shifted
toward studying how to accelerate the translation of research-tested interventions into policy
and practice, taking into account the complex, dynamic, adaptive systems in which these
interventions can be implemented (6). This most recent evolution of implementation science
was driven largely by an acceleration in funding from the National Institutes of Health
(NIH) and various stakeholder interests to fill the gap between scientific discoveries and the
application of these innovations to improve population health (34).
The Institute of Medicine’s (IOM) 1990 report that identified the gap between mental health
research and practice as a national public health priority was among the first reviews that
quantified the need for a return on the investment that federal dollars made toward research
(45). At that time, almost 80% of the ~9.4-million drug-addicted and-dependent individuals
in the United States went untreated, and the treatment administered did not keep up with the
major scientific advances. In 1997, public funds accounted for 65% of the reported revenues
in drug- and alcohol-treatment programs, creating a pressing interest to find better ways to
implement innovations in effective treatment strategies.
For similar reasons, several years after the IOM report on the gap between mental health
research and practice, the NIH highlighted the importance of moving basic research into
human studies with the Roadmap Initiative (66). Billions of taxpayer dollars had been spent
on scientific discoveries to improve health, but few were being used (53). Whereas the
initial purpose of the Roadmap was to accelerate the translation of research benefits to
patients (66), more recent iterations have expanded this view to include the translational
steps from basic research to use by decision makers and residents (41). Although
stakeholders have not been explicitly defined by the Roadmap Initiative, one can imagine
the proximal and distal stakeholders at each translational step (Figure 1). The first
translational step (T1) seeks to move basic discovery into a candidate health application
primarily through efficacy studies performed by clinical and behavioral scientists. In the
second translational step (T2), health services and public health scientists conduct
effectiveness studies and systematic reviews to assess the value of the health application to a
wider population and develop clinical or community guidelines for using research-tested
interventions. In the third translational step (T3), implementation scientists use methods
similar to those used by scientists in the T2 step to identify strategies that can move clinical
guidelines or evidence-based interventions (EBIs) into clinical and public health practice.
Implementation scientists also collaborate with stakeholders to conduct T4 research, which
seeks to evaluate the real-world use of EBIs and implementation strategies by organizations
and individuals.
Connecting with stakeholders (e.g., researchers from different disciplines, industry, public
health, health care, and community-based groups) to inform the development of research at
translational phases T1 and T2 is supported by the Clinical and Translational Science
Awards (CTSA). Through a consortium of 60 medical institutions in 30 states and the
District of Columbia, the CTSA supports clinical and translational investigators to develop
skills to engage stakeholders in team science to accelerate the translation of research into
clinical practice (11). Stakeholder engagement for T3 and T4 translational research is
supported by the NIH Program Announcements on Dissemination and Implementation
Research in Health. Because T3 translational research has a dual emphasis of dissemination
and implementation, these two types of research are often discussed in the same context.
However, the NIH distinguishes between dissemination, “the targeted distribution of
information and intervention materials to a specific public health or clinical practice
audience,” and implementation, “the use of strategies to introduce or change evidence-based
Lobb and Colditz Page 2
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
health interventions within specific settings” (53). This article provides a review only of
implementation science to improve population health as it is currently represented in
translational steps T3 and T4 in the Roadmap. The review is intended to reflect the depth and
breadth of the topic and is not a complete review of the literature.
WHY IS IMPLEMENTATION SCIENCE NEEDED?
Population health can be defined as “the health outcomes of a group of individuals,
including the distributions of such outcomes within the groups” (42). Health outcomes and
the distribution of health outcomes are determined by interacting factors related to biology;
health care systems; psychosocial, economic, and physical environments; and policies across
the life span (42). A goal of implementation science for health is to identify the factors,
processes, and methods that can successfully embed EBIs in policy and practice to achieve
population health. Evidence based refers to interventions that have undergone sufficient
scientific evaluation to be considered effective (56).
Over the past two decades, the United States has experienced a dramatic reduction in
mortality for the leading causes of death, heart disease, and cancer. These improvements in
health are due largely to corresponding reductions in tobacco use; increases in use of
screening tests for breast, cervical, and colorectal cancer (18); and improvements in
therapeutics (49). Despite these improvements, the United States continues to be ranked at
the bottom for life expectancy at birth out of 33 countries in the Organizations for Economic
Co-operation and Development (OECD) (49). Reasons for this ranking are complex.
However, key contributors to the relatively poor longevity in the United States compared
with other OECD countries include continued high prevalence of tobacco use among adults
without a college education (9), inactivity, and the increase in overweight and obesity
among the US population (19). These health behaviors and comorbid conditions are major
risk factors for heart disease and cancer.
Of equal concern to public health in the United States is the substantial inequities in the
distributions of these risk factors for heart disease and cancer by social groups defined by
education level, poverty level, race, ethnicity, and insurance status. Although health care
services account for only about 10% of the population’s health (57), utilization of health
care is a critical factor in the low longevity ranking for the United States compared with
other OECD countries because many of the other countries provide universal insurance
coverage at the na-tional or regional levels, which facilitates access to life-saving screening
tests and therapeutics. In addition, an uninsured status is disproportionately high among low-
income, limited education, racial and ethnic minority groups who are already vulnerable to
health inequities for heart disease, cancer, and associated risk factors and comorbid
conditions. These social characteristics and the characteristics of the environments in which
socially disadvantaged persons live, work, and play (e.g., built environment, food deserts,
workplace policies) contribute to the persistent inequities in health in the United States (65).
Although it is not comprehensive, this list of health issues in the United States provides a
clear picture of the work that remains to be done to improve the health of the nation.
Evidence suggests that only about half of the available medical and public health
innovations are used in practice (10, 24, 38), yet there are sufficient EBIs to reduce by more
than 50% the burden of cancer and chronic and infectious diseases in the United States (12,
13). The following sections provide an overview of the factors that influence use of EBIs
and the ways in which implementation science is being used or can be used to accelerate use
of EBIs to improve population health and reduce social inequities in health.
Lobb and Colditz Page 3
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
FACTORS THAT INFLUENCE USE OF EVIDENCE-BASED INTERVENTIONS
Glasgow & Emmons (24) identified multiple, interacting reasons why adoption and use of
EBIs does not routinely occur in practice, including the characteristics of the intervention
(e.g., high cost, intensive time demands, not customizable), research design used to test the
EBI (e.g., participants or setting not representative of target population or setting, failure to
evaluate implementation), the situation of the intended target setting (e.g., schools,
workplaces, health care facilities, public health departments), and interactions among these
factors (24). These barriers can be particularly problematic when trying to use EBIs to
improve health for medically underserved populations and residents in underresourced areas
because so few interventions are tested in these situations. A study conducted in rural West
Virginia illustrates how these factors influence the use of EBIs in practice settings.
Example 1: Barriers to Use of Evidence-Based Interventions in Rural West Virginia
Residents in rural areas experience multiple barriers to high-quality health care that residents
in urban and suburban areas do not encounter, such as longer distances to reach health
services (2) and higher rates of unemployment, uninsured, and poverty (62). In addition, the
health of rural racial minority groups is particularly disadvantaged with regard to heart
disease, cancer, and diabetes (2). To understand the compatibility of EBIs in rural areas, the
Mary Babb Randolph Cancer Center at West Virginia University released a request for
proposals for community-based cancer education mini-grants. Eligible organizations
belonged to the National Cancer Institute (NCI)-funded Appalachia Community Cancer
Network. The call for proposals required that grantee organizations adapt and implement
EBIs for cancer control, and the funders offered training to grantees to facilitate these
processes (62). Using a multiple-case-study research design, investigators summarized
information ascertained from surveys, interviews, and the final programmatic reports at the
end of the funding period for the 13 grantee organizations (e.g., coalitions, health clinics,
churches, nonprofit community organizations). Results of this study showed that grantees
felt the requirement to use an EBI limited their options for interventions to improve cancer
control. Many grantees commented that it was time consuming and difficult to adapt the
EBIs because the interventions did not take into account older, less educated, or lower-
income populations or the geographically isolated setting, or they required large worksites,
churches, or staffing that were not available (62). All organizations adapted the EBIs. Some
changes were minor, but other changes were substantial in that the core intervention
components were condensed or eliminated.
This case study poignantly demonstrates how the characteristics of the intervention, research
design, and context of the implementation setting affect use of the intervention, even when
funding is provided to encourage use of an EBI. In 2004, Greenhalgh et al. (35) published a
comprehensive meta-narrative review of nearly 500 articles that described factors associated
with the spread and sustained use of EBIs in health service delivery and other organizations
that are the primary target settings for EBIs (35). Among the key findings of this literature
review was that organizations are part of a complex, multidimensional system with multiple
interacting factors that influence use of EBIs at multiple levels (Figure 2) (35). Key features
of complex systems include constant adaptation to nonlinear change, dynamic interactions
between system components, and feedback loops (6, 47). The situations of these complex
systems are influenced by individuals (e.g., perceptions of the innovation, adopter
characteristics), social processes (e.g., communication, linkages, implementation),
characteristics of groups (e.g., system antecedents, readiness to change), and the
sociopolitical, economic environment (20).
The literature suggests that some factors are more influential than others to support the use
of EBIs. For example, rewards for using EBIs, inadequate funding for EBIs, and support
Lobb and Colditz Page 4
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
from state legislators are more influential on adoption of EBIs by public health decision
makers than are factors associated with their personal characteristics (e.g., felt need to be an
expert to make evidence-based decisions) (40). One study found that public health decision
makers with a bachelor’s degree were 5.6 times (95% confidence interval 1.7, 17.9) more
likely than those with a public health master’s degree to report that they lacked skill to
develop evidence-based programs (40). In addition, an organization’s culture (e.g., the value
that an organization places on using an EBI for decision making) (14) is a strong predictor
of whether a systematic review influences the use of EBIs (17). Perhaps most relevant to
successful implementation is the strategic climate of an organization that motivates and
enables employees to embed EBIs in existing practices through policies, procedures, and
reward systems (1).
Much of what we know about factors that influence use of EBIs in organizations comes
from the fields of sociology, psychology, communications, systems science, and
organizational behavior (35). These fields have been using implementation science for
decades, but its application to questions about public health is relatively new. A rich body of
theory on implementation science for health has emerged over the past decade, with the
development of more than 50 implementation theories (14, 15, 60). Several theories have
similar constructs, and testing of the theories has been limited. No particular theories have
been identified as most useful to predict or explain EBI implementation. For many
implementation theories, the psychometric properties of construct measures have not been
evaluated, the relationships among the constructs have not been assessed, and the predictive
validity is unknown.
A systematic review conducted by Emmons et al. (20) of peer-reviewed literature highlights
the need for additional research on measures for organizational constructs. The review
investigated the definitions of constructs and measures of five key characteristics of
organizations that are antecedents for implementation science for health (i.e., leadership,
vision, managerial relations, climate, absorptive capacity). They found that no measure of a
specific construct was used in more than one study, many studies did not report the
psychometric properties of the measures, some assessments were based on a single response
per organization, and the level of the instrument did not always match the level of analysis
(20).
IMPROVING THE FIT OF EVIDENCE-BASED INTERVENTIONS IN REAL-
WORLD SETTINGS
Ideally, we want EBIs to be implemented in policy and practice settings in lieu of
interventions that have not undergone sufficient testing (8). However, as we described
earlier, characteristics of EBIs, the research design used to test the EBI, and the complexity
of the systems in which EBIs are implemented can deter use of these research-tested
interventions in practice. Green (34) used the metaphor of a funnel to describe why EBIs
generally do not fit well with real-life circumstances (Figure 3). At the start of the funnel is
the universe of potential studies defined by funding priorities. These studies become
narrowed by transitions through which researchers review grants, researchers determine
publications in peer-reviewed journals, researchers synthesize the available evidence on
interventions, and researchers (who also are often clinicians) develop guidelines to use EBIs
in practice (34). At each transition in the funnel, the priorities and decisions have been
rarely, if at all, informed by those who will ultimately use the research (34), and evidence
with strong internal validity is overemphasized (32). The result of this discovery-to-practice
process is that initial research often involves populations, procedures, and settings that bear
little relation to the locus of actual implementation. Glasgow and Green have written
extensively about the translation of research to routine practice (24, 26–28, 31–34). We
Lobb and Colditz Page 5
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
summarize some of the key points in their discussions and expand on their ideas by
depicting how greater stakeholder input and attention to external validity in translational
steps T2-T4 would improve the relevance and increase the volume of research that passes
through the funnel.
Peer Review of Research Proposals
Starting with the peer-review stage of research, funders can improve reporting on external
validity of research by requiring that reviewers evaluate whether analysis plans include
methods for data collection that will enable reporting on measures of external validity (e.g.,
description of withdrawals and dropouts; description of target population compared with
population reached; frequency, duration, and intensity of intervention). Guidelines such as
the extension of the CONSORT, RE-Aim, TREND, Jadad scale, and STROBE, which have
been used recently to judge reporting on external validity of completed studies, can also be
used to guide reviewers’ judgment on the analysis plans for reporting external validity in
proposed studies (26, 32, 33, 64, 67).
Stakeholder review groups can also be added to the review cycle to inform the relevance of
the scientific evidence from a user perspective. Several study designs have strong internal
validity and use a broad sample of participants, incorporate aspects of general practice, or
have flexible intervention strategies to improve the relevance of research (participatory
research projects, pragmatic trials, or comparative effectiveness studies). Green recommends
these forms of “action research” to produce practice-based evidence that is “closer to the
actual circumstances of practice” (31,p.i23). Combining ratings across multiple stakeholder
groups is already used in some areas. However, interdisciplinary reviews on the exact same
criteria can result in low agreement in ratings across research, practice, and policy groups
compared with agreement among members within these stakeholder groups (46). Adding a
separate review for stakeholders to comment only on relevance from their perspective would
not undermine the ratings on scientific merit or relevance and would contribute toward
funding of research with greater potential for application and scientific discovery. Because
scientists from different fields have unique groups of immediate stakeholders (Figure 1), the
composition of the stakeholders would change depending on the focus of the grant. Yet, the
instructions could be standardized across all proposals.
Publication Priorities
At the next step in the funnel, journals can develop ways to reduce the leaks in research
evidence and improve the relevance of the published manuscripts by requiring reviewers to
consider how well authors report on external validity, in addition to the consideration
currently given to reporting on a study’s internal validity. As noted above, reviewers can use
several guidelines to judge an author’s report on the external validity of completed studies.
This suggested change in review procedures for manuscripts is practical, given that
numerous leading health journals have agreed that it would be important and feasible to
report on external validity (26). Green & Glasgow (32) have recommended that journals
take several other actions to improve reporting on external validity such as publishing
editorials on external validity, featuring some exemplary studies or a special issue that
highlights factors related to external validity, expanding online publication of protocols
relevant to external validity, engaging other journals and organizations in these efforts, and
indexing key words relevant to key dimensions of external validity.
Research Synthesis
Research synthesis can answer a broad range of questions for knowledge users and scientists
using meta-analysis or qualitative methods (e.g., narrative synthesis, grounded theory,
thematic analysis) to summarize the results. Meta-analyses of randomized controlled trials
Lobb and Colditz Page 6
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
has dominated the research synthesis literature to address the question, Which interventions
work to improve health? The effective interventions identified by meta-analysis are
regularly updated on several web portals (The Community Guide to Community Preventive
Services, The Center for Reviews and Dissemination, The US Preventive Services Task
Force Recommendations, The Cochrane Public Health Group, and The Campbell
Collaboration; see sidebar). However, meta-analysis of RCTs does not inform decision
makers about for whom the interventions work, in which settings, and under what intensity,
dose, and duration of intervention. To answer these more complex questions, other forms of
research synthesis can be used.
A broader range of synthesis approaches can complement findings from meta-analysis of
RCTs to provide policy and practice stakeholders with additional information on which they
can base their decision to use EBIs (37). Systematic review and meta-analysis of
observational studies have exploded over the past 25 years with publication guidelines (58)
and regression methods (5, 50) implemented to build on the diversity of exposure and
outcomes in target settings. Synthesis methods such as scoping studies, meta-narrative, and
realist reviews summarize information from a wide range of sources including the non-peer-
reviewed literature and reports from qualitative studies to assess the applicability of results
across diverse populations, settings, and conditions (37). These alternative methods to
research synthesis employ rigorous, systematic processes to summarize information (35, 36,
52). In addition, the opinions of policy and practice experts can be incorporated to guide the
questions that are answered by research synthesis and to comment on the gaps in the
literature that are identified by the synthesis. Inclusion of stakeholder opinions in literature
reviews is common practice in Canada (37, 30) and is recommended by UNICEF/UNDP/
World Bank and the WHO in the Framework for Operations and Implementation Research
in Health and Disease Control Programs (29).Research syntheses that examine a broad
range of evidence are particularly relevant for public health interventions that are
implemented in complex systems (33). As noted by Brownson, the goal of research
synthesis should not only be to identify effective interventions, but to allow “the practitioner
to understand the local contextual conditions necessary for successful implementation” (8, p.
183).
Guidelines
One of the more dominant roles of implementation science is to fill the information gap that
occurs along the research-to-practice continuum after research synthesis. EBIs have been
identified at this point, but stakeholders lack guidelines to determine how to apply and
evaluate the use of EBIs across populations and settings. To date, guideline development has
been narrowly focused on clinical practices and only somewhat focused on public health
practices. The limited guidelines to assist public health decision makers with
implementation of EBIs include health impact assessments and Locate evidence, Evaluate it,
Assemble it, and inform Decisions (L.E.A.D.) (44). More guideline development is needed
to accelerate the integration of EBIs in public health practice (32).
Interventions do sometimes diffuse to policy and practice contexts before there is sufficient
evidence to suggest effectiveness (59, 63). For example, in randomized trials with a control
condition, persons, organizations, or communities often view random assignment to an
intervention as unfair and may adopt aspects of the intervention even though its efficacy or
effectiveness has not been tested (59). Alternative study designs that offer a placebo instead
of no treatment or that wait-list the control group can minimize threats to internal validity of
randomized trials. Another example is described in a review of the literature that identified
primary studies on the validation of cardiovascular risk models and use of these models in
guidelines to treat diabetes. The authors found 45 prediction models, several of which were
Lobb and Colditz Page 7
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
incorporated in clinical guidelines for the management of type 2 diabetes, even though the
risk scores had not been externally validated in a diabetes population (63).
Practice and Policy
Green’s image of the funnel moves directly from guidelines to practice and policy. Yet, use
of EBIs by policy and practice stakeholders rarely occurs automatically after research
synthesis and guideline development. For example, political will played a large role in
slowing progress with tobacco control (59) and with expediting the discovery and
implementation of antiviral therapy to treat HIV (21, 55). In addition, improved relevance of
EBIs through participatory research can increase the use of EBIs by policy and practice
stakeholders to expedite use of these innovations. However, even if the EBIs have been
tested in the target population and setting, some characteristics of the intervention will likely
need to be adapted for implementation to be feasible. Adaptation refers to “the degree to
which an EBI is changed or modified by a user during adoption and implementation to suit
the needs of the setting or to improve the fit to local conditions” (54, p. 30). An important
consideration for implementation scientists is the balance between adaptation and fidelity to
the intervention’s core components, those thought to determine its effectiveness (8).
PARTICIPATORY RESEARCH TO IMPROVE IMPLEMENTATION OF
EVIDENCE-BASED INTERVENTIONS
Green (31) emphasizes that scientists should engage in participatory research during the
discovery research translational phase (T2) to produce EBIs that are more relevant and
actionable to policy and practice stakeholders. Participatory research requires ongoing,
bidirectional communication between researchers and stakeholders to develop relationships
that will inform research questions as well as practice needs and inform the adaptation of
interventions instead of development of new innovations that are later discarded because of
poor fit (31, 48). The engagement of stakeholders in research can take on many forms, and
partners can include institutions, organizations or individuals, or researchers from different
disciplines. The specific terms used to describe theories of stakeholder engagement differ
depending on whether one refers to social theory or organizational theory. However,
consistent across theories is the basic tenet that partnership engagement can be thought of as
a continuum of involvement (unlinked, exchange information, work side by side on a
common goal, informal sharing of resources, formal sharing of resources based on a written
contract or memorandum of understanding) (22, 23). Following are examples of T3 evidence
being produced through implementation science along the stakeholder engagement
continuum.
Example 2: Using Statistical Modeling to Guide Policy and Practice Decisions
At the lower end of the stakeholder engagement continuum, implementation scientists can
use innovative statistical techniques such as Markov modeling and optimization to engage
policy, public health, and clinical practice partners in applying EBIs in the real world. For
example, Demarteau et al. (16) assessed the optimal mix of screening and vaccination
against human papillomavirus (HPV) to protect against cervical cancer. The evaluation used
a Markov cohort model to estimate the outcomes of 52 different prevention strategies and an
optimization model in which the results of each prevention strategy of the Markov model
were entered as input data to evaluate the combination of different prevention options. The
model was applied in the United Kingdom and Brazil. Findings from this implementation
study suggested that for a zero increase in budget, the number of cases of cervical cancer
would be substantially reduced in the United Kingdom (41%) and Brazil (54%) by
implementing an optimal combination of HPV vaccination (80% coverage) and screening at
Lobb and Colditz Page 8
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
prevaccination coverage (65% United Kingdom, 50% Brazil) while extending the screening
interval to every 6 years in the United Kingdom and 5 years in Brazil (16).
Example 3: Using Pragmatic Trials to Create Applicable Evidence-Based Interventions
The pragmatic trial falls mid-range in the stakeholder-engaged continuum. This type of trial
incorporates features such as allowing participating organizations to choose between options
of care and determining the effects of an intervention under the usual conditions in which it
will be applied. These features are in stark contrast to the features of traditional efficacy
trials that study the effects of an intervention under ideal circumstances (61). These trials are
not opposites. Rather they can be considered as end points on a pragmatic-explanatory
continuum. PRECIS (pragmatic-explanatory continuum indicator summary) is a tool that has
been developed to inform the design of intervention trials based on the research questions
being addressed (real-world or explanatory) (61). The PRECIS tool consists of ten domains
that guide researchers when deciding where their study should fall on the pragmatic-
explanatory continuum. The PRECIS criteria have also been applied using a retrospective
approach to describe completed trials. Using the PRECIS criteria, nine reviewers
independently scored three completed studies that targeted obesity treatment in primary care
settings. All three trials were rated in the moderate range on the PRECIS scale, but the
ratings varied across PRECIS dimensions, being most pragmatic on the flexibility of the
comparison condition and participant representativeness (25). This type of tool can help
researchers improve the fit of their studies with real-world circumstances and provide a
richer understanding of the potential for implementation success in usual care settings.
Example 4: Formal Partnerships between Researchers and Practice Experts
At the higher end of the stakeholder engagement continuum scientists create formal
partnerships with the institutions and organizations that will ultimately implement the EBIs.
The Quality Enhancement Research Initiative (QUERI) at the US Department of Veterans
Affairs is an example of a highly integrated partnership between researchers and
practitioners. Implementation studies that are conducted through QUERI are an exceptional
source of practice-based evidence, and findings can often be generalized to other clinical
settings. QUERI utilizes a six-step process to diagnose gaps in performance to identify and
implement EBIs to reduce the disease burden for conditions common among veterans, such
as heart disease, diabetes, HIV/hepatitis C, and mental health and substance use disorders.
For example, one QUERI study explored the sustainability of a nurse-initiated HIV rapid
testing intervention that was implemented via 2 90-minute in-service sessions designed to
teach pre- and posttest counseling techniques as well as the administration, interpretation,
and entering of results into patients’ medical records. Using time-series analysis, the authors
found that, among registered nurses who participated in the in-service, this low-intensity
implementation strategy resulted in a steady rate of HIV rapid testing for the year following
the training. In addition, there was an unexpected increase in HIV blood testing among other
clinical staff, leading to a 70% site-wide increase in HIV testing (43).
Scientists can use multiple research methods to engage stakeholders at varying levels of
integration along the stakeholder engagement continuum to produce evidence that informs
how to use EBIs. The level of engagement is more a function of the researcher’s choice than
of the research method. For example, time-series analysis is often conducted at the lowest
levels of stakeholder engagement (i.e., unlinked or exchange information). However, the
example provided here demonstrates use of this method in a highly integrated partnership
setting.
Lobb and Colditz Page 9
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
MOVING FORWARD WITH IMPLEMENTATION SCIENCE TO IMPROVE
POPULATION HEALTH
Implementation science has tremendous potential to improve population health by
accelerating the use of EBIs in practice. We have outlined several ways in which
implementation science is being applied to population health. Following are areas in this
field of research that should be emphasized in the future to ensure that the products gained
through implementation science are in balance with the benefits to population health.
Stakeholder Engagement
We discussed many opportunities to increase stakeholder engagement throughout the
discovery to practice funnel. However, to shift the current paradigm for knowledge
production toward a more user-engaged process, training programs, incentives, and reward
systems for researchers must also change. Participatory engagement skills are not routinely
required in public health, doctoral, fellowship, or faculty training programs for health
researchers. University reward structures also slow implementation science because scholars
lack incentives and rewards for collaboration across scientific disciplines and with policy
and practice experts (23). Funding mechanisms also limit opportunities for community-
engaged research when time frames for submission are short, review cycles are long, and
funding periods limit evaluation of continued EBI use over time (31).
Systems Thinking
The introduction of systems thinking to implementation science has revolutionized old ideas
that implementation is a simple linear process or that participatory research in itself will
improve the use of EBIs in practice and policy settings (39). The extent to which an EBI will
be used and the benefit sustained over time is dependent on multiple, multilevel, dynamic
processes that respond to feedback loops through nonlinear adaption (39). The implication
of this transition to systems thinking is that scientists need to consider how to move our
methods of investigation beyond the reductionist approach toward a systems view to
accelerate the development of EBIs and implementation strategies. Systems science methods
such as agent-based modeling, systems dynamics, and network analysis are increasingly
becoming more valuable to implementation science (47) and to implementation in practice.
A Pragmatic Approach to Research
A pragmatic approach to research selects the methods that are best suited to answer the
research question instead of relying on a single research method. Randomized trials are the
gold standard for discovery research that answers questions about what works, but this study
design provides limited information about how to embed EBIs in complex systems that are
constantly adapting to change. More prospective, practice-based cost-effectiveness and
comparative-effectiveness studies are needed to clarify the resources needed to implement
EBIs and identify which interventions or implementation strategies work best in particular
settings. In addition, the development of stakeholder-friendly tools that use mathematical
modeling to select interventions based on an individual’s risk for disease or to choose
implementation strategies based on characteristics of the intervention and target population
will accelerate the use of EBIs in practice and policy settings. The Archimedes Model is one
example of how advanced mathematical modeling is being used to advance implementation
of EBIs in practice. The US Department of Health and Human Services is using the
Archimedes Model to quickly and accurately research, analyze, and evaluate the cost-
effectiveness and implementation strategies for specific health care interventions (4).
Lobb and Colditz Page 10
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
Reliability and Validity of Measures
Numerous measures of factors related to implementation of EBIs in target settings have
emerged over the past decade, but evaluations of the psychometric properties associated
with these measures are limited. Further research is needed to clarify the measures with the
strongest properties for particular settings and those that require further development to
improve reliability and validity. Also needed is clarification on whether individual measures
capture unique constructs or overlapping concepts to refine the terms and definitions of
constructs used in theoretical frameworks for implementation.
External Validity
In addition to knowing what works, policy and practice stakeholders want to know, For
whom does it work? In what settings does it work? And in what dose frequency, intensity,
and duration does it work? Meta-analysis of RCTs is routinely used to provide stakeholders
with information on what works, but the remaining questions that stakeholders ask have not
been adequately addressed by research. More research synthesis using meta-regression of
RCTs, meta-analysis of observational studies, and meta-narrative reviews is needed to
inform stakeholders about whether EBIs have been tested in populations and settings that are
relevant to them and whether the characteristics of the interventions and implementation
strategies are feasible for them to implement. Many of the EBIs recommended to
stakeholders have been developed in studies with highly select participants and in settings
with adequate resources and implemented by highly trained staff under ideal conditions. As
a result, the greatest value of alternative research synthesis at this time is to identify the gaps
in research that need to be filled by pragmatic trials, practice-based research networks, and
ongoing participatory research. Even with an increase in action research (31), we will not
get the perfect solution to external validity that applies to specific settings or populations.
Therefore, the extended benefit of stakeholder engagement in research is the development of
relationships that will inform strategies to guide ongoing adaptation and evaluations of
interventions as the needs and circumstances of target settings change over time (31, 48).
Acknowledgments
This work was supported by the National Cancer Institute (CA U54-155496). R.L. is supported by the Foundation
for Barnes-Jewish Hospital, St. Louis. G.A.C. is also supported by the Cissy Hornung Clinical Research
Professorship, 117489-CRP-03-194-07-CCE, from the American Cancer Society. The authors thank Amy
Ostendorf for lending her graphic design talents to this article.
Glossary
Translation moving scientific discovery along the pathway from bench to
bedside to population
Stakeholder a party with an interest in research products
Population health health outcomes of a group of individuals, including the
distributions of such outcomes within the group
Efficacy study of intervention performance in a highly selected population
in a controlled setting. The emphasis is on internal validity
Effectiveness study of intervention performance in diverse populations in real-
world settings. External validity is balanced with internal validity
Systematic reviews use systematic and explicit methods to identify, select, and
critically appraise relevant research to answer a clearly
formulated question
Lobb and Colditz Page 11
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
Evidence-based
interventions (EBI)
interventions that have undergone sufficient scientific evaluation
to be considered effective
Dissemination targeted distribution of information and intervention materials to a
specific public health or clinical practice audience
Implementation use of strategies to introduce or change evidence-based health
interventions within specific settings to improve population health
Research synthesis the contextualization and integration of findings from individual
studies within broader knowledge on the topic, vsing qwryititative
and/or quantative methods
LITERATURE CITED
1. Aarons GA, Horowitz JD, Dlugosz LR, Ehrhart MG. The role of organizational processes in
dissemination and implementation research. 2012:128–153. See Ref. 7.
2. Agency for Healthc. Res. Quality (AHRQ). Health Care Disparities in Rural Areas: Selected
Findings from the 2004 National Healthcare Disparities Report AHRQ Publ. No. 05–P022.
Rockville MD: AHRQ US Dep. Health Hum. Serv; 2005. http://archive.ahrq.gov/research/ruraldisp/
ruraldispar.htm
3. Andersen R, Mullner R. Assessing the health objectives of the nation. Health Aff. 1990; 9:152–162.
4. Archimedes. Press release. San Franc., Calif.: 2012 May 3rd. HHS enlists Archimedes Inc. to
expand government’s use of health care modeling for forecasting quality and cost outcomes. http://
archimedesmodel.com/PR-3-May-2012
5. Berkey CS, Hoaglin D, Mosteller F, Colditz GA. A random-effects regression model for meta-
analysis. Stat. Med. 1995; 14:395–411. [PubMed: 7746979]
6. Best A. Systems thinking and health promotion. Am. J. Health Promot. 2011; 25:eix–ex. [PubMed:
21361801]
7. Brownson, RC.; Colditz, GA.; Proctor, EK. Dissemination and Implementation Research in Health:
Translating Science to Practice. New York: Oxford Univ Press; 2012.
8. Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: a fundamental concept for
public health practice. Annu. Rev. Public Health. 2009; 30:175–201. [PubMed: 19296775]
9. Byers T. Two decades of declining cancer mortality: progress with disparity. Annu Rev Public
Health. 2010; 31:121–132. [PubMed: 20070204]
10. Cilenti D, Brownson RC, Umble K, Erwin PC, Summers R. Information-seeking behaviors and
other factors contributing to successful implementation of evidence-based practices in local health
departments. J. Public Health Manag. Pract. 2012; 18(6):571–576. [PubMed: 23023282]
11. Clin. Transl. Sci. Awards. (CTSA). Progress Report 2009–2011. Bethesda, MD: Natl. Inst. Health;
2009–2011. http://www.palladianpartners.com/NCATS/ctsa_2011/
12. Colditz GA, Samplin-Salgado M, Ryan CT, Dart H, Fisher L, et al. Harvard report on cancer
prevention, volume 5: fulfilling the potential for cancer prevention: policy approaches. Cancer
Causes Control. 2002; 13:199–212. [PubMed: 12020101]
13. Colditz GA, Wolin KY, Gehlert S. Applying what we know to accelerate cancer prevention. Sci.
Transl. Med. 2012; 4 127rv4.
14. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering
implementation of health services research findings into practice: a consolidated framework for
advancing implementation science. Implement. Sci. 2009; 4:50. [PubMed: 19664226]
15. Davies P, Walker AE, Grimshaw JM. A systematic review of the use of theory in the design of
guideline dissemination and implementation strategies and interpretation of the results of rigorous
evaluations. Implement. Sci. 2010; 5:14. [PubMed: 20181130]
16. Demarteau N, Breuer T, Standaert B. Selecting a mix of prevention strategies against cervical
cancer for maximum efficiency with an optimization program. Pharmacoeconomics. 2012;
30:337–353. [PubMed: 22409292]
Lobb and Colditz Page 12
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
17. Dobbins M, Cockerill R, Barnsley J, Ciliska D. Factors of the innovation, organization,
environment, and individual that predict the influence five systematic reviews had on public health
decisions. Int. J. Technol. Assess. Health Care. 2001; 17:467–478. [PubMed: 11758291]
18. Edwards BK, Ward E, Kohler BA, Eheman C, Zauber AG, et al. Annual report to the nation on the
status of cancer, 1975–2006, featuring colorectal cancer trends and impact of interventions (risk
factors, screening, and treatment) to reduce future rates. Cancer. 2010; 116:544–573. [PubMed:
19998273]
19. Eheman C, Henley SJ, Ballard-Barbash R, Jacobs EJ, Schymura MJ, et al. Annual report to the
nation on the status of cancer, 1975–2008, featuring cancers associated with excess weight and
lack of sufficient physical activity. Cancer. 2012; 118:2338–2366. [PubMed: 22460733]
20. Emmons KM, Weiner B, Fernandez ME, Tu SP. Systems antecedents for dissemination and
implementation: a review and analysis of measures. Health Educ. Behav. 2012; 39:87–105.
[PubMed: 21724933]
21. Fee E, Krieger N. Understanding AIDS: historical interpretations and the limits of biomedical
individualism. Am. J. Public Health. 1993; 83:1477–1486. [PubMed: 8214245]
22. Gajda R. Utilizing collaboration theory to evaluate strategic alliances. Am. J. Eval. 2004; 25:65–
77.
23. Gehlert S, Colditz GA. Cancer disparities: unmet challenges in the elimination of disparities.
Cancer Epidemiol. Biomarkers Prev. 2011; 20:1809–1814. [PubMed: 21784956]
24. Glasgow RE, Emmons KM. How can we increase translation of research into practice? Types of
evidence needed. Annu. Rev. Public Health. 2007; 28:413–433. [PubMed: 17150029]
25. Glasgow RE, Gaglio B, Bennett G, Jerome GJ, Yeh HC, et al. Applying the PRECIS criteria to
describe three effectiveness trials of weight loss in obese patients with comorbid conditions.
Health Serv. Res. 2011; 47:1051–1067. [PubMed: 22092292]
26. Glasgow RE, Green LW, Ammerman A. A focus on external validity. Eval. Health Prof. 2007;
30:115–117.
27. Glasgow RE, Klesges LM, Dzewaltowski DA, Estabrooks PA, Vogt TM. Evaluating the impact of
health promotion programs: using the RE-AIM framework to form summary measures for decision
making involving complex issues. Health Educ. Res. 2006; 21:688–694. [PubMed: 16945984]
28. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion
interventions: the RE-AIM framework. Am. J. Public Health. 1999; 89:1322–1327. [PubMed:
10474547]
29. Glob. Fund to Fights AIDS, Tuberc., and Malar. Framework for Operations and Implementation
Research in Health and Disease Control Programs. Geneva: Glob. Fund; 2008. http://
www.theglobalfund.org/en/me/documents/operationalresearch/
30. Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, et al. Lost in knowledge translation: time
for a map? J Contin. Educ. Health Prof. 2006; 26:13–24. [PubMed: 16557505]
31. Green LW. Making research relevant: If it is an evidence-based practice, where’s the practice-
based evidence? Fam. Pract. 2008; 25(Suppl. 1):i20–i24. [PubMed: 18794201]
32. Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research:
issues in external validation and translation methodology. Eval. Health Prof. 2006; 29:126–153.
[PubMed: 16510882]
33. Green LW, Glasgow RE, Atkins D, Stange K. Making evidence from research more relevant,
useful, and actionable in policy, program planning, and practice: slips “twixt cup and lip.”. Am. J.
Prev. Med. 2009; 37:S187–S191. [PubMed: 19896017]
34. Green LW, Ottoson JM, Garcia C, Hiatt RA. Diffusion theory knowledge dissemination utilization
integration in public health. Annu. Rev. Public Health. 2009; 30:151–174. [PubMed: 19705558]
35. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service
organizations: systematic review and recommendations. Milbank Q. 2004; 82:581–629. [PubMed:
15595944]
36. Greenhalgh T, Wong G, Westhorp G, Pawson R. Protocol–realist and meta-narrative evidence
synthesis: evolving standards (RAMESES). BMC Med. Res. Methodol. 2011; 11:115. [PubMed:
21843376]
Lobb and Colditz Page 13
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
37. Grimshaw, J. A Guide to Knowledge Synthesis. A Knowledge Synthesis Chapter. Ottawa:: Can.
Inst. Health Res.; 2012. http://www.cihr-irsc.gc.ca/e/4.html
38. Hannon PA, Fernandez ME, Williams RS, Mullen PD, Escoffery C, et al. Cancer control planners’
perceptions and use of evidence-based programs. J. Public Health Manag. Pract. 2010; 16:E1–E8.
[PubMed: 20357600]
39. Holmes BJ, Finegood DT, Riley BL, Best A. Systems thinking in dissemination and
implementation research. See Ref. 7. 2012:175–191.
40. Jacobs JA, Dodson EA, Baker EA, Deshpande AD, Brownson RC. Barriers to evidence-based
decision making in public health: a national survey of chronic disease practitioners. Public Health
Rep. 2010; 125:736–742. [PubMed: 20873290]
41. Khoury MJ, Gwinn M, Yoon PW, Dowling N, Moore CA, Bradley L. The continuum of
translation research in genomic medicine: How can we accelerate the appropriate integration of
human genome discoveries into health care and disease prevention? Genet. Med. 2007; 9:665–674.
[PubMed: 18073579]
42. Kindig D, Stoddart G. What is population health? Am. J. Public Health. 2003; 93:380–383.
[PubMed: 12604476]
43. Knapp H, Anaya HD, Goetz MB. Attributes of an independently self-sustaining implementation:
nurse-administered HIV rapid testing in VA primary care. Qual. Manag. Health Care. 2010;
19:292–297. [PubMed: 20924249]
44. Kumanyika S, Brownson RC, Cheadle A. The L.E.A.D. framework: using tools from evidence-
based public health to address evidence needs for obesity prevention. Prev. Chronic Dis. 2012;
9:E125. [PubMed: 22789443]
45. Lamb, S.; Greenlick, MR.; McCarty, D. Bridging the Gap Between Practice and Research: Forging
Partnerships with Community-Based Drug and Alcohol Treatment. Washington, DC: Natl. Acad.
Sci.; 1998. p. 288
46. Lobb R, Petermann L, Manafo E, Keen D, Kerner J. Networking and knowledge exchange to
promote the formation of trans-disciplinary coalitions and levels of agreement among trans-
disciplinary peer reviewers. J. Public Health Manag. Pract. 2012 In press.
47. Luke DA, Stamatakis KA. Systems science methods in public health: dynamics, networks, and
agents. Annu. Rev. Public Health. 2012; 33:357–376. [PubMed: 22224885]
48. Moons KG, Kengne AP, Grobbee DE, Royston P, Vergouwe Y, et al. Risk prediction models: II.
External validation, model updating, and impact assessment. Heart. 2012; 98:691–698. [PubMed:
22397946]
49. Natl. Cent. Health Stat.. Health, United States, 2009 With Special Feature on Medical Technology.
Hyattsville, MD: US Dep. Health Hum. Serv.; 2010. http://www.cdc.gov/nchs/data/hus/hus09.pdf
50. Normand S. Meta-analysis: formulating, evaluating, combining and reporting. Stat. Med. 1999;
18:321–359. [PubMed: 10070677]
51. Oberle MW, Baker EL, Magenheim MJ. Healthy People 2000 and community health planning.
Annu. Rev. Public Health. 1994; 15:259–275. [PubMed: 8054085]
52. Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist review—a new method of systematic
review designed for complex policy interventions. J. Health Serv. Res. Policy. 2005; 10(Suppl. 1):
21–34. [PubMed: 16053581]
53. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research
in mental health services: an emerging science with conceptual, methodological, and training
challenges. Adm. Policy Ment. Health. 2009; 36:24–34. [PubMed: 19104929]
54. Rabin, BA.; Brownson, RC. RC, Brownson; GA, Colditz; EK, Proctor. Dissemination and
Implementation Research in Health: Translating Science to Practice. New York: Oxford Univ.
Press; 2012. Developing the terminology for dissemination and implementation research; p. 23-54.
55. Rust G, Satcher D, Fryer GE, Levine RS, Blumenthal DS. Triangulating on success: innovation,
public health, medical care, and cause-specific US mortality rates over a half century (1950–2000).
Am. J. Public Health. 2010; 100(Suppl. 1):S95–S104. [PubMed: 20147695]
56. Schillinger, D. An Introduction to Effectiveness, Dissemination and Implementation Research.
Fleisher, P.; Goldstein, E., editors. San Francisco: Community Engagem. Progr., Clin. Transl. Sci.
Inst., Univ. Calif; 2010. http://ctsi.ucsf.edu/files/CE/edi_introguide.pdf
Lobb and Colditz Page 14
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
57. Schroeder SA. We can do better—improving the health of the American people. Shattuck lecture.
N. Engl. J. Med. 2007; 357:1221–1228. [PubMed: 17881753]
58. Stroup D, Berlin J, Morton S, Olkin I, Williamson G, et al. Meta-analysis of observational studies
in epidemiology. A proposal for reporting. JAMA. 2000; 283:2008–2012. [PubMed: 10789670]
59. Susser M. The tribulations of trials–intervention in communities. Am. J. Public Health. 1995;
85:156–158. [PubMed: 7856769]
60. Tabak RG, Khoong EC, Chambers D, Brownson RC. Bridging research and practice: models for
dissemination and implementation research. Am. J. Prev. Med. 2012; 43(3):337–350. [PubMed:
22898128]
61. Thorpe KE, Zwarenstein M, Oxman AD, Treweek S, Furberg CD, et al. A Pragmatic-Explanatory
Continuum Indicator Summary (PRECIS): a tool to help trial designers. J. Clin. Epidemiol. 2009;
62:464–475. [PubMed: 19348971]
62. Vanderpool RC, Gainor SJ, Conn ME, Spencer C, Allen AR, Kennedy S. Adapting and
implementing evidence-based cancer education interventions in rural Appalachia: real world
experiences and challenges. Rural Remote Health. 2011; 11:1807. [PubMed: 21988459]
63. van Dieren S, Beulens JW, Kengne AP, Peelen LM, Rutten GE, et al. Prediction models for the
risk of cardiovascular disease in patients with type 2 diabetes: a systematic review. Heart. 2012;
98:360–369. [PubMed: 22184101]
64. von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP. The Strengthening
the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for
reporting observational studies. Rev. Esp. Salud Publ. 2008; 82:251–259.
65. Warnecke RB, Oh A, Breen N, Gehlert S, Paskett E, et al. Approaching health disparities from a
population perspective: the National Institutes of Health Centers for Population Health and Health
Disparities. Am. J. Public Health. 2008; 98:1608–1615. [PubMed: 18633099]
66. Westfall JM, Mold J, Fagnan L. Practice-based research—”Blue Highways” on the NIH roadmap.
JAMA. 2007; 297:403–406. [PubMed: 17244837]
67. Zwarenstein M, Treweek S, Gagnier JJ, Altman DG, Tunis S, et al. Improving the reporting of
pragmatic trials: an extension of the CONSORT statement. Br. Med. J. 2008; 337:a2390.
[PubMed: 19001484]
Lobb and Colditz Page 15
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
WEB PORTALS
The Campbell Collaboration: http://www.campbellcollaboration.org.
The Center for Reviews and Dissemination: http://www.york.ac.uk/inst/crd
The Cochrane Public Health Group: http://ph.cochrane.org
The Community Guide to Preventive Services: http://www.thecommunityguide.org/
index.html
The US Preventive Services Task Force Recommendations: http://
www.uspreventiveservicestaskforce.org/recommendations.htm
Lobb and Colditz Page 16
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
SUMMARY POINTS
1. Measures for and theories of implementation of evidence-based interventions
for health are in the early stages of development.
2. The existing paradigm for how scientists create evidence, prioritize publications,
and synthesize research needs to shift toward greater stakeholder input and
improved reporting on external validity to improve population health.
3. Implementation science can speed translation from discovery to application and
public health benefits.
4. Stakeholder input and partnerships can increase the relevance of research to
practice settings.
5. Practice-based evidence is needed to provide information on external validity
and inform stakeholders’ decisions about the use of evidence-based
interventions.
6. Implementation of evidence-based interventions occurs in target settings that are
part of complex, dynamic systems that respond to feedback loops through
nonlinear adaptation.
Lobb and Colditz Page 17
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
FUTURE ISSUES
1. How will tools such as PRECIS lead to more practice-based evidence?
2. How will an emphasis on external validity throughout the discovery-to-practice
continuum better inform practice and policy stakeholders about how to use
evidence-based interventions?
3. In what ways can mathematical modeling be used to accelerate the translation of
research to practice?
4. How can we measure progress with implementation science at a population
level?
5. How can we evaluate progress by including stakeholders’ perspectives
throughout the discovery-to-practice continuum?
6. How can web portals that promote information on EBIs incorporate searchable
information about the persons, settings, and doses of intervention for which the
interventions are or are not effective?
7. How does systems science inform the development of implementation strategies
for EBIs?
Lobb and Colditz Page 18
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
Figure 1.
Examples of stakeholders at translational steps in the NIH Roadmap Initiative.
Lobb and Colditz Page 19
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
Figure 2.
Factors that influence the spread and sustained use of innovations in health service delivery
and other organizations. From Reference 35 with permission.
Lobb and Colditz Page 20
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
Figure 3.
Improving the flow and relevance of research evidence for implementation. Adapted from
Reference 34, with permission from the Annual Review of Public Health, Volume 30 ©
2009 by Annual Reviews, http://www.annualreviews.org.
Lobb and Colditz Page 21
Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript

Mais conteúdo relacionado

Mais procurados

A review of health behaviour theories
A review of health  behaviour theoriesA review of health  behaviour theories
A review of health behaviour theoriesMariaCarreon6
 
Linking political exposures to child and maternal health outcomes a realist r...
Linking political exposures to child and maternal health outcomes a realist r...Linking political exposures to child and maternal health outcomes a realist r...
Linking political exposures to child and maternal health outcomes a realist r...Araz Taeihagh
 
20140613 brn symposium
20140613 brn symposium20140613 brn symposium
20140613 brn symposiumjescarra
 
2014 review of data quality assessment methods
2014 review of data quality assessment methods2014 review of data quality assessment methods
2014 review of data quality assessment methodsRoger Zapata
 
Evidence Based Practice Lecture 6_slides
Evidence Based Practice Lecture 6_slidesEvidence Based Practice Lecture 6_slides
Evidence Based Practice Lecture 6_slidesZakCooper1
 
Application of Pharma Economic Evaluation Tools for Analysis of Medical Condi...
Application of Pharma Economic Evaluation Tools for Analysis of Medical Condi...Application of Pharma Economic Evaluation Tools for Analysis of Medical Condi...
Application of Pharma Economic Evaluation Tools for Analysis of Medical Condi...IJREST
 
End of life treatments, societal values, and selecting a measure of 'overall'...
End of life treatments, societal values, and selecting a measure of 'overall'...End of life treatments, societal values, and selecting a measure of 'overall'...
End of life treatments, societal values, and selecting a measure of 'overall'...Office of Health Economics
 
Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...
Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...
Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...Marion Sills
 
Pharmacoeconomics Cost utility analysis
Pharmacoeconomics Cost utility analysisPharmacoeconomics Cost utility analysis
Pharmacoeconomics Cost utility analysisMohammed Mohammed
 
Leading change in healthcare- thesis_Mulondo_160601
Leading change in healthcare- thesis_Mulondo_160601Leading change in healthcare- thesis_Mulondo_160601
Leading change in healthcare- thesis_Mulondo_160601jerry mulondo
 
A study on prescription pattern and rational use of statins in tertiary care ...
A study on prescription pattern and rational use of statins in tertiary care ...A study on prescription pattern and rational use of statins in tertiary care ...
A study on prescription pattern and rational use of statins in tertiary care ...SriramNagarajan16
 

Mais procurados (20)

A review of health behaviour theories
A review of health  behaviour theoriesA review of health  behaviour theories
A review of health behaviour theories
 
Linking political exposures to child and maternal health outcomes a realist r...
Linking political exposures to child and maternal health outcomes a realist r...Linking political exposures to child and maternal health outcomes a realist r...
Linking political exposures to child and maternal health outcomes a realist r...
 
20140613 brn symposium
20140613 brn symposium20140613 brn symposium
20140613 brn symposium
 
Theory at a Glance
Theory at a GlanceTheory at a Glance
Theory at a Glance
 
2014 review of data quality assessment methods
2014 review of data quality assessment methods2014 review of data quality assessment methods
2014 review of data quality assessment methods
 
Third variable
Third variableThird variable
Third variable
 
Evidence Based Practice Lecture 6_slides
Evidence Based Practice Lecture 6_slidesEvidence Based Practice Lecture 6_slides
Evidence Based Practice Lecture 6_slides
 
Application of Pharma Economic Evaluation Tools for Analysis of Medical Condi...
Application of Pharma Economic Evaluation Tools for Analysis of Medical Condi...Application of Pharma Economic Evaluation Tools for Analysis of Medical Condi...
Application of Pharma Economic Evaluation Tools for Analysis of Medical Condi...
 
Cost Utility Analysis
Cost Utility AnalysisCost Utility Analysis
Cost Utility Analysis
 
Knowledge Mobilization for Mt Royal 131002
Knowledge Mobilization for Mt Royal 131002Knowledge Mobilization for Mt Royal 131002
Knowledge Mobilization for Mt Royal 131002
 
My Power Point
My Power PointMy Power Point
My Power Point
 
10.1186%2 f1472 6920-9-49
10.1186%2 f1472 6920-9-4910.1186%2 f1472 6920-9-49
10.1186%2 f1472 6920-9-49
 
752573
752573752573
752573
 
Whr02 en five
Whr02 en fiveWhr02 en five
Whr02 en five
 
End of life treatments, societal values, and selecting a measure of 'overall'...
End of life treatments, societal values, and selecting a measure of 'overall'...End of life treatments, societal values, and selecting a measure of 'overall'...
End of life treatments, societal values, and selecting a measure of 'overall'...
 
10.1093@cid@cix505
10.1093@cid@cix50510.1093@cid@cix505
10.1093@cid@cix505
 
Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...
Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...
Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...
 
Pharmacoeconomics Cost utility analysis
Pharmacoeconomics Cost utility analysisPharmacoeconomics Cost utility analysis
Pharmacoeconomics Cost utility analysis
 
Leading change in healthcare- thesis_Mulondo_160601
Leading change in healthcare- thesis_Mulondo_160601Leading change in healthcare- thesis_Mulondo_160601
Leading change in healthcare- thesis_Mulondo_160601
 
A study on prescription pattern and rational use of statins in tertiary care ...
A study on prescription pattern and rational use of statins in tertiary care ...A study on prescription pattern and rational use of statins in tertiary care ...
A study on prescription pattern and rational use of statins in tertiary care ...
 

Semelhante a Implementation scienc exx application

PU36CH26-Woolf ARI 12 February 2015 1347Translating Evide.docx
PU36CH26-Woolf ARI 12 February 2015 1347Translating Evide.docxPU36CH26-Woolf ARI 12 February 2015 1347Translating Evide.docx
PU36CH26-Woolf ARI 12 February 2015 1347Translating Evide.docxwoodruffeloisa
 
Summary Various industries, including health care, have adop.docx
Summary Various industries, including health care, have adop.docxSummary Various industries, including health care, have adop.docx
Summary Various industries, including health care, have adop.docxpicklesvalery
 
Use of research to inform public policymakingLavis, John N;Francis.docx
Use of research to inform public policymakingLavis, John N;Francis.docxUse of research to inform public policymakingLavis, John N;Francis.docx
Use of research to inform public policymakingLavis, John N;Francis.docxdickonsondorris
 
ANRV405-PU31-26 ARI 22 February 2010 172The Role of Behav.docx
ANRV405-PU31-26 ARI 22 February 2010 172The Role of Behav.docxANRV405-PU31-26 ARI 22 February 2010 172The Role of Behav.docx
ANRV405-PU31-26 ARI 22 February 2010 172The Role of Behav.docxrossskuddershamus
 
JECH Glosario Salud en todas las políticas 2013
JECH Glosario Salud en todas las políticas 2013JECH Glosario Salud en todas las políticas 2013
JECH Glosario Salud en todas las políticas 2013Fernando Arteaga Suárez
 
BioMed CentralBMC Health Services ResearchssOpen AcceDeb
BioMed CentralBMC Health Services ResearchssOpen AcceDebBioMed CentralBMC Health Services ResearchssOpen AcceDeb
BioMed CentralBMC Health Services ResearchssOpen AcceDebChantellPantoja184
 
Of all the ways to influence health policy, using research to info.docx
Of all the ways to influence health policy, using research to info.docxOf all the ways to influence health policy, using research to info.docx
Of all the ways to influence health policy, using research to info.docxcherishwinsland
 
Operational Research (OR) in Public Health.docx
Operational Research (OR) in Public Health.docxOperational Research (OR) in Public Health.docx
Operational Research (OR) in Public Health.docxMostaque Ahmed
 
86J Public Health Management Practice, 1999, 5(5), 86–97.docx
86J Public Health Management Practice, 1999, 5(5), 86–97.docx86J Public Health Management Practice, 1999, 5(5), 86–97.docx
86J Public Health Management Practice, 1999, 5(5), 86–97.docxransayo
 
Setting the health_research_priority_agenda_for_moh
Setting the health_research_priority_agenda_for_mohSetting the health_research_priority_agenda_for_moh
Setting the health_research_priority_agenda_for_mohFaisalSaleh40
 
Sec. S-i. Med. Vol. 38, No. 2, pp. 205-215, 1994 Printed in .docx
Sec. S-i. Med. Vol. 38, No. 2, pp. 205-215, 1994 Printed in .docxSec. S-i. Med. Vol. 38, No. 2, pp. 205-215, 1994 Printed in .docx
Sec. S-i. Med. Vol. 38, No. 2, pp. 205-215, 1994 Printed in .docxbagotjesusa
 
Dr Dev Kambhampati | NCCAM- Exploring the Science of Complementary and Altern...
Dr Dev Kambhampati | NCCAM- Exploring the Science of Complementary and Altern...Dr Dev Kambhampati | NCCAM- Exploring the Science of Complementary and Altern...
Dr Dev Kambhampati | NCCAM- Exploring the Science of Complementary and Altern...Dr Dev Kambhampati
 
A retrospective review of the Honduras AIN-C program guided by a community he...
A retrospective review of the Honduras AIN-C program guided by a community he...A retrospective review of the Honduras AIN-C program guided by a community he...
A retrospective review of the Honduras AIN-C program guided by a community he...HFG Project
 
Evidence management-from-theory-to-practice-in-health-care
Evidence management-from-theory-to-practice-in-health-careEvidence management-from-theory-to-practice-in-health-care
Evidence management-from-theory-to-practice-in-health-careslwrel
 
Leading by Success Impact of a Clinical & Translational Res.docx
Leading by Success Impact of a Clinical & Translational Res.docxLeading by Success Impact of a Clinical & Translational Res.docx
Leading by Success Impact of a Clinical & Translational Res.docxcroysierkathey
 
Running Head HEALTH NEEDS ASSESSMENT1HEALTH NEEDS ASSESSMEN.docx
Running Head HEALTH NEEDS ASSESSMENT1HEALTH NEEDS ASSESSMEN.docxRunning Head HEALTH NEEDS ASSESSMENT1HEALTH NEEDS ASSESSMEN.docx
Running Head HEALTH NEEDS ASSESSMENT1HEALTH NEEDS ASSESSMEN.docxwlynn1
 

Semelhante a Implementation scienc exx application (20)

PU36CH26-Woolf ARI 12 February 2015 1347Translating Evide.docx
PU36CH26-Woolf ARI 12 February 2015 1347Translating Evide.docxPU36CH26-Woolf ARI 12 February 2015 1347Translating Evide.docx
PU36CH26-Woolf ARI 12 February 2015 1347Translating Evide.docx
 
Summary Various industries, including health care, have adop.docx
Summary Various industries, including health care, have adop.docxSummary Various industries, including health care, have adop.docx
Summary Various industries, including health care, have adop.docx
 
Use of research to inform public policymakingLavis, John N;Francis.docx
Use of research to inform public policymakingLavis, John N;Francis.docxUse of research to inform public policymakingLavis, John N;Francis.docx
Use of research to inform public policymakingLavis, John N;Francis.docx
 
ANRV405-PU31-26 ARI 22 February 2010 172The Role of Behav.docx
ANRV405-PU31-26 ARI 22 February 2010 172The Role of Behav.docxANRV405-PU31-26 ARI 22 February 2010 172The Role of Behav.docx
ANRV405-PU31-26 ARI 22 February 2010 172The Role of Behav.docx
 
JECH Glosario Salud en todas las políticas 2013
JECH Glosario Salud en todas las políticas 2013JECH Glosario Salud en todas las políticas 2013
JECH Glosario Salud en todas las políticas 2013
 
BioMed CentralBMC Health Services ResearchssOpen AcceDeb
BioMed CentralBMC Health Services ResearchssOpen AcceDebBioMed CentralBMC Health Services ResearchssOpen AcceDeb
BioMed CentralBMC Health Services ResearchssOpen AcceDeb
 
Of all the ways to influence health policy, using research to info.docx
Of all the ways to influence health policy, using research to info.docxOf all the ways to influence health policy, using research to info.docx
Of all the ways to influence health policy, using research to info.docx
 
Operational Research (OR) in Public Health.docx
Operational Research (OR) in Public Health.docxOperational Research (OR) in Public Health.docx
Operational Research (OR) in Public Health.docx
 
86J Public Health Management Practice, 1999, 5(5), 86–97.docx
86J Public Health Management Practice, 1999, 5(5), 86–97.docx86J Public Health Management Practice, 1999, 5(5), 86–97.docx
86J Public Health Management Practice, 1999, 5(5), 86–97.docx
 
Setting the health_research_priority_agenda_for_moh
Setting the health_research_priority_agenda_for_mohSetting the health_research_priority_agenda_for_moh
Setting the health_research_priority_agenda_for_moh
 
Sec. S-i. Med. Vol. 38, No. 2, pp. 205-215, 1994 Printed in .docx
Sec. S-i. Med. Vol. 38, No. 2, pp. 205-215, 1994 Printed in .docxSec. S-i. Med. Vol. 38, No. 2, pp. 205-215, 1994 Printed in .docx
Sec. S-i. Med. Vol. 38, No. 2, pp. 205-215, 1994 Printed in .docx
 
Dr Dev Kambhampati | NCCAM- Exploring the Science of Complementary and Altern...
Dr Dev Kambhampati | NCCAM- Exploring the Science of Complementary and Altern...Dr Dev Kambhampati | NCCAM- Exploring the Science of Complementary and Altern...
Dr Dev Kambhampati | NCCAM- Exploring the Science of Complementary and Altern...
 
A retrospective review of the Honduras AIN-C program guided by a community he...
A retrospective review of the Honduras AIN-C program guided by a community he...A retrospective review of the Honduras AIN-C program guided by a community he...
A retrospective review of the Honduras AIN-C program guided by a community he...
 
2015_09 PH Monograph
2015_09 PH Monograph2015_09 PH Monograph
2015_09 PH Monograph
 
Frenk power ideas
Frenk power ideasFrenk power ideas
Frenk power ideas
 
Evidence management-from-theory-to-practice-in-health-care
Evidence management-from-theory-to-practice-in-health-careEvidence management-from-theory-to-practice-in-health-care
Evidence management-from-theory-to-practice-in-health-care
 
[287 met]
[287 met][287 met]
[287 met]
 
Enhancing health systems and role of health policy and systems research and a...
Enhancing health systems and role of health policy and systems research and a...Enhancing health systems and role of health policy and systems research and a...
Enhancing health systems and role of health policy and systems research and a...
 
Leading by Success Impact of a Clinical & Translational Res.docx
Leading by Success Impact of a Clinical & Translational Res.docxLeading by Success Impact of a Clinical & Translational Res.docx
Leading by Success Impact of a Clinical & Translational Res.docx
 
Running Head HEALTH NEEDS ASSESSMENT1HEALTH NEEDS ASSESSMEN.docx
Running Head HEALTH NEEDS ASSESSMENT1HEALTH NEEDS ASSESSMEN.docxRunning Head HEALTH NEEDS ASSESSMENT1HEALTH NEEDS ASSESSMEN.docx
Running Head HEALTH NEEDS ASSESSMENT1HEALTH NEEDS ASSESSMEN.docx
 

Último

8980367676 Call Girls In Ahmedabad Escort Service Available 24×7 In Ahmedabad
8980367676 Call Girls In Ahmedabad Escort Service Available 24×7 In Ahmedabad8980367676 Call Girls In Ahmedabad Escort Service Available 24×7 In Ahmedabad
8980367676 Call Girls In Ahmedabad Escort Service Available 24×7 In AhmedabadGENUINE ESCORT AGENCY
 
Trichy Call Girls Book Now 9630942363 Top Class Trichy Escort Service Available
Trichy Call Girls Book Now 9630942363 Top Class Trichy Escort Service AvailableTrichy Call Girls Book Now 9630942363 Top Class Trichy Escort Service Available
Trichy Call Girls Book Now 9630942363 Top Class Trichy Escort Service AvailableGENUINE ESCORT AGENCY
 
(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...
(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...
(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...parulsinha
 
All Time Service Available Call Girls Marine Drive 📳 9820252231 For 18+ VIP C...
All Time Service Available Call Girls Marine Drive 📳 9820252231 For 18+ VIP C...All Time Service Available Call Girls Marine Drive 📳 9820252231 For 18+ VIP C...
All Time Service Available Call Girls Marine Drive 📳 9820252231 For 18+ VIP C...Arohi Goyal
 
Call Girls Madurai Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Madurai Just Call 9630942363 Top Class Call Girl Service AvailableCall Girls Madurai Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Madurai Just Call 9630942363 Top Class Call Girl Service AvailableGENUINE ESCORT AGENCY
 
Pondicherry Call Girls Book Now 9630942363 Top Class Pondicherry Escort Servi...
Pondicherry Call Girls Book Now 9630942363 Top Class Pondicherry Escort Servi...Pondicherry Call Girls Book Now 9630942363 Top Class Pondicherry Escort Servi...
Pondicherry Call Girls Book Now 9630942363 Top Class Pondicherry Escort Servi...GENUINE ESCORT AGENCY
 
Andheri East ) Call Girls in Mumbai Phone No 9004268417 Elite Escort Service ...
Andheri East ) Call Girls in Mumbai Phone No 9004268417 Elite Escort Service ...Andheri East ) Call Girls in Mumbai Phone No 9004268417 Elite Escort Service ...
Andheri East ) Call Girls in Mumbai Phone No 9004268417 Elite Escort Service ...Anamika Rawat
 
Model Call Girls In Chennai WhatsApp Booking 7427069034 call girl service 24 ...
Model Call Girls In Chennai WhatsApp Booking 7427069034 call girl service 24 ...Model Call Girls In Chennai WhatsApp Booking 7427069034 call girl service 24 ...
Model Call Girls In Chennai WhatsApp Booking 7427069034 call girl service 24 ...hotbabesbook
 
Best Rate (Patna ) Call Girls Patna ⟟ 8617370543 ⟟ High Class Call Girl In 5 ...
Best Rate (Patna ) Call Girls Patna ⟟ 8617370543 ⟟ High Class Call Girl In 5 ...Best Rate (Patna ) Call Girls Patna ⟟ 8617370543 ⟟ High Class Call Girl In 5 ...
Best Rate (Patna ) Call Girls Patna ⟟ 8617370543 ⟟ High Class Call Girl In 5 ...Dipal Arora
 
Call Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service AvailableCall Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service AvailableGENUINE ESCORT AGENCY
 
9630942363 Genuine Call Girls In Ahmedabad Gujarat Call Girls Service
9630942363 Genuine Call Girls In Ahmedabad Gujarat Call Girls Service9630942363 Genuine Call Girls In Ahmedabad Gujarat Call Girls Service
9630942363 Genuine Call Girls In Ahmedabad Gujarat Call Girls ServiceGENUINE ESCORT AGENCY
 
Jogeshwari ! Call Girls Service Mumbai - 450+ Call Girl Cash Payment 90042684...
Jogeshwari ! Call Girls Service Mumbai - 450+ Call Girl Cash Payment 90042684...Jogeshwari ! Call Girls Service Mumbai - 450+ Call Girl Cash Payment 90042684...
Jogeshwari ! Call Girls Service Mumbai - 450+ Call Girl Cash Payment 90042684...Anamika Rawat
 
VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋
VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋
VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋TANUJA PANDEY
 
Call Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service AvailableCall Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service AvailableDipal Arora
 
Call Girls Raipur Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Raipur Just Call 9630942363 Top Class Call Girl Service AvailableCall Girls Raipur Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Raipur Just Call 9630942363 Top Class Call Girl Service AvailableGENUINE ESCORT AGENCY
 
Mumbai ] (Call Girls) in Mumbai 10k @ I'm VIP Independent Escorts Girls 98333...
Mumbai ] (Call Girls) in Mumbai 10k @ I'm VIP Independent Escorts Girls 98333...Mumbai ] (Call Girls) in Mumbai 10k @ I'm VIP Independent Escorts Girls 98333...
Mumbai ] (Call Girls) in Mumbai 10k @ I'm VIP Independent Escorts Girls 98333...Ishani Gupta
 
Independent Call Girls Service Mohali Sector 116 | 6367187148 | Call Girl Ser...
Independent Call Girls Service Mohali Sector 116 | 6367187148 | Call Girl Ser...Independent Call Girls Service Mohali Sector 116 | 6367187148 | Call Girl Ser...
Independent Call Girls Service Mohali Sector 116 | 6367187148 | Call Girl Ser...karishmasinghjnh
 
Russian Call Girls Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service...
Russian Call Girls Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service...Russian Call Girls Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service...
Russian Call Girls Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service...adilkhan87451
 
Coimbatore Call Girls in Thudiyalur : 7427069034 High Profile Model Escorts |...
Coimbatore Call Girls in Thudiyalur : 7427069034 High Profile Model Escorts |...Coimbatore Call Girls in Thudiyalur : 7427069034 High Profile Model Escorts |...
Coimbatore Call Girls in Thudiyalur : 7427069034 High Profile Model Escorts |...chennailover
 

Último (20)

8980367676 Call Girls In Ahmedabad Escort Service Available 24×7 In Ahmedabad
8980367676 Call Girls In Ahmedabad Escort Service Available 24×7 In Ahmedabad8980367676 Call Girls In Ahmedabad Escort Service Available 24×7 In Ahmedabad
8980367676 Call Girls In Ahmedabad Escort Service Available 24×7 In Ahmedabad
 
Trichy Call Girls Book Now 9630942363 Top Class Trichy Escort Service Available
Trichy Call Girls Book Now 9630942363 Top Class Trichy Escort Service AvailableTrichy Call Girls Book Now 9630942363 Top Class Trichy Escort Service Available
Trichy Call Girls Book Now 9630942363 Top Class Trichy Escort Service Available
 
(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...
(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...
(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...
 
All Time Service Available Call Girls Marine Drive 📳 9820252231 For 18+ VIP C...
All Time Service Available Call Girls Marine Drive 📳 9820252231 For 18+ VIP C...All Time Service Available Call Girls Marine Drive 📳 9820252231 For 18+ VIP C...
All Time Service Available Call Girls Marine Drive 📳 9820252231 For 18+ VIP C...
 
Call Girls Madurai Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Madurai Just Call 9630942363 Top Class Call Girl Service AvailableCall Girls Madurai Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Madurai Just Call 9630942363 Top Class Call Girl Service Available
 
Pondicherry Call Girls Book Now 9630942363 Top Class Pondicherry Escort Servi...
Pondicherry Call Girls Book Now 9630942363 Top Class Pondicherry Escort Servi...Pondicherry Call Girls Book Now 9630942363 Top Class Pondicherry Escort Servi...
Pondicherry Call Girls Book Now 9630942363 Top Class Pondicherry Escort Servi...
 
Andheri East ) Call Girls in Mumbai Phone No 9004268417 Elite Escort Service ...
Andheri East ) Call Girls in Mumbai Phone No 9004268417 Elite Escort Service ...Andheri East ) Call Girls in Mumbai Phone No 9004268417 Elite Escort Service ...
Andheri East ) Call Girls in Mumbai Phone No 9004268417 Elite Escort Service ...
 
Model Call Girls In Chennai WhatsApp Booking 7427069034 call girl service 24 ...
Model Call Girls In Chennai WhatsApp Booking 7427069034 call girl service 24 ...Model Call Girls In Chennai WhatsApp Booking 7427069034 call girl service 24 ...
Model Call Girls In Chennai WhatsApp Booking 7427069034 call girl service 24 ...
 
Best Rate (Patna ) Call Girls Patna ⟟ 8617370543 ⟟ High Class Call Girl In 5 ...
Best Rate (Patna ) Call Girls Patna ⟟ 8617370543 ⟟ High Class Call Girl In 5 ...Best Rate (Patna ) Call Girls Patna ⟟ 8617370543 ⟟ High Class Call Girl In 5 ...
Best Rate (Patna ) Call Girls Patna ⟟ 8617370543 ⟟ High Class Call Girl In 5 ...
 
Call Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service AvailableCall Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service Available
 
9630942363 Genuine Call Girls In Ahmedabad Gujarat Call Girls Service
9630942363 Genuine Call Girls In Ahmedabad Gujarat Call Girls Service9630942363 Genuine Call Girls In Ahmedabad Gujarat Call Girls Service
9630942363 Genuine Call Girls In Ahmedabad Gujarat Call Girls Service
 
Jogeshwari ! Call Girls Service Mumbai - 450+ Call Girl Cash Payment 90042684...
Jogeshwari ! Call Girls Service Mumbai - 450+ Call Girl Cash Payment 90042684...Jogeshwari ! Call Girls Service Mumbai - 450+ Call Girl Cash Payment 90042684...
Jogeshwari ! Call Girls Service Mumbai - 450+ Call Girl Cash Payment 90042684...
 
VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋
VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋
VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋
 
Call Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service AvailableCall Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service Available
 
Call Girls Raipur Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Raipur Just Call 9630942363 Top Class Call Girl Service AvailableCall Girls Raipur Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Raipur Just Call 9630942363 Top Class Call Girl Service Available
 
Mumbai ] (Call Girls) in Mumbai 10k @ I'm VIP Independent Escorts Girls 98333...
Mumbai ] (Call Girls) in Mumbai 10k @ I'm VIP Independent Escorts Girls 98333...Mumbai ] (Call Girls) in Mumbai 10k @ I'm VIP Independent Escorts Girls 98333...
Mumbai ] (Call Girls) in Mumbai 10k @ I'm VIP Independent Escorts Girls 98333...
 
Independent Call Girls Service Mohali Sector 116 | 6367187148 | Call Girl Ser...
Independent Call Girls Service Mohali Sector 116 | 6367187148 | Call Girl Ser...Independent Call Girls Service Mohali Sector 116 | 6367187148 | Call Girl Ser...
Independent Call Girls Service Mohali Sector 116 | 6367187148 | Call Girl Ser...
 
Russian Call Girls Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service...
Russian Call Girls Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service...Russian Call Girls Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service...
Russian Call Girls Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service...
 
🌹Attapur⬅️ Vip Call Girls Hyderabad 📱9352852248 Book Well Trand Call Girls In...
🌹Attapur⬅️ Vip Call Girls Hyderabad 📱9352852248 Book Well Trand Call Girls In...🌹Attapur⬅️ Vip Call Girls Hyderabad 📱9352852248 Book Well Trand Call Girls In...
🌹Attapur⬅️ Vip Call Girls Hyderabad 📱9352852248 Book Well Trand Call Girls In...
 
Coimbatore Call Girls in Thudiyalur : 7427069034 High Profile Model Escorts |...
Coimbatore Call Girls in Thudiyalur : 7427069034 High Profile Model Escorts |...Coimbatore Call Girls in Thudiyalur : 7427069034 High Profile Model Escorts |...
Coimbatore Call Girls in Thudiyalur : 7427069034 High Profile Model Escorts |...
 

Implementation scienc exx application

  • 1. Implementation Science and its Application to Population Health Rebecca Lobb and Graham A. Colditz Department of Surgery, Washington University in St. Louis, St. Louis, Missouri Rebecca Lobb: lobbr@wustl.edu Abstract Implementation science studies the use of strategies to adapt and use evidence-based interventions in targeted settings (e.g., schools, workplaces, health care facilities, public health departments) to sustain improvements to population health. This nascent field of research is in the early stages of developing theories of implementation and evaluating the properties of measures. Stakeholder engagement, effectiveness studies, research synthesis, and mathematical modeling are some of the methods used by implementation scientists to identify strategies to embed evidence-based interventions in clinical and public health programs. However, for implementation science to reach its full potential to improve population health the existing paradigm for how scientists create evidence, prioritize publications, and synthesize research needs to shift toward greater stakeholder input and improved reporting on external validity. This shift will improve the relevance of the research that is produced and provide information that will help guide decision makers in their selection of research-tested interventions. Keywords knowledge translation; research translation; public health impact; community-engaged research INTRODUCTION THE EVOLUTION OF IMPLEMENTATION SCIENCE Implementation science for health evolved from practice-based evaluations that were conducted in the 1960s and 1970s to understand problems with implementing national initiatives (e.g., President Kennedy’s New Frontier and President Johnson’s Great Society and War on Poverty) (34). These programs required the adoption of new ideas by organizations, as much as individuals, to improve health and reduce the burden of poverty (34). The emphasis on organizational accountability and application of research questions to assess whether the policies were implemented as planned led to the emergence of implementation science (34). During the 1980s and 1990s, evaluations of the Healthy People objectives further shaped the field of implementation science. Preliminary assessments of the initial Healthy People found poor progress with achieving many of the 226 objectives, primarily owing to the lack of measures. These assessments led to the selection of objectives for Health People 2000 that were more feasible to measure (3, 51). In addition, tools and implementation strategies are now distributed to states and local health departments to increase the potential for achieving Health People objectives. Copyright © 2013 by Annual Reviews. All rights reserved DISCLOSURE STATEMENT The authors are not aware of any affiliations, memberships, funding, or financial holdings that might be perceived as affecting the objectivity of this review. NIH Public Access Author Manuscript Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. Published in final edited form as: Annu Rev Public Health. 2013 ; 34: 235–251. doi:10.1146/annurev-publhealth-031912-114444. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 2. Current applications of implementation science for health continue to develop tools and strategies to improve the success of implementation and to study how programs and interventions are used in practice-based settings. However, the focus of this field has shifted toward studying how to accelerate the translation of research-tested interventions into policy and practice, taking into account the complex, dynamic, adaptive systems in which these interventions can be implemented (6). This most recent evolution of implementation science was driven largely by an acceleration in funding from the National Institutes of Health (NIH) and various stakeholder interests to fill the gap between scientific discoveries and the application of these innovations to improve population health (34). The Institute of Medicine’s (IOM) 1990 report that identified the gap between mental health research and practice as a national public health priority was among the first reviews that quantified the need for a return on the investment that federal dollars made toward research (45). At that time, almost 80% of the ~9.4-million drug-addicted and-dependent individuals in the United States went untreated, and the treatment administered did not keep up with the major scientific advances. In 1997, public funds accounted for 65% of the reported revenues in drug- and alcohol-treatment programs, creating a pressing interest to find better ways to implement innovations in effective treatment strategies. For similar reasons, several years after the IOM report on the gap between mental health research and practice, the NIH highlighted the importance of moving basic research into human studies with the Roadmap Initiative (66). Billions of taxpayer dollars had been spent on scientific discoveries to improve health, but few were being used (53). Whereas the initial purpose of the Roadmap was to accelerate the translation of research benefits to patients (66), more recent iterations have expanded this view to include the translational steps from basic research to use by decision makers and residents (41). Although stakeholders have not been explicitly defined by the Roadmap Initiative, one can imagine the proximal and distal stakeholders at each translational step (Figure 1). The first translational step (T1) seeks to move basic discovery into a candidate health application primarily through efficacy studies performed by clinical and behavioral scientists. In the second translational step (T2), health services and public health scientists conduct effectiveness studies and systematic reviews to assess the value of the health application to a wider population and develop clinical or community guidelines for using research-tested interventions. In the third translational step (T3), implementation scientists use methods similar to those used by scientists in the T2 step to identify strategies that can move clinical guidelines or evidence-based interventions (EBIs) into clinical and public health practice. Implementation scientists also collaborate with stakeholders to conduct T4 research, which seeks to evaluate the real-world use of EBIs and implementation strategies by organizations and individuals. Connecting with stakeholders (e.g., researchers from different disciplines, industry, public health, health care, and community-based groups) to inform the development of research at translational phases T1 and T2 is supported by the Clinical and Translational Science Awards (CTSA). Through a consortium of 60 medical institutions in 30 states and the District of Columbia, the CTSA supports clinical and translational investigators to develop skills to engage stakeholders in team science to accelerate the translation of research into clinical practice (11). Stakeholder engagement for T3 and T4 translational research is supported by the NIH Program Announcements on Dissemination and Implementation Research in Health. Because T3 translational research has a dual emphasis of dissemination and implementation, these two types of research are often discussed in the same context. However, the NIH distinguishes between dissemination, “the targeted distribution of information and intervention materials to a specific public health or clinical practice audience,” and implementation, “the use of strategies to introduce or change evidence-based Lobb and Colditz Page 2 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 3. health interventions within specific settings” (53). This article provides a review only of implementation science to improve population health as it is currently represented in translational steps T3 and T4 in the Roadmap. The review is intended to reflect the depth and breadth of the topic and is not a complete review of the literature. WHY IS IMPLEMENTATION SCIENCE NEEDED? Population health can be defined as “the health outcomes of a group of individuals, including the distributions of such outcomes within the groups” (42). Health outcomes and the distribution of health outcomes are determined by interacting factors related to biology; health care systems; psychosocial, economic, and physical environments; and policies across the life span (42). A goal of implementation science for health is to identify the factors, processes, and methods that can successfully embed EBIs in policy and practice to achieve population health. Evidence based refers to interventions that have undergone sufficient scientific evaluation to be considered effective (56). Over the past two decades, the United States has experienced a dramatic reduction in mortality for the leading causes of death, heart disease, and cancer. These improvements in health are due largely to corresponding reductions in tobacco use; increases in use of screening tests for breast, cervical, and colorectal cancer (18); and improvements in therapeutics (49). Despite these improvements, the United States continues to be ranked at the bottom for life expectancy at birth out of 33 countries in the Organizations for Economic Co-operation and Development (OECD) (49). Reasons for this ranking are complex. However, key contributors to the relatively poor longevity in the United States compared with other OECD countries include continued high prevalence of tobacco use among adults without a college education (9), inactivity, and the increase in overweight and obesity among the US population (19). These health behaviors and comorbid conditions are major risk factors for heart disease and cancer. Of equal concern to public health in the United States is the substantial inequities in the distributions of these risk factors for heart disease and cancer by social groups defined by education level, poverty level, race, ethnicity, and insurance status. Although health care services account for only about 10% of the population’s health (57), utilization of health care is a critical factor in the low longevity ranking for the United States compared with other OECD countries because many of the other countries provide universal insurance coverage at the na-tional or regional levels, which facilitates access to life-saving screening tests and therapeutics. In addition, an uninsured status is disproportionately high among low- income, limited education, racial and ethnic minority groups who are already vulnerable to health inequities for heart disease, cancer, and associated risk factors and comorbid conditions. These social characteristics and the characteristics of the environments in which socially disadvantaged persons live, work, and play (e.g., built environment, food deserts, workplace policies) contribute to the persistent inequities in health in the United States (65). Although it is not comprehensive, this list of health issues in the United States provides a clear picture of the work that remains to be done to improve the health of the nation. Evidence suggests that only about half of the available medical and public health innovations are used in practice (10, 24, 38), yet there are sufficient EBIs to reduce by more than 50% the burden of cancer and chronic and infectious diseases in the United States (12, 13). The following sections provide an overview of the factors that influence use of EBIs and the ways in which implementation science is being used or can be used to accelerate use of EBIs to improve population health and reduce social inequities in health. Lobb and Colditz Page 3 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 4. FACTORS THAT INFLUENCE USE OF EVIDENCE-BASED INTERVENTIONS Glasgow & Emmons (24) identified multiple, interacting reasons why adoption and use of EBIs does not routinely occur in practice, including the characteristics of the intervention (e.g., high cost, intensive time demands, not customizable), research design used to test the EBI (e.g., participants or setting not representative of target population or setting, failure to evaluate implementation), the situation of the intended target setting (e.g., schools, workplaces, health care facilities, public health departments), and interactions among these factors (24). These barriers can be particularly problematic when trying to use EBIs to improve health for medically underserved populations and residents in underresourced areas because so few interventions are tested in these situations. A study conducted in rural West Virginia illustrates how these factors influence the use of EBIs in practice settings. Example 1: Barriers to Use of Evidence-Based Interventions in Rural West Virginia Residents in rural areas experience multiple barriers to high-quality health care that residents in urban and suburban areas do not encounter, such as longer distances to reach health services (2) and higher rates of unemployment, uninsured, and poverty (62). In addition, the health of rural racial minority groups is particularly disadvantaged with regard to heart disease, cancer, and diabetes (2). To understand the compatibility of EBIs in rural areas, the Mary Babb Randolph Cancer Center at West Virginia University released a request for proposals for community-based cancer education mini-grants. Eligible organizations belonged to the National Cancer Institute (NCI)-funded Appalachia Community Cancer Network. The call for proposals required that grantee organizations adapt and implement EBIs for cancer control, and the funders offered training to grantees to facilitate these processes (62). Using a multiple-case-study research design, investigators summarized information ascertained from surveys, interviews, and the final programmatic reports at the end of the funding period for the 13 grantee organizations (e.g., coalitions, health clinics, churches, nonprofit community organizations). Results of this study showed that grantees felt the requirement to use an EBI limited their options for interventions to improve cancer control. Many grantees commented that it was time consuming and difficult to adapt the EBIs because the interventions did not take into account older, less educated, or lower- income populations or the geographically isolated setting, or they required large worksites, churches, or staffing that were not available (62). All organizations adapted the EBIs. Some changes were minor, but other changes were substantial in that the core intervention components were condensed or eliminated. This case study poignantly demonstrates how the characteristics of the intervention, research design, and context of the implementation setting affect use of the intervention, even when funding is provided to encourage use of an EBI. In 2004, Greenhalgh et al. (35) published a comprehensive meta-narrative review of nearly 500 articles that described factors associated with the spread and sustained use of EBIs in health service delivery and other organizations that are the primary target settings for EBIs (35). Among the key findings of this literature review was that organizations are part of a complex, multidimensional system with multiple interacting factors that influence use of EBIs at multiple levels (Figure 2) (35). Key features of complex systems include constant adaptation to nonlinear change, dynamic interactions between system components, and feedback loops (6, 47). The situations of these complex systems are influenced by individuals (e.g., perceptions of the innovation, adopter characteristics), social processes (e.g., communication, linkages, implementation), characteristics of groups (e.g., system antecedents, readiness to change), and the sociopolitical, economic environment (20). The literature suggests that some factors are more influential than others to support the use of EBIs. For example, rewards for using EBIs, inadequate funding for EBIs, and support Lobb and Colditz Page 4 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 5. from state legislators are more influential on adoption of EBIs by public health decision makers than are factors associated with their personal characteristics (e.g., felt need to be an expert to make evidence-based decisions) (40). One study found that public health decision makers with a bachelor’s degree were 5.6 times (95% confidence interval 1.7, 17.9) more likely than those with a public health master’s degree to report that they lacked skill to develop evidence-based programs (40). In addition, an organization’s culture (e.g., the value that an organization places on using an EBI for decision making) (14) is a strong predictor of whether a systematic review influences the use of EBIs (17). Perhaps most relevant to successful implementation is the strategic climate of an organization that motivates and enables employees to embed EBIs in existing practices through policies, procedures, and reward systems (1). Much of what we know about factors that influence use of EBIs in organizations comes from the fields of sociology, psychology, communications, systems science, and organizational behavior (35). These fields have been using implementation science for decades, but its application to questions about public health is relatively new. A rich body of theory on implementation science for health has emerged over the past decade, with the development of more than 50 implementation theories (14, 15, 60). Several theories have similar constructs, and testing of the theories has been limited. No particular theories have been identified as most useful to predict or explain EBI implementation. For many implementation theories, the psychometric properties of construct measures have not been evaluated, the relationships among the constructs have not been assessed, and the predictive validity is unknown. A systematic review conducted by Emmons et al. (20) of peer-reviewed literature highlights the need for additional research on measures for organizational constructs. The review investigated the definitions of constructs and measures of five key characteristics of organizations that are antecedents for implementation science for health (i.e., leadership, vision, managerial relations, climate, absorptive capacity). They found that no measure of a specific construct was used in more than one study, many studies did not report the psychometric properties of the measures, some assessments were based on a single response per organization, and the level of the instrument did not always match the level of analysis (20). IMPROVING THE FIT OF EVIDENCE-BASED INTERVENTIONS IN REAL- WORLD SETTINGS Ideally, we want EBIs to be implemented in policy and practice settings in lieu of interventions that have not undergone sufficient testing (8). However, as we described earlier, characteristics of EBIs, the research design used to test the EBI, and the complexity of the systems in which EBIs are implemented can deter use of these research-tested interventions in practice. Green (34) used the metaphor of a funnel to describe why EBIs generally do not fit well with real-life circumstances (Figure 3). At the start of the funnel is the universe of potential studies defined by funding priorities. These studies become narrowed by transitions through which researchers review grants, researchers determine publications in peer-reviewed journals, researchers synthesize the available evidence on interventions, and researchers (who also are often clinicians) develop guidelines to use EBIs in practice (34). At each transition in the funnel, the priorities and decisions have been rarely, if at all, informed by those who will ultimately use the research (34), and evidence with strong internal validity is overemphasized (32). The result of this discovery-to-practice process is that initial research often involves populations, procedures, and settings that bear little relation to the locus of actual implementation. Glasgow and Green have written extensively about the translation of research to routine practice (24, 26–28, 31–34). We Lobb and Colditz Page 5 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 6. summarize some of the key points in their discussions and expand on their ideas by depicting how greater stakeholder input and attention to external validity in translational steps T2-T4 would improve the relevance and increase the volume of research that passes through the funnel. Peer Review of Research Proposals Starting with the peer-review stage of research, funders can improve reporting on external validity of research by requiring that reviewers evaluate whether analysis plans include methods for data collection that will enable reporting on measures of external validity (e.g., description of withdrawals and dropouts; description of target population compared with population reached; frequency, duration, and intensity of intervention). Guidelines such as the extension of the CONSORT, RE-Aim, TREND, Jadad scale, and STROBE, which have been used recently to judge reporting on external validity of completed studies, can also be used to guide reviewers’ judgment on the analysis plans for reporting external validity in proposed studies (26, 32, 33, 64, 67). Stakeholder review groups can also be added to the review cycle to inform the relevance of the scientific evidence from a user perspective. Several study designs have strong internal validity and use a broad sample of participants, incorporate aspects of general practice, or have flexible intervention strategies to improve the relevance of research (participatory research projects, pragmatic trials, or comparative effectiveness studies). Green recommends these forms of “action research” to produce practice-based evidence that is “closer to the actual circumstances of practice” (31,p.i23). Combining ratings across multiple stakeholder groups is already used in some areas. However, interdisciplinary reviews on the exact same criteria can result in low agreement in ratings across research, practice, and policy groups compared with agreement among members within these stakeholder groups (46). Adding a separate review for stakeholders to comment only on relevance from their perspective would not undermine the ratings on scientific merit or relevance and would contribute toward funding of research with greater potential for application and scientific discovery. Because scientists from different fields have unique groups of immediate stakeholders (Figure 1), the composition of the stakeholders would change depending on the focus of the grant. Yet, the instructions could be standardized across all proposals. Publication Priorities At the next step in the funnel, journals can develop ways to reduce the leaks in research evidence and improve the relevance of the published manuscripts by requiring reviewers to consider how well authors report on external validity, in addition to the consideration currently given to reporting on a study’s internal validity. As noted above, reviewers can use several guidelines to judge an author’s report on the external validity of completed studies. This suggested change in review procedures for manuscripts is practical, given that numerous leading health journals have agreed that it would be important and feasible to report on external validity (26). Green & Glasgow (32) have recommended that journals take several other actions to improve reporting on external validity such as publishing editorials on external validity, featuring some exemplary studies or a special issue that highlights factors related to external validity, expanding online publication of protocols relevant to external validity, engaging other journals and organizations in these efforts, and indexing key words relevant to key dimensions of external validity. Research Synthesis Research synthesis can answer a broad range of questions for knowledge users and scientists using meta-analysis or qualitative methods (e.g., narrative synthesis, grounded theory, thematic analysis) to summarize the results. Meta-analyses of randomized controlled trials Lobb and Colditz Page 6 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 7. has dominated the research synthesis literature to address the question, Which interventions work to improve health? The effective interventions identified by meta-analysis are regularly updated on several web portals (The Community Guide to Community Preventive Services, The Center for Reviews and Dissemination, The US Preventive Services Task Force Recommendations, The Cochrane Public Health Group, and The Campbell Collaboration; see sidebar). However, meta-analysis of RCTs does not inform decision makers about for whom the interventions work, in which settings, and under what intensity, dose, and duration of intervention. To answer these more complex questions, other forms of research synthesis can be used. A broader range of synthesis approaches can complement findings from meta-analysis of RCTs to provide policy and practice stakeholders with additional information on which they can base their decision to use EBIs (37). Systematic review and meta-analysis of observational studies have exploded over the past 25 years with publication guidelines (58) and regression methods (5, 50) implemented to build on the diversity of exposure and outcomes in target settings. Synthesis methods such as scoping studies, meta-narrative, and realist reviews summarize information from a wide range of sources including the non-peer- reviewed literature and reports from qualitative studies to assess the applicability of results across diverse populations, settings, and conditions (37). These alternative methods to research synthesis employ rigorous, systematic processes to summarize information (35, 36, 52). In addition, the opinions of policy and practice experts can be incorporated to guide the questions that are answered by research synthesis and to comment on the gaps in the literature that are identified by the synthesis. Inclusion of stakeholder opinions in literature reviews is common practice in Canada (37, 30) and is recommended by UNICEF/UNDP/ World Bank and the WHO in the Framework for Operations and Implementation Research in Health and Disease Control Programs (29).Research syntheses that examine a broad range of evidence are particularly relevant for public health interventions that are implemented in complex systems (33). As noted by Brownson, the goal of research synthesis should not only be to identify effective interventions, but to allow “the practitioner to understand the local contextual conditions necessary for successful implementation” (8, p. 183). Guidelines One of the more dominant roles of implementation science is to fill the information gap that occurs along the research-to-practice continuum after research synthesis. EBIs have been identified at this point, but stakeholders lack guidelines to determine how to apply and evaluate the use of EBIs across populations and settings. To date, guideline development has been narrowly focused on clinical practices and only somewhat focused on public health practices. The limited guidelines to assist public health decision makers with implementation of EBIs include health impact assessments and Locate evidence, Evaluate it, Assemble it, and inform Decisions (L.E.A.D.) (44). More guideline development is needed to accelerate the integration of EBIs in public health practice (32). Interventions do sometimes diffuse to policy and practice contexts before there is sufficient evidence to suggest effectiveness (59, 63). For example, in randomized trials with a control condition, persons, organizations, or communities often view random assignment to an intervention as unfair and may adopt aspects of the intervention even though its efficacy or effectiveness has not been tested (59). Alternative study designs that offer a placebo instead of no treatment or that wait-list the control group can minimize threats to internal validity of randomized trials. Another example is described in a review of the literature that identified primary studies on the validation of cardiovascular risk models and use of these models in guidelines to treat diabetes. The authors found 45 prediction models, several of which were Lobb and Colditz Page 7 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 8. incorporated in clinical guidelines for the management of type 2 diabetes, even though the risk scores had not been externally validated in a diabetes population (63). Practice and Policy Green’s image of the funnel moves directly from guidelines to practice and policy. Yet, use of EBIs by policy and practice stakeholders rarely occurs automatically after research synthesis and guideline development. For example, political will played a large role in slowing progress with tobacco control (59) and with expediting the discovery and implementation of antiviral therapy to treat HIV (21, 55). In addition, improved relevance of EBIs through participatory research can increase the use of EBIs by policy and practice stakeholders to expedite use of these innovations. However, even if the EBIs have been tested in the target population and setting, some characteristics of the intervention will likely need to be adapted for implementation to be feasible. Adaptation refers to “the degree to which an EBI is changed or modified by a user during adoption and implementation to suit the needs of the setting or to improve the fit to local conditions” (54, p. 30). An important consideration for implementation scientists is the balance between adaptation and fidelity to the intervention’s core components, those thought to determine its effectiveness (8). PARTICIPATORY RESEARCH TO IMPROVE IMPLEMENTATION OF EVIDENCE-BASED INTERVENTIONS Green (31) emphasizes that scientists should engage in participatory research during the discovery research translational phase (T2) to produce EBIs that are more relevant and actionable to policy and practice stakeholders. Participatory research requires ongoing, bidirectional communication between researchers and stakeholders to develop relationships that will inform research questions as well as practice needs and inform the adaptation of interventions instead of development of new innovations that are later discarded because of poor fit (31, 48). The engagement of stakeholders in research can take on many forms, and partners can include institutions, organizations or individuals, or researchers from different disciplines. The specific terms used to describe theories of stakeholder engagement differ depending on whether one refers to social theory or organizational theory. However, consistent across theories is the basic tenet that partnership engagement can be thought of as a continuum of involvement (unlinked, exchange information, work side by side on a common goal, informal sharing of resources, formal sharing of resources based on a written contract or memorandum of understanding) (22, 23). Following are examples of T3 evidence being produced through implementation science along the stakeholder engagement continuum. Example 2: Using Statistical Modeling to Guide Policy and Practice Decisions At the lower end of the stakeholder engagement continuum, implementation scientists can use innovative statistical techniques such as Markov modeling and optimization to engage policy, public health, and clinical practice partners in applying EBIs in the real world. For example, Demarteau et al. (16) assessed the optimal mix of screening and vaccination against human papillomavirus (HPV) to protect against cervical cancer. The evaluation used a Markov cohort model to estimate the outcomes of 52 different prevention strategies and an optimization model in which the results of each prevention strategy of the Markov model were entered as input data to evaluate the combination of different prevention options. The model was applied in the United Kingdom and Brazil. Findings from this implementation study suggested that for a zero increase in budget, the number of cases of cervical cancer would be substantially reduced in the United Kingdom (41%) and Brazil (54%) by implementing an optimal combination of HPV vaccination (80% coverage) and screening at Lobb and Colditz Page 8 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 9. prevaccination coverage (65% United Kingdom, 50% Brazil) while extending the screening interval to every 6 years in the United Kingdom and 5 years in Brazil (16). Example 3: Using Pragmatic Trials to Create Applicable Evidence-Based Interventions The pragmatic trial falls mid-range in the stakeholder-engaged continuum. This type of trial incorporates features such as allowing participating organizations to choose between options of care and determining the effects of an intervention under the usual conditions in which it will be applied. These features are in stark contrast to the features of traditional efficacy trials that study the effects of an intervention under ideal circumstances (61). These trials are not opposites. Rather they can be considered as end points on a pragmatic-explanatory continuum. PRECIS (pragmatic-explanatory continuum indicator summary) is a tool that has been developed to inform the design of intervention trials based on the research questions being addressed (real-world or explanatory) (61). The PRECIS tool consists of ten domains that guide researchers when deciding where their study should fall on the pragmatic- explanatory continuum. The PRECIS criteria have also been applied using a retrospective approach to describe completed trials. Using the PRECIS criteria, nine reviewers independently scored three completed studies that targeted obesity treatment in primary care settings. All three trials were rated in the moderate range on the PRECIS scale, but the ratings varied across PRECIS dimensions, being most pragmatic on the flexibility of the comparison condition and participant representativeness (25). This type of tool can help researchers improve the fit of their studies with real-world circumstances and provide a richer understanding of the potential for implementation success in usual care settings. Example 4: Formal Partnerships between Researchers and Practice Experts At the higher end of the stakeholder engagement continuum scientists create formal partnerships with the institutions and organizations that will ultimately implement the EBIs. The Quality Enhancement Research Initiative (QUERI) at the US Department of Veterans Affairs is an example of a highly integrated partnership between researchers and practitioners. Implementation studies that are conducted through QUERI are an exceptional source of practice-based evidence, and findings can often be generalized to other clinical settings. QUERI utilizes a six-step process to diagnose gaps in performance to identify and implement EBIs to reduce the disease burden for conditions common among veterans, such as heart disease, diabetes, HIV/hepatitis C, and mental health and substance use disorders. For example, one QUERI study explored the sustainability of a nurse-initiated HIV rapid testing intervention that was implemented via 2 90-minute in-service sessions designed to teach pre- and posttest counseling techniques as well as the administration, interpretation, and entering of results into patients’ medical records. Using time-series analysis, the authors found that, among registered nurses who participated in the in-service, this low-intensity implementation strategy resulted in a steady rate of HIV rapid testing for the year following the training. In addition, there was an unexpected increase in HIV blood testing among other clinical staff, leading to a 70% site-wide increase in HIV testing (43). Scientists can use multiple research methods to engage stakeholders at varying levels of integration along the stakeholder engagement continuum to produce evidence that informs how to use EBIs. The level of engagement is more a function of the researcher’s choice than of the research method. For example, time-series analysis is often conducted at the lowest levels of stakeholder engagement (i.e., unlinked or exchange information). However, the example provided here demonstrates use of this method in a highly integrated partnership setting. Lobb and Colditz Page 9 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 10. MOVING FORWARD WITH IMPLEMENTATION SCIENCE TO IMPROVE POPULATION HEALTH Implementation science has tremendous potential to improve population health by accelerating the use of EBIs in practice. We have outlined several ways in which implementation science is being applied to population health. Following are areas in this field of research that should be emphasized in the future to ensure that the products gained through implementation science are in balance with the benefits to population health. Stakeholder Engagement We discussed many opportunities to increase stakeholder engagement throughout the discovery to practice funnel. However, to shift the current paradigm for knowledge production toward a more user-engaged process, training programs, incentives, and reward systems for researchers must also change. Participatory engagement skills are not routinely required in public health, doctoral, fellowship, or faculty training programs for health researchers. University reward structures also slow implementation science because scholars lack incentives and rewards for collaboration across scientific disciplines and with policy and practice experts (23). Funding mechanisms also limit opportunities for community- engaged research when time frames for submission are short, review cycles are long, and funding periods limit evaluation of continued EBI use over time (31). Systems Thinking The introduction of systems thinking to implementation science has revolutionized old ideas that implementation is a simple linear process or that participatory research in itself will improve the use of EBIs in practice and policy settings (39). The extent to which an EBI will be used and the benefit sustained over time is dependent on multiple, multilevel, dynamic processes that respond to feedback loops through nonlinear adaption (39). The implication of this transition to systems thinking is that scientists need to consider how to move our methods of investigation beyond the reductionist approach toward a systems view to accelerate the development of EBIs and implementation strategies. Systems science methods such as agent-based modeling, systems dynamics, and network analysis are increasingly becoming more valuable to implementation science (47) and to implementation in practice. A Pragmatic Approach to Research A pragmatic approach to research selects the methods that are best suited to answer the research question instead of relying on a single research method. Randomized trials are the gold standard for discovery research that answers questions about what works, but this study design provides limited information about how to embed EBIs in complex systems that are constantly adapting to change. More prospective, practice-based cost-effectiveness and comparative-effectiveness studies are needed to clarify the resources needed to implement EBIs and identify which interventions or implementation strategies work best in particular settings. In addition, the development of stakeholder-friendly tools that use mathematical modeling to select interventions based on an individual’s risk for disease or to choose implementation strategies based on characteristics of the intervention and target population will accelerate the use of EBIs in practice and policy settings. The Archimedes Model is one example of how advanced mathematical modeling is being used to advance implementation of EBIs in practice. The US Department of Health and Human Services is using the Archimedes Model to quickly and accurately research, analyze, and evaluate the cost- effectiveness and implementation strategies for specific health care interventions (4). Lobb and Colditz Page 10 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 11. Reliability and Validity of Measures Numerous measures of factors related to implementation of EBIs in target settings have emerged over the past decade, but evaluations of the psychometric properties associated with these measures are limited. Further research is needed to clarify the measures with the strongest properties for particular settings and those that require further development to improve reliability and validity. Also needed is clarification on whether individual measures capture unique constructs or overlapping concepts to refine the terms and definitions of constructs used in theoretical frameworks for implementation. External Validity In addition to knowing what works, policy and practice stakeholders want to know, For whom does it work? In what settings does it work? And in what dose frequency, intensity, and duration does it work? Meta-analysis of RCTs is routinely used to provide stakeholders with information on what works, but the remaining questions that stakeholders ask have not been adequately addressed by research. More research synthesis using meta-regression of RCTs, meta-analysis of observational studies, and meta-narrative reviews is needed to inform stakeholders about whether EBIs have been tested in populations and settings that are relevant to them and whether the characteristics of the interventions and implementation strategies are feasible for them to implement. Many of the EBIs recommended to stakeholders have been developed in studies with highly select participants and in settings with adequate resources and implemented by highly trained staff under ideal conditions. As a result, the greatest value of alternative research synthesis at this time is to identify the gaps in research that need to be filled by pragmatic trials, practice-based research networks, and ongoing participatory research. Even with an increase in action research (31), we will not get the perfect solution to external validity that applies to specific settings or populations. Therefore, the extended benefit of stakeholder engagement in research is the development of relationships that will inform strategies to guide ongoing adaptation and evaluations of interventions as the needs and circumstances of target settings change over time (31, 48). Acknowledgments This work was supported by the National Cancer Institute (CA U54-155496). R.L. is supported by the Foundation for Barnes-Jewish Hospital, St. Louis. G.A.C. is also supported by the Cissy Hornung Clinical Research Professorship, 117489-CRP-03-194-07-CCE, from the American Cancer Society. The authors thank Amy Ostendorf for lending her graphic design talents to this article. Glossary Translation moving scientific discovery along the pathway from bench to bedside to population Stakeholder a party with an interest in research products Population health health outcomes of a group of individuals, including the distributions of such outcomes within the group Efficacy study of intervention performance in a highly selected population in a controlled setting. The emphasis is on internal validity Effectiveness study of intervention performance in diverse populations in real- world settings. External validity is balanced with internal validity Systematic reviews use systematic and explicit methods to identify, select, and critically appraise relevant research to answer a clearly formulated question Lobb and Colditz Page 11 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 12. Evidence-based interventions (EBI) interventions that have undergone sufficient scientific evaluation to be considered effective Dissemination targeted distribution of information and intervention materials to a specific public health or clinical practice audience Implementation use of strategies to introduce or change evidence-based health interventions within specific settings to improve population health Research synthesis the contextualization and integration of findings from individual studies within broader knowledge on the topic, vsing qwryititative and/or quantative methods LITERATURE CITED 1. Aarons GA, Horowitz JD, Dlugosz LR, Ehrhart MG. The role of organizational processes in dissemination and implementation research. 2012:128–153. See Ref. 7. 2. Agency for Healthc. Res. Quality (AHRQ). Health Care Disparities in Rural Areas: Selected Findings from the 2004 National Healthcare Disparities Report AHRQ Publ. No. 05–P022. Rockville MD: AHRQ US Dep. Health Hum. Serv; 2005. http://archive.ahrq.gov/research/ruraldisp/ ruraldispar.htm 3. Andersen R, Mullner R. Assessing the health objectives of the nation. Health Aff. 1990; 9:152–162. 4. Archimedes. Press release. San Franc., Calif.: 2012 May 3rd. HHS enlists Archimedes Inc. to expand government’s use of health care modeling for forecasting quality and cost outcomes. http:// archimedesmodel.com/PR-3-May-2012 5. Berkey CS, Hoaglin D, Mosteller F, Colditz GA. A random-effects regression model for meta- analysis. Stat. Med. 1995; 14:395–411. [PubMed: 7746979] 6. Best A. Systems thinking and health promotion. Am. J. Health Promot. 2011; 25:eix–ex. [PubMed: 21361801] 7. Brownson, RC.; Colditz, GA.; Proctor, EK. Dissemination and Implementation Research in Health: Translating Science to Practice. New York: Oxford Univ Press; 2012. 8. Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: a fundamental concept for public health practice. Annu. Rev. Public Health. 2009; 30:175–201. [PubMed: 19296775] 9. Byers T. Two decades of declining cancer mortality: progress with disparity. Annu Rev Public Health. 2010; 31:121–132. [PubMed: 20070204] 10. Cilenti D, Brownson RC, Umble K, Erwin PC, Summers R. Information-seeking behaviors and other factors contributing to successful implementation of evidence-based practices in local health departments. J. Public Health Manag. Pract. 2012; 18(6):571–576. [PubMed: 23023282] 11. Clin. Transl. Sci. Awards. (CTSA). Progress Report 2009–2011. Bethesda, MD: Natl. Inst. Health; 2009–2011. http://www.palladianpartners.com/NCATS/ctsa_2011/ 12. Colditz GA, Samplin-Salgado M, Ryan CT, Dart H, Fisher L, et al. Harvard report on cancer prevention, volume 5: fulfilling the potential for cancer prevention: policy approaches. Cancer Causes Control. 2002; 13:199–212. [PubMed: 12020101] 13. Colditz GA, Wolin KY, Gehlert S. Applying what we know to accelerate cancer prevention. Sci. Transl. Med. 2012; 4 127rv4. 14. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement. Sci. 2009; 4:50. [PubMed: 19664226] 15. Davies P, Walker AE, Grimshaw JM. A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implement. Sci. 2010; 5:14. [PubMed: 20181130] 16. Demarteau N, Breuer T, Standaert B. Selecting a mix of prevention strategies against cervical cancer for maximum efficiency with an optimization program. Pharmacoeconomics. 2012; 30:337–353. [PubMed: 22409292] Lobb and Colditz Page 12 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 13. 17. Dobbins M, Cockerill R, Barnsley J, Ciliska D. Factors of the innovation, organization, environment, and individual that predict the influence five systematic reviews had on public health decisions. Int. J. Technol. Assess. Health Care. 2001; 17:467–478. [PubMed: 11758291] 18. Edwards BK, Ward E, Kohler BA, Eheman C, Zauber AG, et al. Annual report to the nation on the status of cancer, 1975–2006, featuring colorectal cancer trends and impact of interventions (risk factors, screening, and treatment) to reduce future rates. Cancer. 2010; 116:544–573. [PubMed: 19998273] 19. Eheman C, Henley SJ, Ballard-Barbash R, Jacobs EJ, Schymura MJ, et al. Annual report to the nation on the status of cancer, 1975–2008, featuring cancers associated with excess weight and lack of sufficient physical activity. Cancer. 2012; 118:2338–2366. [PubMed: 22460733] 20. Emmons KM, Weiner B, Fernandez ME, Tu SP. Systems antecedents for dissemination and implementation: a review and analysis of measures. Health Educ. Behav. 2012; 39:87–105. [PubMed: 21724933] 21. Fee E, Krieger N. Understanding AIDS: historical interpretations and the limits of biomedical individualism. Am. J. Public Health. 1993; 83:1477–1486. [PubMed: 8214245] 22. Gajda R. Utilizing collaboration theory to evaluate strategic alliances. Am. J. Eval. 2004; 25:65– 77. 23. Gehlert S, Colditz GA. Cancer disparities: unmet challenges in the elimination of disparities. Cancer Epidemiol. Biomarkers Prev. 2011; 20:1809–1814. [PubMed: 21784956] 24. Glasgow RE, Emmons KM. How can we increase translation of research into practice? Types of evidence needed. Annu. Rev. Public Health. 2007; 28:413–433. [PubMed: 17150029] 25. Glasgow RE, Gaglio B, Bennett G, Jerome GJ, Yeh HC, et al. Applying the PRECIS criteria to describe three effectiveness trials of weight loss in obese patients with comorbid conditions. Health Serv. Res. 2011; 47:1051–1067. [PubMed: 22092292] 26. Glasgow RE, Green LW, Ammerman A. A focus on external validity. Eval. Health Prof. 2007; 30:115–117. 27. Glasgow RE, Klesges LM, Dzewaltowski DA, Estabrooks PA, Vogt TM. Evaluating the impact of health promotion programs: using the RE-AIM framework to form summary measures for decision making involving complex issues. Health Educ. Res. 2006; 21:688–694. [PubMed: 16945984] 28. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am. J. Public Health. 1999; 89:1322–1327. [PubMed: 10474547] 29. Glob. Fund to Fights AIDS, Tuberc., and Malar. Framework for Operations and Implementation Research in Health and Disease Control Programs. Geneva: Glob. Fund; 2008. http:// www.theglobalfund.org/en/me/documents/operationalresearch/ 30. Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, et al. Lost in knowledge translation: time for a map? J Contin. Educ. Health Prof. 2006; 26:13–24. [PubMed: 16557505] 31. Green LW. Making research relevant: If it is an evidence-based practice, where’s the practice- based evidence? Fam. Pract. 2008; 25(Suppl. 1):i20–i24. [PubMed: 18794201] 32. Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research: issues in external validation and translation methodology. Eval. Health Prof. 2006; 29:126–153. [PubMed: 16510882] 33. Green LW, Glasgow RE, Atkins D, Stange K. Making evidence from research more relevant, useful, and actionable in policy, program planning, and practice: slips “twixt cup and lip.”. Am. J. Prev. Med. 2009; 37:S187–S191. [PubMed: 19896017] 34. Green LW, Ottoson JM, Garcia C, Hiatt RA. Diffusion theory knowledge dissemination utilization integration in public health. Annu. Rev. Public Health. 2009; 30:151–174. [PubMed: 19705558] 35. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004; 82:581–629. [PubMed: 15595944] 36. Greenhalgh T, Wong G, Westhorp G, Pawson R. Protocol–realist and meta-narrative evidence synthesis: evolving standards (RAMESES). BMC Med. Res. Methodol. 2011; 11:115. [PubMed: 21843376] Lobb and Colditz Page 13 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 14. 37. Grimshaw, J. A Guide to Knowledge Synthesis. A Knowledge Synthesis Chapter. Ottawa:: Can. Inst. Health Res.; 2012. http://www.cihr-irsc.gc.ca/e/4.html 38. Hannon PA, Fernandez ME, Williams RS, Mullen PD, Escoffery C, et al. Cancer control planners’ perceptions and use of evidence-based programs. J. Public Health Manag. Pract. 2010; 16:E1–E8. [PubMed: 20357600] 39. Holmes BJ, Finegood DT, Riley BL, Best A. Systems thinking in dissemination and implementation research. See Ref. 7. 2012:175–191. 40. Jacobs JA, Dodson EA, Baker EA, Deshpande AD, Brownson RC. Barriers to evidence-based decision making in public health: a national survey of chronic disease practitioners. Public Health Rep. 2010; 125:736–742. [PubMed: 20873290] 41. Khoury MJ, Gwinn M, Yoon PW, Dowling N, Moore CA, Bradley L. The continuum of translation research in genomic medicine: How can we accelerate the appropriate integration of human genome discoveries into health care and disease prevention? Genet. Med. 2007; 9:665–674. [PubMed: 18073579] 42. Kindig D, Stoddart G. What is population health? Am. J. Public Health. 2003; 93:380–383. [PubMed: 12604476] 43. Knapp H, Anaya HD, Goetz MB. Attributes of an independently self-sustaining implementation: nurse-administered HIV rapid testing in VA primary care. Qual. Manag. Health Care. 2010; 19:292–297. [PubMed: 20924249] 44. Kumanyika S, Brownson RC, Cheadle A. The L.E.A.D. framework: using tools from evidence- based public health to address evidence needs for obesity prevention. Prev. Chronic Dis. 2012; 9:E125. [PubMed: 22789443] 45. Lamb, S.; Greenlick, MR.; McCarty, D. Bridging the Gap Between Practice and Research: Forging Partnerships with Community-Based Drug and Alcohol Treatment. Washington, DC: Natl. Acad. Sci.; 1998. p. 288 46. Lobb R, Petermann L, Manafo E, Keen D, Kerner J. Networking and knowledge exchange to promote the formation of trans-disciplinary coalitions and levels of agreement among trans- disciplinary peer reviewers. J. Public Health Manag. Pract. 2012 In press. 47. Luke DA, Stamatakis KA. Systems science methods in public health: dynamics, networks, and agents. Annu. Rev. Public Health. 2012; 33:357–376. [PubMed: 22224885] 48. Moons KG, Kengne AP, Grobbee DE, Royston P, Vergouwe Y, et al. Risk prediction models: II. External validation, model updating, and impact assessment. Heart. 2012; 98:691–698. [PubMed: 22397946] 49. Natl. Cent. Health Stat.. Health, United States, 2009 With Special Feature on Medical Technology. Hyattsville, MD: US Dep. Health Hum. Serv.; 2010. http://www.cdc.gov/nchs/data/hus/hus09.pdf 50. Normand S. Meta-analysis: formulating, evaluating, combining and reporting. Stat. Med. 1999; 18:321–359. [PubMed: 10070677] 51. Oberle MW, Baker EL, Magenheim MJ. Healthy People 2000 and community health planning. Annu. Rev. Public Health. 1994; 15:259–275. [PubMed: 8054085] 52. Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist review—a new method of systematic review designed for complex policy interventions. J. Health Serv. Res. Policy. 2005; 10(Suppl. 1): 21–34. [PubMed: 16053581] 53. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm. Policy Ment. Health. 2009; 36:24–34. [PubMed: 19104929] 54. Rabin, BA.; Brownson, RC. RC, Brownson; GA, Colditz; EK, Proctor. Dissemination and Implementation Research in Health: Translating Science to Practice. New York: Oxford Univ. Press; 2012. Developing the terminology for dissemination and implementation research; p. 23-54. 55. Rust G, Satcher D, Fryer GE, Levine RS, Blumenthal DS. Triangulating on success: innovation, public health, medical care, and cause-specific US mortality rates over a half century (1950–2000). Am. J. Public Health. 2010; 100(Suppl. 1):S95–S104. [PubMed: 20147695] 56. Schillinger, D. An Introduction to Effectiveness, Dissemination and Implementation Research. Fleisher, P.; Goldstein, E., editors. San Francisco: Community Engagem. Progr., Clin. Transl. Sci. Inst., Univ. Calif; 2010. http://ctsi.ucsf.edu/files/CE/edi_introguide.pdf Lobb and Colditz Page 14 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 15. 57. Schroeder SA. We can do better—improving the health of the American people. Shattuck lecture. N. Engl. J. Med. 2007; 357:1221–1228. [PubMed: 17881753] 58. Stroup D, Berlin J, Morton S, Olkin I, Williamson G, et al. Meta-analysis of observational studies in epidemiology. A proposal for reporting. JAMA. 2000; 283:2008–2012. [PubMed: 10789670] 59. Susser M. The tribulations of trials–intervention in communities. Am. J. Public Health. 1995; 85:156–158. [PubMed: 7856769] 60. Tabak RG, Khoong EC, Chambers D, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am. J. Prev. Med. 2012; 43(3):337–350. [PubMed: 22898128] 61. Thorpe KE, Zwarenstein M, Oxman AD, Treweek S, Furberg CD, et al. A Pragmatic-Explanatory Continuum Indicator Summary (PRECIS): a tool to help trial designers. J. Clin. Epidemiol. 2009; 62:464–475. [PubMed: 19348971] 62. Vanderpool RC, Gainor SJ, Conn ME, Spencer C, Allen AR, Kennedy S. Adapting and implementing evidence-based cancer education interventions in rural Appalachia: real world experiences and challenges. Rural Remote Health. 2011; 11:1807. [PubMed: 21988459] 63. van Dieren S, Beulens JW, Kengne AP, Peelen LM, Rutten GE, et al. Prediction models for the risk of cardiovascular disease in patients with type 2 diabetes: a systematic review. Heart. 2012; 98:360–369. [PubMed: 22184101] 64. von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Rev. Esp. Salud Publ. 2008; 82:251–259. 65. Warnecke RB, Oh A, Breen N, Gehlert S, Paskett E, et al. Approaching health disparities from a population perspective: the National Institutes of Health Centers for Population Health and Health Disparities. Am. J. Public Health. 2008; 98:1608–1615. [PubMed: 18633099] 66. Westfall JM, Mold J, Fagnan L. Practice-based research—”Blue Highways” on the NIH roadmap. JAMA. 2007; 297:403–406. [PubMed: 17244837] 67. Zwarenstein M, Treweek S, Gagnier JJ, Altman DG, Tunis S, et al. Improving the reporting of pragmatic trials: an extension of the CONSORT statement. Br. Med. J. 2008; 337:a2390. [PubMed: 19001484] Lobb and Colditz Page 15 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 16. WEB PORTALS The Campbell Collaboration: http://www.campbellcollaboration.org. The Center for Reviews and Dissemination: http://www.york.ac.uk/inst/crd The Cochrane Public Health Group: http://ph.cochrane.org The Community Guide to Preventive Services: http://www.thecommunityguide.org/ index.html The US Preventive Services Task Force Recommendations: http:// www.uspreventiveservicestaskforce.org/recommendations.htm Lobb and Colditz Page 16 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 17. SUMMARY POINTS 1. Measures for and theories of implementation of evidence-based interventions for health are in the early stages of development. 2. The existing paradigm for how scientists create evidence, prioritize publications, and synthesize research needs to shift toward greater stakeholder input and improved reporting on external validity to improve population health. 3. Implementation science can speed translation from discovery to application and public health benefits. 4. Stakeholder input and partnerships can increase the relevance of research to practice settings. 5. Practice-based evidence is needed to provide information on external validity and inform stakeholders’ decisions about the use of evidence-based interventions. 6. Implementation of evidence-based interventions occurs in target settings that are part of complex, dynamic systems that respond to feedback loops through nonlinear adaptation. Lobb and Colditz Page 17 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 18. FUTURE ISSUES 1. How will tools such as PRECIS lead to more practice-based evidence? 2. How will an emphasis on external validity throughout the discovery-to-practice continuum better inform practice and policy stakeholders about how to use evidence-based interventions? 3. In what ways can mathematical modeling be used to accelerate the translation of research to practice? 4. How can we measure progress with implementation science at a population level? 5. How can we evaluate progress by including stakeholders’ perspectives throughout the discovery-to-practice continuum? 6. How can web portals that promote information on EBIs incorporate searchable information about the persons, settings, and doses of intervention for which the interventions are or are not effective? 7. How does systems science inform the development of implementation strategies for EBIs? Lobb and Colditz Page 18 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 19. Figure 1. Examples of stakeholders at translational steps in the NIH Roadmap Initiative. Lobb and Colditz Page 19 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 20. Figure 2. Factors that influence the spread and sustained use of innovations in health service delivery and other organizations. From Reference 35 with permission. Lobb and Colditz Page 20 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
  • 21. Figure 3. Improving the flow and relevance of research evidence for implementation. Adapted from Reference 34, with permission from the Annual Review of Public Health, Volume 30 © 2009 by Annual Reviews, http://www.annualreviews.org. Lobb and Colditz Page 21 Annu Rev Public Health. Author manuscript; available in PMC 2014 January 24. NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript