Article Text
Statistics from Altmetric.com
More commitment to deal with the research–practice gap
How best to put evidence into effective practice to achieve an intended reduction in morbidity, mortality or disability has long been an issue of concern in research on injury field. Research-to-practice gaps have always existed and progress in this subject has been slow. Factors that contribute to this problem include lapses in communication between researchers and practitioners, and service delivery issues such as lack of public awareness, poor financing and a non-supportive political atmosphere. Scientific publications of research on intervention effectiveness, which do not provide information useful for widescale public health dissemination, also add to the problem.1 Additional issues cited by public health practitioners are that interventions may be too narrowly focused, complex, difficult and costly, or may not engage or meet the perceived needs of the community.2–4 Once established, prevention programs must be sustained with adequate infrastructure and long-term intensity, requiring substantial resource investment.2
The process described in the article by Brussoni et al5 (this issue, p 373), began with the academic team accessing systematic reviews or meta-analyses to synthesize information from existing research and evaluation studies on a specific topic (eg, smoke alarm programs) to determine effectiveness of strategies.6 The researchers then convened local practitioners, policy makers and other professionals with the goal of planning potential programmatic action to deal with a targeted injury problem for which prevention strategies have proved successful. The process culminated with the production of an “effective action briefing”. We applaud the authors and Injury Prevention for providing a forum to continue these discussions.
By providing summaries of a large number of research or evaluation studies, a well-conducted systematic review can be invaluable to practitioners. In public health, the focus on evidence-based interventions has led to several frameworks with which to assess the rigor of the available research. One of the most widely cited is the “hierarchy of evidence”, which places greater weight on evidence that comes from more rigorous study designs.7,8 However, there is growing recognition that even evidence-based guidelines from tightly controlled trials, ideally controlled by random assignment, may not be a sufficient framework to weigh all of the information needed to design an intervention appropriate for a community.9–12 These methods do not take into consideration the diverse circumstances of public health practice,3,9 and many appraisals of evidence do not distinguish between failure of the intervention concept or theory versus failure of implementation.10
Even proved effective interventions can be rendered ineffective at any stage of the process, including the initial concept and planning stage (represented in the article by Brussoni et al5). In addition, the complexities of program design and delivery including inadequate reach into the target population, facing unanticipated community obstacles, lack of participant acceptance or compliance and many of the barriers noted in this paper may lead to failure.13–16 The emerging discipline of translational research, which focuses on the process of moving evidence-based programs from their development into widespread practice, may provide valuable information about factors associated with successful implementation.17 This method may generate knowledge to help reduce the theory–practice gap but will “require long-term commitment among researchers, practitioners and policy makers”.18
The paper by Brusonni et al5 recommends practice field meetings to facilitate communication between researchers, public health practitioners, policy makers, managers and other professionals from important sectors. These groups identified strategies, policies, target populations, barriers, facilitators and funding streams for implementing injury prevention programs. The process is derived from a report by Kelly et al19 from the Health Development Agency in London, which describes a highly structured approach to developing preventive interventions. The process used by Brussoni et al included about 400 person-hours at meetings (98 people who attended meetings that were > 4 h long), and more hours in preparation, travel and support for the meetings. Some may question whether it is practical for most programs to invest such resources in the earliest stages of planning; actual implementation of interventions will require even more effort and resources than the planning stage.
Others may question whether this overarching structure allows enough flexibility for interventions to be adapted to the target population. If the agenda is determined by, and the work done on, the researcher’s terms, it is likely that useful input from the community may be limited or stifled. If the researchers are asking all the questions, then they define the direction of the project and ultimately the knowledge gained. Knowledge has been described as a social phenomenon requiring that stakeholders be engaged throughout information gathering: “for all information’s independence and extent, it is people in their communities, organizations, and institutions who ultimately decide what it all means and why it matters”.20
What is missing from the Brussoni et al’s5 report is the most important question about this process: does it work? Given that the participants “unanimously agreed” on challenges associated with limited funding and staffing (ie, infrastructure), will any effective intervention action occur as a result of the plan? We look forward to reading future reports about the effect of the process outlined by the authors on the development, implementation and evaluation of effective community-based interventions.
The barriers identified in their paper are common issues that public health practitioners must consider and overcome in the implementation of any intervention. It is not feasible or practical for complex, comprehensive public health interventions to be designed as highly controlled studies. Highly controlled studies that do not attempt to deal with these “intervention issues” in implementation will probably fail because the structured program design did not anticipate the barriers and was not flexible. Public health practitioners must use research-based evidence judiciously when planning community interventions but, to be effective, interventions also require competent planning and evaluation of the program that considers the needs and expectations of the recipients and the interest of key stakeholders.12
“Collaboration implies an equal partnership” and “trusting in the motives and intelligence of people from different backgrounds”.21 Bridging the gap between research and practice will require what some organizational change experts call “knowledge activation”. This process “is about enabling, not controlling … anyone who wants to be a knowledge activist must give up, at the outset, the idea of controlling knowledge creation”.22 It is practically impossible to learn the skill of transforming past evidence into effective community intervention in a classroom, and difficult to teach it in an academic setting. Competence can be learnt best through practice and through development of relationships in the community. This emphasizes the need for universities to place a higher priority on real community experience in the graduate curriculum to prepare the next generation of researchers and practitioners for the challenges of prevention.
A successful leader for an injury prevention program must be “inclusive rather than exclusive, and work as a partner, not as the expert who knows exactly what to do and how to do it”.21 The “fit” between the community intervention practitioners and the targeted community, including attitudes of people involved, is an important influence on the success of most programs.15 We have repeatedly found that the enthusiasm, skills and cultural competence of the community injury prevention practitioner have a major effect on the success of a program, both in its embrace by the community and collaborators, and its acceptance by the intended audience. This aspect of a program is difficult to evaluate, or even describe, but it is nevertheless an important element—“human behavior never is, never was, and never will be a spectator sport”.23
Christoffel and Gallagher21 have described several important issues in a systems approach to developing a comprehensive injury prevention program. Additional conceptual frameworks for injury prevention programs move beyond the formative stage of bringing important community planners and resources together towards an actual framework delineating a systems approach to prevention.24,25 In some ways, the process described by Brussoni et al seems similar to the initial process outlined in the Safe Community and PRECEDE-PROCEED models.26 Active involvement of the intended audience (not mentioned in the paper by Brussoni et al) and researchers in other disciplines (ie, fire-rescue personnel, teachers, religious leaders and others) is a necessary part of planning and implementing any effective intervention at a community level.
Few of the published multifaceted, community-based interventions reported external validity measures or had sufficient description of any of the program phases (planning, design or implementation) to identify key success factors of the programs.4,27,28 A recent report in Injury Prevention reported that few articles about smoke alarm programs had sufficient detail about implementation to allow replication of the intervention—that is, “to move from understanding the evidence to using it”.16 This lack of published description of effective process in papers on injury prevention may be a key reason for the difficulties in replicating community-based prevention efforts. Journal editors can make an important contribution here by encouraging a focus on implementation methods in scientific articles that deal with community interventions, either by allowing lengthier descriptions of interventions within articles or by at least providing web-based links to such in-depth descriptions.
Another contributing factor is that most grant funding guidelines do not deal with the issue of formative evaluation and community assessment methods, both of which are crucial to effective implementation. We could question the value of answering “what happened” outcome questions if the “why did this happen” questions are neither asked nor answered.
Finally, we believe that starting the dialog and encouraging a commitment from the injury field to deal with the barriers that have been identified can resolve this gap. It will require two-way communication, understanding and appreciation of the complex work conducted by both researchers and practitioners, as well as acknowledgement of their interdependence. Additional commitment from funders, journal editors and educational institutions is necessary. We request international and national organizations (eg, in the US, the National Center for Injury Prevention and Control, State and Territorial Injury Prevention Directors’ Association, Society for Advancement of Violence and Injury Research and Association of Schools of Public Health) to further the discussion on ways to deal with the complex challenge of bridging the research–practice gap.
RECOMMENDATIONS TO BRIDGE THE RESEARCH–PRACTICE GAP
-
Researchers and practitioners should engage the community, including stakeholders, as equal partners in the initiation of community-based interventions.
Scientific evidence and community knowledge should be integrated into intervention planning.
-
Journal editors should allow a focus on implementation methods in scientific articles that deal with community interventions, either by allowing lengthier descriptions of interventions within articles or providing web-based links to such in-depth descriptions.
-
Negative findings warrant careful exploration to determine whether the research failed to find an effect as a result of program design, implementation or evaluation.
-
The injury field should have an equal focus on theory and research, practice, and training, including the following:
Universities should make it a priority that the next generation of researchers (ie, graduate students) and practitioners acquire real experience in community-based programs.
Existing practitioners should receive quality training and opportunities for skills development to enhance their ability to apply scientific evidence and community knowledge at every stage of intervention development, adaptation, implementation and evaluation.
Funding guidelines should support the acquisition of comprehensive knowledge by requiring strong formative and process information and outcome data.
More commitment to deal with the research–practice gap
REFERENCES
Footnotes
-
Competing interests: None declared.