Evidence synthesis research (systematic reviews, meta-analyses, and more) offers access to an entire body of literature in one place, which is appealing for educational decision-makers who want a comprehensive look at evidence before making program and policy decisions. However, studies on the use of research in education offer little evidence of educators actually using synthesis findings in their decision-making, despite policymakers saying they want to make decisions based on an entire body of evidence instead of single studies. At the HEDCO Institute for Evidence-Based Educational Practice, our work centers around evidence synthesis research, with an intentional focus on making our findings relevant, useful, and accessible to educators. Given this mission, we think a lot about the gap between evidence synthesis production and its use in education, ultimately questioning:
How can we narrow the gap between evidence synthesis research and its practical use to better support educators in their work?
Literature on the use of research evidence (URE) in policy and practice offers some promising strategies that can be applied to evidence synthesis research. For example, studies on URE have clearly shown the importance of relationships between researchers and educational decision-makers for improving URE. Meaningful relationships can span from formal, long-term research-practice partnerships, to informal, time-limited consultations that serve as more of a rapid response to educators’ needs.
For evidence synthesis research, this might include formal co-production of evidence (involving educators in all phases of a review as members of the research team) or limited engagement such as “top-and-tail” (involving educators at the start and end of a review) or one-time involvement. It might also entail brokering knowledge from existing reviews: meeting with educators to learn about their research needs, finding syntheses, assessing their quality, and discussing findings with educators in accessible ways.
In our work, we have implemented two of these approaches: (1) involving educators using a top-and-tail approach in four systematic reviews and (2) creating briefs based on existing evidence synthesis research. For the systematic reviews – two on school-based mental health prevention programs and two on the four-day school week – we gathered interest-holder feedback on what information was most important to them and how to present findings in ways that were useful. Their insights shaped our codebooks and data collection, as well as the content and design of our non-technical reports. Interest-holder feedback was pivotal in improving the potential usefulness and accessibility of our findings for decision-makers, which also echoed other strategies drawn from URE research such as writing in plain language, minimizing jargon, and including visuals.
For the briefs, we met in short, 20 to 30 minute meetings with interest-holders to learn about upcoming decisions in their work where research might be useful. These topics were much narrower and more focused compared to the broader issues we addressed in our own reviews. For example, one state education leader requested research on best practices for implementing state guidance at the local level, whereas a school psychologist asked for research on open enrollment policies in her state because of upcoming votes in the legislature. Our team searched high-quality, rigorous synthesis research to answer these questions, which we have done on a regular basis for broader topics of interest. Notably, for these questions, synthesis research was difficult to find and often non-existent given the specificity of the questions. Still, we put together findings from single studies and grey literature that were relevant and useful, and again, presented them using best practices for communicating evidence to non-research audiences.
Two approaches used to connect educators with evidence synthesis research
Involving educators in a new systematic review
- Interviews with interest-holders at the start of a review to inform data collection efforts
- Interviews at the end of a review to discuss findings and get feedback on presentation of results
Creating briefs for educators based on existing reviews
- One-on-one meetings with interest-holders to ask for decisions where research might be useful
- Finding existing high-quality evidence syntheses and creating a tailored, accessible brief of findings
These two different approaches connected educators with evidence synthesis research in different ways, and educators’ involvement in our systematic review process clearly benefited our work. What is more difficult to measure is how useful these approaches were for educators themselves and the extent to which they meaningfully changed URE in decision-making. Interest-holders in both groups told us they appreciated the chance to offer feedback and liked seeing research presented in an accessible way, suggesting there is a benefit to involvement, but it’s unclear if “the juice is worth the squeeze” considering the time it took to recruit and interview participants and the financial resources for appropriately compensating them for their time. For the briefs, the time it took to create the briefs in a short timeframe – two staff members working almost exclusively on the briefs for about a week – is impractical for every decision, so a more sustainable model would be more impactful.
Interest-holders also told us they intended to share findings with colleagues and thanked us for providing them with research they could use. However, it has been incredibly difficult to track how and when findings have been used in practice over time. If findings have shaped conversations or shifted mindsets around these topics (i.e., conceptual use), how can we measure it? And if we can measure it, who decides what constitutes “meaningful” use?
Further, our team found evidence that was both high and low quality across all fields of study we searched for these projects. Communicating this mixed quality to educational decision-makers is critical for offering a fully transparent view of the findings, but it is unclear how practically useful it is or how best to present it. This consideration is particularly important when creating unsystematic summaries of existing research – a common approach that many knowledge brokers use – because brokers may unintentionally downplay (a lack of) study quality when presenting the evidence. We recognize that systematically assessing quality of evidence takes time, though, so is the time saved with a more rapid approach worth the trade-off?
As evidence synthesis research continues to proliferate (one study estimated 80 new systematic reviews published per day in 2019), these are questions the broader educational field must consider to assure educators are getting the best evidence in ways that are useful and accessible to support their work.
Please join Dr. Day at her upcoming Knowledge Cafe to diver deeper into these questions!
