This paper, titled “Evaluation of the JBI scoping reviews methodology by current users,” was authored by Hanan Khalil, Marsha Bennett, Christina Godfrey, Patricia McInerney, Zac Munn, and Micah Peters, representing institutions such as Latrobe University, LSU Health, Queen’s University, University of the Witwatersrand, University of Adelaide, and the University of South Australia. Published in 2020, this original research sought to assess the practical experiences of authors utilizing the Joanna Briggs Institute (JBI) scoping review methodology.
Scoping reviews, a newer type of evidence synthesis that has gained prominence since 2005, serve to identify, analyze, and report on existing evidence. Unlike traditional systematic reviews which typically evaluate the comparative effectiveness of interventions, scoping reviews aim to map a body of literature to understand its characteristics, identify knowledge gaps, clarify concepts, or investigate research conduct. They are often a precursor to more focused systematic reviews or primary research. While systematic reviews have been around since the 1970s, scoping reviews are a more recent addition, leading to varied methodological approaches and challenges for authors.
Recognizing this, the JBI Database of Systematic Reviews and Implementation Reports published comprehensive guidance for conducting and reporting scoping reviews in 2014, building upon earlier frameworks. This JBI methodology outlines key steps including focusing on a specific research area, formulating a well-defined research question using the Participants, Concept, Context (PCC) framework, documenting a search strategy, data charting, and presentation. Crucially, critical appraisal and meta-synthesis are not required components of JBI scoping reviews, as their primary goal is to provide an overview rather than a critically synthesized answer. This makes scoping reviews more of a hypothesis-generating exercise compared to the hypothesis-testing nature of systematic reviews. JBI’s approach aligns with the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analysis for Scoping Reviews) checklist, which helps standardize conduct and reporting.
The primary objective of this study was to evaluate users’ experiences with the JBI scoping review methodology, gather feedback on its development stages, and pinpoint its strengths and limitations. The authors, who were also the original developers of the JBI methodology, aimed for these findings to inform future refinements and enhance the methodological guidance for researchers, academics, clinicians, and policymakers. The study involved an electronic survey distributed to 51 registered users in the JBI Database of Systematic Reviews and Implementation Reports, with 31 participants completing the survey (a 61% response rate). The majority of participants were researchers (55%) and university employees (77%), with many reporting that their scoping reviews led to further work (87%). Most participants found the guidance adequate (31% strongly agreed, 48% agreed), though limitations were identified in specific areas.
Overall, the JBI scoping review methodology is valued by its users for its clarity, rigor, and systematic, step-by-step approach, being described as a well-documented, detailed, and transparent process developed by experts. However, the study highlighted significant areas for improvement. The most prominent feedback indicated a strong need for additional detailed guidance regarding inclusion criteria and the presentation of results, along with the provision of clear examples for each step of the methodology. Participants also noted that the current JBI software could be revised to include templates for scoping reviews, similar to other systematic review types. Furthermore, 14% of participants felt the guidance for data extraction was inadequate, reflecting challenges identified in other studies concerning the unclear nature and extent of data to extract iteratively. Similarly, a substantial proportion (14%) disagreed that adequate guidance was provided for writing the scoping review report, underscoring an ongoing lack of consensus on standardized procedures for these flexible reviews. The findings from this evaluation are intended to inform updates to the JBI guidance, ensuring its continued relevance and utility.
Reference for this article:
Khalil, H., Bennett, M., Godfrey, C., McInerney, P., Munn, Z., & Peters, M. (2020). Evaluation of the JBI scoping reviews methodology by current users. International Journal of Evidence-Based Healthcare, 18(1), 95–100.
