This document presents an evaluation of the Joanna Briggs Institute (JBI) scoping reviews methodology, drawing on the experiences of its current users. Scoping reviews have become an increasingly prominent tool in evidence synthesis since 2005, serving to identify, analyze, and report on evidence across various research areas. Unlike traditional systematic reviews, which typically assess the comparative effectiveness of interventions, scoping reviews are primarily conducted to understand the characteristics of a body of literature, clarify concepts, investigate research conduct, or identify knowledge gaps. They can also help define parameters for more focused systematic reviews or pinpoint areas requiring further primary investigation.
Recognizing the growing use and the disparate methods often employed in scoping reviews, the JBI Database of Systematic Reviews and Implementation Reports published a comprehensive methodology for their conduct and reporting in 2014. This methodology defines a JBI scoping review as one that focuses on a specific research area, employs a well-defined research question (including information on participants, concept of interest, and context), utilizes a well-documented search strategy, and includes data charting and presentation. Notably, critical appraisal and meta-synthesis are not mandatory components of JBI scoping reviews. These reviews have been applied across diverse fields, such as medication safety, nurse practitioner education, disaster management, and indigenous healthcare models.
To aid in the ongoing refinement of the JBI guidance, the original authors of the methodology undertook a study to evaluate users’ experiences. This evaluation aimed to identify strengths and limitations and gather feedback on the various stages of scoping review development. The study involved an electronic survey, administered via Qualtrics, to 51 corresponding authors of published scoping reviews and protocols in the JBI Database of Systematic Reviews and Implementation Reports.
Of the invited users, 31 participants completed the survey, yielding a 61% response rate. The majority of respondents were researchers (55%) and university employees (77%). The study revealed that completing a scoping review is a significant undertaking, with 42% of participants reporting it took between 6 and 12 months, and 32% spending over a year. Importantly, 87% of participants indicated that their scoping reviews led to further work, such as developing systematic reviews, forming the basis for grant applications, contributing to doctoral studies, or informing future research projects.
Key findings regarding the JBI methodology highlighted both its value and areas for improvement:
- Strengths: Participants generally valued the methodology for its clarity, systematic approach, transparent nature, detailed guidance, and the expertise behind its development. Many praised the well-described, step-by-step approach within the JBI manual.
- Limitations and Areas for Improvement:
- A significant and recurring request was for more examples within each section of the methodology, particularly concerning inclusion criteria and the presentation of results.
- Participants noted a need for greater guidance on inclusion criteria.
- Clarification was sought for results presentation.
- 14% of participants found the guidance for data extraction inadequate, citing challenges with the unclear nature and extent of data to extract, and the need for dynamic extraction forms.
- A smaller proportion of participants (14%) also reported inadequate guidance for writing up the scoping review report, reflecting broader challenges in standardizing such reports due to the flexible nature of scoping reviews.
- Some users expressed a desire for the JBI software to include templates specific to scoping reviews, similar to those available for other systematic review types.
In conclusion, while the JBI scoping review methodology is highly valued by its users, the evaluation underscores a clear demand for additional detailed guidance, especially concerning inclusion criteria and the presentation of results, along with the provision of clear examples for each methodological step. The findings from this study are intended to inform future updates and enhancements to the JBI scoping review methodological chapter, aiming to further clarify and improve this essential evidence synthesis tool.
Reference for the source:
Khalil, H., Bennett, M., Godfrey, C., McInerney, P., Munn, Z., & Peters, M. (2020). Evaluation of the JBI scoping reviews methodology by current users. International Journal of Evidence-Based Healthcare, 18(1), 95–100. https://doi.org/10.1097/XEB.0000000000000202
