Researching Unconventional Therapies: A Methodological Framework

This detailed introductory piece further explores the article “How Should We Research Unconventional Therapies? A Panel Report from the Conference on Complementary and Alternative Medicine Research Methodology, National Institutes of Health” by Vickers et al., published in 1997.

This report originated from a series of conferences sponsored by the U.S. National Institutes of Health’s Office of Alternative Medicine (OAM), which was established in 1991 with a mandate to promote and fund research into unconventional forms of health care. A core challenge for the OAM, and indeed for the broader scientific community, was whether standard research methods, developed within a biomedical framework, were suitable for investigating medical systems often based on significantly different “world views” from conventional medicine.

The panel concluded that research in unconventional medicine necessitates a “mosaic” of evidence, meaning that various questions need to be asked to build a comprehensive understanding. Crucially, the choice of research design should be determined by the specific question being asked, rather than the therapy itself. This approach aims to avoid the classic problems in CAM research by closely aligning the research design with the inquiry.

The article outlines the panel’s deliberations in addressing four main questions:

  • What is a good question?
    • Designing effective research questions requires knowledge and understanding of the subject matter, ideally involving a team of experts in both the therapy and research methodology.
    • Good questions must be answerable and important.
    • Answerability implies that a question is explicit, focused, and practicable.
      • Explicitness means clearly defining terms, as words like “effective” can mean different things to patients and scientists.
      • Focus is crucial to avoid overly complex designs with multiple endpoints, which are a common problem in research. Large, global questions should be broken down into manageable stages, with each stage associated with a single study.
      • Practicability means that the question can be addressed given available resources, patient numbers, and suitable research tools. For example, a question about strengthening a “heart chakra” is currently impractical due to the lack of reliable measurement methods.
    • Importance means the answer should ideally lead to the alleviation or prevention of suffering, be of personal significance to the researcher, and be relevant to a social group. Traditional criteria for priority setting (burden of illness, cost to society, knowledge gap, adequacy of current therapy) apply, but for CAM, the extent of public use should also be considered.
  • How can questions be matched with research designs?
    • This is a two-stage process: first, defining the general type of question to deduce an appropriate methodology (e.g., sociological research for public utilization, clinical trial for attributing effect). Second, matching specific aspects of the question to particular study aspects (e.g., qualitative vs. quantitative for sociological research, type of control group in a clinical trial).
    • A primary focus in CAM research is attributing cause and clinical effect – essentially, “does it work?”.
    • The Randomized Controlled Trial (RCT) is conventionally considered the “gold standard” for attributing cause and effect due to its ability to reliably distinguish effective from ineffective therapies by comparing like groups through randomization.
    • While some CAM practitioners criticize the RCT as inappropriate or infeasible, the panel conducted a thought experiment and concluded that the only reasons not to conduct an RCT for attributing cause and clinical effect are practical, not theoretical.
    • The panel’s view on RCTs rests on three assumptions: (1) cause precedes effect; (2) mental beliefs do not influence chance events; and (3) the beliefs of the researcher do not have unmediated physical effects. The third assumption specifically excludes “unmediated” effects, acknowledging that observer effects (like the Hawthorne effect) are mediated and controllable.
    • However, the article emphasizes that numerous other study forms (e.g., case-control studies, cohort studies, case series) can be used when an RCT is infeasible. While useful, these are less rigorous and less reliable than RCTs; past examples show interventions appearing effective in case series but failing in subsequent RCTs.
    • It is vital to understand that the RCT is not the only valid type of research. While it is the method of choice for attributing cause and effect, a variety of different types of questions require different types of evidence to create a “mosaic picture” for making judgments about clinical practice.
  • What is a strategic approach to research?
    • A strategic approach involves approaching questions in a logical order. A common mistake is prematurely jumping to placebo-controlled RCTs without sufficient preliminary evidence (e.g., from pilot studies or case series) demonstrating the regimen’s benefit.
    • Common failures in strategic research include:
      • Using untested interventions in definitive trials.
      • Researching mechanisms of nonpharmacological therapy before establishing clinical benefit.
      • Developing outcome measures concurrently with therapy evaluation.
      • Replicating negative research.
      • Failing to balance theoretical and practical questions.
    • Good strategic approaches often involve a sequence of studies, starting with pilot or observational studies and progressing to more rigorous controlled trials and long-term follow-ups, as demonstrated by examples in acupuncture, spinal manipulation, and Chinese herbal preparations.
  • How can inappropriate interpretations of trials be avoided?
    • Inappropriate interpretation can render studies worthless and has been a significant problem for CAM research.
    • Common interpretive errors include:
      • Assuming pharmacological effects in animal/in vitro models imply clinical value.
      • Confusing statistical significance with clinical significance.
      • Making overly general statements about efficacy based on trials of specific techniques or patient groups.
      • Using methodologically flawed research as a basis for knowledge claims.
      • Confusing “no evidence of effect” with “evidence of no effect,” especially when studies are underpowered due to small sample sizes.
      • Citing only research that supports a particular viewpoint instead of evaluating the evidence as a whole.
      • Relying on single studies or using subgroup analyses to generate definitive conclusions instead of hypotheses.
    • To avoid these pitfalls, the panel suggested four steps for appropriate interpretation:
      1. Interpreting the study in terms of the question: Was the design appropriate?
      2. Evaluating the study in terms of other relevant evidence: What other data are available?
      3. Assessing the methodological rigor: Was the research properly executed?
      4. Judging the generalizability of the results: To what situations can the results be applied?
    • Ultimately, fostering a critical attitude toward evidence evaluation is an inherent part of science and should be an integral part of research methodology training.

The panel’s discussions with CAM practitioners revealed that focusing on a single question at a time and matching it to an appropriate research design made the research process feel less “unfair” and less complex to them. This emphasizes the importance of a clear, question-oriented approach in this field.

Reference: Vickers, A., Cassileth, B., Ernst, E., Fisher, P., Goldman, P., Jonas, W., Kang, S., Lewith, G., Schulz, K., & Silagy, C. (1997). How should we research unconventional therapies? A Panel Report from the Conference on Complementary and Alternative Medicine Research Methodology, National Institutes of Health. International Journal of Technology Assessment in Health Care, 13(1), 111-121.

Video

Subscribe to the Health Topics Newsletter!

Google reCaptcha: Invalid site key.