Open label extension studies and patient selection biases

J Eval Clin Pract. 2008 Feb;14(1):141-4. doi: 10.1111/j.1365-2753.2007.00821.x.

Abstract

Rationale: Long-term observational studies are essential in assessing the effect of interventions for chronic diseases on long-term safety and tolerability, as well as informing on efficacy in a non-clinical trial setting. Concerns over the scientific validity of open label extension studies to randomized controlled trials have recently been raised. Patients experiencing adverse events will be withdrawn before the follow-on period of the study, and those experiencing milder side-effects will be less likely to opt to continue into the open label extension.

Methods: The usual method of analysis of the open label extension study, which ignores any patients not continuing into the follow-on period of the study, is outlined. It is shown that ignoring patients who exit the trial is equivalent to assuming the outcome data are missing completely at random. Where this assumption is not met, treatment effect estimates will be biased. An alternative method of analysis is proposed, which does not rely on the often unjustifiable assumption of outcomes being missing completely at random.

Results: In an example open label extension study, with reported responder rate 43%, we show how an analysis allowing for patient selection biases produces a responder rate of just 28%.

Conclusions: The method of analysis proposed here, minimizes the effect of patient selection biases. Future reporting ideals for open label extension studies are recommended to minimize future biases. For studies which have not reported results in detail we suggest a sensitivity based on the worst case scenario, as a minimum treatment effect estimator.

Publication types

  • Meta-Analysis
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Defibrillators / statistics & numerical data*
  • Ethics, Clinical
  • Ethics, Research
  • Humans
  • Patient Selection*
  • Randomized Controlled Trials as Topic / methods*
  • Research Design*
  • Selection Bias