Article type
Year
Abstract
Background: A systematic review may evaluate different aspects of a healthcare intervention. To accommodate the evaluation of various research questions, the inclusion of more than one study design may be necessary. Researchers have different opinions about he role of non-randomised studies.
Objectives: The aim of this study was to evaluate methods studies that have assessed whether reported effects differ by study types.
Methods: We searched PubMed, the Cochrane Database of Systematic Reviews, and the Cochrane Methodology Register. We identified 49 relevant studies, which included 31 systematic reviews and 18 trials.
Results: Thirty-nine of these 49 studies reported about the same or a similar intervention in both study designs and 10 studies included different interventions in the analyses. The studies compared the effect sizes between randomized and non-randomized controlled trials, which were statistically different in 35% and not different in 53%. Twelve per cent of the studies reported both different and non-different effect sizes.
Conclusions: Different study designs addressing the same question yielded varying results, with differences in about half of all examples. The risk of presenting uncertain results without knowing for sure the direction and magnitude of the effect applies to both non-randomized and randomized controlled trials.
Objectives: The aim of this study was to evaluate methods studies that have assessed whether reported effects differ by study types.
Methods: We searched PubMed, the Cochrane Database of Systematic Reviews, and the Cochrane Methodology Register. We identified 49 relevant studies, which included 31 systematic reviews and 18 trials.
Results: Thirty-nine of these 49 studies reported about the same or a similar intervention in both study designs and 10 studies included different interventions in the analyses. The studies compared the effect sizes between randomized and non-randomized controlled trials, which were statistically different in 35% and not different in 53%. Twelve per cent of the studies reported both different and non-different effect sizes.
Conclusions: Different study designs addressing the same question yielded varying results, with differences in about half of all examples. The risk of presenting uncertain results without knowing for sure the direction and magnitude of the effect applies to both non-randomized and randomized controlled trials.