A methodological review of the time series analysis designs in guideline.

Tags: Oral
Matowe L, Thomas R, Ramsay C, Grimshaw J

Background: Time series analysis designs have frequently been used to evaluate quality assurance strategies; they attempt to assess whether an intervention has had an effect significantly greater than the underlying trend (1). They are more robust than simple (uncontrolled) before-and-after studies and are relatively cheap (especially as advances in information systems permit the collection and storage of a large amount of observational clinical and administrative data). This study aimed to review critically the use of time series analysis designs in evaluations of the effectiveness of different guideline dissemination and implementation strategies.

Methods: A methodological review of time series analysis studies identified during a systematic review of guideline implementation studies was undertaken. Two independent reviewers extracted data about the quality of time series design, statistical methods used and results reported by authors. We also re-analysed (where possible) studies that did not use appropriate statistical methods to analyse their data. They were reanalysed either using Autoregressive Integrated Moving Averages (ARIMA) or regression models.

Results: 36 evaluations of guideline implementation strategies using time series were identified. There were common methodological in many studies (most notably inadequate data points to allow robust statistical estimation of the effects of the intervention. Only 11 (31%) of the reviewed studies analysed their data correctly. Re-analysis of studies using analysis methods frequently overturned statistical significance.

Conclusions: Time series designs are a potentially useful design for evaluating quality assurance strategies such as guideline implementation methods. However existing evaluations have methodological flaws and have been incorrectly analysed. As a result, researchers have often drawn misleading conclusions about the effectiveness of clinical interventions. Future evaluations using time series designs should be better designed and use appropriate analytical designs.

Reference 1. Grimshaw JM, Campbell MK, Eccles MP, Steen (2000). Experimental and quasi-experimental designs for evaluating guideline implementation strategies. Family Practice 17, S11-S18.