Agreement in results data between conference abstracts and full reports of randomized controlled trials: should we depend on conference abstracts?

Article type
Authors
Saldanha I1, Scherer R1, Dickersin K1
1Cochrane Eyes and Vision Group, USA
Abstract
Background: Including conference abstracts in systematic reviews promotes comprehensiveness, but conference abstracts are not usually peer-reviewed and often contain preliminary data. Conclusions about intervention efficacy in randomized controlled trials (RCTs) are generally based on primary outcomes (POs).
Objective: To quantify agreement between PO results of RCTs presented as conference abstracts and their corresponding full reports.
Methods: We included all abstracts of RCTs presented at the 2001-2004 Association for Research in Vision and Ophthalmology conferences. We identified corresponding full reports through to 2013 by electronic searching and emailing authors. We extracted data about the PO from each abstract and full report. For each abstract-full report pair where the PO was the same and direction of results (positive or neutral) could be determined in both publications, we examined whether the results agreed. We classified any discordance as quantitative (any change in magnitude but not direction of effect), qualitative (change in direction of effect), or both.
Results: Two-hundred and thirty (44.8%) of the 513 eligible abstracts had been published in full. Within these 230 abstracts, direction of results for the PO was positive for 64 (27.8%), neutral for 55 (23.9%), and could not be determined for 119 (51.7%). POs differed in 82/230 pairs (36.9%). POs were more likely to differ if the abstract’s results were neutral compared to positive (RR = 1.28, 95% CI 1.15 to 1.40). Among the 103 abstract-full report pairs in which the PO was the same and direction of results could be determined, results agreed in 20 (19.4%), there was quantitative discordance in 74 (71.9%), and both quantitative and qualitative discordance in 9 (8.7%; Figure 1).
Conclusions: POs in more than one-third of RCTs presented as abstracts differed from POs in the full reports; a difference was more likely if the direction of PO results in the abstract was neutral. When the POs were the same in both, only one-fifth of pairs agreed, and almost one-tenth of pairs had qualitatively different results. Systematic reviewers should be aware of, and cautious about, these differences.