Article type
Year
Abstract
Background: There are concerns over the reliability of trial information reported in conference abstracts and whether trials reported only as abstracts should be included in systematic reviews.
Objectives: This study assesses how the information about a trial reported in a conference abstract differs to its subsequent full publication.
Methods: Randomized trials reported at the American Society of
Clinical Oncology conference (1992) were identified. The Cochrane Central Register of Controlled Trials and PubMed (December 2002) were searched to identify any corresponding full publications. A checklist (based on CONSORT) was used to compare the information in 37 trial abstracts with that in their full publication.
Results: Some aspects of trials were well reported in both the abstract and full publication. 95% of study objectives, 92% of participant eligibility, 100% of trial interventions and 84% of primary outcomes were the same in both. Other areas were more discrepant. 46% of trials reported the same number of participants randomized in the abstract and full publication; only 22% reported the same number of participants analyzed. The median number analyzed per trial was 96 for abstracts compared to 117 for full publications (p=0.003). This is partly explained by the fact that the abstracts were often preliminary reports of the trial. 82% of trials were closed to follow-up in the full publication compared to 19% of abstracts (p>0.001). Lack of information was a major problem in assessing trial quality, no abstracts reported the method of allocation concealment, 16% reported method of blinding and 14% reported intention to treat analysis. These figures were 49%, 19% and 46% respectively for full publications.
Conclusions: Trial findings in abstracts can be unstable, especially for trials presenting early or preliminary results. Systematic reviewers should contact trialists for more information. If data from abstracts are included in the meta-analysis, a sensitivity analysis should be performed to assess their impact.
Acknowledgments: We are grateful to Phil Alderson, Anne Eisinga, Liz MacKinnon, Steve McDonald, Philippa Middleton and Roberta Scherer for their help with this study.
Objectives: This study assesses how the information about a trial reported in a conference abstract differs to its subsequent full publication.
Methods: Randomized trials reported at the American Society of
Clinical Oncology conference (1992) were identified. The Cochrane Central Register of Controlled Trials and PubMed (December 2002) were searched to identify any corresponding full publications. A checklist (based on CONSORT) was used to compare the information in 37 trial abstracts with that in their full publication.
Results: Some aspects of trials were well reported in both the abstract and full publication. 95% of study objectives, 92% of participant eligibility, 100% of trial interventions and 84% of primary outcomes were the same in both. Other areas were more discrepant. 46% of trials reported the same number of participants randomized in the abstract and full publication; only 22% reported the same number of participants analyzed. The median number analyzed per trial was 96 for abstracts compared to 117 for full publications (p=0.003). This is partly explained by the fact that the abstracts were often preliminary reports of the trial. 82% of trials were closed to follow-up in the full publication compared to 19% of abstracts (p>0.001). Lack of information was a major problem in assessing trial quality, no abstracts reported the method of allocation concealment, 16% reported method of blinding and 14% reported intention to treat analysis. These figures were 49%, 19% and 46% respectively for full publications.
Conclusions: Trial findings in abstracts can be unstable, especially for trials presenting early or preliminary results. Systematic reviewers should contact trialists for more information. If data from abstracts are included in the meta-analysis, a sensitivity analysis should be performed to assess their impact.
Acknowledgments: We are grateful to Phil Alderson, Anne Eisinga, Liz MacKinnon, Steve McDonald, Philippa Middleton and Roberta Scherer for their help with this study.