Accurate COnsensus Reporting Document (ACCORD) checklist: a reporting guideline for consensus methods

Article type
Authors
Tovey D1, Harrison N2, Logullo P3, van Zuuren E4, Price A5, Winchester C6, Hughes E7, Blazey P8, Goldman K9, Hungin A10, Gattrell W11
1Journal Of Clinical Epidemiology, London, UK
2OPEN Health Communications, London, UK
3University of Oxford, Oxford, UK; EQUATOR NEtwork UK Centre, Oxford, UK
4Leiden University Medical Centre, Leiden, The Netherlands
5Dartmouth Institute for Health Policy & Clinical Practice (TDI), Geisel School of Medicine, Dartmouth College, Hanover, New Hampshire, United States
6Oxford PharmaGenesis, Tubney, UK
7Camino Communications, UK
8Department of Medicine, University of British Columbia, Vancouver, British Columbia, Canada
9AbbVie, North Chicago, Illinois, United States
10Faculty of Medical Sciences, Newcastle University, Newcastle upon Tyne, UK
11Bristol Myers Squibb, Uxbridge, UK
Abstract
Background: Consensus methods underpin the gathering of expert knowledge and are widely used in the development of clinical guidelines, policy recommendations and reporting guidelines. Unfortunately, inadequate or incomplete reporting of consensus methods is common.

Objective: To develop a reporting guideline relevant to all types of consensus methods in biomedical research and clinical medicine.

Methods: Development of the ACCORD reporting guideline followed the process set out by the EQUATOR Network and was supported by the International Society of Medical Publication Professionals (ISMPP). A Steering Committee developed the study protocol, conducted a systematic review to identify potential reporting items, and then oversaw a modified-Delphi survey to refine the checklist items. Here we report results from the Delphi survey and present the final checklist. We also summarise the results of a pilot study (Gattrell et al, ISMPP EU 2024, abstract 18) with 15 volunteers completing a survey to rate the ease of understanding of the 35 checklist items and the overall complexity of the checklist using a 5-point Likert scale (1=difficult, 5=easy).

Results: The Delphi panel (n=72) had representation from six continents and included people with a broad range of experience, including clinical, research, policy and patient perspectives. The preliminary checklist of 56 items was refined during three rounds of voting, with rounds completed by 58, 54 and 51 panellists respectively. One item, regarding patient and public involvement, failed to achieve stable consensus during the voting. The Steering Committee decided to reinstate this item, taking into account panellist comments. The final checklist included 35 items, relating to the article title (n=1), introduction (n=3), methods (n=21), results (n=5), discussion (n=2) and other information (n=3). In the pilot study, median understandability of items was 5.0 on a scale of 0-5.0. No item had an understandability score <4.0; and over two-thirds (24/35, 68.6%) were ≥4.5.

Conclusions: ACCORD is the first reporting guideline applicable to all consensus-based studies and was well understood by volunteers. ACCORD supports authors to provide complete and transparent reporting, while providing readers with clarity regarding the methods used to reach agreement, ultimately enhancing confidence in consensus panels and generating trust in their recommendations.