Appraisal of search strategies in industry submissions for technology appraisal (ASSIST): Reviewing search methods of industry submissions to NICE using a structured checklist

Article type
Authors
Allen A1, Misso K1, Riemsma R1, Kleijnen J1
1Kleijnen Systematic Reviews Ltd, UK
Abstract
Background: The UK National Institute for Health and Clinical Excellence (NICE) undertakes Single Technology Assessments (STA), to produce recommendations on the use of new and existing medicines, products and treatments in the NHS. The STA process is open and transparent, and involves the submission of evidence by the manufacturer or sponsor of the technology. NICE contracts independent Evidence Review Groups (ERG) to appraise and assess the evidence submission. As part of the STA process, the ERG must appraise and critique the search methods which underpin the systematic review and economic model components of the evidence submission. This project describes how an ERG Information Team tested a standardised, structured and reproducible approach to reviewing search methods in manufacturer’s submissions (MS). Although an evidence based checklist [1] exists for peer review of individual search strategies, the ERG must appraise search strategies and the clarity of search method reporting. Based on the methods described by McGowan [1], Sampson [2,3] and NICE [4], we developed a tool for the assessment of STA search methods.

Objectives: To create and pilot a search appraisal checklist designed specifically for evaluating industry evidence submissions.

Methods: Data were collected on types of errors made in the study identification sections of two STA manufacturers' submissions. This evidence was incorporated to create the ASSIST checklist, which will be piloted on further STA’s for fine-tuning and evaluation with an expected project conclusion in mid-July 2011. The pilot checklist included 33 items in 10 domains.

Results and Conclusions: Preliminary findings during the pilot stage have shown that implementation of the ASSIST checklist enabled standardised appraisal of search methods. The ERG identified several shortcomings in MS searches, including typographical errors, incorrectly combined line numbers, inappropriate subject headings explosion and errors in use of study design filters. The finalised checklist and evaluation will be presented.

References

1. McGowan J, Sampson M, Lefebvre C. An evidence based checklist for the peer review of electronic search strategies (PRESS EBC). Evidence Based Library and information practice 2010;5(1):1-6.

2. Sampson M, McGowan J, Cogo E, Grimshaw J, Moher D, Lefebvre C. An evidence-based practice guideline for the peer review of electronic search strategies. J Clin Epidemiol 2009;62(9):944-52.

3. Sampson M, McGowan J, Lefebvre C, Moher D, Grimshaw J. PRESS: peer review of electronic search strategies. Ottawa: Canadian Agency for Drugs and Technologies in Health (CADTH), 2008 Available from: http://www/cadth.ca/index.php/en/hta/reports-publications/search/publication/781

4. National Institute for Health and Clinical Excellence. Single Tech-nology Appraisal: specification for manufacturer/sponsor submission of evidence. London: NICE, October 2009, 2009. 76p.