What information resources are searched to prepare systematic reviews of economic evaluations?

Article type
Authors
Wood H1, Arber M1, Glanville J1
1York Health Economics Consortium, University of York, United Kingdom
Abstract
Background: Healthcare decision-makers require timely assessment of cost-effectiveness evidence through systematic reviews (SR). Efficient and effective search methodology is an essential component of SR production. Although the number of SRs of economic evaluations has increased, the quality of search methodology used in recent reviews has not been widely investigated.
Objectives: This study sought to identify which information resources were used to identify studies for recent SRs of economic evaluations, and to investigate whether the choice of resources reflected current recommendations for conduct of such reviews.
Methods: SRs of economic evaluations published since January 2013 were identified from MEDLINE. Two reviewers extracted the following information from those SRs that met inclusion criteria: general medical databases searched, specialist economic databases searched, Health Technology Assessment (HTA) sources searched, supplementary search techniques used. Results were compared against information resources recommended by NICE (National Institute for Health and Care Excellence) when searching for economic evidence for single technology appraisals (STAs), and the summary of current best evidence-based practice provided in Sure Info (http://vortal.htai.org/?q=node/336).
Results: A total of 65 SRs met the inclusion criteria; data were extracted from 42/65 reviews. Five reviews (12%) met or exceeded the search resources recommended by NICE. Nine reviews (21%) searched at least four of the six types of resource recommended by Sure Info (specialist economic databases, general databases, HTA databases, webpages of HTA agencies, grey literature, collections of utility studies). None of the reviews searched all six. Although all reviews explicitly described the resources searched, reporting frequently contained errors or lack of clarity in database and interface names.
Conclusions: Information resources used to identify evidence for the majority of recently published SRs of economic evaluations do not reflect current recommendations.