Article type
Year
Abstract
Background:
Research on the databases that should be searched to identify economic evaluations (EEs) largely predates the closure of two databases that indexed EEs: NHS EED and HEED. The closures impact on search methodology for the identification of EEs for systematic reviews (SRs).
Objectives:
To assess which databases are now the best sources of EEs and identify the most efficient combination of databases for a SR of EEs. We also investigated the quality of MEDLINE search strategies used in source SRs: record retrieval relies on search sensitivity as well as the selection of appropriate databases.
Methods:
We sourced a quasi gold standard (QGS) set of EEs from SRs of EEs undertaken to inform health technology assessments (HTA). We calculated the yield for nine databases, and combination of those databases. We assessed the number and characteristics of references not found in the nine databases. We re-ran the reported MEDLINE search strategies in each source SR where possible and calculated their sensitivity and precision.
Results:
Across all databases, records for 337/351 QGS references could be found (yield 96%). Embase yielded the most references (314). The lowest number of databases that could be combined to find records for all 337 references was four: Embase + HTA Database + MEDLINE/PubMed + Scopus. Records for 14/51 references (4%) could not be found in any database tested. These references comprised non-English language studies, conference abstracts, and non-journal reports and HTAs. Of the 46 source SRs, 29 reported a MEDLINE strategy that enabled reproduction. Ten of 29 (34.5%) of the strategies missed at least one of the included records that could be found in MEDLINE. Mean sensitivity was 0.89 and mean precision was 0.016.
Conclusions:
Searching beyond key databases for published EEs may be inefficient, as long as these resources are searched using appropriate strategies. Searchers should concentrate on refining search strategies in key databases to ensure satisfactory sensitivity and precision, and additionally consider approaches to identify grey literature.
Patient or healthcare consumer involvement:
No patients were directly involved in the research. However, the results will be of relevance to patients by improving the quality and efficiency of SRs of EEs on which healthcare decisions are made.
Research on the databases that should be searched to identify economic evaluations (EEs) largely predates the closure of two databases that indexed EEs: NHS EED and HEED. The closures impact on search methodology for the identification of EEs for systematic reviews (SRs).
Objectives:
To assess which databases are now the best sources of EEs and identify the most efficient combination of databases for a SR of EEs. We also investigated the quality of MEDLINE search strategies used in source SRs: record retrieval relies on search sensitivity as well as the selection of appropriate databases.
Methods:
We sourced a quasi gold standard (QGS) set of EEs from SRs of EEs undertaken to inform health technology assessments (HTA). We calculated the yield for nine databases, and combination of those databases. We assessed the number and characteristics of references not found in the nine databases. We re-ran the reported MEDLINE search strategies in each source SR where possible and calculated their sensitivity and precision.
Results:
Across all databases, records for 337/351 QGS references could be found (yield 96%). Embase yielded the most references (314). The lowest number of databases that could be combined to find records for all 337 references was four: Embase + HTA Database + MEDLINE/PubMed + Scopus. Records for 14/51 references (4%) could not be found in any database tested. These references comprised non-English language studies, conference abstracts, and non-journal reports and HTAs. Of the 46 source SRs, 29 reported a MEDLINE strategy that enabled reproduction. Ten of 29 (34.5%) of the strategies missed at least one of the included records that could be found in MEDLINE. Mean sensitivity was 0.89 and mean precision was 0.016.
Conclusions:
Searching beyond key databases for published EEs may be inefficient, as long as these resources are searched using appropriate strategies. Searchers should concentrate on refining search strategies in key databases to ensure satisfactory sensitivity and precision, and additionally consider approaches to identify grey literature.
Patient or healthcare consumer involvement:
No patients were directly involved in the research. However, the results will be of relevance to patients by improving the quality and efficiency of SRs of EEs on which healthcare decisions are made.