Article type
Year
Abstract
Background: previous studies have investigated the transparent and informative reporting of a wide variety of study designs, including systematic reviews.
Objectives: in this study, we went a step further and actually tried to replicate the reported results from meta-analyses of diagnostic test accuracy.
Methods: we selected all systematic reviews of diagnostic test accuracy published in January 2018 and containing a meta-analysis. We requested the protocol from the review authors and used this information and the information in the published review to replicate the reported meta-analysis. Unsuccessful replication was defined as a result differing more than 1% point from the reported point estimates; or reported primary study results that were different from the actual primary study results; or if the data from the primary studies could not be extracted without checking the data in the review first.
Results: of the 51 included reviews, 16 had a protocol registered in PROSPERO and five of those responded to our request for a protocol. Four others replied, but did not send the protocol. Eighteen reviews (35%) provided all the details needed to replicate the largest meta-analysis in that review without going back to the included studies. In 14 (27.5%) of those, the outcome of the meta-analysis could be replicated. Considering the correctness of the numbers from the primary papers and the availability of a search only two meta-analyses were fully replicable.
Conclusions: published meta-analyses of diagnostic test accuracy were poorly replicable. This was partly because of lack of information about the methods and data used; and partly because of mistakes in the data extraction or data reporting.
Patient or healthcare consumer involvement: patients and heathcare consumers did not participate in this study.
Objectives: in this study, we went a step further and actually tried to replicate the reported results from meta-analyses of diagnostic test accuracy.
Methods: we selected all systematic reviews of diagnostic test accuracy published in January 2018 and containing a meta-analysis. We requested the protocol from the review authors and used this information and the information in the published review to replicate the reported meta-analysis. Unsuccessful replication was defined as a result differing more than 1% point from the reported point estimates; or reported primary study results that were different from the actual primary study results; or if the data from the primary studies could not be extracted without checking the data in the review first.
Results: of the 51 included reviews, 16 had a protocol registered in PROSPERO and five of those responded to our request for a protocol. Four others replied, but did not send the protocol. Eighteen reviews (35%) provided all the details needed to replicate the largest meta-analysis in that review without going back to the included studies. In 14 (27.5%) of those, the outcome of the meta-analysis could be replicated. Considering the correctness of the numbers from the primary papers and the availability of a search only two meta-analyses were fully replicable.
Conclusions: published meta-analyses of diagnostic test accuracy were poorly replicable. This was partly because of lack of information about the methods and data used; and partly because of mistakes in the data extraction or data reporting.
Patient or healthcare consumer involvement: patients and heathcare consumers did not participate in this study.