Article type
Year
Abstract
Background:
Data abstraction during systematic reviews is typically error-prone and resource-intensive. We developed Data Abstraction Assistant (DAA) as a free and open-source tool that facilitates data verification and reproducible abstraction by allowing data abstractors to link abstracted information with its source by dropping 'flags' in study articles.
Objective:
Compare the relative effectiveness, in reducing time taken and errors made, of three data abstraction approaches:
A) DAA-facilitated single abstraction plus verification;
B) non-DAA-facilitated single abstraction plus verification; and
C) non-DAA-facilitated independent dual abstraction plus adjudication.
Methods:
We enrolled and organized data abstractors into pairs based on experience. We randomized each pair to abstract data from six studies (two studies each under approaches A, B, and C). Across all pairs, abstraction was done from 48 studies that came from four systematic reviews. We defined an 'error' as either omission or incorrect abstraction of information for a given item on the data abstraction form, when compared with the information abstracted by two investigators (TL and IJS). Participants self-recorded the total time spent for data abstraction per study, including initial abstraction and verification/adjudication.
Results:
All 52 enrolled abstractors (26 pairs) completed the DAA Trial. The data abstraction forms had a median of 121 data items per study (IQR 102 to 150). Mean error proportions (18%, 17%, and 17% for approaches A, B, and C, respectively) were similar, with no statistically significant differences. Mean times per study were similar for approaches A and B (~90 minutes) but were significantly longer for approach C (142 minutes) by 52 minutes (95% CI 33 to 71).
Conclusions:
Error proportions were similar among the three data abstraction approaches, but, times spent on single abstraction plus verification (approaches A and B) were much lower than dual independent data abstraction (approach C). The time saving that occurs without compromising accuracy implies that systematic reviewers should reconsider their choice of data abstraction approach, such as avoiding dual independent data abstraction. Importantly, by linking abstracted data with their exact source, DAA provides an audit trail that is crucial for reproducible research.
Data abstraction during systematic reviews is typically error-prone and resource-intensive. We developed Data Abstraction Assistant (DAA) as a free and open-source tool that facilitates data verification and reproducible abstraction by allowing data abstractors to link abstracted information with its source by dropping 'flags' in study articles.
Objective:
Compare the relative effectiveness, in reducing time taken and errors made, of three data abstraction approaches:
A) DAA-facilitated single abstraction plus verification;
B) non-DAA-facilitated single abstraction plus verification; and
C) non-DAA-facilitated independent dual abstraction plus adjudication.
Methods:
We enrolled and organized data abstractors into pairs based on experience. We randomized each pair to abstract data from six studies (two studies each under approaches A, B, and C). Across all pairs, abstraction was done from 48 studies that came from four systematic reviews. We defined an 'error' as either omission or incorrect abstraction of information for a given item on the data abstraction form, when compared with the information abstracted by two investigators (TL and IJS). Participants self-recorded the total time spent for data abstraction per study, including initial abstraction and verification/adjudication.
Results:
All 52 enrolled abstractors (26 pairs) completed the DAA Trial. The data abstraction forms had a median of 121 data items per study (IQR 102 to 150). Mean error proportions (18%, 17%, and 17% for approaches A, B, and C, respectively) were similar, with no statistically significant differences. Mean times per study were similar for approaches A and B (~90 minutes) but were significantly longer for approach C (142 minutes) by 52 minutes (95% CI 33 to 71).
Conclusions:
Error proportions were similar among the three data abstraction approaches, but, times spent on single abstraction plus verification (approaches A and B) were much lower than dual independent data abstraction (approach C). The time saving that occurs without compromising accuracy implies that systematic reviewers should reconsider their choice of data abstraction approach, such as avoiding dual independent data abstraction. Importantly, by linking abstracted data with their exact source, DAA provides an audit trail that is crucial for reproducible research.