Moving from traditional to ‘streamlined’ systematic reviews in the Drug Effectiveness Review Project: assessment of impact on report usability

Article type
Authors
Holzhammer B1, Peterson K2, Selph S1, Holmes R2, McDonagh M1
1Pacific Northwest Evidence-Based Practice Center, USA
2Pacific Northwest Evidence-Based Practice Center , USA
Abstract
Background: The Drug Effectiveness Review Project (DERP) is a collaboration of 12 USA state Medicaid agencies that commission comparative effectiveness reviews of drug classes from the Pacific Northwest Evidence-based Practice Center (PNW-EPC) to inform decisionmaking. To reduce the size, timeline and cost of reviews, in 2012 the PNW-EPC implemented streamlining measures and two new abbreviated products: single drug reviews and reviews of reviews.
Objectives: To assess the impact of streamlining on report readability, scope and process, end-user satisfaction and usability in the decisionmaking processes of DERP collaborating agencies.
Methods: We developed a structured questionnaire assessing report usability (including readability, scope, and process domains) using Likert scale and open-ended responses to get feedback on the streamlining process overall and on two specific reports. We will solicit responses from participating organization representatives and drug review committee members. Descriptive statistics and narrative methods will be used to summarize responses.
Results: To date, nine representatives have completed questionnaires. The majority of respondents agreed that the shorter length of DERP streamlined reports made them more readable and comprehensible (78%), easier to use for decisionmaking (78%) and timelier (89%). 44% of participants stated that excluding indirect evidence in streamlined reports made them more targeted to the needs of their program. However, 22% of respondents believed that excluding such studies removed important evidence or limited the usability of these reports, indicating that indirect evidence is still important when direct evidence is limited; 89% of respondents reported that they were satisfied or very satisfied with the streamlined DERP reports they have used in their programs.
Conclusions: This interim analysis found that streamlining the comparative drug effectiveness review process improved the timeliness, readability and comprehensibility of reports and did not reduce usefulness in programmatic decisionmaking. Responses were mixed on the impact of narrowing the scope of reports by excluding indirect evidence.