Article type
Year
Abstract
Background: Like medical interventions, psychosocial interventions are not always implemented as intended. Previous authors have noted five domains of 'implementation fidelity': 1-Adherence to protocol; 2-Programme differentiation; 3-Take-up; 4-Quality of delivery; and 5-Participant responsiveness. Many have suggested that information about implementation is often omitted from published trial reports. This work offers a preliminary index for analysing intervention implementation for studies included in systematic reviews.
Objective: To develop a tool for reviewers to measure implementation fidelity of intervention programmes used in controlled trials.
Methods: The domains were selected after analysis of existing research literature. At this exploratory stage, we chose not to make a strict scoring method for fidelity domains. Each domain is scored according to presence and quality of fidelity information (high, medium, low). The instrument is being developed to provide a quantitative measure for use in future reviews. We report on preliminary testing of the instrumentÂ’s application to a systematic review of Multi-systemic therapy (MST). MST is an intervention for delinquent youth, which has paid close attention to fidelity issues.
Results: 10 RCTs of multi-systemic therapy were assessed. While none of the studies had information on all domains, all had some data that would be missed by instruments currently used to assess study quality in Cochrane reviews. All studies reported some data related to adherence and programme differentiation; most reported something about take-up and quality of programme delivery; half reported data on participant responsiveness.
Conclusion: Systematic reviews of treatment effects should test for implementation fidelity to ensure that they are synthesising studies of similar interventions. When this has been done, practitioners and policy-makers will be able to make informed decisions about the meaning and applicability of conclusions from reviews. Failure to consider implementation fidelity could result in the acceptance of misleading systematic reviews that may not be testing well-defined interventions implemented as intended. These issues are highly relevant to debates in other healthcare fields (e.g. the call for 'expertise-basedÂ’ trials in surgery to take account of variation in fidelity). The development of a quantitative measure of implementation fidelity is needed urgently.
Objective: To develop a tool for reviewers to measure implementation fidelity of intervention programmes used in controlled trials.
Methods: The domains were selected after analysis of existing research literature. At this exploratory stage, we chose not to make a strict scoring method for fidelity domains. Each domain is scored according to presence and quality of fidelity information (high, medium, low). The instrument is being developed to provide a quantitative measure for use in future reviews. We report on preliminary testing of the instrumentÂ’s application to a systematic review of Multi-systemic therapy (MST). MST is an intervention for delinquent youth, which has paid close attention to fidelity issues.
Results: 10 RCTs of multi-systemic therapy were assessed. While none of the studies had information on all domains, all had some data that would be missed by instruments currently used to assess study quality in Cochrane reviews. All studies reported some data related to adherence and programme differentiation; most reported something about take-up and quality of programme delivery; half reported data on participant responsiveness.
Conclusion: Systematic reviews of treatment effects should test for implementation fidelity to ensure that they are synthesising studies of similar interventions. When this has been done, practitioners and policy-makers will be able to make informed decisions about the meaning and applicability of conclusions from reviews. Failure to consider implementation fidelity could result in the acceptance of misleading systematic reviews that may not be testing well-defined interventions implemented as intended. These issues are highly relevant to debates in other healthcare fields (e.g. the call for 'expertise-basedÂ’ trials in surgery to take account of variation in fidelity). The development of a quantitative measure of implementation fidelity is needed urgently.