Article type
Year
Abstract
Background: Health technology assessment (HTA) researchers are often confronted with situations where only case series studies (CSs) are available; however, no universally accepted validated tool exists for assessing the methodological quality of these studies.
Objectives: To describe the processes of developing a quality appraisal checklist for CSs.
Methods: An initial broad list of 30 criteria was compiled through a limited search of the literature. Six HTA researchers from Canada, Australia and Spain participated in a modified Delphi study to develop the checklist. The checklist culled to 18 criteria was updated through an additional search of newly published checklists. An explanatory dictionary was developed for each of the final 20 criteria and the resultant tool was piloted in a number of HTA reviews.
Results: A four-stage e-mail-based process culled an initial list of 30-criteria to a more ‘user friendly’ 18-criteria checklist. Two new criteria were added later to the checklist (i.e. prospective study design and blind assessment of outcomes) based on the literature review of other CSs checklists. First-hand experience with the checklist and its dictionary indicated a general level of satisfaction by the researchers. Suggestions were made to improve the clarity and feasibility of the checklist and the dictionary.
Conclusions: This comprehensive checklist and dictionary was considered a valuable tool by its initial users, although it may not include all the criteria seen to be crucial for assessing methodological quality of CSs by reviewers outside of the HTA field. Reviewers involved in the appraisal process should determine which of the 20 criteria are essential in accordance to the specific condition and technology under review. The dictionary may also need to be customized for each review. The process of validation of the tool is continuing.
Objectives: To describe the processes of developing a quality appraisal checklist for CSs.
Methods: An initial broad list of 30 criteria was compiled through a limited search of the literature. Six HTA researchers from Canada, Australia and Spain participated in a modified Delphi study to develop the checklist. The checklist culled to 18 criteria was updated through an additional search of newly published checklists. An explanatory dictionary was developed for each of the final 20 criteria and the resultant tool was piloted in a number of HTA reviews.
Results: A four-stage e-mail-based process culled an initial list of 30-criteria to a more ‘user friendly’ 18-criteria checklist. Two new criteria were added later to the checklist (i.e. prospective study design and blind assessment of outcomes) based on the literature review of other CSs checklists. First-hand experience with the checklist and its dictionary indicated a general level of satisfaction by the researchers. Suggestions were made to improve the clarity and feasibility of the checklist and the dictionary.
Conclusions: This comprehensive checklist and dictionary was considered a valuable tool by its initial users, although it may not include all the criteria seen to be crucial for assessing methodological quality of CSs by reviewers outside of the HTA field. Reviewers involved in the appraisal process should determine which of the 20 criteria are essential in accordance to the specific condition and technology under review. The dictionary may also need to be customized for each review. The process of validation of the tool is continuing.