Article type
Year
Abstract
Background:
The process of turning evidence from systematic reviews into actionable recommendations when developing guidelines according to standards for trustworthiness can be difficult to understand and learn; both for participating clinical experts, physicians and lay people. We also know little about what individual clinicians, patients or healthy lay people would have recommended if they were provided with the same evidence as the experts. When guidelines and evidence data are stored as data elements the content can be used to make an interactive game that can be used to learn guideline methodology and harvest data about peoples choices.
Objectives:
To develop and test perceived usefulness of an ‘evidence processing game’ which can be used to:
1. practically learn the principles of applying evidence made according to standards for trustworthy guidelines.
2. harvest data about users' choices when they are presented with the original evidence.
3. give feedback to systematic reviewers about people's choices when give their guideline content.
Methods:
We used modern game technology to make an online recommendation making game based on structured guidelines published in the MAGIC (Making GRADE the Irresistible Choice) authoring and publication platform (www.magicapp.org). Testing will be done as observations using qualitative research methods.
Results:
We will display the game at the conference, along with the initial testing and feedback.
Discussion:
Does clinical experts’ reasoning match that of people playing the game? Can we make evidence process training fun?
Implications for guideline developers/users:
GAME-IT brings a new way to harvest information regarding values and preferences in decision making, and it can potentially make evidence processing training easier.
The process of turning evidence from systematic reviews into actionable recommendations when developing guidelines according to standards for trustworthiness can be difficult to understand and learn; both for participating clinical experts, physicians and lay people. We also know little about what individual clinicians, patients or healthy lay people would have recommended if they were provided with the same evidence as the experts. When guidelines and evidence data are stored as data elements the content can be used to make an interactive game that can be used to learn guideline methodology and harvest data about peoples choices.
Objectives:
To develop and test perceived usefulness of an ‘evidence processing game’ which can be used to:
1. practically learn the principles of applying evidence made according to standards for trustworthy guidelines.
2. harvest data about users' choices when they are presented with the original evidence.
3. give feedback to systematic reviewers about people's choices when give their guideline content.
Methods:
We used modern game technology to make an online recommendation making game based on structured guidelines published in the MAGIC (Making GRADE the Irresistible Choice) authoring and publication platform (www.magicapp.org). Testing will be done as observations using qualitative research methods.
Results:
We will display the game at the conference, along with the initial testing and feedback.
Discussion:
Does clinical experts’ reasoning match that of people playing the game? Can we make evidence process training fun?
Implications for guideline developers/users:
GAME-IT brings a new way to harvest information regarding values and preferences in decision making, and it can potentially make evidence processing training easier.