Trusting what you see: the importance of images shared with evidence for health decision-making and how to get it right

Article type
Authors
Chapman S1, Morley K2, Ryan-Vig S1
1Cochrane UK
2Cochrane Consumer
Abstract
Background:
PROBAST (Prediction model Risk Of Bias Assessment Tool) has been launched in January 2019. Since then, there has been much progress and literature on the methodology for prediction modelling in general and on the use of artificial intelligence (AI) and machine learning (ML) techniques in this field in particular. Hence, it is timely to develop PROBAST+AI that applies to studies on developing and evaluating (validating) multivariable diagnostic and prognostic prediction models using any data analytical (statistical) AI or ML technique.

Objectives:
To develop and test PROBAST+AI, a quality and risk of bias assessment tool that applies to studies on developing and evaluating (validating) multivariable diagnostic and prognostic prediction models using any AI/ML data analytical technique.

Methods:
Using a Delphi process (at least 3 survey rounds) among a diverse and large (>200) group of key stakeholders and experts on prediction model and AI/ML, we identified the relevant items for PROBAST+AI. Participants gave their opinion on a large series of quality, risk of bias, and applicability domains and signalling questions, using a 5-Point Likert scale. Participants were also asked to add new suggestions using free-text boxes.

Results:
Currently, two Delphi rounds have been conducted, and a third is planned in the spring of 2023, followed, if deemed necessary, by a final consensus expert meeting. The results and PROBAST+AI will be presented at the meeting.

Conclusion:
PROBAST+AI will provide key stakeholders in diagnostic and prognostic prediction models (including primary study authors, systematic reviewers, guideline developers, healthcare providers, prediction model developers, and patients), guidance on the relevant methodological aspects of diagnostic, and prognostic prediction model studies using any AI/ML data analytical technique.

Patient or healthcare consumer involvement: Part of the PROBAST+AI Delphi and working group. Our project has methodological implications for systematic reviews.
















The images we use when sharing evidence are important. Images can convey ideas quickly, simply, more universally, and sometimes more powerfully than text. They can engage and inform our target audiences of people who may use the evidence for their health decisions. Well-chosen images can draw people in, arouse emotions, and influence someone’s decision whether to read on. They can also enhance or reinforce written information and contribute to the credibility of the content. By contrast, bad image choices can repel, misrepresent, and undermine the trustworthiness of the written information they accompany, as well as of the organisation sharing the image.

Here, we will encourage people to reflect on the impact of images by exploring some positive examples of image choice, as well as some problematic ones.

We will share some key considerations to guide your image choices, including pitfalls to look out for. We will explore, for example, the importance of depicting a diverse range of people to ensure wide representation and inclusivity, accurately depicting the evidence, choosing realistic and relatable images that depict topics sensitively, and avoiding images that stigmatize or reinforce stereotypes. We will also consider how alternatives to stock images, especially art made by people about their health experiences, can be particularly powerful and relatable.

As we look at the impact of image choice, we will draw on Cochrane’s Guide to Choosing Images for Sharing Evidence, produced by Cochrane UK with input from a diverse global advisory group which is now a Cochrane learning resource.

Patient, public, and/or healthcare consumer involvement: One of the abstract authors is a consumer. The Guide we drew on was developed with the help of a global advisory group which included healthcare consumers. We include, and advocate for, images made by people to reflect their health experiences and talk about involving consumers in image choice.