Article type
Year
Abstract
Background: The Embase project is managed by a consortium made up of Metaxis Ltd, the Cochrane Dementia and Cognitive Improvement Group and the York Health Economics Consortium. The project’s objectives are to identify reports of randomised trials in Embase and to submit those reports to Cochrane’s Central Register of Controlled Trials (CENTRAL). The project has been managed by this consortium since April 2013 and uses crowdsourcing for much of the screening.
Objectives: This poster will focus on presenting answers to the following three questions about the screening crowd:
1. What do we know about the crowd?
2. Do screeners become more confident over time?
3. How engaged is the crowd?
Methods: By September 2015 the project will have generated over 18 months’ worth of data. Previous publications have focussed on crowd performance, but here we focus on the characteristics and behaviour of the crowd. We will analyse the data from the project’s sign up form, which all screeners must complete. This includes demographic information plus information about prior knowledge of randomised trial design. We will assess screeners’ decision-making over time by looking at: time per citation and the proportion of ‘Unsure’ classifications made. To determine screeners’ engagement we will look at screeners’ average activity per month/year.
Results: The 18-month results will be presented in this poster. Interim data suggest that the crowd is 57% female; that most of the crowd is aged between 31 and 40 years old; and around 4% were not familiar with randomised trial design before taking part.
Conclusions: Crowdsourcing brings with it many potential benefits, but is not a simple option. The more we can learn about our participants, the more we can help make the experience rewarding and fulfilling for them and retain their involvement in this project and others that are now underway, such as Project Transform.
Objectives: This poster will focus on presenting answers to the following three questions about the screening crowd:
1. What do we know about the crowd?
2. Do screeners become more confident over time?
3. How engaged is the crowd?
Methods: By September 2015 the project will have generated over 18 months’ worth of data. Previous publications have focussed on crowd performance, but here we focus on the characteristics and behaviour of the crowd. We will analyse the data from the project’s sign up form, which all screeners must complete. This includes demographic information plus information about prior knowledge of randomised trial design. We will assess screeners’ decision-making over time by looking at: time per citation and the proportion of ‘Unsure’ classifications made. To determine screeners’ engagement we will look at screeners’ average activity per month/year.
Results: The 18-month results will be presented in this poster. Interim data suggest that the crowd is 57% female; that most of the crowd is aged between 31 and 40 years old; and around 4% were not familiar with randomised trial design before taking part.
Conclusions: Crowdsourcing brings with it many potential benefits, but is not a simple option. The more we can learn about our participants, the more we can help make the experience rewarding and fulfilling for them and retain their involvement in this project and others that are now underway, such as Project Transform.