Article type
Year
Abstract
Background: Socialmedia likewikis, Google Docs and social reference managers (e.g., Mendeley) could be used when conducting a scoping review to crowdsource (i.e., obtaining content by soliciting contributions from an online community) important unpublished work found in the grey literature.
Objectives: To compare the performance of email and three crowdsourcing tools to collect and share citations to be considered for inclusion in a scoping review.
Methods: This study is part of an ongoing scoping review. Our methodology has been published (http://goo.gl/MbtIV). In addition to standard databases, the grey literature sources searched for this review were: HTAi vortal, Mednar, OpenSIGLE, Google, Bing and Yahoo. In order to identify any missing articles or unpublished work, 40 experts were invited by email to share relevant papers using one of three different crowdsourcing tools: an HLWIKI page (http://goo.gl/oeL1I), a Mendeley Group (http://goo.gl/alhpo) and a Google Docs spreadsheat (http://goo.gl/QlyCC). We also tweeted about the study protocol. In each of these crowdsourcing tools, we added some of the articles we had found in order to stimulate reciprocal sharing and to give experts an idea of the kind of papers we were looking for.
Results: Figure 1 presents our flow chart. We sent emails to 40 different experts and our protocol was tweeted 12 times (http://goo.gl/oe4jL). Direct email generated the most responses (n = 10) which allowed us to identify two papers that met our inclusion criteria. Mendeley and HLWIKI did not generate any new articles. For the Google Docs spreadsheet, two experts proposed two different links to papers, but none of them were included in the scoping review.
Conclusions: More research is needed to identify the barriers and facilitators to the use of crowdsourcing in academic research. This would help us understand how to improve the use of crowdsourcing tools to support the conduct of knowledge syntheses.
Objectives: To compare the performance of email and three crowdsourcing tools to collect and share citations to be considered for inclusion in a scoping review.
Methods: This study is part of an ongoing scoping review. Our methodology has been published (http://goo.gl/MbtIV). In addition to standard databases, the grey literature sources searched for this review were: HTAi vortal, Mednar, OpenSIGLE, Google, Bing and Yahoo. In order to identify any missing articles or unpublished work, 40 experts were invited by email to share relevant papers using one of three different crowdsourcing tools: an HLWIKI page (http://goo.gl/oeL1I), a Mendeley Group (http://goo.gl/alhpo) and a Google Docs spreadsheat (http://goo.gl/QlyCC). We also tweeted about the study protocol. In each of these crowdsourcing tools, we added some of the articles we had found in order to stimulate reciprocal sharing and to give experts an idea of the kind of papers we were looking for.
Results: Figure 1 presents our flow chart. We sent emails to 40 different experts and our protocol was tweeted 12 times (http://goo.gl/oe4jL). Direct email generated the most responses (n = 10) which allowed us to identify two papers that met our inclusion criteria. Mendeley and HLWIKI did not generate any new articles. For the Google Docs spreadsheet, two experts proposed two different links to papers, but none of them were included in the scoping review.
Conclusions: More research is needed to identify the barriers and facilitators to the use of crowdsourcing in academic research. This would help us understand how to improve the use of crowdsourcing tools to support the conduct of knowledge syntheses.