Using social media and crowdsourcing to gather important publications for a scoping review about wikis and collaborative writing tools

Article type
Authors
Archambault PM1, van de Belt T2, Grajales F3, Faber M4, Kuziemsky C5, Gagnon S6, Bilodeau A7, Rioux S6, Fournier C6, Nadeau C6, Emond M6, Aubin K8, Gold I9, Gagnon M8, Turgeon A10, Heldoorn M11, Poitras J12, Kremer J2, Eysenbach G13, Légaré F14
1Department of Family Medicine and Emergency Medicine, Faculté de médecine, Université Laval, Québec, Canada
2Radboud University Nijmegen Medical Centre, Department of Obstetrics and Gynaecology, Division of Reproductive Medicine, Nijmegen, The Netherlands
3IMIA Social Media Working Group, Geneva, Switzerland
4Radboud University Nijmegen Medical Centre, Radboud REshape and Innovation Center, Nijmegen, The Netherlands
5Telfer School of Management, University of Ottawa, Canada
6Centre de santé et de services sociaux Alphonse- Desjardins (CHAU de Lévis), Lévis, Canada
7Institut national de santé publique du Québec, Québec, Canada
8Faculté des sciences infirmières, Université Laval, Québec, Canada
9Association of Faculties of Medicine of Canada, Ottawa, Canada
10Division of Critical Care Medicine, Department of Anesthesia, Faculté de médecine, Université Laval, Québec, Canada
11Federation of Patients and Consumer Organisations in the Netherlands, The Netherlands
12Faculté de médecine, Université Laval, Canada
13Centre for eHealth Innovation, Université of Toronto, Canada
14Centre de recherche du Centre hospitalier universitaire de Québec (CRCHUQ), Québec, Canada
Abstract
Background: Socialmedia likewikis, Google Docs and social reference managers (e.g., Mendeley) could be used when conducting a scoping review to crowdsource (i.e., obtaining content by soliciting contributions from an online community) important unpublished work found in the grey literature.

Objectives: To compare the performance of email and three crowdsourcing tools to collect and share citations to be considered for inclusion in a scoping review.

Methods: This study is part of an ongoing scoping review. Our methodology has been published (http://goo.gl/MbtIV). In addition to standard databases, the grey literature sources searched for this review were: HTAi vortal, Mednar, OpenSIGLE, Google, Bing and Yahoo. In order to identify any missing articles or unpublished work, 40 experts were invited by email to share relevant papers using one of three different crowdsourcing tools: an HLWIKI page (http://goo.gl/oeL1I), a Mendeley Group (http://goo.gl/alhpo) and a Google Docs spreadsheat (http://goo.gl/QlyCC). We also tweeted about the study protocol. In each of these crowdsourcing tools, we added some of the articles we had found in order to stimulate reciprocal sharing and to give experts an idea of the kind of papers we were looking for.

Results: Figure 1 presents our flow chart. We sent emails to 40 different experts and our protocol was tweeted 12 times (http://goo.gl/oe4jL). Direct email generated the most responses (n = 10) which allowed us to identify two papers that met our inclusion criteria. Mendeley and HLWIKI did not generate any new articles. For the Google Docs spreadsheet, two experts proposed two different links to papers, but none of them were included in the scoping review.

Conclusions: More research is needed to identify the barriers and facilitators to the use of crowdsourcing in academic research. This would help us understand how to improve the use of crowdsourcing tools to support the conduct of knowledge syntheses.