Patient recruitment challenges have plagued clinical research sponsors—from industry-backed to academia and government -funded, for decades. Only a small percentage of physicians and patients actually participate in clinical trials. Moreover, with greater numbers of advanced therapy medicinal products (ATMPs), such as gene therapy and cellular therapy, comes even more complex and tailored protocols making inclusion and exclusion criteria even more challenging. Online patient enrollment via forums such as Facebook have in some cases worked very well. So, it isn’t a surprise that sponsors would not only fully embrace, but also, in some cases when perhaps they were behind on their enrollment targets, push the online recruitment envelop to catch up. More recently the use of online crowdsourcing platforms to connect with and potentially recruit research participants has become ubiquitous in social, behavioral and educational research. At least in some cases, ethical concerns are raised, and a recent analysis discusses the implications.
For example, crowdsourcing participants or “crowd workers” may become vulnerable as research participants as they become ever more important in the value chain of clinical research reports author Adrian Kwek. Moreover, ethics reviewers must evolve how they review the integrity and ethics of a clinical trial, and the processes its sponsors utilize to target, recruit and enroll research participants. How will ethics bodies review the crowdsourced research protocol?
This research reviews ethics considerations of crowdsourced research using the popular crowd-working platform Amazon Mechanical Turk as an important example.
What is Amazon Mechanical Turk?
Amazon Mechanical Turk (MTurk) is a crowdsourcing marketplace that makes it easier for individuals and businesses to outsource their processes and jobs to a distributed workforce who can perform these tasks virtually. This can be a project such as conducting a simple data validation and research to more subjective tasks like survey participation, content moderation and more. MTurk enables companies to harness the collective intelligence, skills and insights from a global workforce to streamline business processes, augment data collection and analysis, and accelerate machine learning development.
A study conducted in 2017 concluded that there was potential for conducting trials employing crowdsourcing workers –particularly if methods were employed to ensure that participants receive the intervention. The research authors noted that the potential represented significant opportunity to allow for the rapid conduct of multiple trials during the development stages of online interventions. Some writers have noted that MTurk has ushered in “a golden age” for online research.
Crowd workers face two reputed threats, including 1) undue inducements and dependent relationships and the author uncovers that 2) autonomy-focused arguments about such factors are inconclusive or inapplicable and the author applies a specific theory of exploitation to frame the ethics of crowdsourced research. The author does identify significant causes for concern of exploitation online. Read the entire journal article to understand more about how frameworks such as the “WeAreDynamo Guidelines” contain ways of ethically treating crowdsourced research participants.
Call to Action: For a holistic understanding of how research participation is going online via tools such as Amazon Mechanical Turk, follow the source to read the entire article.