• 21st June 2018
  • 2 min read

New report: citizen science can make systematic reviews faster and more efficient

New report: citizen science can make systematic reviews faster and more efficient

Collaboration_Red_RGB

If you missed the first report on this topic, you might like to read Citizen science: Crowdsourcing for research first.


 

Joining forces with diverse groups of people can help produce high-quality systematic reviews more quickly and efficiently, according to a new report from THIS Institute.

Citizen science: crowdsourcing for systematic reviews looks at how people can contribute their expertise to scientific studies using new online platforms – even if they don’t think of themselves as researchers or scientists. Created in collaboration with RAND Europe, the report is the second of three that THIS Institute has developed to inform its citizen science approach. It focuses on one of the cornerstones of scientific research: the systematic review.

Traditional systematic reviews can take thousands of hours to complete, and the burden is growing as the base of scientific literature increases. But promising early evidence suggests that engaging large groups of people can speed up the systematic review process without sacrificing accuracy. It can also be very rewarding for participants.

In one crowdsourced review showcased in the report, participants made 15,000 screening decisions in just 100 hours. And with the right control measures, crowdsourced systematic reviews can meet the ‘gold standard’ of quality.

Evidence suggests there is strength – and accuracy – in numbers.

“When it comes to conducting systematic reviews, evidence suggests there is strength – and accuracy – in numbers,” says Rebecca Simmons, Deputy Director of THIS Institute and co-author of the report. “And it’s not just the speed of crowdsourced reviews that’s impressive. Citizen scientists have proven to be conscientious reviewers who make accurate screening decisions.”

The report highlights a number of challenges to consider when crowdsourcing systematic reviews. Recruiting and retaining participants can be resource-intensive and drop-out rates can be high.

“Evidence shows that providing training, well-defined tasks, feedback and rewards can mitigate these risks and encourage participation,” says co-author Lucy Strang, a Senior Analyst at RAND Europe.

With these learning reports, THIS Institute aims to strengthen the evidence base around citizen science and how to best engage people outside of research to improve the quality and safety of patient care. The third report on generating research ideas and building consensus using citizen science is due out later this summer.

 

Download the report