Duplicative and out of date: systematic reviews of remdesivir highlight wasted research on COVID-19

Systematic reviews of the antiviral drug remdesivir featured considerable duplication, and half were already out of date at the time of publication, according to a recent investigation by Cochrane Australia published in the Journal of Clinical Epidemiology. The findings reveal a lack of coordination among international researchers and point to the adoption of ‘living guidelines’ as a model to address these failures.

In the scramble to understand, prevent and treat COVID-19, countless hours and dollars have been invested in clinical trials, observational studies and systematic reviews. Never before has such a singular and determined global research effort been undertaken.

Remdesivir is an antiviral drug that emerged early in the pandemic as a potential treatment for COVID-19. Although just seven randomised trials comparing remdesivir with placebo or standard care had been published by September 2021, systematic reviews of the drug appeared regularly throughout 2020 and 2021.

This caught the attention of researchers at Cochrane Australia, part of an independent global network of researchers producing trusted, high-quality systematic reviews that underpin and inform daily decisions around best-practice in health care around the world.

In recent years, Cochrane Australia has embraced an emerging model of clinical guideline development called living guidelines. This model enables rapid inclusion of the latest evidence and has been adopted by Australia’s National COVID-19 Clinical Evidence Taskforce.

Their recent investigation compared the usefulness and durability of systematic reviews of remdesivir with the living guidelines model used by the Taskforce. The team included systematic reviews of remdesivir as a treatment for COVID-19 published up to May 2021. They found 38 eligible reviews (published in 45 unique reports, including preprints and living review updates), equivalent to a new review being published every nine days.

A systematic review was considered up to date if it included the four major randomised trials of remdesivir versus placebo or standard care that were available at the time of the review’s publication. The team also assessed the level of intentional replication and unintentional duplication among the reviews.

The researchers found that half of the systematic review reports (23/45) were out of date because they failed to include the results from the main trials that were available when the review was published. Even the reviews that were in-date when published subsequently became out of date as the results of new trials became available— the 11 reviews in this category remained current for just 10 days on average.

A third of reviews cited other systematic reviews of remdesivir but only four provided justifications for why another review was necessary. The researchers noted that only one in five reviews were registered in PROSPERO, an international database of prospectively registered systematic reviews that aims to minimise duplication.

The COVID-19 Taskforce guidelines were updated weekly following their launch in April 2020, allowing for the rapid inclusion of trial results. The time from publication of the four major randomised trials of remdesivir to their inclusion in the guidelines was 9, 13, 14 and 27 days.

The WHO Solidarity trial was the largest trial of remdesivir. Within two weeks of the interim results being published, the Taskforce guidelines had been updated, yet none of the nine systematic reviews published immediately following the Solidarity trial included these results.

“While we recognise the huge pressure to generate knowledge about COVID-19,” lead author Steve McDonald says, “this example highlights why researchers need to work smarter to avoid unnecessary waste.

“The persistent duplication we uncovered meant that time and resources were wasted addressing questions that others were already dealing with. More complete registrations of reviews in PROSPERO may have led authors to question the need for repeated reviews.

“Our study also exposed the susceptibility of systematic reviews to becoming out of date in fast-paced situations like COVID-19. The ‘living’ approach adopted by the Taskforce shows how results can be rapidly included to maintain currency. Living synthesis approaches during COVID-19 have proven effective—the time and money spent producing duplicative systematic reviews could be better invested elsewhere.”