Amid new policies to make research publications more readily available to both scientists and the public, stakeholders from an international cohort of universities signed off on a set of practices to monitor accessibility in biomedical research. They also committed to tracking their institutions’ adherence to the new guidelines on a digital dashboard.
The practices were drafted based on a study that included 80 researchers across 20 institutions in Australia, Brazil, Canada, Chile, Hong Kong, Nigeria and several European countries. The results were published Jan. 24 in PLOS Biology.
“Policy in the absence of monitoring is not effective,” Kelly Cobey, Ph.D., director of the Open Science and Meta Research program at the University of Ottawa Heart Institute and first author of the study, said in a press release. “We have reached an agreement on how to design a digital dashboard to track open science practices to determine if we are doing a good job implementing them or not.”
The development is part of the greater open science movement, which seeks to break down journal paywalls and dismantle practices that hinder collaboration and undermine public trust. While the movement took off well before 2020, the COVID-19 pandemic added new urgency. The need for scientists to be able to access research papers and replicate their methods can go unmet on account of the price to subscribe to the most prestigious journals. This can cost institutions thousands of dollars and puts scientists at universities with fewer resources at a disadvantage.
Policymakers and funders worldwide, from UNESCO to the U.S. government, have begun recommending or requiring that researchers adhere to open science practices to qualify for grants. These include publishing results in journals that can be read for free by anyone, giving other scientists access to data and research software and registering clinical trials before starting recruitment.
But without clear guidelines and a way to track their progress, it’s hard to see whether the policies are truly making science more transparent and accessible, the researchers wrote in their paper. Furthermore, mandates from funders will only change the culture if they’re being monitored, Cobey told Fierce Biotech Research.
“Simply creating a requirement for open science practices will not drive change unless the funder enforces the action,” she said in an email. Rewarding open science at the level of the institution is also essential for progress, she added.
To establish a set of practices to track, Cobey and her team invited 32 universities that had previously expressed a commitment to open science. Twenty of them chose to participate. Among the 80 individuals who took part—about half were women and one was nonbinary—were people involved in the purchasing of academic journals and sharing data, such as library staff, research administrators and staff involved in researcher assessment like tenure committee members.
The researchers came up with guidelines through a three-round Delphi study, which involves combining surveys with anonymous comments from participants. Between rounds, votes were aggregated, anonymized and presented back to the participants with their own individual scores and feedback as well as those of others. That way, each person can compare others’ rationale with their own before proceeding to the next round.
In the first round, Cobey’s team sent out 17 potential practices for participants to consider, which included items typical of open science requirements from funders, such as clinical trial reporting, as well as others that were geared toward transparency, like describing conflicts of interest. While many of the items were already required by journals, the researchers noted that even seemingly obvious things such as conflict of interest declarations are not necessarily standardized.
After another round of vetting potential practices, the participants met for two days of Zoom meetings. Their work culminated in a list of 19 practices that spanned publishing in open access journals, reporting on clinical trial data, tracking preprints, sharing data, funding statements and more. The top priority items were reporting on whether clinical trials were registered before recruitment started and on whether study data were openly available. Seven of the 19 practices were geared toward transparency, including reporting if author contributions were broken down in the article and if conflicts of interest were disclosed.
Next, Cobey’s team is developing a digital dashboard that institutions will be able to implement to track their progress on the guidelines. They expect to have a prototype within the next 12 months and are planning to pilot it at three institutions before making it more widely available, according to Cobey. While some may choose to make their dashboards public, others may choose to use it internally.
There are no plans to use the dashboard broadly to track open science progress at nonparticipating institutions, Cobey said. Rather, it will be available to the community to use as they see fit for their particular setting.
“For implementation to be successful we feel individual institutions would need to undergo their own internal review, processes and consultation to see how to best integrate,” she explained.