The peer review system is broken. We asked academics how to fix it
The peer review process is the cornerstone of modern scholarship. Before new work is published in an academic journal, experts review the evidence, research and arguments to make sure it stacks up.
However, many authors, reviewers and editors have problems with how the modern peer review system works. It can be slow, opaque and rattling, and it works thanks to the volunteer work of already overworked academics.
Read more: Explained: what is peer review?
Last month, one of us (Kelly-Ann Allen) expressed her frustration with the difficulties of finding peer reviewers on Twitter. Hundreds of replies later, we had a huge collection of peer review critiques and suggestions for improvement.
The suggestions for journals, publishers and universities show that there is much to be done to make peer review more accountable, fair and inclusive. We have summarized our full findings below.
Three challenges of peer review
We see three main challenges facing the peer review system.
First, peer review can be a form of exploitation.
Many companies that publish academic journals profit from subscriptions and sales. However, authors, editors, and peer reviewers generally give their time and effort on a voluntary basis, effectively doing free work.
And while peer review is often seen as a collective enterprise of the academic community, in practice a small fraction of researchers do most of the work. A study of biomedical journals found that in 2015, only 20% of researchers performed up to 94% of peer review.
Peer review can be a “black box”
The second challenge is the lack of transparency in the peer review process.
Peer review is usually done anonymously: researchers do not know who is reviewing their work, and reviewers do not know whose work they are reviewing. This leaves room for honesty, but can also make the process less open and accountable.
Opacity can also suppress discussion, protect bias, and diminish the quality of reviews.
Peer review can be slow
The final challenge is the speed of peer review.
When a researcher submits an article to a journal, if they exceed the initial rejection, they may face a long wait for review and eventual publication. It is not uncommon for research to be published a year or more after submission.
This delay is bad for everyone. For policymakers, leaders and the public, this means they can make decisions based on outdated scientific evidence. For academics, delays can stall their careers while they wait for the publications they need to secure promotions or tenure.
Read more: Journal articles, grants, jobs… as rejections pile up, it’s not enough to tell academics to ‘suck’
The researchers suggest that delays are usually caused by a shortage of examiners. Many academics report that difficult workloads can discourage them from participating in peer review, and this has worsened since the onset of the COVID-19 pandemic.
It was also found that many journals rely heavily on US and European reviewers, which limits the size and diversity of the group of reviewers.
Can we correct the peer review?
So what can be done? Most of the constructive suggestions from the aforementioned big Twitter conversation fell into three categories.
First, many have suggested that there should be better incentives for conducting peer reviews.
This could include publishers paying reviewers (American Economic Association journals already do this) or giving profits to research departments. Journals could also offer reviewers free subscriptions, publication vouchers, or expedited reviews.
However, we should recognize that journals offering incentives could create new problems.
Read more: Explained: the ins and outs of peer review
Another suggestion is that universities could better recognize peer review as part of the academic workload, and perhaps reward outstanding contributors to peer review.
Some Twitter commenters have argued that tenured scholars should review a certain number of papers each year. Others thought more should be done to support non-profit journals, given that a recent study found some 140 journals in Australia alone went out of print between 2011 and 2021.
Most respondents agreed that conflicts of interest should be avoided. Some suggested expert databases would make it easier to find relevant reviewers.
Use more inclusive peer review recruitment strategies
Many respondents also suggested that journals can improve the way they recruit reviewers and the work they distribute. Expert reviewers could be selected on the basis of their methodological or content expertise, and asked to focus on that element rather than both.
Respondents also argued that journals should do more to tailor their invitations to target the most relevant experts, with a simpler process for accepting or rejecting the offer.
Others felt that more non-tenured academics, doctoral researchers, people working in related industries and retired experts should be recruited. More peer review training for graduate students and increased representation of women and underrepresented minorities would be a good start.
Rethinking double-blind peer review
Some respondents pointed to a growing movement towards more open peer review processes, which can create a more humane and transparent approach to review. For example, the Royal Society Open Science publishes all decisions, review letters and voluntary identification of peer reviewers.
Another suggestion to speed up the publication process was to give more priority to urgent research.
What can be done?
The overarching message from the huge response to a single tweet is that there is a need for systemic changes to the peer review process.
There is no shortage of ideas to improve the process for the benefit of academics and the general public. However, it will be up to journals, publishers and universities to put them into practice and create a more accountable, fair and inclusive system.
The authors would like to thank Emily Rainsford, David V. Smith, and Yumin Lu for their contributions to the original article Toward Improving Peer Review: Crowdsourced Insights from Twitter.