Author: Jadranka Stojanovski, University of Zadar, Croatia jadranka.stojanovski@irb.hr
Reviewers: Karla Avanço, Maciej Maryl
Introduction
Open science initiatives have gained significant traction in Europe since 2014, spearheaded by the European Commission’s efforts to develop guidelines and recommendations and implement open science practices. In recent years, UNESCO has joined these initiatives, expanding its global influence by adopting the UNESCO Recommendation on Open Science in 2021. Notable contributions from China and the United States have further shaped the global publication landscape supporting open science.
The growth of open science can be attributed to several challenges scientific communication faces. These challenges include:
- The 3R Crisis: This crisis encompasses the issues of reproducibility, replicability, and repeatability1 in scientific research, highlighting the need for robust and trustworthy data repositories.
- The Serial Crisis: The commercialisation of scholarly publishing has resulted in escalating prices, restrictive licensing, and the transfer of rights, which hampers access to and dissemination of scientific knowledge.
- The Functionality Crisis: Emerges from the dominance of traditional publishing formats without adequate digital modernisation, leading to outdated procedures and wasted researcher time. (Brembs et al., 2021).
- The Peer Review Crisis: The peer review, “a cornerstone of scientific publishing” (Ghosal et al., 2022), is plagued by flaws such as slowness, high costs, secrecy, exclusiveness, inefficiency, biases, lack of transparency, the potential for abuse, and unreliability. Additionally, the absence of appropriate incentives hinders its effectiveness.
Research on peer review (PR) has witnessed steady growth over the past two decades, according to Dimensions2 and Scopus3 databases (Figure 1). In 2021 more than 15.000 articles on peer review were published covering various aspects such as research assessment, identification of biases (e.g., gender and nepotism), ethical concerns, criteria for conducting PR, editor responsibilities, reviewer incentives and rewards, considerations of quality and reliability, emerging trends in PR practices, and strategies for teaching and learning about PR. These studies aim to shed light on the advantages and limitations of PR to enhance its efficacy and integrity.
Figure 1. Growth of the number of publications on peer review 1983-2022 according to the Dimensions (TITLE-ABS = “peer review”) and Scopus databases ( TITLE-ABS-KEY = “peer review”)
The importance of the PR is further emphasised by adopting the Agreement on Reforming the Research Assessment in 2022. This reform calls for a shift in research assessment criteria, emphasising quality over traditional metric-based indicators. To recognise contributions that advance knowledge and assess the potential impact of research results, it is imperative to embrace diverse research activities and practices, foster early sharing and open collaboration, employ evaluation processes and criteria that respect the diversity of scientific disciplines and research types, acknowledge and appropriately evaluate diversity in research roles and careers (including roles beyond academia) and ensure gender equality, equal opportunities, and inclusivity. Elevating PR to a new level will be necessary to achieve these objectives.
By addressing the challenges in scholarly communication and advancing PR practices, the research community can foster an environment that promotes high-quality research, encourages inclusivity, and maximises the potential impact of scientific findings.
Traditional Peer Review
The editorial board typically manages traditional PR, involving single or double anonymisation and engaging two external reviewers whose reports are visible only to the handling editor. While this evaluation system appears solid, it suffers from various shortcomings:
- a lengthy process, slowing down the publication (Nguyen et al., 2015)
- incurring high costs4 (Aczel et al., 2021)
- susceptibility to biases concerning gender, nationality, affiliation, and language of the manuscript (Mavrogenis et al., 2023)
- potentially disregarding groundbreaking works (Tennant et al., 2017)
- being unable to detect severe errors in methodology, data collection and analysis (Bohannon, 2013)
- inconsistency in the opinions and comments provided by reviewers when evaluating the same paper (Ross-Hellauer, 2017)
- prone to manipulation, fertile ground for unethical behaviours
- the lack of recognition, credit, and rewards for the reviewers
- squandering knowledge and resources (Stojanovski, 2018)
However, traditional PR remains deeply ingrained in the scholarly publishing landscape, with double-anonymized PR often considered a safeguard for impartiality and objectivity. Therefore, inviting to explore alternative models and approaches to improve the PR system, the DIAMAS project acknowledges existing practices in its recommendations for institutional publishers. These recommendations include:
- clearly describing reviewer roles and responsibilities
- engaging multiple external reviewers
- minimising potential conflicts of interest
- implementing various forms of reviewer recognition and rewards
- displaying submission and acceptance dates on publications
- maintaining a registry of all data and documents related to the peer review process
- addressing rejection, withdrawal, or retraction procedures in relevant policies,
- establishing mechanisms for handling complaints and appeals, and
- offering training on peer review for editors and reviewers. (Sevkušić&Kuchma, 2023)
Open/transparent peer review
Open peer review (OPR) is a recommended practice of open science to transform the PR process into an open scientific discourse. Active participation from authors, open publication of reviewer reports, and the opportunity for the wider community to provide feedback and engage in the assessment (UNESCO Recommendation on Open Science, 2021) allow public discussion on research papers before or after formal publication.
OPR offers several benefits:
- More transparent evaluation fosters greater accountability and effective error detection, building trust and confidence in the research community.
- It facilitates the exchange of knowledge and ideas among researchers, encouraging constructive criticism and helping identify flaws or areas for improvement.
- Contribute to authors’ professional development by providing feedback and engaging in discussions about their work.
- Foster a sense of community and collaboration within the academic community by encouraging open dialogue.
- Enhance the quality, reliability, and effectiveness of the PR process by enabling open discussions among reviewers and the wider community.
- Enhance the visibility, recognition and reputation of reviewers and their contributions.
- Provide additional context for readers, aiding their understanding of the paper.
- Reviewers are more likely to provide constructive criticism and detailed feedback, and their reviews can serve as guidance for researchers, particularly early-career researchers, to conduct methodologically rigorous research and improve their reporting.
- Help validate the quality of editorial work.
In a systematic literature review, Ross-Hellauer (2017) identified seven key traits of OPR, among which open reports and open identities have received the most attention from the scholarly community. Open reports are generally received positively, while attitudes towards open identities have been more sceptical (Ross-Hellauer & Hornbach, 2022).
However, opponents of OPR raise valid concerns, particularly regarding the disclosure of reviewer identities. They argue that revealing the identities of reviewers may have negative consequences, such as reducing the level of criticism received, especially when the author is affiliated with reputable institutions. Additionally, disclosing reviewer identities could potentially lead to unnecessary retaliation, harassment, or unwanted repercussions, particularly in small scientific communities where a limited number of researchers work in a specific field and publications are predominantly in the local language. Researchers might be reluctant to disclose their identities due to privacy concerns.
OPR is a complex process in which different stakeholders participate, various peer review outcomes are openly available, and it can take place at various stages of the scholarly publishing workflow (Figure 2). It encompasses a wide range of interactions and can take place on diverse platforms such as preprint servers, journals, or dedicated platforms. This complexity makes it challenging to establish a singular, unequivocal definition of the OPR process, as it is approached from distinct perspectives and implemented in various contexts.
Figure 2. Layers of open peer review
Integration of Research Data
It is intriguing that neither traditional PR nor OPR does not prioritise the availability of research data, which plays a crucial role in assessing the validity of research and interpreting results. Despite the critical importance of open research data in the open science agenda, its integration into the context of open (or closed) peer review is still rare. The primary object of evaluation in OPR is still the author’s textual description of the research process, results, and findings, accompanied by tables, graphical representations and/or images. Although many journals require authors to provide a Data Availability Statement (DAS), studies have shown that this statement may not always be reliable (Gabelica et al., 2022). In addition, while some journals publish the underlying data as supplementary material, data notes or data descriptors or provide links to datasets stored in repositories such as Figshare, Dryad, Open Science Framework or Zenodo with the accepted/final version of the article, it is uncommon for journals to routinely provide raw or underlying data for review purposes.
Preprint Reviews
While preprints are often defined as unreviewed manuscripts stored in preprint servers, significant progress has been made in the peer review of preprints in recent years. An increased number of preprints have enabled completely new forms of OPR. Although the practice of publishing manuscripts (preprints) in open digital repositories has existed in certain fields for over three decades (e.g., arXiv.org was created in 1992), preprint servers are rapidly expanding to other disciplines, with biomedicine experiencing significant growth. The publication of preprints introduces exciting possibilities for experimenting with peer review, making it open and inclusive not only for expert reviewers but also for the wider scientific community and the general public. It is important to distinguish between preprint feedback, which includes public comments, and preprint review, which involves a discussion about the rigour and validity of the research, includes a reviewer’s statement about competing interests, and has the reviewer’s identity publicly or verified (Avissar-Whiting et al., 2023).
Innovative practices – examples
In terms of innovative practices, the journal Atmospheric Chemistry and Physics (ACP) has been at the forefront of opening its peer review workflow and outcomes (Figure 3). In the initial phase, submitted manuscripts undergo a rapid access review, during which scientific quality, significance and presentation are assessed. Papers rated as “Bad” in any of the three categories are rejected, while other papers are published on the journal’s website as “Discussion papers” and undergo a comprehensive PR by selected reviewers. The PR process also includes an interactive public discussion that extends for several weeks. Reviewers have the option to sign their reviews and can choose to remain anonymous, whereas reader comments must be signed. If a paper is accepted, the final version is published, and the initial version of the paper, along with the reviewer and reader comments, as well as the author’s responses, are permanently stored and available in the “Publication history” section, making them citable.
Figure 3. Peer review history in the journal Atmospheric Chemistry and Physics (https://acp.copernicus.org/articles/23/4863/2023/acp-23-4863-2023.html)
F1000 is another pioneering platform in OPR, where reviewer reports, author responses, and community comments are openly available. F1000 editors firmly believe that their completely transparent review process ensures quality and accountability, and they consider the publication of reviewers’ full names and affiliations as a means to foster constructive reviews and collaborative discussions within the scientific community. The PR process at F1000Research begins with the publication of the manuscript, followed by soliciting expert opinions and reader comments. All reviewers undergo a thorough evaluation by the editorial board to assess their expertise, qualifications, impartiality, and potential conflicts of interest. The author’s responses to the reviewers’ comments are also made publicly available. Based on the feedback and suggestions from the reviewers, an updated version of the paper is published, while the original version remains accessible to the public. In adherence to open data mandates, research data are made available alongside the published article, although exceptions may apply when dealing with sensitive data (Figure 4).
Figure 4. F1000Research example of OPR
In the realm of preprint reviews, innovative approaches have emerged through various platforms such as PREreview, Sciety, Review Commons, Peer Community In (PCI), PeerRef, Rapid Reviews, Qeios, and others (Figure 5). These platforms offer new avenues for conducting peer review specifically tailored to the context of preprints.
Figure 5. Examples of preprint reviews from Review Commons and PeerRef platforms
Conclusion
In conclusion, despite its shortcomings, peer review remains the foundation of scholarly communication, as it provides seal of quality for scholarly content. Reviewers, although often inadequately rewarded for their tremendous efforts, demonstrate unwavering motivation and consider reviewing a civic duty. They feel a sense of obligation to their colleagues and expect the same level of effort in return for their own work. Early access to intriguing research findings serves as an additional motivation, as long as it is not exploited for privileged information (Pros and cons of open peer review, 1999). Nonetheless, there are several avenues through which the work of reviewers can be supported, made more transparent, and ultimately more valuable. Open peer review, with publicly accessible reports and author responses, and the option for reviewers to participate anonymously or with their identity disclosed, can contribute to this advancement.
The significance of OPR is further emphasised by the ongoing reform of research assessment systems, which seeks to elevate the evaluation process to a higher, more qualitative level. Additionally, OPR, encompassing various levels of openness such as openly available multiple versions of publications, peer review history, datasets, code, and preregistration, holds the potential to address issues inherent in the current research assessment systems. Serving as a crucial link in achieving the goals of open science, OPR is increasingly gaining acceptance in scholarly communication, indicating that open reviewers’ reports and other layers of openness may soon become standard practice.
While embracing OPR and its benefits, it is vital to acknowledge and address the challenges associated with its implementation. Striking a balance that considers the limitations of both traditional PR and OPR, while respecting the specific needs and contexts of diverse research communities, is of utmost importance. By navigating these challenges thoughtfully, we can foster a more robust and effective peer review process that aligns with the principles of openness and advances the pursuit of scientific knowledge.
Footnotes
1 The terms have varying definitions depending on the research team and experimental setup. Repeatability refers to the ability to repeat a study within the same team using the same experimental setup. Replicability, conversely, pertains to the ability to replicate the study by a different team while utilising the same experimental setup. Reproducibility involves conducting the study with different teams and different experimental setups. However, it is important to note that these definitions are not consistently applied, leading to ongoing debates and discussions within the scientific community. The terminology in this area is still evolving and subject to interpretation. Different scientific fields may have unique perspectives on these terms, leading to further complexity. Additionally, different dimensions of the 3Rs can exist, including aspects related to data (numbers), findings, and conclusions. Nevertheless, it is crucial to emphasise that the ability to repeat research plays a fundamental role in scientific methods. By openly sharing research data, software, and other research outcomes, we can enhance the rigour of scientific studies. While precision in terminology is desirable, it is much more important to prioritise the collective effort to improve research practices.
2 https://app.dimensions.ai/discover/publication
4 Estimated costs are over €2×109 in 2020 for the US, China, and the UK alone, excluding manuscript revisions and editorial mediation expenses
Literature
Aczel, B., Szaszi, B., & Holcombe, A. O. (2021). A billion-dollar donation: estimating the cost of researchers’ time spent on peer review. Research Integrity and Peer Review, 6(1), 1-8. https://doi.org/10.1186/s41073-021-00118-2
Avissar-Whiting, M., Belliard, F., Brand, A., Brown, K., Clément-Stoneham, G., Dawson, S., … Williams, M. (2023, April 3). Advancing the culture of peer review with preprints. https://doi.org/10.31219/osf.io/cht8p
Bohannon, John. Who’s afraid of peer review? Science, 342(615):60-65. https://doi.org/10.1126/science.342.6154.60
Brembs, Björn, Philippe Huneman, Felix Schönbrodt, Gustav Nilsonne, Toma Susi, Renke Siems, Pandelis Perakakis, Varvara Trachana, Lai Ma, & Sara Rodriguez-Cuadrado. (2023). Replacing academic journals. https://doi.org/10.5281/zenodo.7643806
Draft Recommendation on Open Science, UNESCO. (2021). (https://unesdoc.unesco.org/ark:/48223/pf0000378841)
F1000Research: Open Data, Software and Code Guidelines. https://f1000research.com/for-authors/data-guidelines
Gabelica, M., Bojčić, R., & Puljak, L. (2022). Many researchers were not compliant with their published data sharing statement: a mixed-methods study. Journal of Clinical Epidemiology, 150, 33-41. https://doi.org/10.1016/j.jclinepi.2022.05.019
Ghosal T, Kumar S, Bharti PK, Ekbal A (2022) Peer review analyze: A novel benchmark resource for computational analysis of peer reviews. PLoS ONE 17(1): e0259238. https://doi.org/10.1371/journal.pone.0259238
Mavrogenis, A. F., & Scarlat, M. M. (2023). Quality peer review is mandatory for scientific journals: ethical constraints, computers, and progress of communication with the reviewers of International Orthopaedics. International Orthopaedics, 47(3), 605-609. https://doi.org/10.1007/s00264-023-05715-y
Pros and cons of open peer review. Nat Neurosci 2, 197–198 (1999). https://doi.org/10.1038/6295
Ross-Hellauer T. What is open peer review? A systematic review [version 2; peer review: 4 approved]. F1000Research 2017, 6:588. https://doi.org/10.12688/f1000research.11369.2
Ross-Hellauer, T., & Horbach, S. P. J. M. (2022, December 21). ‘Conditional Acceptance’ (additional experiments required): A scoping review of recent evidence on key aspects of Open Peer Review. https://doi.org/10.31222/osf.io/r6t8p
Smith R. Peer review: a flawed process at the heart of science and journals. J R Soc Med. 2006 Apr;99(4):178-82. https://doi.org/10.1177/014107680609900414
Ševkušić, Milica, & Kuchma, Iryna. (2023). DIAMAS deliverable: D3.1 IPSP Best Practices Quality evaluation criteria, best practices, and assessment systems for Institutional Publishing Service Providers (IPSPs) (Under review by the European Commission). Zenodo. https://doi.org/10.5281/zenodo.7859172
Stojanovski, J. (2018) Otvoreni recenzijski postupak / Open Peer Review. In: Hebrang Grgić, I. (ed.) u Otvorenost znanosti i visokom obrazovanju / Openness in Science and Higher Education, Zagreb, Školska knjiga, 80-92. (https://www.bib.irb.hr/959161/download/959161.Jadranka_poglavlje.pdf) Tennant JP, Dugan JM, Graziotin D et al. A multi-disciplinary perspective on emergent and future innovations in peer review [version 3; peer review: 2 approved]. F1000Research 2017, 6:1151 https://doi.org/10.12688/f1000research.12037.3
