Author: Françoise Gouzi

Reviewers: Carol Delmazo

Innovation in the Arts, Humanities, and Social Sciences (AHSS) often defies conventional academic evaluation. Groundbreaking digital tools, dynamic research outputs, and novel methodologies are shaping the future of scholarship—yet, outdated assessment methods fail to capture their true impact. In response, the OPERAS Innovation Lab team took a bold step forward and in 2024 released the Guidelines on Publishing and Evaluating Innovative Outputs in Social Sciences and Humanities (available on Zenodo). Building on the OPERAS Innovation Lab case studies of innovative outputs, we reflect on how to recognize and reward innovation, ensuring that research excellence is no longer measured by citations alone but by real-world influence, interdisciplinary reach, and societal impact.

The primary focus of these guidelines is on evaluating the innovative project as a whole – encompassing the project, service, or tool itself – rather than solely on the content it produces. While there are instances where the project and its content are intertwined, the emphasis here is on assessing the project’s overall innovation. For example, a new publishing platform that enables scholars to publish innovative research in digital humanities would be evaluated as an innovative project, not just for the innovative research it publishes.

Through our theoretical discussions, we identified the need for a comprehensive approach to research assessment that goes beyond traditional metrics and includes a wider range of contributions. We also highlighted the importance of research infrastructures in supporting innovation. 

Several cross-cutting questions on evaluation changes in Arts, Humanities, and Social Sciences have emerged as essential pillars to build the new evaluation framework. The challenges we identified are the following:

Current evaluation methods are inadequate

Traditional ways of assessing research, which mostly focus on counting citations and publications, don’t work well for AHSS projects. They often involve different kinds of results, like digital tools or datasets, that aren’t always captured by standard metrics.

There is a need for a more inclusive approach

The framework calls for a shift towards a more comprehensive and inclusive way of evaluating research. This new approach should consider a wider range of contributions and use both qualitative and quantitative measures.

Research Infrastructures build on Innovation

Research infrastructures play a crucial role in supporting innovation by providing financial and human resources, technical tools, and services. They empower researchers and institutions to develop, produce, and share a variety of digital tools and research outputs.

International declarations support change

Several international agreements and declarations, like the Barcelona Declaration, DORA and CoARA, also support recognizing diverse research outputs and call for changes in how research is evaluated, advocating for “bibliodiversity”.

Collaboration is key

The framework emphasizes the importance of collaboration among researchers, institutions, and policymakers to drive systemic change in research assessment practices.

Context and Methodology

The San Francisco Declaration on Research Assessment (DORA) and the Coalition for Advancing Research Assessment (CoARA) advocate for a comprehensive approach to research assessment. They recommend evaluating all research outputs, including datasets and software, alongside traditional publications and emphasize the importance of considering a diverse range of impact measures, including qualitative indicators such as influence on policy and practice, to gain a holistic understanding of research contributions. This approach goes beyond a narrow focus on publication metrics, recognizing the multifaceted value and impact of scholarly activities.

Traditional research assessment metrics, primarily focused on articles and monographs, often fail to capture the value of research diversity. This limitation stems from the lack of established quality assurance mechanisms for non-traditional scholarly work. To fill this gap, it is essential to  develop appropriate assessment criteria for innovative formats and contributions that fall outside conventional publishing models.

Building Criteria for the Evaluation of Innovation

The Innovation Lab evaluation framework is informed by four case studies: the SHAPE-ID Toolkit, OAPEN Recommender System, Transformations: A DARIAH Journal, and the Journal of Digital History (JDH). These studies highlight the challenges and opportunities of fostering innovation within SSH. The framework was shaped during the validation workshop conducted by the OPERAS Innovation Lab team at the OPERAS Conference in Zadar (24-26 April 2024). 

Establishing a clear and understandable framework is essential to effectively assess innovative projects and their outputs in AHSS. The following ten criteria, each accompanied by a comprehensive explanation, are designed to guide the assessment of innovative projects and contribute to a more robust and inclusive research evaluation system.

Novelty

Originality of the idea. Does the project bring new research process methods, practices, and innovative content? Novelty is not necessarily disruptive, but it changes traditional forms and ways of doing research. The project novelty may also lie in repurposing the existing technology and standards for the novel content.

Utility

Addressing a need. Is the innovation responding to a need identified in research practices or the discipline? Will the innovative project help specific academic communities?

User engagement

This includes both the usability of the application and the functionalities that enable users to collaborate or contribute. It  can include usage metrics or other indicators of community uptake. 

User responsiveness

The application’s adaptability refers to its capacity to provide a robust, accessible, and adaptable service or tool to all devices according to their different uses.

Impact and collaboration

Capacity to contribute to science communication and interaction with society, entrepreneurship, knowledge valorisation, and industry-academia cooperation. The service should provide the framework for interdisciplinary exchange and disciplinary expertise.

Bridging potential

Capacity of the project or tool to propose multi-, inter-, and trans-disciplinary as well as inter-sectoral approaches. Cross boundaries between disciplinary knowledge.

Support and training for users 

Capacity to document and describe the new technology and make it readily available for the tool’s users. Are training materials, support/helpdesk available?

Sustainability

Capacity to manage the support and maintenance of the innovation (in time) thanks to funding, institutional support (robust infrastructure), human resources, and Open Science policy or rules. 

Peer review/evaluation process or workflow 

Open Peer Review is preferred to single/double anonymised approach. The capacity to provide an open and transparent evaluation framework enables better recognition for reviewers, more equity for authors, and a transparent framework for discussing ongoing research.

Reusability / reproducibility

Openness of research and results that are verifiable and reproducible (data reuse, protocols, workflows under open licences). Is the project’s data harvestable by other services and available to other users and projects?

Editorial and institutional challenges of this evaluation framework

To properly assess the innovation dimension in a qualitative evaluation process, reviewers should rely on Open Science requirements to ensure the fairness and openness of the project. The FAIR principles and CARE principles (they describe how data should be treated to ensure that Indigenous governance over the data and its use are respected) provide guidance on how research outputs should be organized and treated to ensure accessibility, understanding, exchange, and reuse.

From a technical point of view, and as technologies are constantly changing, assigning Persistent Unique Identifiers (PIDs) to the project will enable reliable, robust and long-term access to the different dimensions of the innovative output. They improve the accessibility and openness of digital research outputs and sustainable referencing. 

Promoting visibility and reuse within the scholarly communication ecosystem is essential to fostering more comprehensive recognition and adoption of innovative outputs. Platforms such as OpenAIRE, OpenAlex, and the SSH Open Marketplace offer valuable channels for disseminating and showcasing these outputs.

To truly integrate innovative outputs into the research assessment landscape, a concerted effort is needed from various stakeholders. These  include research teams, research performing organisations, and international institutions, who must collaborate to advocate for the inclusion of diverse outputs in national research assessment frameworks. 

References

ALLEA, All European Academies. (2023). Recognising Digital Scholarly Outputs in the Humanities. ALLEA. DOI: 10.26356/OUTPUTS-DH

Gouzi, Françoise, Laure Barbot, Matej Durco, Sally Chambers, and Toma Tasovac. 2024. ‘DARIAH Data Policy’. Zenodo. https://zenodo.org/records/10409010.

Maryl, M., Wnuk, M., Gouzi, F., & Umerle, T. (2024). Guidelines on Publishing and Evaluating Innovative Outputs in Social Sciences and Humanities. Zenodo. DOI: 10.5281/zenodo.14221728

Toma Tasovac, Laurent Romary, Erzsébet Tóth-Czifra, Rahel C. Ackermann, Daniel Alves, et al.. The Role of Research Infrastructures in the Research Assessment Reform: A DARIAH Position Paper. 2023. ⟨hal-04136772⟩

Tóth-Czifra, E., Błaszczyńska, M., Buchner, A., & Maryl, M. (2021). OPERAS-P Deliverable D6.6: Report on quality assessment of innovative research in SSH. DOI 10.5281/zenodo.4922537

Wilkinson, Mark D., Michel Dumontier, I. Jsbrand Jan Aalbersberg, Gabrielle Appleton, Myles Axton, Arie Baak, Niklas Blomberg, et al. 2016. ‘The FAIR Guiding Principles for Scientific Data Management and Stewardship’. Scientific Data 3 (March):160018. https://doi.org/10.1038/sdata.2016.18

Leave a Reply