Have a personal or library account? Click to login
Recommendations on Open Science Rewards and Incentives: Guidance for Multiple Stakeholders in Research Cover

Recommendations on Open Science Rewards and Incentives: Guidance for Multiple Stakeholders in Research

Open Access
|May 2025

Full Article

Introduction

Why Open Science is important

Science is a cumulative process (Merton, 1973) that relies on previous knowledge considering all types of research outputs (Dasgupta and David, 1994; Walsh, Cohen and Cho, 2007). Although sharing research outputs as common goods should be the norm, this is actually not the case.

The Open Science (OS) movement was forged in response to this concern. It refers to a range of activities (Grattarola et al., 2024) including sharing research outputs. OS enables replication, improves productivity, limits redundancy, and helps create more robust research methods and a rich network of resources, thus increasing research efficiency (Murray and O’Mahony, 2007; Shibayama and Baba, 2011; Walsh, Cohen and Cho, 2007). In the end, it contributes to the collective building and valorisation of scientific knowledge and to societal progress (Cole et al., 2024).

How modern science is recognised

Recognition and credit are fundamental aspects of modern scientific practice, serving as a foundation for any reward mechanism. This involves acknowledging contributions to scientific work, attributed to individuals, groups, or institutions (Shibayama and Baba, 2011). Crediting is the first step in valuing scientific contributions, typically quantified through various metrics that help build a scientist’s reputation. This process plays a critical role in the broader context of rewarding researchers, which encompasses aspects such as academic promotions, grant opportunities, and access to resources that support future discoveries (Latour and Woolgar, 1986; Shibayama and Lawson, 2021). Scientific discoveries gain credit through community attribution, peer review, and citations, and sometimes through patents (ALLEA, 2023). However, sharing intermediate or pre-publication outputs remains less established, as it doesn’t align neatly with the conventional crediting systems in academia (Shibayama and Lawson, 2021). Despite the recognised importance of open sharing, many studies highlight that academia often fails to adequately value and reward efforts toward opening the scientific process (Hicks et al., 2015; Munafò et al., 2017; Thelwall, Kousha and Waltman, 2015; Wilsdon et al., 2015). Yet engaging in OS practices requires significant time, energy, and expertise, particularly in making data and software findable, accessible, interoperable, and reusable according to the FAIR principles (Wilkinson et al., 2016).

The current sharing practice of academics

In the ‘publish or perish’ culture, some outputs (such as data, databases, or algorithms) may provide academics with an advantage under high competition which can lead them not to share those (Dasgupta and David, 1994; Haas and Park, 2010; Haeussler et al., 2014; Merton, 1973). Moreover, some commercialisation contexts, regulatory constraints, privacy issues or data reuse concerns as well as shortage of funds, lack of time or of capacities and technical resources, could also be barriers (Haas and Park, 2010; Walsh, Cohen and Cho, 2007). As a result, the amount of outputs shared through open mechanisms is still limited in many communities or disciplines, and a lot of resources are shared in one-to-one transactions (Shibayama and Baba, 2011; Tenopir et al., 2015; Wallis, Rolando and Borgman, 2013). Thus, the degree of openness is still mainly at the discretion of individual academics (Blume, 1974; Hackett, 2008; Nelson, 2016). However, academics broadly agree that open sharing is beneficial to science and numerous studies showed that when requested, it is respected (Czarnitzki, Grimpe and Pellens, 2015; Haas and Park, 2010; Shibayama and Baba, 2011; Walsh, Cohen and Cho, 2007). Now a clear consensus on how outputs should be shared and rewarded needs to be established.

The current normative incentives for sharing

Since the Budapest Declaration (BOAI, 2002) that propelled the Open Access (OA) concept, various stakeholders, including governments, funding agencies, and research organisations, have developed policies to promote Open Science (Manco, 2022). Notable initiatives like the Scientific Electronic Library Online (SciELO), the Wellcome Trust, the National Institutes of Health (NIH), Queensland University of Technology and the Gates Foundation have been pioneers in introducing these policies. Despite these efforts, researchers often find that OS activities are insufficiently recognised in formal assessments, which discourages them from engaging in sharing practices (Arthur et al., 2021).

Efforts to address this gap have been spearheaded by initiatives within the Responsible Research Assessment (RRA) movement, such as the DORA declaration (DORA, 2012), the Leiden Manifesto (Hicks et al., 2015), the Metric Tide (Wilsdon et al., 2015) and the Dutch initiative ‘Science in Transition’ (Dijstelbloem et al., 2013). These efforts have broadened the spectrum of recognised research outputs, including datasets and software, though not always explicitly focusing on OS. The European Union has been a leader in incorporating OS into the RRA discourse. For instance, the European Commission’s Working Group on rewards under Open Science developed the Open Science Career Assessment Matrix (OS-CAM), which proposes criteria for evaluating OS activities at all career stages (Cabello Valdes et al., 2017). Furthermore, the European Research Area Policy Agenda for 2022–2024 has prioritised transforming research assessment systems to include OS practices, which was supported by the EU Council’s conclusions (EC-DGRI, 2022; EU Council, 2022). In this context, the Coalition for Advancing Research Assessment (CoARA, 2022) was established to harmonise research assessment practices with an emphasis on recognising OS activities. Following, the Horizon Europe programme has incorporated OS into its evaluation of all research proposals and project assessments (EU Parliament and Council, 2021; EU AGA, 2023), and ongoing Horizon Europe projects such as ‘GraspOS’ (EU Horizon RIA GraspOS project, 2023) and Open and Universal Science, ‘OPUS’ (OPUS project, 2022) have been specifically designed to support the reforms of RRA systems that include OS practices. Lastly, cOAlition S funders, including the European Commission, have recently introduced a proposal named ‘Towards Responsible Publishing’ (cOAlition S, 2023), which calls for the incorporation of OS practices into funders’ assessment policies and the elimination of journal metrics in the evaluation of researchers.

Several countries such as Netherlands (Kramer and Bosman, 2024; VNSU et al., 2019), France (CNRS, 2019), Norway (UHR Working group, 2021), Finland (Working group for responsible evaluation of a researcher, 2020) and the Latin America and Caribbean region (CLACSO, 2019) have also initiated efforts to integrate OS practices into research assessments (Rijcke et al., 2023). Simultaneously, bottom-up international initiatives like the Research Data Alliance (RDA) and CODATA working groups have emerged, focusing on articulating OS concerns and offering recommendations for recognition and credit (CODATA WG, 2024; RDA-EoR IG, 2023; RDA-SHARC IG, 2017).

Objective

In this paper, we provide a set of recommendations developed by the RDA-SHAring Rewards & Credit interest group (RDA-SHARC IG) to help implement rewarding schemes for OS practices. These recommendations specifically emphasise incorporating sharing activities into research evaluation schemes as an overarching, valuable, and hopefully efficient strategy to promote OS practices. They target a wide range of stakeholders across the research and innovation landscape, as highlighted by the UNESCO Recommendation on OS (UNESCO, 2021, section 12), underlining the importance of a collaborative effort among researchers, research institutions and any organisation performing research (public and private), funders, government policymakers and publishers to transform the research culture toward OS (Nosek, 2024).

Methodology

Our values

The foundation of our methodology lies in the ethical principles and values of science. Traditional norms of science, known as CUDOS (Communalism, Universalism, Disinterestedness, Organized Skepticism), as described by Merton (1942; 1973), initially characterised the ethos of scientific practice. However, these norms, which tended to isolate science from society, no longer align with today’s inclusive science landscape. International efforts like the Singapore Statement on Research Integrity (2010), UNESCO Recommendations on Science and Open Science (2017), the Hong Kong principles (Moher et al., 2020), the European Code of Conduct for Research Integrity (ALLEA, 2023) and the CARE principles (Carroll et al., 2020) have established more comprehensive guidelines, promoting greater integrity, inclusivity, respect for indigenous rights, and further unified policies. These guidelines form the general framework for our recommendations.

Identifying the needs and research focus areas

Our process began with a Birds of a Feather (BoF) session at Research Data Alliance (RDA) Plenary 9 (2017), focusing on challenges in sharing data and rewarding such efforts. This session led to the creation of the RDA-SHARC interest group (RDA-SHARC IG, 2017), that i) developed a human readable FAIR assessment tool (David et al., 2024) and ii) formed a core sub-working group (namely, the authors of the present work) to develop our recommendations. Further needs were refined through subsequent RDA plenary sessions, regular teleconference meetings, and emails and asynchronous exchanges (e.g., via Google Doc; Figure 1, step 1).

dsj-24-1852-g1.png
Figure 1

Flowchart of developing the RDA-SHARC IG Recommendations on Open Science rewards and incentives to various stakeholders. The process included 4 steps, namely: 1) identifying the needs and research focus areas, 2) agreeing on terms and concepts (developing rewards-related terminology), 3) mapping existing policies and rewarding initiatives, 4) developing a set of recommendations out of SHARC IG meetings, a global survey and feedback from RDA sessions.

Agreeing on terms and concepts: terminology

The preparatory step led us to develop a shared terminology around research recognition and rewards in OS (Grattarola et al., 2023f) as a common understanding of the terms and concepts mapping this landscape (Figure 1, step 2). This included considering intangible rewards like acknowledgments, citations, and co-authorship (Hicks, 2012; Latour and Woolgar, 1986) and tangible ones such as funding and career promotion (Haeussler et al., 2014; Nelson, 2016; Shibayama and Lawson, 2021). Opportunities for future collaboration were also reported as possible rewards for sharing (Haeussler et al., 2014; Shibayama and Baba, 2011).

Developing mapping tools

To further facilitate the use of our recommendations, we built as a third step (Figure 1) two mapping tables: one summarising the main OS policies across countries, pointing to rewards-related information whenever specified (Grattarola et al., 2023e); another showcasing examples of existing OS rewarding tools (Grattarola et al., 2023a; Grattarola et al., 2023b; Grattarola et al., 2023c; Grattarola et al., 2023d). These mapping tools were identified through a survey conducted by the authors and discussions conveyed within SHARC’s meetings and RDA plenaries. Details of the survey methodology and results are available in (Grattarola et al., 2024).

Finalising recommendations

Our final step involved developing actionable recommendations based on identified needs and research focus during the SHARC IG working sessions and meetings (Figure 1, step 4). These recommendations aimed to i) guide researchers in the existing OS rewarding landscape on how to get credit in practice, and ii) raise awareness among various stakeholders in the research assessment system regarding which rewarding mechanisms to provide and implement to ensure the system works effectively. These recommendations were informed by the survey results and continuous feedback from RDA-SHARC members and participants (Grattarola et al., 2024). A first version of the recommendations was presented and discussed at RDA Plenary 20 (March 2023). Following feedback from the audience was integrated at best in a second version of the recommendations. The second version was submitted to the RDA community review process and endorsed as an RDA Interest Group output in June 2024 (RDA-SHARC IG, 2017). It resulted in the final version detailed in the present work.

Which actions to implement first will depend on the stage each stakeholder is at. Therefore, we intentionally did not prioritise the actions in our recommendations.

Overview and Discussion

Tables 1, 2, 3, 4, 5 distils the comprehensive recommendations and specific examples for fostering Open Science (OS) practices across different stakeholders, focusing on research-performing organizations, funders, publishers, government policymakers, and researchers. These are discussed here.

Table 1

Recommendations to Research performing organisations.

RECOMMENDATION SCOPERECOMMENDED ACTIONEXAMPLES*/DETAILS
Promoting RRAParticipate in building & promoting relevant frameworks and initiatives related to responsible research assessment
(e.g., join forums such as the CoARA: Coalition for Advancing Research Assessment)
Sign DORA Declaration
Sign CoARA Agreement
Engaging with OS communitiesBe part of the OS conversation by joining relevant communities, such as the Research Data AllianceList of examples of OS communities of practice
Adopting formal OS policiesEstablish institutional prerequisites to enable the practice of OS
* Post institutional OS policies in a visible and easy to find place (website), including all facets of OS (publications, data, software, citizen science)
* Mandate deposit of ALL research outputs (e.g., publications, datasets, code) in the institutional or other compliant repository to be publicly available under an open licence (no later than the time of an associated publication, as much as possible)

In case of legitimate constraints – ‘dark’ deposit with open metadata. A ‘dark’ deposit (or restricted deposit) is a work in a repository whose full text stays hidden from the public (not OA). However, metadata associated with these deposits is publicly accessible so that authors’ scholarly records are discoverable


* Mandate for a DMP/software management plan for all research projects, which the staff/postgraduate students are involved in
* Require to manage research data in line with the FAIR principles
* Ensure that all publications (co)-authored by the staff/postgraduate students contain data availability statements
OS at Finnish Meteorological Institute
* Encourage that the staff/postgraduate students retain sufficient IP rights to comply with the OA requirements
* Minimise the administrative burden generated by some OS activities and provide support to facilitate these steps while promoting trust and transparency
Harvard University’s Rights Retention policy;
UK Institutional Rights Retention policies
Include criteria for open research activities in recruitment, evaluation and rewarding policies
* Consider the Hong Kong principles to reinforce open science and research integrityWCRI Hong Kong principles
* Consider/create indicators (qualitative and/or quantitative) in general as well as disciplinary data-level metrics for crediting data sharing in the evaluation schemesCoARA agreement on RRA; DORA RRA documents;
EC’s OS Career Assessment Matrix (OS-CAM);
* Boost appreciation of the researchers who excel in Research Data Management & OS practices, including well-documented, FAIR and open digital outputs, during their annual reviews by integrating these activities into the institutional research evaluation schemeNOR-CAM Assessment Framework;
TU Delft strategic-plan-2025;
BIH QUEST programmeResearcher assessment at FMI
* Promote that non-OA (closed, i.e., only accessible over paywall) outputs should not be reported for performance evaluation proceduresCNRS policy (p. 11)
OS capacity buildingProvide OS capacity building support
* Provide OS courses (ideally as part of the annual mandatory training for research staff and mandatory subjects for postgraduate students)
* Organise institutional working groups, workshops
* Provide digital training materials, newsletters
* Ensure that the various facets of OS are coherently developed and do not work in silos
FAIR & OS training initiatives;
UNESCOs index;
Mandatory OS course for PhD candidates at Maastricht University;
Mandatory OS course for PhD candidates at Erasmus University Rotterdam; UU Digital Competence Centers; NFDI, DE
* Establish dedicated human resources/units, such as OS regulatory adviser, data stewards & managers, appoint professionalised Data Stewards, and engage libraries
* Facilitate collaboration with related OS groups and people
TU Delft Data Stewardship project and Data Champions initiative;
CNRS DDOR
OS infrastructureProvide infrastructure and material resources for OS
* Provide or work with a trusted repository (certification based on CoreTrustSeal, Nestor Seal DIN31644 or ISO16363)ECs expectations for trusted repositories (pp. 155–156)
* Provide digital services & operational tools (e.g., DMP tool, FAIR data management, anonymisation and analysis tools, entry points for OS help)
* Develop/refine systems which track/monitor research outputs, including OS outputsKorean NTIS platform
OS fundingProvide financial support for OS
* Cover costs associated with registering PIDs (e.g., DOIs) for all research outputs, including datasets
* Determine reasonable OA costs to support while transitioning to the Diamond OA modelNew Gates Foundation’s OA policy
* Cover costs associated with research data/software managementRADS Initiative: estimates of institutional expenses for public access to research data
* Provide templates for cost calculation of OS activities in order to facilitate their inclusion in funding applications
* Financially support sustainable tools, initiatives and infrastructure development for OS locally, nationally and internationallySCOSS;
Liverpool University Press’s Opening the Future programme for Diamond OA books;
2024 Report on the Sustainability of Diamond OA in Europe
OS rewardsImplement various types of rewards
* Awards, gifts to researchers that contribute very actively to OSOpen Research Awards: a Primer from UKRN
* Organise free time (sabbatical time)
* Salary bonus to researchers being actively engaged with OS
* Create data champions schemesTU Delft Data Champions initiative
* Create OS stamp/badge/label (e.g., in a PhD Degree Certificate)Examples of OS Badges/Certificates/Tokens

[i] *The list of examples referred to in the table point to initiatives/policies active in 2024.

Table 2

Recommendations to Funders.

RECOMMENDATION SCOPERECOMMENDED ACTIONEXAMPLES*/DETAILS
Engaging with OS communitiesBe part of the OS conversation by joining relevant communities, such as the Research Data AllianceList of examples of OS communities of practice;RDA ‘s Research Funders and Stakeholders on Open Research IG;RDA’s National PID Strategies WG
Adopting formal OS policiesAdopt and publish formal policies requiring/strongly encouraging OS activities
* Be specific whether it is a requirement or a recommendation (e.g., require vs encourage preprints)
OS evaluationAlign OS outputs with traditional ones
* Recognise well-documented, FAIR and open digital outputs as first-class contributions during the project lifecycle and in the research assessment frameworkNOR-CAM Assessment Framework;EC’s OS Career Assessment Matrix (OS-CAM)
Monitoring OS outputsMonitor compliance in OS implementation and make it transparent to relevant stakeholders
* Share funded OS activities with open scholarly infrastructure, academic databases and search enginesTransition of Open Funder Registry into Research Organisation Registry; OpenAlex: open bibliographic database;Funders’ support of the Barcelona Declaration on Open Research Information
* Share/credit the array of research outcomes from funded projects and explore project identifiers like the RAiD as an opportunity to link the project outcomesRAiD;Korean NTIS platform (linkage of outputs based on national R&D project number)
OS FundingCreate calls financing OS-driven activities
* Calls financing data sharing and re-use and support for software that is critical to researchDataWorks! Prize;Essential Open Source Software for Science
* Short-term funding for early career researchers to improve OS sharing
For all research projects, systematically allocate a portion of the proposal budget to OS activities, such as data management and sharingA Pilot incentive programme from the Uruguayan ANII research funding agency
Ensure that enough funding is dedicated to appropriate resources for staff and OS infrastructure devoted to the development of shared data platforms (i.e., with standardisation, quality control and analysis tools services that will enable real-time use of data within a project collaboration and future reuse by all)Life watch services

[i] *The list of examples referred to in the table point to initiatives/policies active in 2024.

Table 3

Recommendations to Publishers.

RECOMMENDATION SCOPERECOMMENDED ACTIONEXAMPLES*/DETAILS
Unambiguous identificationMake use of ORCID mandatory in all research outputs
(as it is the only universal and free identifier)
* Make the ORCID search easier in the manuscript submission system
Getting started with your ORCID record
Findable data & software citationRequire that authors cite data & software they produce and/or reuse in the method/reference section or in a data/software availability statementAGU’s Data & Software Availability Statement
Pre-printingProvide support for preprints to facilitate Open Access and open peer-revieweLife’s New Model
Peer Community in
Open peer-reviewFoster discussion on the implementation of open peer-review models and the recognition of expert efforts in open peer-reviewOpen Research Europe: Open Peer-Review Publishing Model
Recognising contributorshipAdopt the CRediT taxonomy to enable the mention of OS activities as part of the contributors’ research outputsImplementing CRediT;
ESIP Research Artefact Citation (see Activities/Large Spreadsheet of Research Artefacts)
Encouraging OS activitiesAdopt the OS badges initiative to award badges based on pre-registration/open data/open materialsCoS Badges initiative
Encourage OA publishing in all LMICs by revising the criteria for publishing fees and adjusting them based on meaningful indicators (for instance, to the national Gross domestic expenditure on R&D/GERD and not only to the country GDP)Research and Development Expenditure (% of GDP);
Research4 Life
Assessing opennessAssess journals for transparency and openness
* Start with assessing OA and use the TOP factor for more advanced assessmentTOP factor
Establish data and software review mechanisms where relevant
* Establish data editors that work with the publication stakeholders to assess quality and FAIRness of data/softwareRole of data editors in astronomy

[i] *The list of examples referred to in the table point to initiatives/policies active in 2024.

Table 4

Recommendations to Government policymakers.

RECOMMENDATION SCOPERECOMMENDED ACTIONEXAMPLES*/DETAILS
Promoting national overarching policies on OSDevelop overarching policies requiring/strongly encouraging OS activities at all levels, including an increase in OS awareness among decision-makers
Ensure that the national policies will allow to:
* Harmonise practices
* Provide a budget
* Monitor implementation across disciplines and institutions
* Include rewarding mechanisms as key elements of OS policies (positive aspects rather than a ‘burden’ and requirements only)
  • Create observatories of practices that showcase the rewarding mechanisms in place or being piloted in real life

  • Provide funding to compare/value and harmonise mechanisms and to study deeply such mechanisms

  • Facilitate networking and sharing of practices across institutions at the national level

  • Harmonise the way mechanisms are assessed

  • Participate in international comparisons and organise involvement in international initiatives (e.g. SCOSS, CoARA, RDA)

  • Facilitate the implementation of evaluation criteria, considering all aspects of OS (i.e., not only open publications and open data, but also actual reuse of existing data and citizen science activities engaging the public in the scientific process)

RDA-SHARC list of examples of national/institutional OS policies

[i] *The list of examples referred to in the table point to initiatives/policies active in 2024.

Table 5

Recommendations to Researchers.

RECOMMENDATION SCOPERECOMMENDED ACTIONEXAMPLES*/DETAILS
Raising awareness of OS policiesBe aware of the existing and relevant institutional, countrywide, regional, and community research policies, including laws, regulations and agreementsRDA-SHARC list of examples of national/institutional OS policies
Raising awareness of OS trainingBe aware of OS training sessions and resources provided by institutions or communitiesUNESCO OS Capacity Building Index;OS Loterre Thesaurus
OS Capacity buildingMaximise as much as possible digital presence using PID for individuals and for all outputs (ex: ORCID, DOI or other identifier for Open Access publications/Open Access datasets/open source software)Parsec Digital Presence checklist;PLOS Handbook/Guide
* Include citation elements for research data/software created in the References section of a paper. To support indexing and reuse:
  • Use a style that is structured and that includes the nature of the published object (e.g., data, software…; ex: American Psychological Association (APA) style;

  • Include a persistent identifier (DOI), preferred, or URL;

  • Use labels/bracketed descriptions (e.g., [Dataset], [Software], [Collection], [ComputationalNotebook])


* Include a data/software availability statement in any paper that describes where and how data are available, and how to cite them if possible.
AGU’s Data and Software Availability and Citation Checklist & Templates
Update CV & reporting information within OS activities
Recognising contributorshipAcknowledge OS contributorship
* Specify all kind of contributorship early in the projectsThe Turing Way project’s Acknowledging Contributors
* Use the CRediT taxonomy:
  • Allocate the terms appropriately to project contributorship and contributions to research outputs;

  • Advocate for institutional acknowledgement and adoption of the taxonomy for research outputs

Implementing the CRediT Taxonomy
Cite or acknowledge researchers’ OS outputs while leveraging PIDs
* Cite data and research outputs in Data Availability Statement and References sections of papers
* Acknowledge and cite OS tools used, e.g., with an identifier or ‘How to cite’ statement (if any)
F1000 Open Data, Software and Code Guidelines
Raising awareness of OS costsBe aware of how to include OS costs in all funding applicationsCuration and Data Management Services
Raising awareness of OS financial rewardingSolicit dedicated financial reward or support* Apply to specific funds for OS activities wherever relevant
* Apply to OS prize/awards if any
RDA-SHARC list of examples of existing financial rewarding tools
Raising awareness of OS symbolic rewardingGet symbolic reward
* Apply for OS certificates/OS ambassador/OS badge schemes
* Apply for training badges
* Join OS acknowledging opportunities to gain visibility/reputation
RDA-SHARC list of examples of existing symbolic rewarding tools

[i] *The list of examples referred to in the table point to initiatives/policies active in 2024.

Recommendations to research performing organisations (Table 1)

To gain insight and learn how to support OS activities, institutions should first actively join RRA and OS related communities/initiatives (e.g., DORA, CoARA, RDA) and encourage their personnel to be active in them. Formal OS policies should be adopted and posted on institutional websites, ideally in a discoverable and usable format (e.g. human and machine readable), and communicated to the communities they serve. Important to these policy measures, research outputs should be deposited in community trusted repositories (e.g., institutionally supported repositories, CoreTrustSeal) and made publicly available and reusable under permissive licences. To make these outputs fully reusable, a data management plan (DMP) should be required for all research projects and FAIR principles should be applied as much as possible. In particular, all publications (co)-authored by researchers/staff and students should contain ‘Data Availability Statements’ and data citation references (which applies to other research outputs such as software).

Furthermore, OS practices expected by a policy should be monitored and rewarded, implying that they should be considered as part of criteria for recruitment and evaluation. A prerequisite for OS monitoring is engagement with persistent identifier (PID) infrastructures, such as Datacite which enables tracking OS activities and outputs through relevant metadata. Even though openly shared datasets, software, protocols, and other research outputs are increasingly accompanied with Digital Object Identifiers (DOIs) and can be tracked, these efforts are not always fully credited as part of research evaluation and recruitment procedures. There is a need to develop new metrics and indicators for evaluating OS practices, aligning with principles of openness, transparency, and collaboration, and thereby crediting the creator. Assessing scientific production traditionally relies on citation-based metrics from databases like Web of Science, or Google Scholar. However, further discussions in the research community have moved beyond traditional metrics (from PubMed Medline, Scopus etc.; Datacite, 2024) and have explored alternative approaches potentially more suited to OS activities (Bosman, Debackere and Cawthorn, 2024; Das, 2015; Ugwu Okechukwu et al., 2023).

Capacity building is critical to implement OS policies. Improvements in OS capacity building should be made by incorporating OS education into research workflows (such as in curricula, training programs, and working groups), so as to become part of the culture. Infrastructures and material resources for OS such as providing digital services and tools should be facilitated by institutions (e.g., FAIR data management service, DMP tools, tools for anonymization, and guidance towards trusted repositories). Notably, OS practice should be facilitated and streamlined by services, wherever relevant, such as automated metadata completion via persistent identifiers and transfer and communication of copyrights and intellectual property rights should be retained to comply with OA and OS requirements.

Another important aspect is the financial support for OS, including PID-related costs such as DOI registration for all research outputs such as datasets, costs associated with research data/software management, investments in national/regional OS initiatives such as Diamond OA. In order to support OS activities, it is important to include related costs in funding applications, create funding opportunities to work with relevant OS communities, and establish other incentives for OS activities. Various types of OS rewarding solutions need to be explored and implemented, ranging from awards, salary bonuses, champions, badging schemes, to additional free time (e.g., sabbaticals), depending on context. These should also be integrated and recognised as part of recruitment, promotion and tenure schemes (e.g., recognizing Open Access to research outputs). Token recognition systems (e.g. blockchain backed) are also emerging as a new opportunity to reward the contributions that academics make to the scientific ecosystem (Finke and Hensel, 2024). This adds to already present citation mechanisms, including data, software, and other research outputs as recognition.

Recommendations to funders (Table 2)

For funders to support OS, it is important that they develop policies that require, or at a minimum, encourage OS activities in their communities and integrate them into their proposal workflows. To develop these policies, funders should gain a better understanding of current open research practices and capabilities, by conducting landscape analyses, engaging with the OS community, leveraging expertise, and identifying initial steps (i.e., low hanging fruits) that can be taken to monitor and guide these activities. Mapping key stakeholders in OS would be prudent, to avoid being overwhelmed and to interface with the OS community via these stakeholders. For reference, the Aligning Science Across Parkinson’s (ASAP, 2021) is an example of the more forward-looking funder policies.

OS monitoring is still a relatively new and developing aspect of the research community where organisations like UNESCO are guiding these conversations. However, it is difficult for funders to track these conversations, and it is important for these groups to engage funders, where reasonable (e.g., Scilifelab Data Centre and ReSA, 2024). For instance, to develop a common framework and schema where policy recommendations and requirements can be aligned. These communities, for the funders sake, should also work towards ensuring that the underlying sources and workflows used to provide information for monitoring and assessment are clear. Funders are limited in how they can interface with OS infrastructure, so it is important for infrastructure providers to take a simple approach to how they need funders to provide them with information (for instance, asking funders to interact with APIs or use XML vs CSV). The support of funders like Arcadia for projects such as OpenAlex (Portenoy, 2024) underscores the importance of investing in collaborative, open scholarly infrastructure to be used as sources for OS monitoring. This commitment is shared by other funders, such as the Bill & Melinda Gates Foundation and the French National Research Agency, who have demonstrated their support by signing the 2024 Barcelona Declaration of Open Research Information.

Initiatives like the national PID strategies out of RDA (Brown et al., 2022) are helpful to funders as they outline the required infrastructure components they need to enable OS. An example is RAiD (Research Activity Identifier) which allows funders to interlink outputs and resources, but also better understand (interdisciplinary) collaboration in the projects they fund. Not every funder has the capability to implement a data management plan workflow but an output-based approach is an alternative to monitoring and assessment. In line with PIDs that make researchers outputs searchable and discoverable and guarantee their long-term accessibility and tracing, it is worth mentioning emerging decentralised PID approaches such as dPIDs (Hill, Koellinger and Van Winkle, 2024) and dARK (Matas et al., 2023), as new potential monitoring systems to be explored.

New approaches to funding OS need to be explored and implemented, where funding is allocated to support policies. These can be prizes celebrating OS aspects such as the ‘DataWorks! Prize’, developing ‘OS champions’, for instance, at Michael J. Fox Foundation (in the US), encouraging and allocating support for DMPs and data publishing like ANII (in Uruguay). Also, coordination is key as a number of funders are limited by how much they can allocate to OS versus some of the funders that are allocating more towards big initiatives and infrastructure projects. The decision regarding what to fund in OS is more often dependent on the funder’s vision, mission, goals, and values.

Supporting OS requires certain commitments from funders beyond just infrastructure. Diversity, Equity, Inclusion, and Accessibility (DEIA) should be integrated into programs together with fostering team science, collaboration, and greater transparency, in line with the CARE principles (Carroll et al., 2020). These are key tenets of OS, but it is also important that funders look at which principles and values are important to them and how they align with OS (e.g., supporting preprints and Open Access for the public good). These principles and values can be used as a compass to help with guiding funders through a dynamic OS landscape. Funders should look internally too on how they dedicate staff time and resources to support OS (e.g., setting up teams and roles).

Recommendations to publishers (Table 3)

Piwowar & Chapman (2008) investigated the data sharing policies of 70 journals and found that researchers more frequently share data when journals have such a policy, and that the probability of sharing data correlates positively with the strength of the policy (Mongeon et al., 2017). Publishers’ policies are therefore key for OS implementation. Over time many have established sharing policies in line with recommendations to research funders and institutions, yet there is a need for journals to provide clearer instructions to authors, reviewers and staff to encourage OS and foster rewarding schemes for it.

Journals should facilitate researcher-authors’ compliance with good OS practices as a prerequisite to credit. This entails implementing a number of connected measures: first, establishing a clear mandate to use unique PIDs for both individuals and their research outputs to enable their digital connectivity to the scholarly record and the attribution of their work; second, making a clear request that all data and software related to a published manuscript adhere to the FAIR principles, along with providing guidance on how to do so and where to deposit these resources to enable reuse; third, providing support for preprints would also help facilitate Open Access; and fourth, requiring the full and proper citation of all data and software, whether created, used or reused from others’ research, in all publications, as it is indispensable for receiving credit.

Requesting FAIR data and software implies that editorial staff and reviewers are able to verify proper citation of data and software and ensure that all supplementary resources are openly available, free of charge, even if the article is not. For this, journals should assign specific editors, such as ‘data editors’, to assess the quality and FAIRness of data and software (e.g., The American Naturalist). By supporting the FAIR principles in their policies, in combination with clear instructions on how authors should comply, will aid the journals in making strides towards more automated reviews.

The peer-reviewing activity is essential to the scientific method, and publishers should endeavour to recognise its importance and promote transparency through open peer-reviewing models (with or without reviewers’ anonymity). This can be an additional way to expand OS and improve responsible research assessment. Journals should systematically implement existing tools, such as the CRediT taxonomy, to enable clarifying one’s contribution/roles in research works, and systematically use existing guidelines such as the TOP Factor, which can assess their openness and transparency.

Finally, to foster greater inclusivity it is crucial to reconsider the current calibration of OA publishing fees, which are based solely on a country’s GDP for Low- and Middle-Income Countries (LMICs). This approach unfairly impacts countries like for instance Uruguay, where GDP is not considered to be low while their R&D funding is. In such cases, it is imperative to employ more meaningful economic indicators to mitigate the exacerbation of disparities in global knowledge access and to calibrate more equitable costs. Programs such as Research4 Life provide one mechanism for use by publishers to try to calibrate costs. More concrete examples are provided in Table 3.

Recommendations to government policymakers (Table 4)

The governments’ adoption and promotion of a national OS policy are an important driver for its implementation. It demonstrates political willingness and helps facilitate the harmonisation of practices across a variety of institutions and disciplines: giving common guidelines and a roadmap to all universities and research institutes facilitates a consistent uptake of OS across territory, institutes and disciplines. Some countries have been early in setting up a national OS strategy (Sveinsdottir, Davidson and Proudman, 2021), and a few of them have included rewarding mechanisms such as France (MHERI, 2021) and the Netherlands (Gielen et al., 2022). In the French national OS plan, a number of measures are mentioned to make OS practices sustainable, among them the requirement for changes in the evaluation system. In the Dutch national OS strategy, a requirement for realising OS is to ‘Make OS rewarding through incentives (Recognition & Rewards)’.

It is important to recognise that international reference texts such as the UNESCO Recommendation on OS (UNESCO, 2021) and the OS policies for European countries (CoNOSC, 2022) have stimulated such national strategies and policies. By the end of 2023, eleven countries had national policies stemming from UNESCO’s OS recommendations (Austria, Colombia, Cyprus, Ireland, Italy, Latvia, Lesotho, Romania, South Africa, Spain, and Ukraine), so the number of countries having such national policies had doubled since the recommendation. Four countries included OS principles in their national Science Technology and Innovation policies (Estonia, Ghana, Sierra Leone, and Slovenia); eleven countries (Botswana, Côte d’Ivoire, Croatia, Kenya, Mozambique, Namibia, Nigeria, Somalia, United Republic of Tanzania, Uganda, and Venezuela) are currently developing OS policies taking into account the UNESCO recommendation though not specifically mentioning rewarding and crediting measures (UNESCO, 2023).

Our overarching recommendation is for governments to develop national OS policies. Table 4 gives examples of such national strategies in various countries that policymakers can adapt to their own contexts. Considering such policies, a number of specific elements need attention:

First, incorporating effective reward mechanisms into national OS policies is important. Providing clear incentives is needed, as opposed to framing OS activities as burdensome requirements. These incentives are vital for fostering the acceptance and successful implementation of OS policies within the scientific community.

Second, compiling and documenting use cases via dedicated websites would highlight real-life mechanisms that have been implemented or piloted. Given the substantial diversity among institutions and policies across various domains and contexts, it is clear that rewarding different scientific activities is not a ‘one size fits all’ effort. Showcasing use cases would accelerate the implementation of systems that work effectively across most domains. At the same time, it would accommodate specific mechanisms where necessary. Additionally, it would help avoid repeating mistakes or duplicating efforts.

Third, systematic and rigorous approaches to analyse OS activities, particularly reward mechanisms, are needed. The French national OS plan, for example, has launched a specific call for research proposals in 2023 to study OS activities, including reward systems. To achieve a comprehensive understanding, we recommend prioritising and encouraging funding for projects dedicated to the in-depth analysis of these mechanisms or providing direct funding for such research initiatives.

Finally, it is often the case that various practices are established and tools or mechanisms are tested, but this is frequently done in silos, without coordination between institutions. At the national level, such coordination can be organised and highlighted. Thus, facilitating networking and sharing of practices across institutions at the national level is highly recommended. Further, despite international initiatives such as RDA and CoARA that are pivotal for harmonising assessment methods and mechanisms, there is still a notable lack of dedicated efforts to standardise the assessment of rewards for OS activities at the national level across various institutions and disciplines. Addressing this gap should be a priority to advance OS on a global scale.

Recommendations to researchers (Table 5)

At the individual level, and in the current research ecosystem, getting some kind of reward from OS activities will result from several distinct mechanisms that people must be aware of.

First, the normative context framing one’s research activity, e.g., in particular national and institutional ones if existent, sets the tone for what must, can, or should be done, and sometimes describes how. It is then imperative that everyone is aware of the policies and regulations in place and of the possible means accompanying their implementation. More and more, OS frameworks are endorsed over time worldwide and may provide opportunities to get/apply for various kinds of training and support (material, financial, human). For instance, through specific funds, prizes or awards (Grattarola et al., 2023a, Grattarola et al., 2023d), or by anticipating an OS budget in the funding applications. Researchers need to watch over this evolving context to anticipate assignments and seize opportunities.

Second, a number of actions are necessary to maximise one’s digital presence and visibility on the basis of crediting processes in research (detailed in Stall et al., 2023). The prerequisite for crediting is an identification scheme for researchers and their work’s outputs that is unambiguous, persistent and embedded in the scholarly digital ecosystem. The attribution of a PID with associated rich metadata to a research object, makes it searchable and discoverable and guarantees its long-term accessibility and tracing. This is easily achievable for datasets or databases that are numerical by nature. Regarding physical/material resources, it requires first that their description is somehow digitised and accessible on the web (e.g., via metadata-only datasets, data papers, or landing pages). Identification through PIDs is now supported by robust organisations, especially DataCite operating DOIs for numerical objects and ORCID for individual researchers. Making visible those identified elements is the next step to getting or giving credit. It is essential that researchers refer systematically to all their own OS-identified outputs wherever relevant through citation and/or acknowledgement, notably in papers, CVs, and reporting activities. It is equally essential that researchers cite or acknowledge other’s outputs they reuse in their own research. This is also intrinsically linked with how co-authorship is managed within projects/teams. It is important to consider the diverse contributor roles and it is advised to establish how to handle co-authorships from the beginning of a project to ensure that everyone’s contribution (including e.g., technicians or data collectors) is included.

Thirdly, obtaining symbolic rewards such as OS badges and certificates or OS ambassador roles can serve as a form of recognition for researchers who engage in OS practices (e.g., Open Science Badges of the Center for Open Science). These recognition schemes can help build trust in the researchers’ work and enhance their credibility as researchers (Schneider et al., 2022). By earning badges, researchers demonstrate their commitment to OS and become visible in their community for that. Having digital badges incorporated into an author’s record as a contribution to overall metrics is to be explored and implemented in research scholarly infrastructures. More practical information is provided in Table 5.

Finally, credit/recognition can also be obtained for research outputs that have a commercial perspective through patents that may have been obtained based on the results. Obtaining patents means that researchers or their employer legally own intellectual property rights. Researchers should be aware that patenting and OS practices are compatible (EC Innovation Council and SMEs Executive Agency, 2023), i.e., open sharing of findings can be done as soon as a patent application is filed or prior to the filing in certain jurisdictions such as the US and South Korea which provide ‘grace periods’ (Nuechterlein et al., 2023). In such cases, advice should be given to the applicant that they should encourage the ‘free non-commercial use by [other] researchers of knowledge disclosed in patents’. Given that large, detailed and consistent datasets are an asset not only for researchers but also for companies, monetary reward opportunities can arise to provide incentives for data sharing (ALLEA, 2022).

Examples of national and institutional OS plans, OS and FAIR awards and dedicated funds for OS are given in Grattarola et al. (2023a b c d e).

Concluding Remarks

Opening science today necessitates integrating transformative changes in research culture, workflows, governance structures and assessment mechanisms, and involves extending these changes across all scientific communities. Achieving this goal is not feasible through the efforts of an individual researcher without support from other stakeholders in the research ecosystem and global coordination of their collective actions. These stakeholders include research performing and funding organisations, publishers, and government policymakers.

Given the historical organisation of science, the transition to OS can be challenging, burdensome, and costly for researchers who generate scientific outputs. Identifying mechanisms to facilitate and reward those at the forefront of this transition is essential for accelerating the entire process. This study has practical implications, providing actionable recommendations that embrace a holistic approach to guide the development and implementation of rewarding schemes at various levels – where they exist, or to assist in their creation where they are needed.

Finally, it is important to note that incentivising OS practices, such as data sharing, might lead some researchers to engage in strategic sharing to accumulate rewards, effectively ‘gaming’ the system rather than focusing on the production of new, high-quality knowledge. Therefore, to prevent a similar ‘publish or perish’ dynamic within OS practices – where rewards may drive efforts focused more on quantity than on substantive contributions – it is crucial that any OS reward and incentive schemes incorporate stringent eligibility criteria for rewards, based on rigorous quality assessments of outputs and governed by principles of research integrity and responsible conduct (such as the World Conference on Research Integrity Hong Kong principles).

Data Accessibility Statements

The present recommendations are openly available on GitHub [https://bienflorencia.github.io/rda-sharc-reco/], DOI: https://doi.org/10.15497/RDA/000117.

Abbreviations

AGA: Annotated Grant Agreement

AGU: American Geophysical Union

ALLEA: All European Academies

ANR: Agence Nationale de Recherche (French National Research Agency)

BoF: Birds of a Feather

CARE: Collective benefit, Authority to control, Responsibility, Ethics

ChatGPT: Chat Generative Pre-trained Transformer

CLACSO: Consejo Latinoamericano de Ciencias Sociales

CoARA: Coalition for Advancing Research Assessment

CODATA: Committee On Data

CRediT: Contributor Roles Taxonomy

CUDOS: Communality, Universalism, Disinterestedness and Organised Scepticism

DDOR: Direction des données ouvertes de la recherche (DDOR) du CNRS

DMP: Data Management Plan

DEIA: Diversity, Equity, Inclusion, and Accessibility DOI: Digital Object Identifier

DORA: Declaration On Research Assessment

FAIR: Findable, Accessible, Interoperable, Reusable

GERD: Gross domestic Expenditure on R&D

LMIC: Low- or Middle-Income Country

NIH: National Institutes of Health

OA: Open Access

ORCID: Open Researcher and Contributor ID

OS: Open Science

OS-CAM: Open Science Career Assessment Matrix

OSTP: Office of Science and Technology Policy

PID: Persistent IDentifier

RAiD: Research Activity Identifier

RDA: Research Data Alliance

RRA: Responsible Research Assessment

SciELO: Scientific Electronic Library Online

SCOSS: Sustainability Coalition for Open Science Services

SHARC: SHAring Rewards and Credit

TOP: Transparency Openness Promotion

UKRN: UK Reproducibility Network

UNESCO: United Nations Educational, Scientific and Cultural Organisation

Acknowledgements

The authors wish to thank the contributors to initial phases of the present work:

Romain David, ERINHA, FR; Alison Specht, University of Queensland, AU; Gabrielle Bertier, University Toulouse III, FR; Mohammed Yahia, INIST, FR; Louise Bezuidenhout, Data Archiving and Networked Services (DANS), Royal Netherlands Academy of Arts and Sciences, NL; Michele de Rosa, 2.0 LCA consultants; Laurent Dollé, Université Libre de Bruxelles, BE; Sofie Beckaert, University Gent, BE, and Elena Bravo, ISS, IT.

The authors also wish to thank Anne-Sophie Archambeau, Jane Carpenter, Anna Cohen Nabeiro, Aurélie Delavaud, Fiona Murphy, Sophie Parmelon, Anne-Marie Tassé and Martina Zilioli for their inputs.

Finally, the authors wish to acknowledge the fruitful comments from all other members of the RDA-SHARC interest group and all the attendees to the various SHARC working sessions held at RDA plenaries.

Competing Interests

The authors have no competing interests to declare.

Author Contributions

All authors contributed substantially to the conception, design, interpretation of the work and were involved in revising the successive drafts critically. FG, HS and LM were further involved in the acquisition and analysis of data for the survey; LM provided the first draft of the recommendations and was coordinating all the tasks needed to achieve the work.

Language: English
Submitted on: Nov 4, 2024
|
Accepted on: Apr 22, 2025
|
Published on: May 6, 2025
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2025 Laurence Mabile, Hanna Shmagun, Christopher Erdmann, Anne Cambon-Thomsen, Mogens Thomsen, Florencia Grattarola, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.