Summary

The Publish-Review-Curate (PRC) model adopted by MetaROR (MetaResearch Open Review), removes the binary “accept/reject” gate from the assessment of scholarly contributions. The evaluation process of articles submitted to the platform, through peer reviews and an editorial assessment, is openly accessible, and the measure of value is apparent from a qualitative perspective.
While this transparency accelerates knowledge flow, funders still struggle to recognise PRC outputs when allocating scarce resources. This mixed‑methods project will:
- Map funders’ readiness to use reviewed preprints
- Uncover barriers faced by reviewers and panel members
- Create the foundations to co-design a prototype and pilot machine‑ and human‑readable artefacts (“PRC Evaluation Summaries”) that allow evaluators to grasp at a glance the strengths, weaknesses and societal relevance of research hosted on MetaROR
Project team
- Stephen Pinfield, Senior Research Fellow, RoRI and University of Sheffield
- Ludo Waltman, Senior Research Fellow, RoRI and CWTS-Leiden
- André Brasil, Research Fellow, RoRI and CWTS-Leiden
Partner organisations
- Austrian Science Fund (FWF)
- Dutch Research Council (NWO)
- Swiss National Science Foundation (SNSF)
- Social Science and Humanities Research Council of Canada (SSHRC)
The Project in detail
Background and rationale
- Recognition & rewards in flux: Global initiatives (DORA, Leiden Manifesto, CoARA) shift away from metrics such as the Journal Impact Factor and the h-index, promoting recognition of a broader set of research outputs. Yet many funders resist implementing these new approaches, particularly those in the Global South. Likewise, even when funders take innovative action, some reviewers hold on to traditional methods and revert to legacy metrics when decisions get tight.
- MetaROR’s promise: By publishing transparent reviews and assessment reports for each contribution, MetaROR surfaces nuanced judgements of rigour, novelty and usefulness that could replace binary proxies.
- Gap: Funders lack simple, trustworthy ways to interpret narrative information at scale. A structured “PRC Evaluation Summary” layer, co‑designed with funders, could bridge this gap.
- Main actors and audience: This project focus on two distinct groups: funders and evaluators, represented by the leadership of evaluator panels in each funder, as this is a group that helps shape evaluation strategies. Recommendations from the project will be also directed to these two groups.
Objectives and research questions
Objective | Research questions |
Map perceptions | How do funders currently view reviewed preprints and PRC artefacts? Which incentives or constraints shape adoption? |
Diagnose resistance | Why do some expert reviewers continue to consult traditional bibliometrics despite new policies? (+funders) |
Co‑design solutions | Which metadata fields, visual cues, and narrative elements help evaluators make confident, fair judgements within time pressure? |
Pilot & evaluate | Does the prototype “PRC Evaluation Summary” improve funder decision‑making efficiency, transparency and satisfaction? |
Work packages
Work package | Activities | Deliverables |
1. Landscape exploration (funders) | Discussion with MetaROR steering group | Survey design from findings |
2. Funder survey | Online survey of organisations in the RoRI network | Report on funder perceptions & readiness. |
3. In‑depth interviews | Semi‑structured interviews with programme officers selected from the survey. | Thematic analysis identifying pain points & emerging good practice. |
4. Reviewer survey (conditional) | If respondents cite reviewer resistance, deploy a survey to reviewers (via funders). | Dataset on reviewer attitudes. |
5. Synthesis & dissemination | Analyse results from the survey(s) and interviews, relating these with relevant literature. | Policy brief; Working paper (preprint) |
6. Design sprints with the MetaROR team | Co‑create metadata schema, UX wireframes & dashboard output. | Alpha prototypes; documentation. |
7. Pilot implementation | Integrate prototypes into MetaROR | An evaluation component integrated into MetaROR |
Project timeline
The MetaROR platform launched in November 2024 and the project based on the platform runs until April 2026.
Work package | Months |
1. Landscape exploration (funders) | April – June 2025 |
2. Funder survey | July – August 2025 |
3. In‑depth interviews | September- October 2025 |
4. Reviewer survey (conditional) | November – December 2025 |
5. Synthesis & dissemination | January – April 2026 |
6. Design sprints with the MetaROR team | TBD |
7. Pilot implementation | TBD |
Outputs
The MetaROR project is at an early stage (as the MetaROR platform needed to be operational for a certain period before the research can take place). We anticipate project outputs to include:
- Evidence on funder and reviewer behaviour around PRC outputs
- The interoperable “PRC Evaluation Summary” specification is ready for MetaROR deployment
- Policy guidance enabling funders to cite PRC outputs in calls and panel briefs
- Prototype dashboard visualising PRC signals, such as traffic‑light badges and narrative highlights (linked to objective Pilot & Evaluate)
- Contribute to a cultural shift away from journal‑level metrics toward equitable funding decisions
Additional information
To build MetaROR into a community-driven collaboration that reflects the rich and growing diversity of metaresearch, we hope to further expand its project and editorial team. We invite anyone interested in contributing to MetaROR’s development and implementation to reach out to us.