D1.1 OSPO-RADAR Stakeholder Requirements and Specifications
Project Title | Open Source Program Office Research Assets Dashboard and Archival Resource |
Project Acronym | OSPO-RADAR |
Grant Agreement No. | 2025-25188 |
Start Date of Project | 2025-06-01 |
Duration of Project | 24 months |
Project Website | https://www.softwareheritage.org/2025/04/02/ospo-radar/ |
Work Package | WP1, Project management, specifications, planning, and testing |
Authors | Morane Gruenpeter, Renaud Boyer, Sabrina Granger |
Reviewer(s) | Clare Dillon, Christopher Erdman, Bastien Guerry, Daniel S. Katz, Violaine Louvet, Micha Moskovic ADD COMMUNITY REVIEWERS |
Date | 2025-09-30 |
Version | V1.0 FOR COMMUNITY REVIEW |
Document log
Issue | Date | Comment | Author/Editor/ Reviewer |
v.0.1 | 06-08-2025 | Inception and first structure draft | M. Gruenpeter & R. Boyer |
v.0.2 | 21-08-2025 | Writing sprint and gaps identification | M. Gruenpeter, R. Boyer & S.Granger |
v.0.3 | 04-09-2025 | Draft of main sections | M. Gruenpeter, R. Boyer & S.Granger |
v 0.4 | 22-09-2025 | Added workflows, use cases & personas | M. Gruenpeter, R. Boyer & S.Granger |
1.0 | 26-09-2025 | Version 1.0 shared for community review | M. Gruenpeter, R. Boyer & S.Granger |
Abstract
The OSPO-RADAR project will develop the technical tooling to enable Academic Open-Source Programme Offices (OSPOs) to efficiently archive, manage, and showcase their institutions' software productions. This will elevate research software to a first-class research output and enable an evidence-based approach to software development. Built on the integration with Software Heritage, the portal will offer enhanced metadata management, streamlined workflows, and institutional visibility, fostering a sustainable ecosystem for open-source software management.[a][b]
Terminology
Terminology/Acronym | Definition |
ARDC | Archive, Reference, Describe & Cite Referencing a software refers to: making software artifacts identifiable by attributing SoftWare Hash Identifiers (SWHIDs). |
CFF | Citation File Format |
CRUD | Create, Read, Update, and Delete |
DMP | Data Management Plan: “...is a living summary document that provides assistance with organising and planning all the phases in the lifecycle of data. It explains, for each dataset, how project data will be managed, from creation or collection to sharing and archiving.” (Université Paris Saclay, 2019) Software can be tracked in a software management plan (SMP). |
FAIR | Principles based on community expectations in respect of research outputs - findable, accessible, interoperable, and reusable. |
JATS | Journal Article Tag Suite: XML format used to describe scientific literature published online |
MVP | “A Minimum Viable Product is a version of a product with just enough features to be usable by early customers who can then provide feedback for future product development[1]” |
PID | Persistent identifier: generally expected to be unique, resolvable, and persistent. |
RSMD | Research Software MetaData (guidelines) |
SIRS | Scholarly infrastructures for research software (report) |
SPDX | System Package Data Exchange is an open standard capable of representing systems with digital components as bills of materials. |
SWHID | SoftWare Hash Identifiers are designed to identify permanently and intrinsically all the levels of granularity that correspond to concrete software artifacts: snapshots, releases, commits, directories, files and code fragments. SWHID became ISO/IEC international standard 18670 on April 23, 2025. |
SWORD | Simple Web-service Offering Repository Deposit is an interoperability standard developed by JISC extending ATOM. |
Table of contents
2/ Stakeholder groups and personas
2.1/ What is the Open Source Programme Office Scope in Academia?
2.2/ Scholarly Ecosystem stakeholders
2.3/ Personas, as a collective image of a segment of the target audience
3/ Current landscape and associated challenges
3.1/ The existing infrastructures in the ecosystem
3.2/ Building on existing infrastructures and components
3.3/ The Role of CodeMeta in OSPO-RADAR
4.3/ Objectives (aligned with the SIRS pillars)
4.4/ Existing features OSPOs can use
5/ Minimum Viable Product: features and workflows overviews
5.1/ Account creation, login and management
5.2/ Populate the software source code dashboard
5.3/ View dashboard, curate and filter
5.4/ Public view and search capabilities
6/ Non-functional requirements to address
6.2/ Performance / compatibility
6.4/ Sustainability & maintainability
7/ What’s next? Interoperability & reusability of the Dashboard.
7.1 / integration with existing tools
7.2/ Functionalities and capabilities to keep in mind after MVP
8/ The road ahead: a sustainable service model
Appendix B: Persona Academic OSPO Manager
Appendix C: A collection of use cases from the RSMD workshop (2023)
Appendix D: Review grid of the RSAC components specifications
The OSPO-RADAR project (Open Source Program Office Research Assets Dashboard and Archival Resource) addresses a growing need in the research ecosystem: the effective management, preservation, and recognition of software as a first-class research output.[c]
At the highest policy level, such as the UNESCO recommendations on Open Science, the DORA declaration, and the French National Plan for Open Science, software is increasingly acknowledged as a key scholarly product. Yet, unlike articles or datasets, the infrastructures supporting the archival, curation, and metadata management of software remain underdeveloped.
The inherent complexity of software, with its dynamic nature, dependency layers, and varied documentation, poses unique challenges for discoverability, preservation, and attribution. For Open Source Program Offices (OSPOs) within academic and research institutions, these gaps translate into operational difficulties. OSPOs play a critical role in managing and promoting open source practices, yet they face recurring obstacles in:
(C. Dillon, 2025)
Effective metadata management is central to overcoming these challenges. Metadata links software to related publications, datasets, and contributors, thus strengthening its academic recognition and impact. Standards such as CodeMeta, widely adopted by the community and endorsed by EOSC, RDA, and Force11, provide a strong foundation. However, they require further refinement to address OSPO-specific workflows, such as capturing license compatibility or documenting diverse forms of software documentation.
To meet these needs, OSPO-RADAR proposes the development of a Dashboard. Built on the Software Heritage archive and grounded in metadata standards, the dashboard will offer OSPOs a unified, interoperable, and scalable platform to:[d][e]
This dashboard aims to simplify workflows, enhance discoverability, and increase the visibility of research software, thereby empowering OSPOs to better support their institutions and contribute to the sustainability of open science.
The
SIRS (Scholarly Infrastructures for Research Software, EOSC 2020) distills four essentials for a healthy software scholarship ecosystem—archive, reference, describe, cite—and couples them with a cross-cutting call for interoperability and researcher education (notably via publishers). Together, these set the north star for OSPO-RADAR: move beyond principles into day-to-day operations that institutions can actually run.
Why SIRS matters now
Despite broad consensus, practice lags: hand-offs between repositories remain brittle, identifiers aren’t used consistently, metadata is shallow or siloed, and citation guidance is fragmented. Infrastructures have the leverage to normalize practice, provided the tooling makes compliance the path of least resistance. OSPO-RADAR operationalizes SIRS recommendations via ready-to-run flows, opinionated defaults, and actionable signals that the OSPO offices can adopt quickly.
Archive
Reference
Describe
Cite
The scope of D1.1 is to capture the requirements, expectations, and challenges of Open Source Program Offices (OSPOs) in academic and research institutions, in order to shape the functional design of OSPO-RADAR. The methodology combined two complementary approaches: (1) gathering structured feedback through a survey distributed across the OSPO and research software community, and (2) validating and enriching these findings through a community review process, during September and October 2025.
The OSPO-RADAR survey collected responses from 16 organizations, with a majority representing universities (68.8%), followed by research institutes (25%).[l] The maturity of OSPOs varied: most respondents indicated that their initiatives are just getting started (53.8%), while others reported defined roles and goals (23.1%) or defined policies with limited tooling (15.4%). Only one organization reported advanced workflows with automation.
Key findings included:
These results provide a clear picture of the diverse contexts and priorities of OSPOs in academia, confirming both the demand for dedicated tooling and the importance of interoperability with existing platforms.
In addition to the survey, a community review process was implemented to validate and refine the preliminary findings. This process combined synchronous and asynchronous exchanges to maximize participation and depth of feedback:
This combined approach confirmed the survey’s main conclusions, while emphasizing the importance of flexibility to accommodate institutions at different maturity levels. It also reinforced the value of transparency and continued engagement, ensuring that OSPO-RADAR remains shaped by its community.
Within the academic landscape, Open Source Programme Offices (OSPOs) play a strategic role in bridging institutional research practices with the broader open-source ecosystem.[m] While corporate OSPOs primarily address compliance, risk management, and efficiency, academic OSPOs are distinguished by their focus on advancing Open Science and supporting research software as a first-class scholarly output.
The scope of an academic OSPO typically includes:
In this way, OSPOs in academia contribute not only to the efficient management of research software assets but also to the cultural shift required for software to be fully recognized as a critical component of scholarly communication.
The Software Source Code Identification Working Group (SCID WG), a joint effort under the Research Data Alliance (RDA) and FORCE11, produced a landmark output on use cases and identifier schemes for persistent software source code identification. Published in 2020, the report identifies a broad range of stakeholders that are listed below.
Infrastructures and tooling:
Organization types and actors:
Research Data Alliance/FORCE11 Software Source Code Identification WGet al. (2020). Software Source Code Identification Use cases and identifier schemes for persistent software source code identification (1.1). Zenodo. https://doi.org/10.15497/RDA00053 |
Persona’s name | Job title | Main needs | Pain points |
Sofia | Academic OSPO Manager |
|
|
Christine | Academic library director |
|
|
Leo | Infrastructure technical manager |
|
|
Jitin | Researcher |
|
|
Mostafa | Lead of a research unit |
|
|
A first observation from both the OSPO-RADAR survey and the SIRS Gap Analysis Report (Azzouz-Thuderoz et al., 2023) highlights both the strengths and limitations of current infrastructures. Thus[s], a study (Carlin et al., 2023) highlighted the fact that software isn’t included as a type of research output within many repositories in the UK.
(Carlin et al., 2023) A concerning lack of software stored in institutional repositories in the UK
While there are multiple services supporting the archiving, referencing, and publishing of research software, OSPOs still lack an integrated view and proper tooling to manage their software assets.
“No OSPO is an Island” (Gilles Mathieu)
Stakeholder feedback illustrates these gaps:
Infrastructure type | Purpose | Potential challenges for the end-users |
Scholarly repositories, Institutional repositories | Store, manage, and provide access to datasets, enabling their sharing, preservation, and reuse. | Software is not always included as a type of research output within a repository: deposit is not possible[t] Coverage issue in terms of archival: “the vast majority of repositories do not feed the universal archive [SWH] yet” (Azzouz-Thuderoz et al., 2023) Difficult to identify and track the software pieces’ evolution Descriptions may not provide information that may be relevant for an OSPO, as the extraction of intrinsic metadata, which is found in the source code itself, is not supported. Intrinsic identifiers are not supported by most of the infrastructures |
Aggregators | Integrate, harmonize, and offer access to information originating from different sources, which should otherwise be accessed independently. Enrich the aggregated content with information that was not available at the sources. | |
Forges | Facilitate collaborative work on a software project. A forge contains tools like a versioned source code repository, discussion forums, an automated testing environment and so on. (French Ministry of Higher Education and Research.) | Coverage: even if an institutional forge is implemented, end-users may collaborate in external instances. Archival: forges are collaborative tools, but are not designed for long-term access. |
Publishing platforms | Infrastructures associated with journals or publishing companies that are used to deposit scholarly publications | “Only a few publishers have already explicitly integrated this recommendation [the software associated with the publication should be equipped with proper metadata].” Examples of successful implementations: Dagstuhl, IPOL, eLife, JTCAM, and the journals hosted on the Episcience platform |
The added value of OSPO-RADAR is to transform a fragmented ecosystem into an actionable, interoperable, and institutionally relevant view of research software, equipping OSPOs with the tools they currently lack to manage, sustain, and showcase their software assets.
“The issue [...] is in the interoperability of disparate platforms, scientific disciplines and descriptors for research artefacts, i.e., individual repositories cannot communicate in a common language to describe their data to each other.”
(Carlin et al., 2023)
The OSPO-RADAR dashboard relies on machine-actionable metadata to archive, track, and valorize institutional software assets. CodeMeta provides a JSON-LD schema that captures key descriptive, administrative, and provenance metadata, while mapping to external standards such as schema.org, Dublin Core, and DataCite. By adopting CodeMeta as the backbone metadata model, OSPO-RADAR ensures:
The following CodeMeta properties are central to OSPO-RADAR use cases:
Property | Description | OSPO Use Case Example | Related PID / Standard |
name | Human-readable title of the software | Institutional inventory of software assets | schema.org name |
description | A description of the item. | A research output abstract. | schema.org description |
author / contributor | People or organizations responsible for the software | Linking to ORCID and ROR for attribution. Adding author role. | ORCID, ROR |
version | Software version or release | Tracking evolution of institutional outputs | Semantic Versioning |
license | Applicable license(s) | Monitoring license compatibility and compliance | SPDX License IDs |
programmingLanguage | Primary implementation languages | Reporting language trends in institutional software | schema.org mapping |
relatedPublication | Publications describing or citing the software | Linking code to Scholarly infrastructure (e.g HAL) bibliographic records | DOI, HAL ID |
funding | Funding sources that supported development | Tracking grants and sponsor contributions | Grant DOIs, funder ROR IDs |
readme / documentation (proposed) | Documentation sources (README, build instructions, manuals) | Ensuring reproducibility and usability in reporting | To be added via CodeMeta PR |
Initial stakeholder consultations confirm that CodeMeta should be extended for OSPO workflows. The following gaps are prioritized:
These requirements will be discussed with the CodeMeta governance community (SciCodes, EOSC, RDA, Force11) to ensure global alignment.
Within OSPO-RADAR, CodeMeta will serve four core functions:
By adopting and extending CodeMeta, OSPO-RADAR ensures that the dashboard is technically robust and aligned with international standards, supporting OSPOs in managing software as a recognized scholarly output.
The OSPO-RADAR project aims to deliver a Source Code Assets Dashboard (OR Dashboard) that enables academic and research OSPOs to archive, manage, and valorize their software outputs. Built on top of the Software Heritage archive and the CodeMeta standard, it provides tools for metadata curation, reporting, and institutional visibility.
This section lists the Software Heritage capabilities OSPOs can use today—without waiting for an OSPO-RADAR dashboard. Through web tools, APIs, and a push-based deposit service (SWORD v2), OSPOs can trigger on-demand archiving (Save Code Now), push tarballs or metadata-only records, run bulk save jobs, obtain stable SWHIDs for referencing, and generate publisher-ready citations from intrinsic metadata (CodeMeta/CITATION.cff).
The section captures the basic operations that the OSPO-RADAR Dashboard needs to support. They describe how a user interacts with the system, how the Dashboard connects with the Software Heritage archive, and what minimal features are required for a Minimum Viable Product (MVP). The idea is to keep things lightweight and usable, while ensuring that provenance and integration are handled correctly from the start.
The MVP delivers the minimal end-to-end path, including:
Following the community review of the proposed MVP, a clear boundary will be set for the OSPO-RADAR project (2025-2027) and all other use cases, features and tools will be deferred to the backlog as issues.
#W1 Account creation: General description of the feature’s workflow How new users request access, with validation by an administrator and retrieval of an API token from Software Heritage. This sets up the user profile and organization link. Main actor: OSPO manager / dashboard admin | |
Needs or pain-points to be tackled | OSPO-RADAR capabilities |
Adding a new OSPO to the dashboard is a process that requires manual validation to confirm the participants' identity and train them on the tool. |
|
The OSPO must then be autonomous in managing user accounts linked to its organization. |
|
We should be able to differentiate between two different user levels in an OSPO: a contributor level to manage only the collection & metadata, and a manager level who can also make changes to the organization's settings (description, users, etc.). |
|
It is important to minimize the amount of personal data collected during account creation. |
|
Whenever possible, existing organization/user accounts from Software Heritage's centralized authentication should be reused. |
|
User story As a prospective OSPO collaborator, I want to request an account on the OSPO-RADAR Dashboard, so that I can access my institution’s collection and contribute entries. | |
Associated use case A prospective user opens the OSPO-RADAR Dashboard and submits an account request. The Dashboard records the request and immediately notifies the designated Admin for that organization. A validation phase follows:
End state: Either the request is closed as rejected (user informed), or the user is active, linked to the correct organization, and the SWH API token is stored for later authenticated operations. | |
Role | Description (+Personas) | Create collection & Bulk addition | Update & Validate entries in collection | View collection |
Admin - manager | Full control: can create sub-collections, bulk add, update/validate entries, and view. | Yes | Yes | Yes |
Curator - contributor[u] | Can enrich and maintain collections: update/validate entries and view. | No | Yes | Yes |
Researcher | Read-only access to consult collections. | No | No | Yes |
Large public | General audience with view-only access. | No | No | Yes |
#W2 General description of the feature’s workflow Standard authentication with error handling and password reset. Nothing fancy, but secure and consistent with institutional practices. Ensure that different institution profiles can be accessed by role based levels of access. Main actor: User (contributor) / User (admin) | |
Needs or pain-points to be tackled | OSPO-RADAR capabilities |
The authentication mechanism must be secure: requiring complex passwords and recommending two-factor authentication. |
|
It should be easy to change a password without needing to contact a manager or Software Heritage. |
|
User story
| |
Associated use case
| |
Name / Anonymous | Comment | +Upvotes |
AAI authentication (EduGain | ||
Feedback summary and response: | ||
#W3 General description of the feature’s workflow Allows an OSPO manager to submit a list of software projects/urls to populate the collection. The workflow calls “Bulk Save Request[6]” and annotates the origin with the institution information. Main actor: Admin / OSPO Manager | |
Needs or pain-points to be tackled | OSPO-RADAR capabilities |
The initial import[v] process needs to be simple and not rely on a specific tool. A simple list of software URLs (link to the project on a forge, a package manager, etc.) should work. |
|
A detailed report should be available after the import to know what has been added to the collection, archived, etc. |
|
User story As an OSPO, I want to import a list of software to my collection and view a status report of this import, so that I can quickly populate the collection from existing sources, ensure each origin is archived and assigned an SWHID, and act on a clear per-item outcome without manual re-entry. | |
Associated use case
| |
#W4 General description of the feature’s workflow For one-off additions. A researcher provides a URL, a curator is notified and validates it. If the project isn’t there yet, a Save Code Now request is triggered before adding it to the collection. Have software contributions automatically included in institutionally validated exports that can be reused in CVs, annual activity reports, and grant applications. Main actor: Researcher / Scientist / Research manager | |
Needs or pain-points to be tackled | OSPO-RADAR capabilities |
Avoid friction due to account creation, login / password etc. and at the same time the form to signal software must not be easily discoverable outside the institution. |
|
OSPO staff need simple curation tools/workflows to triage submissions before anything appears on the public “front-shop” view. |
|
If a project isn’t archived, a simple mechanism in the workflow should make it easy to do. |
|
Researchers add mentions of software in articles that are deposited in scholarly repositories (e.g HAL) and to ease the additions of these inputs in an institutional collection, a notification meachnism to the OSPO, could facilitate the input. | Optional: COAR Notify from other infrastructures to include software origins to a collection (relates to the workflow of identifying software in articles) |
User story As a researcher, I want my software contributions to appear in the collection. | |
Associated use case
| |
Metadata fields covered:
| |
#W5 General description of the feature’s workflow For one-off additions. A curator searches a software by its name or url in Software Heritage’s database. If the project isn’t archived, the Curator triggers Save Code Now and then adds it to the collection. Main actor: Curator | |
Needs or pain-points to be tackled | OSPO-RADAR capabilities |
Provide access to Software Heritage’s search engine. |
|
Provide a simple interface to query the search engine and identify matching projects, to enable on-off additions. |
|
User story As a Curator, I want the piece of software I’ve identified to appear in the collection after searching by name or URL, so that I can quickly curate the catalog without manual re-entry and ensure an SWHID exists. | |
Associated use case
| |
Workflow overview: #W5 search and add software
Name / Anonymous | Comment | +upvoters |
Feedback summary and response: | ||
#W6 General description of the feature’s workflow Once the collection is populated, users (OSPO Manager / Curator) need a simple list view of software projects associated with their institution, with basic filters (dates, research teams, licenses, programming languages). The goal is clarity and fast retrieval—not advanced analytics—while laying the groundwork for later reporting. Main actor: OSPO Manager / Contributor (Curator) | |
Needs or pain-points to be tackled | OSPO-RADAR capabilities |
No reliable baseline inventory of software produced vs. used across a wide, multi-unit community; data scattered in many tools; hard to start a landscape analysis. “Since we are composed of many of the universities in XXX we are an interesting catalyst and example (for the XXX Research Council). Some initial work has been done with XXX and we expect some analysis in July but also via the service we procure in the Fall.” “While we are just starting out, XXX, and our organisation (XXX), have particular expertise when it comes to research software but we are starting with a landscape analysis of software produced and used in our community which is [country]-wide in the Life Sciences. The challenge is having a baseline to start so that we can build on with guidance and overall monitoring.” |
|
#W7 General description of the feature’s workflow Institutions need more than a static list of software projects. They need dynamic, up-to-date views that support monitoring, decision-making, and reporting. Filtering allows OSPO managers to quickly identify trends, gaps, or compliance issues without manual data collection. Main actor: OSPO Manager | |
Needs or pain-points to be tackled | OSPO-RADAR capabilities |
Need to scope the landscape by organizational unit/team. Licensing posture unclear across the portfolio. Use filters (e.g., by department, programming language, license type, activity date, contributor) to explore the institutional software portal and generate real-time insights. |
|
Metadata fields covered
| |
User story As an OSPO manager, I can apply filters in the collection view of the OSPO-RADAR Dashboard to instantly see which departments have the most active projects, which licenses are most commonly used[y], or which projects have not been updated recently, so that I can provide accurate reports to leadership and funders, and identify areas needing support. | |
Associated use case
| |
#W8 General description of the feature’s workflow Curators open a software record in the internal dashboard, edit/enrich key metadata, validate fields (licenses, teams, ORCID links, funding), and save. Admin/Managers may review and publish changes to the public “front-shop” view. The goal is to improve data quality with minimal friction while keeping provenance clear (what came from the source vs what the institution added). Main actor: Curator / Contributor Secondary actor: Admin / Manager (review & publish) | |
Needs or pain-points to be tackled | OSPO-RADAR capabilities |
Missing or inconsistent metadata across records. Difficulty normalizing licenses, languages, teams, and identifiers. |
|
Capturing funding and related outputs (papers, datasets) minimally. |
|
Linking people and affiliations reliably. |
|
Distinguishing what comes from the code repo vs institutional curation. | Provenance badges (Intrinsic from repo / Extrinsic curated); field-level source indicators. |
Hard to tell how the institution relates to a piece of software (created vs contributed vs used vs dependent). | Add four explicit checkboxes on submit/curate forms:
|
Quickly spotting incomplete records. | Bonus: Data-quality hints (e.g., “Missing license”, “No ORCID”, “Unknown language”). |
User Story As a curator, I want to flag external projects our teams Contributed to so we can credit staff contributions and surface them in evaluations. | |
Associated use case
| |
(Guerry, 2025)
#W9 General description of the feature’s workflow Generate a CSV file containing an institutional inventory of software components, their source repository URLs, and their corresponding Software Hash identifiers (SWHIDs). Reason: Main actor: OSPO Manager / Research Support Staff | |
Needs or pain-points to be tackled | OSPO-RADAR capabilities |
Institutions need certainty that every software component in their inventory is safely archived and has a clear, reproducible SWHID. Without this, exported lists may include unarchived or untraceable code, undermining trust in reporting and compliance. | Metadata fields covered:
|
User story As an OSPO manager, I can export a CSV with three columns: Component/Project Name, URL (origin), SWHID; so that my institution has an authoritative list of its archived software assets, which can be used for internal tracking or shared with funders. | |
Associated use case
| |
Name / Anonymous | Comment | +upvoters |
Feedback summary and response: | ||
#W10 General description of the feature’s workflow Beyond the internal dashboard, institutions often need a public-facing view of their software outputs. The idea is to expose a “front-shop” that displays selected projects from the dashboard, with metadata from intrinsic sources or from institutional annotations. This is a read-only public portal that surfaces a curated subset of records. It includes:
The public site is fed directly from the dashboard (source of truth). Institutions control visibility through publish/unpublish, feature/unfeature toggles. The list of published entries can be displayed on any institutional website, using the dashboard API endpoint. Main actor: All (public visitors, researchers, funders, OSPO staff) | |
Needs or pain-points to be tackled | OSPO-RADAR capabilities |
Institutions want to showcase software outputs publicly without duplicating data. | The front-shop portal is automatically fed from the curated internal collection; OSPO staff select which records are public. No re-entry required. |
Need to control publication while keeping workflows lightweight. | Publish/unpublish and feature/unfeature toggles in the dashboard, with instant sync to the public site. |
Make it easier for machines to navigate the pages by implementing Signposting patterns | Public pages include Signposting HTTP link headers and structured links (to SWHID, DOI, related publications), enabling automated harvesting and interoperability. |
Institutions need to know impact and reach of their portfolio.[aa][ab][ac] | Built-in basic analytics: counts of page views, downloads, and outbound clicks, aggregated per project and per institution. |
User story
| |
Associated use case
| |
Name / Anonymous | Comment | +upvoters |
Feedback summary and response: | ||
To allow the greatest number of people to access the site's content, we aim for an AA conformity level with The Web Content Accessibility Guidelines.
We will depend on external APIs (search, deposit, etc.) so we’ll need to find ways of displaying loading states.
The public website must:
The dashboard itself will require javascript and at least a tablet-sized screen to work properly.
The OSPO-RADAR Dashboard will comply with EU and French law and transparent policies for cookies, privacy, and terms:
The OSPO-RADAR Dashboard will be developed as open source project, under a permissive license and, and can be self-hosted, forked, and extended without lock-in. We will:
OSPO-RADAR’s users and partners emphasize different starting points and levels of ambition. Taken together, these inputs point to a modular product that is easy to plug in, easy to ignore if out-of-scope, and easy to reuse elsewhere.
Key signals from partners
Design implication: Favor simple UX and a minimal, well-chosen feature set over breadth. Make integration the default path to value.
Evaluate integrations
Interaction pattern
From checklists to guided flows
Outcomes
Once the dashboard is in place, another potential emerges with the combination of regular institutional reporting. These report, drawing on the dashboard’s data, the Software Heritage archive and the a massive data analysis of the archive, could transform raw information into actionable insights. They enable institutions to:
As Roberto Di Cosmo has often emphasized, university OSPOs are the local execution engines of open-source and open-science policy[ae]: they sit at the interface between researchers and institutional functions (tech transfer, legal, research office, libraries/open science), track national and local policy evolution, and build a coherent, institution-wide view of software production. In practical terms, they coach and equip research teams across the full lifecycle: Archive & Reference (deposit in Software Heritage and, where relevant, HAL; assign stable SWHIDs), Describe & Cite (maintain codemeta.json; generate publisher-ready software citations linked to research outputs and evaluation), Compliance & License (default-open decision flow, compatibility checks, and legal support), and Development & Dissemination (good forge hygiene, CI/testing, packaging, onboarding, and community practices) (Di Cosmo at UGA).
OSPO-RADAR is designed as the operational gateway for these tasks, giving OSPOs the workflows, metadata, and dashboards needed to make policy actionable at scale.
The pilot report on French academic institutions preserved in Software Heritage (Di Cosmo, July 2025) demonstrates the feasibility of such a product. Using the institutional analysis pipeline, Software Heritage was able to provide:
The OSPO-RADAR dashboard, complemented by institutional reports, establishes a product ecosystem that extends beyond the project’s two-year timeline. Reports can be offered as:
For OSPOs, institutional-level reports represent far more than static analyses: they function as strategic tools that allow institutions to monitor software production across departments and labs, identify flagship projects and emerging communities, benchmark practices such as metadata adoption and project sustainability, demonstrate impact to funders and policy bodies, and support compliance with Open Science mandates, including archiving and citation readiness. By packaging the dashboard’s data into exportable reports, OSPO-RADAR extends its role beyond day-to-day management, providing OSPOs with a means to communicate effectively with leadership, funders, and external stakeholders.
In this way, OSPO-RADAR moves from a one-off project deliverable to a sustainable service model, positioning Software Heritage as both an archival resource and a provider of institutional intelligence for research software and beyond.
Description/Link |
https://www.zotero.org/groups/5682994/faircoreeosc_d./library |
Alliez, P., Cosmo, R. D., Guedj, B., Girault, A., Hacid, M.-S., Legrand, A., & Rougier, N. (2020). Attributing and Referencing (Research) Software: Best Practices and Outlook From Inria. Computing in Science & Engineering, 22(1), 39–52. https://doi.org/10.1109/MCSE.2019.2949413 |
Azzouz-Thuderoz, M., Del Cano, L., Castro, L. J., Dumiszewski, Ł., Garijo, D., Gonzalez Lopez, J. B., Gruenpeter, M., Schubotz, M., & Wolski, M. (2023). SIRS Gap Analysis Report. https://zenodo.org/records/10376006 |
Bilder, G., Lin, J., & Neylon, C. (2020). The Principles of Open Scholarly Infrastructure. https://doi.org/10.24343/C34W2H |
Carlin, D., Rainer, A., & Wilson, D. (2023). Where is all the research software? An analysis of software in UK academic repositories. PeerJ Computer Science, 9, e1546. https://doi.org/10.7717/peerj-cs.1546 |
Declaration on Research Assessment. (2013). San Francisco Declaration on Research Assessment. DORA. https://sfdora.org/read/ |
Di Cosmo, R. (2025, September 8). Academic Software Landscape overview through the Software Heritage looking glass. Access to research software: Opportunities and challenges, OECD. Zenodo. https://zenodo.org/doi/10.5281/zenodo.17075792 |
Di Cosmo, R., Gruenpeter, M., & Zacchiroli, S. (2018). Identifiers for Digital Objects: The Case of Software Source Code Preservation. 1–9. https://doi.org/10.17605/OSF.IO/KDE56 |
Di Cosmo, R. (2023, March). SWHID specification kickoff meeting. SWHID kick-off meeting, Online Conference. https://hal.science/hal-04121507 |
Dillon, C. (2025). Academic OSPOs, What are they & Why we need them! 5ème séminaire de l’écosystème Recherche Data Gouv, Lille. https://rdg-seminaire5.sciencesconf.org/program/details |
Directorate-General for Research and Innovation (European Commission) & EOSC Executive Board. (2022). Strategic Research and Innovation Agenda (SRIA) of the European Open Science Cloud (EOSC). Publications Office of the European Union. https://data.europa.eu/doi/10.2777/935288 |
Eglen, S., & Nüst, D. (2019). CODECHECK: An open-science initiative to facilitate sharing of computer programs and results presented in scientific publications. Septentrio Conference Series, 1. https://doi.org/10.7557/5.4910 |
EOSC Executive Board & EOSC Secretariat. (2020). Scholarly infrastructures for research software. Report from the EOSC Executive Board Working Group (WG) Architecture Task Force (TF) SIRS. European Commission. Directorate General for Research and Innovation. https://data.europa.eu/doi/10.2777/28598 |
Garijo, D., Arroyo, M., Gonzalez, E., Treude, C., & Tarocco, N. (2024). Bidirectional Paper-Repository Tracing in Software Engineering. Proceedings of the 21st International Conference on Mining Software Repositories, 642–646. https://doi.org/10.1145/3643991.3644876 |
Granger, S., Gruenpeter, M., Monteil, A., Nivault, E., & Sadowska, J. (2022, October 26). Modérer un dépôt logiciel dans HAL: Dépôt source et dépôt SWHID. Inria ; CCSD ; Software Heritage. https://inria.hal.science/hal-01876705 |
Gruenpeter, M., Sadowska, J., Nivault, E., & Monteil, A. (2022). Create software deposit in HAL. Inria ; CCSD ; Software Heritage. https://inria.hal.science/hal-01872189 |
Gruenpeter, M., Granger, S., Monteil, A., Chue Hong, N., Breitmoser, E., Antonioletti, M., Garijo, D., González Guardia, E., Gonzalez Beltran, A., Goble, C., Soiland-Reyes, S., Juty, N., & Mejias, G. (2023). D4.4—Guidelines for recommended metadata standard for research software within EOSC. https://doi.org/10.5281/ZENODO.8199104 |
Guerry, B. (2025, September 24). Renforcer la visibilité et l’interconnexion entre les OSPOs du secteur public. Inauguration de l’Open Source Program Office de l’Université Grenoble Alpes, Saint Martin d’Hères. https://ospo-uga.sciencesconf.org/data/pages/bg_dinum_uga_2025_v1.0.pdf |
Katz, D. S., & Barker, M. (2023). The Research Software Alliance (ReSA). Upstream. https://doi.org/10.54900/zwm7q-vet94 |
Le Berre, D., Jeannas, J.-Y., Cosmo, R. D., & Pellegrini, F. (2023). Higher Education and Research Forges in France—Definition, uses, limitations encountered and needs analysis [Report]. Comité pour la science ouverte. https://doi.org/10.52949/37 |
|
Malone, J., Brown, A., Lister, A. L., Ison, J., Hull, D., Parkinson, H., & Stevens, R. (2014). The Software Ontology (SWO): A resource for reproducibility in biomedical data analysis, curation and digital preservation. Journal of Biomedical Semantics, 5(1), 25. https://doi.org/10.1186/2041-1480-5-25 |
Mayernik, M. S. (2016). Research data and metadata curation as institutional issues. Journal of the Association for Information Science and Technology, 67(4), 973–993. https://doi.org/10.1002/asi.23425 |
Rios, F. (2018). Incorporating Software Curation into Research Data Management Services: Lessons Learned. International Journal of Digital Curation, 13(1), Article 1. https://doi.org/10.2218/ijdc.v13i1.608 |
Task Force on Best Practices for Software Registries, Monteil, A., Gonzalez-Beltran, A., Ioannidis, A., Allen, A., Lee, A., Bandrowski, A., Wilson, B. E., Mecum, B., Du, C. F., Robinson, C., Garijo, D., Katz, D. S., Long, D., Milliken, G., Ménager, H., Hausman, J., Spaaks, J. H., Fenlon, K., … Morrell, T. (2020). Nine Best Practices for Research Software Registries and Repositories: A Concise Guide (arXiv:2012.13117). arXiv. http://arxiv.org/abs/2012.13117 |
Treloar, A., & Wilkinson, R. (2008). Rethinking Metadata Creation and Management in a Data-Driven Research World. 2008 IEEE Fourth International Conference on EScience, 782–789. https://doi.org/10.1109/eScience.2008.41 |
Université Paris Saclay. (2019, November 29). Introduction to data management plans. Université Paris-Saclay. http://www.universite-paris-saclay.fr/en/recherche/science-ouverte/les-donnees-de-la-recherche/introduction-data-management-plans |
van de Sandt, S., Nielsen, L. H., Ioannidis, A., Muench, A., Henneken, E., Accomazzi, A., Bigarella, C., Lopez, J. B. G., & Dallmeier-Tiessen, S. (2019). Practice meets Principle: Tracking Software and Data Citations to Zenodo DOIs (Version 1). arXiv. https://doi.org/10.48550/ARXIV.1911.00295 |
Young, J., Barba, L. A., Choudhury, S., Flanagan, C., Lippert, D., & Littauer, R. (2024). A Definition of an Academic OSPO. https://doi.org/10.5281/ZENODO.13910683 |
Identified Risk / Cause of Failure | Proposed Mitigation Measure |
The product doesn’t match functional needs. | Define key functionalities and a clear MVP. |
No active users or engagement. | Use an agile, iterative approach with frequent user testing and feedback. |
Users themselves don’t know or can’t articulate what they need. | Early engagement, lightweight prototypes, and validation sessions. |
Too many divergent use cases → scattered development and feature overload. | Define scope via governance and Product Owner oversight. |
Service architecture or delivery model (SaaS, archive integration) is unclear or unmanageable. | Clarify delivery model early; validate feasibility with the technical team. |
Operating costs are too high. | Design for efficiency; plan, budget and sustainability model from start. |
Users can’t operate the tool autonomously → overload on the helpdesk. | Ensure usability; provide training and clear documentation. |
Governance is unclear → difficulty prioritizing features, risk of scope creep. | Establish a governance structure and validation mechanism. |
Some partners perceive the product as a competitor → threatens collaboration. | Communicate clearly why Software Heritage is the right actor to deliver this solution. |
Not all potential user groups were identified. | Map stakeholders early; revisit and update regularly. |
Another team or competitor develops a similar service in parallel. | Monitor the ecosystem and position OSPO-RADAR as complementary. |
Insufficient resources for development, deployment, and maintenance. | Secure adequate funding; adjust scope if needed. |
Academia and industry have fundamentally different needs. | Keep modular design; allow context-specific configurations. |
Negative reputation risk (e.g., concerns over tracking or privacy). | Include privacy protections (e.g., hide personal data when needed). |
Name: Sofia
Age: 38
Location: Lisbon, Portugal
Languages: Portuguese, English, Spanish
Professional life
Needs
Relations
Influences / Information sources
Motivation / Drivers
Tech Stack
Superpowers
Daily Routine
09:00 → Morning espresso + email check
10:30 → Meeting with research office on Horizon Europe reporting
12:00 → Lunch with IT lead
14:00 → Internal OSPO training session
15:30 → Review open source policy draft
16:00 → Review SonarQube analysis reports and CI/CD pipeline status
17:00 → OSPO network call
21:00 → Check GitHub PRs on OSPO-RADAR, CodeMeta
“Open is not just about the license, it's a culture — and we are engaged to implement open practices with articles, with data and with source code.”
Actor - Who? | Action - What? | Reason - Why? | Scenario | |
metadata | Stakeholder that does the action | Action needed | The goal of the action, why do this actor needs this action | As a [actor] I can [action] so that [reason] |
CodeRepository | Software developer | open an issue | contribute to an existing tool to improve it | |
identifier | Sw author or responsible asks for PID PID provider mints/calculates a PID | Ask Assign/Calculate | To make the resource easily identifiable to the world | As an Author/responsible of the software, I ask for a PID (extrinsic) or enable automatic calculation (intrinsic, e.g., by having the software in SWH) so my sw has a unique PID |
author | Curator or from code repository/README | Entered by a curator or harvested from another site or within the software | attribution/contact | As a collaborator, I can identify a creator of the code so that they get attribution in a citation |
dateModified | Developer / researcher | As a developer / researcher, I can check the modified date so that I know if the software is “up to date” (has not been worked on a long time ago). | ||
BuildInstructions | A user | needs to install or run the software | To use the software locally or in its own hardware (such as a HPC, or cloud) | |
DateCreated | Developer of the software | As a developer I can have credit and attribution for a software I worked on. | ||
Readme | A user | Wants to understand if the software is useful for their purpose | To select the software they are going to use | |
Readme | A user | Wants to understand the requirements of using a given software (see also software requirements) | To decide whether or not the requirements can be met before running the software | |
Readme | A user | Wants to know who to acknowledge | To add a reference to the software tool in a publication, or a poster or any other scholarly output | |
datePublished | Developer / Aggregator | Citation | So that people know when it was released. | |
version | User | refer a specific version of software | To be able to cite it for reproduction of older results from colleagues | As a user I can refer to a specific version of a software so that I can cite it when reproducing older research. |
Keyword | Search engine | Help in findability of the resource, but also in the classification of it. aries/Ontologies | Resources need to be classified in order to be easily findable | A postdoctorant looking for some software helping in analysing his data |
ProgrammingLanguage | User / search engine | search for language specific tools | To enable interoperability with other tools written in the same language To allow reuse by users who can use that language | A teacher wanting to illustrate a way to solve a problem using a specific language |
runtimePlatform | User / Contributor | Use (or develop) the software | To have compatible environment where the software should work as intended | As a user (or contributor) I can use (or develop) the software so that I have a compatible environment where the software works as it should. |
license | Authors of the software | Make a decision about a proper license and add (a) LICENSE file(s) | To make a clear statement, what are the conditions and terms that apply to use the product/software | As a developer I want to be able to clearly state, who should be allowed to do what with my software As a user, I want to know, what I am allowed to do with this software (use it, build on it) |
developmentStatus | Developers | Description of development status, e.g. Active, inactive, suspended. See repostatus.org | Inform the public if a software is live or outdated. Important information to decide, if I - as a researcher - want to use or build on this software for my research | As a user, I want to know, if I can use this software for my research. As a developer, I want to know, if anyone is actively maintaining this software |
embargoDate | Authors when publishing software via any sort of repository? | Date when the embargo is over | Some software might be restricted for a period of time. The users need to know when this period of restriction has ended. | |
OperatingSystem | User/service | Reinstall / reuse | The operating system should be described so the user or service can install or run the tool | An IT user wanting to compare performances of given OS’s used in the implementation of some kind of software |
SoftwareRequirements | user/other software/dependencies | Install before reuse, Or combine with other tools | To ensure all the pre-requisites for the software are available before re-use To inform about dependencies of the Resource | A user wanting to be sure that the software is not using a dependency which ha can not use in his own environment (incompatibilities) |
Section | Subsection | Suggested usage | The question to answer |
Overview | Text from proposal | ||
Objectives | Archive | Archive | |
Reference | Reference | ||
Describe | Describe | ||
Cite | Cite | ||
Out of Scope | Identify elements in the proposal that are out of scope for this particular subcomponent And identify other limitations due to resources or feasibility | What are the limitations of the subcomponent in achieving the 4 pillar objectives- archive, reference, describe and cite? | |
Requirements | User stories | As a __ I can ___ so that ___ | What is the user’s story? Why does this user want to achieve a particular goal? |
User requirements | Identifying the needs, goals, and tasks directly from the SIRS report | What is the user’s need to achieve the goal and have a happy ending? | |
Functional requirements | Identification of application requirements, server, database, etc.. | What does the system / infrastructure can provide as new or improved functionalities to obtain the happy ending? | |
Non-functional requirements | What does the system / infrastructure can provide as non technical additions to obtain the happy ending? | ||
Specifications | Architectural design | Sequence diagram | What components are related and what is the workflow to obtain the objectives? |
Functional specifications | A breakdown of the implementation function: capabilities, appearance, and interactions with users in detail for software developers (equivalent to a issue / ticket / task that can be resolved with one PR / diff) | What will the subcomponent team implement as part of the infrastructure features/functionalities to answer the identified requirements? | |
Service specifications | How do I test that the service is working properly? | ||
Operational specifications | |||
Integration with EOSC Core components | List the FC4E CC with links to each specs and identify the points of integration / interoperability | How does this subcomponent interact with each one of the FC4E Core Components? | |
External references | implementation infrastructure | Relevant documentation on infrastructure website | Where can I get more information about the implementation’s current state? |
Software Heritage | Relevant documentation on SWH | Where can I get more information about the SWH archive's current state? |
| OSPO-RADAR has received funding from the Sloan Foundation under Grant Agreement no. 2025-25188.
[a]This seems like the abstract for the project, not for the deliverable.
1 total reaction
Mihail Anton reacted with ➕ at 2025-10-14 13:23 PM
[b]it's also future tense rather than present or past
[c]in a specific context, right?
[d]In EVERSE project we are developing a dashboard. Software quality pipelines will do assessments using quality indicators (metrics) and the results will be added the dashboard. Software citation, and other metadata will also be included in the framework. I see a big overlap here. Perhaps we should chat about this.
See:
- https://github.com/EVERSE-ResearchSoftware/DashVERSE
- https://github.com/EVERSE-ResearchSoftware/QualityPipelines
[e]I see my comment was added anonymously :)
I am Faruk Diblen.
[f]should this have a bullet?
[g]I think "unambiguously referable" really means something like "The specific version of software I used in a paper has a stable DOI or other identifier."
[h]Can you elaborate more? What information is missing here?
[i]Finability is, I think, something we can measure. "Unambiguously referable" is not, I think, specific enough for us to know how to measure .... I'm trying therefore to intuit what the author meant, and then translate that into something we can measure. In my own case, referring to specific versions of specific software packages is what I need to attest to the full provenance for data in my research. I do this down to an enumeration of all the versions of all the external libraries as well. I think that is what we are trying to get at.
[j]Here we are separating reference from (citation) credit, so it is not about the extrinsic identifier (DOI) getting you into a metadata record, but a reference to the specific artifact, that can be a directory, a commit or a release and that we are sure that it is this specific thing, in our case using a hash.
[k]I don't see the connection between this and the earlier text in the subsection
[l]it might be worth saying something about what fraction of research organizations have OSPOs and if there is anything unique about this subset compared with the larger set of research organizations.
[m]maybe say that this goal "bridging ..." is one that is common to many universities, but they have different ways of doing this and are at different levels of success. OSPOs are one way, and are the focus of this work (assuming this is true. One question is if this project will benefit the organizations that don't have OSPOs and if that is an intended goal of the project)
[n]Could an “Engineer as Software User” be part of this persona or might it be added as additional persona? IMHO, any potential software user from outside the academic environment matters
[o]Can you explain how this persona is interacting with the Dashboard or with the Software Heritage archive? This persona matters in the ecosystem but it would be useful to understand what is the connection with this infrastructure
[p]A potential software user might discover the software through the Dashboard, and its information might be relevant to understand software authorship, maintainance, etc.
[q]How will they access the dashboard? It is an internal dashboard for an academic OSPO. With a visitor view to browse entities. Not sure this view is adapted to this user. Can you check workflow 10 and see if it is?
[r]We will capture all comments at the end of the review and see how to address globally. I'm asking here in case you have specific user in mind for whom this is the view they will use and explain in depth the use case.
[s]why "thus"?
[t]is a line needed below this in this column as well?
[u]It would be valuable for an OSPO administrator to be able to delegate the ability to update and validate entries for specific projects to project owners or department/lab administrators. This ability to assign edit rights that are limited to specific projects may be similar to this "Curator - contributor" role, but with a more limited scope.
[v]This may not only be an initial import, it could be an on-going workflow. If an OSPO maintains their own registry in parallel, there will need to be the ability to perform batch updates on a periodic basis to keep the data in the two systems synchronized.
[w]I assume this duplicate check will not allow multiple OSPOs to enter the same project URL. If an institution uses an open source library or tool but does not actively contribute to the code, will there be a way to capture the details of that location implementation so that it can be included as part of the institution's dashboard? How would this look if multiple institutions do this for the same open source project?
[x]Implementation detail - If this check relies on matching email domains, ensure multiple domains can be listed and allow OSPO admins to set these values.
[y]It would also be nice to be able to identify repos that do not have an identified license, or README, or other recommended project files.
[z]If these are to be used as an institutional resource, there will need to be the ability to set institutional branding, be adjustable to meet baseline institutional web resource guidelines (e.g. colors, logos, header/footer links) and support the use of a custom domain or subdomain.
[aa]Funders will ask for indicators of impact
[ab]n° of citations/mentions
[ac]This component is being used in.... (dependency metrics)
[ad]Consider if this will be a sticking point for institutions outside of the EU. If so, might there be partners willing to host in other jurisdictions?
[ae]what happens if university OSPOs remain rare or even decrease?