1 of 13

Identifying outcome and impact indicators and improving their collection throughout the Belgian Node

ELIXIR Impact implementation study - Show and Tell - 23 March 2023

Kim De Ruyck

2 of 13

Partners

  • All 5 Flemish universities, 2 Walloon universities, Sciensano, VIB (lead)

Funding

  • Research Foundation Flanders - FWO
  • Calls for International Research Infrastructures
  • Grants
    • 2019-2022 : 10 groups from 6 Flemish partners
    • 2023-2026 : 13 groups from 6 Flemish partners

  • European grants : ELIXIR-Converge, EuroScienceGateway, Genomic Data Infrastructure…

The Belgian ELIXIR Node

3 of 13

Domains

  • Research data management
  • Reproducible data analysis
  • Sensitive data infrastructure & federated learning framework
  • Services for the interpretation of human research data
  • Plant biodiversity services
  • Training

Node Services

  • Call every two years
  • 25 Node Services from 14 groups in 8 partners

>> read more

The Belgian ELIXIR Node

4 of 13

FWO-IRI grant proposal (every 4 years, March-April)

  • ‘Scientific context and Impact analysis’ template

    • Describe the impact and achievements already obtained or to be expected (scientific, economic, societal and/or policy-relevant impact, demonstrate the output of the previous funding (i.e. key publications, patents, visibility, training, development of collaborations and projects,... )
    • Describe the impact of the RI on the position of the users/stakeholders
    • Describe how this IRI proposal is expected to enhance the acquisition of new projects and attract funding opportunities for the Flemish infrastructure users
    • Describe how you will maximize the impact of the proposed RI through collaboration with (an)other RI(s) and through expanding the community

  • Feedback from the Science Commission

    • Report more actively and in more detail on the use of the infrastructure

by Flemish, other Belgian and international researchers

    • Solicit active feedback from all these researchers on their priorities

Collecting outcome & impact indicators

Specified by FWO

5 of 13

FWO-IRI grant reporting (yearly, March-April)

  • Key Performance Indicators :

    • mainly technical information, very few outcome/impact indicators

  • Additional information :

    • scientific output measured by publications and valorisations
    • usage of the infrastructure
    • international importance (new projects)
    • public research funds and contracts in collaboration with third parties

Collecting outcome & impact indicators

Specified by FWO

Specified by us

6 of 13

Node Services evaluation/call (yearly or every 2 years, June)

  • Template based on Core Data Resources

    • Key publications and citations
    • Usage numbers
    • Data throughput
    • Counterfactual : what if the resource had not existed?
    • Accelerating science : how does the resource accelerate science?
    • Narratives on how the resource influenced research efficiency, research related knowledge?

Collecting outcome & impact indicators

Specified by us

7 of 13

Scientific Advisory Board meeting (yearly, October)

  • Lists

    • Publications
    • Number of training courses and attendees
    • Events organised
    • Projects
    • External mentions (in guidelines, websites, presentations…)

Collecting outcome & impact indicators

Specified by us

8 of 13

  • Impact framework made for one service (e.g. data submission tool)

  • Mapping of current information collected

  • Examples from other Nodes’ use cases, impact training course…

Gap analysis

9 of 13

Impact framework

10 of 13

Mapping exercise

11 of 13

  • We already collect a lot of information: spread out in several documents, sometimes underused

  • We collect many usage related evidence of outcome and impact

  • Impact related information sometimes ‘hidden’ in reporting documents for specific services or Node Service evaluation forms

  • We identified a number of additional indicators

  • Funding agency is very vague, so includes everything

Mapping exercise : main conclusions

12 of 13

  • Alignment needed

    • Required by funding agency and desired by us versus currently collecting
    • Funded services versus Node Services (e.g. not all funded services are part of the SDP)
    • Timing (e.g. emerging versus mature NS, NS evaluations versus project reporting)

  • Reorganize what we collected / Prioritise what we collect

    • Project reporting (KPIs) : very technical => impact related
    • Focus on use and needs (cfr. science commission)

  • Guide our partners (especially for the non-quantitative indicators)

    • Examples

Lessons learnt

13 of 13

This is not the end

We haven’t got all the answers yet…

… but we came a step closer to listing suitable and specific outcome and impact indicators and methods to collect these periodically throughout the Node

Happy to answer questions

Conclusion