1 of 24

Improving reproducibility of geospatial conference papers

Lessons learned from a first implementation�of reproducibility reviews

Daniel Nüst (AGILE 2020 Reproducibility Chair), Frank O. Ostermann,�Carlos Granell, Alexander Kmoch (all Reproducibility Committee)�

https://doi.org/10.7557/5.5601

2 of 24

https://agile-online.org/

AGILE council | annual conference | PhD schools | initiatives

GIScience teaching/research @ European research agendas

2

3 of 24

https://reproducible-agile.github.io/

2017, ‘18 & ‘19: Workshops on reproducibility�2019: Reproducible publications at AGILE conferences (initiative)�2020: First AGILE reproducibility review

3

4 of 24

AGILE Reproducible Paper Guidelines:�Contents & First Revision

4

5 of 24

AGILE Reproducible Paper Guidelines 🇬🇧 🇪🇸

https://doi.org/10.17605/OSF.IO/CB7Z8

Created by AGILE Initiative in 2019, see report at https://osf.io/hupxr/

Transparency & Reproducibility�GIScience�https://osf.io/phmce/wiki/home/

Promotion�Acknowledge spectrum

5

6 of 24

The guidelines

Author guidelines�Data in Research Papers�Computational workflows �in Research Papers�Pre-submission checklist�Writing DASA section

Rationale/Motivation/Vision

Reviewer guidelines

Reproducibility reviewer guidelines (WIP)

6

7 of 24

The guidelines for

reproducibility reviewers (WIP)

Ideal vs. realistic

Role

Skills

Do’s & dont’s

7

8 of 24

Reproducibility Review at�AGILE Conference 2020

8

9 of 24

Review process

9

Reproducibility review after accept/reject decisions, triggered by regular reviewer

Reproducibility review & communication

Community conference & coronavirus

Badges on proceedings page

Presentation at conference

10 of 24

Reproducibility review results

6 reproducibility reports published

16 not possible/not attempted(5 of which after communication with authors):

  • no starting point in the paper
  • documentation insufficient for third party
  • sensitive/confidential/commercial data
  • proprietary software
  • software paper
  • (conceptual papers)

10

11 of 24

Reproducibility review reports

11

12 of 24

Reproducibility review reports

12

13 of 24

Reproducibility review reports

13

14 of 24

14

Independent execution of computations underlying research articles.

15 of 24

Findings

Overall

  • Saw full spectrum of reproducibility
  • Compared to previous years’ submissions, the guidelines and increased community awareness markedly improved reproducibility
  • ⅚ reproduced papers have DASA; all embrace guidelines
  • Reproducibility reports with many recommendations for improvement, well received by authors, even included in revision before publication > reward!
  • Good practices spread slowly
  • Process

15

Read full report at https://osf.io/7rjpe/

16 of 24

Findings

Challenges for reproducibility reviewer:

  • Inconsistencies (identifiers, links) between paper and code
  • Lack of connections between artefacts (code <> figure)
  • Workspaces layout: no documentation, absolute paths
  • Unknown runtime and no demo subsets of data
  • No guidance on efforts and stop points

�All efforts beyond mere workflow execution

16

Read full report at https://osf.io/7rjpe/

17 of 24

🙌

17

How to put your community on a path towards�more reproducibility in 5 easy hard steps

  1. Build a team of enthusiasts (workshop, social events)
  2. Assess the current state and raise awareness (workshop, paper)
  3. Institutional support (🙏 AGILE Council 🙏 + committee chairs)
  4. Positive encouragement (no reproduction != bad science)
  5. Keep at it!

18 of 24

Next steps

Do it again in 2021 🎉

Revise guidelines 🛠️ 🇮🇹 🇫🇷 🇨🇳

Grow reproducibility reviewer team� ECRs, credit @ ORCID, skills

Continue research 🕵️� Ostermann, F., Nüst, D., Granell, C., Hofer, B., & Konkol, M. (2020).� Reproducible Research and GIScience: an evaluation using � GIScience conference papers.� EarthArXiv. https://doi.org/10.31223/x5zk5v

18

Continue community engagement towards opening scholarship Scope� RequirementsAcceptance condition? � Open review if tenured� Format-free submission� CRediT� Phase out when standard practice...

19 of 24

Thank you!

I look forward to your questions!�@nordhomen | d.n@wwu.de

Slides: https://doi.org/10.7557/5.5601

Reproducibility Committee 2020 + Initiative�Daniel Nüst (University of Münster, GER)�Frank Ostermann (University of Twente, NEL)�Carlos Granell (Universitat of Jaume I, ESP)�Alexander Kmoch (University of Tartu, EST)�Barbara Hofer (University of Salzburg, AUT)�Rusne Sileryte (TU Delft)�Markus Konkol (University of Twente, NEL)

https://reproducible-agile.github.io/

Word-stem cloud of all AGILE 2020 submissions (full/short/poster)

19

Slides published under CC BY 4.0

20 of 24

Bonus slides

20

21 of 24

The guidelines for data

“What if…” and Examples (not shown)

21

22 of 24

The guidelines�for workflows

Examples (not shown)

22

23 of 24

The guidelines for reproducibility reviewers (WIP)

Examples for “Do’s and Don’ts”:

  • Do shift burden to author
  • Do encourage and set examples
  • Do not accept private data sharing
  • Document your work in report (impact)
  • Be kind (career stage, knowledge, privileges)
  • No rummaging

23

24 of 24

0% of rejected papers have a DASA section (correlation, not cause)�48% of accepted full papers have DASA section

24

Reproducible research and GIScience: an evaluation using AGILE conference papershttps://doi.org/10.7717/peerj.5072