Improving reproducibility of geospatial conference papers
Lessons learned from a first implementation�of reproducibility reviews
Daniel Nüst (AGILE 2020 Reproducibility Chair), Frank O. Ostermann,�Carlos Granell, Alexander Kmoch (all Reproducibility Committee)�
https://doi.org/10.7557/5.5601
AGILE council | annual conference | PhD schools | initiatives
GIScience teaching/research @ European research agendas
2
https://reproducible-agile.github.io/
2017, ‘18 & ‘19: Workshops on reproducibility�2019: Reproducible publications at AGILE conferences (initiative)�2020: First AGILE reproducibility review
3
AGILE Reproducible Paper Guidelines:�Contents & First Revision
4
Created by AGILE Initiative in 2019, see report at https://osf.io/hupxr/
Transparency & Reproducibility�GIScience�https://osf.io/phmce/wiki/home/
Promotion�Acknowledge spectrum
5
The guidelines
Author guidelines�Data in Research Papers�Computational workflows �in Research Papers�Pre-submission checklist�Writing DASA section
Rationale/Motivation/Vision
Reviewer guidelines
Reproducibility reviewer guidelines (WIP)
6
The guidelines for
reproducibility reviewers (WIP)
Ideal vs. realistic
Role
Skills
Do’s & dont’s
7
Reproducibility Review at�AGILE Conference 2020
8
Review process
Proceedings:�https://www.agile-giscience-series.net/review_process.html
Process documentation:�https://osf.io/7rjpe/
9
Reproducibility review after accept/reject decisions, triggered by regular reviewer
Reproducibility review & communication
Community conference & coronavirus
Badges on proceedings page
Presentation at conference
Reproducibility review results
6 reproducibility reports published
16 not possible/not attempted�(5 of which after communication with authors):
10
Reproducibility review reports
11
Reproducibility review reports
12
Reproducibility review reports
13
14
Independent execution of computations underlying research articles.
Findings
Overall
15
Read full report at https://osf.io/7rjpe/
Findings
Challenges for reproducibility reviewer:
�All efforts beyond mere workflow execution
16
Read full report at https://osf.io/7rjpe/
🙌
17
How to put your community on a path towards�more reproducibility in 5 easy hard steps
Next steps
Do it again in 2021 🎉
Revise guidelines 🛠️ 🇮🇹 🇫🇷 🇨🇳
Grow reproducibility reviewer team� ECRs, credit @ ORCID, skills
Continue research 🕵️� Ostermann, F., Nüst, D., Granell, C., Hofer, B., & Konkol, M. (2020).� Reproducible Research and GIScience: an evaluation using � GIScience conference papers.� EarthArXiv. https://doi.org/10.31223/x5zk5v
18
Continue community engagement towards opening scholarship� Scope� Requirements� Acceptance condition?� � Open review if tenured� Format-free submission� CRediT� Phase out when standard practice...
Slides: https://doi.org/10.7557/5.5601
�Reproducibility Committee 2020 + Initiative�Daniel Nüst (University of Münster, GER)�Frank Ostermann (University of Twente, NEL)�Carlos Granell (Universitat of Jaume I, ESP)�Alexander Kmoch (University of Tartu, EST)�Barbara Hofer (University of Salzburg, AUT)�Rusne Sileryte (TU Delft)�Markus Konkol (University of Twente, NEL)
https://reproducible-agile.github.io/
Word-stem cloud of all AGILE 2020 submissions (full/short/poster)
19
Slides published under CC BY 4.0
Bonus slides
20
The guidelines for data
“What if…” and Examples (not shown)
21
The guidelines�for workflows
Examples (not shown)
22
The guidelines for reproducibility reviewers (WIP)
Examples for “Do’s and Don’ts”:
23
0% of rejected papers have a DASA section (correlation, not cause)�48% of accepted full papers have DASA section
24
Reproducible research and GIScience: an evaluation using AGILE conference papers�https://doi.org/10.7717/peerj.5072