1 of 11

Breakout

Instructions

2 of 11

Central challenge of the use case

(phrased as a “How to…?” challenge)

Ask “Why?”

    • …to find higher order challenges.
    • Rephrase these as “How to…?” challenges.

Ask “Why else?”

    • …to find more higher order challenges.
    • Rephrase these as “How to…?” challenges.

Ask “What’s stopping you?”

    • …to find more tactical challenges.
    • Rephrase these as “How to…?” challenges.

Ask “What’s stopping you?”

    • …to find tactical challenges.
    • Rephrase these as “How to…?” challenges.

Why?

What’s stopping you?

3 of 11

NSF topics to address in breakout sessions from Dear Colleague Letter

(http://www.nsf.gov/pubs/2014/nsf14059/nsf14059.jsp):

Metrics of impact:

  • Introduction of appropriate metrics that match the effort necessary for successful development and maintenance of scientific software and data frameworks
  • Development and use of metrics that measure software and data framework usage and impact on science, engineering and education
  • Establishment of metrics that recognize open access policies and sharing
  • Comparison of impact for publication of software and data in a citable form before paper publication, as advocated in initiatives such as SageCite (http://blogs.ukoln.ac.uk/sagecite/), versus current practices
  • Creation of specific project metrics that assess and monitor effective availability and accessibility of software and data
  • Identification of sources of information about researchers' productivity and impact
  • Development of ways in which researchers' scientific activity can be automatically captured and validated

Citation and attribution:

  • Novel mechanisms for citation of software and datasets as distinct products of scholarship, promoting standards of academic credit and rigor for these cyberinfrastructure components
  • Novel citation methods for new forms of publication and scientific expression so that researchers are able to ensure their work is citable, and others are able to discover and access it
  • Citation patterns that include a role for citations (e.g. to value activities such as “data provider/curator” and/or “software tool provider” alongside “data analyzer” or “computational modeler”), which can help create a credit market for data and software sharing

4 of 11

How to use your time:

  • Summary of the challenge. 20%
  • Why is it important? 20%
  • Why not solved? 20%
  • Critical Actions (~3-5) 40%

5 of 11

What is a “Critical Action”?

Criteria:

  • Relevant
  • Concrete
  • Rational
  • Aggressive
  • Understandable

Examples:

“$5 million over 3 years to build a framework.”

“A mandate from a federal funding agency to use a common framework.”

6 of 11

Useful Metrics

Karl Benedict - UNM libraries

Ray Idaszak - RENCI

Jennifer Lin - PLoS

Fiona Murphy - Wiley

Mark Parsons - Research Data Alliance

Mark Taylor - Elsevier

Curt Tilmes - NASA

Kes Schroer - Dartmouth

7 of 11

Post-it notes

  • How do we measure contributions?
  • Who cares?
  • Measure impact of data/software as well as science.
  • What are the barriers that might exist for getting measured - i.e. “publication costs”
  • What constitutes “publication”
  • Alternatives to “citation” as a measure
  • Weighing contributions
  • Align metrics with problem
  • How do we define metrics (multiple metrics) to reduce chance of gaming
  • Linkage between values and metrics
  • Align source of metrics (publishers, repositories, search engines …) with clearly defined information requirements

8 of 11

Summary of Challenge

  • How to classify and weight different types of contributions?
  • How to map metrics to relevant communities and their needs and values?
  • How to ensure metrics are non-proprietary, independent, reproducible, transparent, simple, and trustworthy?

9 of 11

Why is it important?

  • Incentivize the right behavior to advance science

10 of 11

Why not solved?

  • Recognition of data and software as scholarly outputs is relatively recent.
  • Academic snobbery
  • Need to overcome status quo of articles as the only first-class products.
  • No current standards for software “publication” metadata, challenges for data metadata

11 of 11

Critical Actions Demands

  1. NSF (and others) pay to convene key players to identify and harmonize standards on roles, attribution, value, and transitive credit (in an extensible framework) - with promise of recognition by key sponsors.
  2. Agencies/publishers/societies/foundations fund implementation grants to identify and measure data and software impact in a way that is relevant to stakeholder/community.
  3. Identify a model to iterate and improve the standards framework.
  4. Define discovery and use metadata standard(s) for software.