1 of 9

ArcLight MVP*

*(minimum viable product)

Wind-Down Deck

Stanford University Libraries

University of Michigan

Additional support provided by

Georgia Tech, National Library of Medicine,

And Rockefeller Archives Center

2 of 9

Team Members

#

Role

Who

1

Tech lead

Stanford: Jessie Keck (1 FTE)

1

Product owner

Stanford: Mark Matienzo (0.8 FTE)

5

Developers

Stanford: Jessie Keck, Jack Reed, Darren Hardy, Camille Villa (1 FTE each)�Michigan: Gordon Leacock (0.8 FTE)

2

UX designers

Stanford: Gary Geisler, Jennifer Vine (~0.5 FTE each)

1

DevOps liaison

Stanford: Erin Fahy

2

Management liaisons

Stanford: Tom Cramer (DLSS Direct Liaison)

Michigan: Nabeela Jaffer

14

Domain experts

* indicates person with time allocation to testing, indexing, documentation, etc.

Stanford: Laura Wilsey (DLSS), Jenny Johnson (SPEC), Michelle Paquette (SPEC), Frank Ferko (ARS), Charles Fosselman (EAL), Sarah Patton (Hoover)

Michigan: Mike Shallcross*, Max Eckard*, Dallas Pillen* (0.4 FTE each)

Georgia Tech: Wendy Hagenmaier* (0.2 FTE)

NLM: John Rees* (0.2 FTE)

RAC: Hillel Arnold*, Bonnie Gordon*, Patrick Galligan* (~0.05 FTE each)

3

Tech stakeholders

Stanford: Stu Snydman

Michigan: Roger Espinosa, Tom Burton-West, Chris Powell

3 of 9

Useful Links

4 of 9

Demo Videos

5 of 9

Accomplishments

  • Development of a minimum viable product for archival discovery and delivery
  • Engine built on Blacklight 7.0.0-alpha and Bootstrap 4
  • A successful project that started with community development from day 1
  • Demonstration of digital object viewers in context of archival description using oEmbed/sul-embed; extendable to other systems
  • Significant performance improvements to the ead_solr (indexer) gem
  • Deployment of application on AWS Elastic Beanstalk
  • 100% test coverage and clean code base thanks to Rubocop

6 of 9

End User Value Delivered

  • A demonstrable proof of concept application for prototyping archival discovery paradigms, under experimentation by other institutions and archivists
  • Search, browse, filter and sort functionality, with discovery at the collection and component level
  • Integrated digital object delivery
  • Lays groundwork for future prototyping of indexing directly from ArchivesSpace or other systems
  • Lays groundwork for integration of other request management systems (e.g. Aeon or sul-requests)

7 of 9

Lessons Learned

  • High-level goals for each sprint helped maintain focus
  • Feedback from stakeholders (internal or otherwise) during a community sprint can be hard to come in a timely manner - but all feedback we got was good and constructive
  • Having a product owner with a clear vision of product was essential
  • Team learned how to identify tickets/stories at the right “size” iteratively, and empowered developers to help identify boundaries
  • Pairing occurred naturally based on time zone (EST/PST)
  • Integrating a new team member helped build confidence in work

8 of 9

Next steps (near term)

  • Accessibility testing and subsequent improvements (Stanford, Michigan, NLM)
  • UI improvements (Stanford)
  • More testing and outreach to gather feedback
    • Deployment testing
    • Indexing
    • Follow up meeting with Stanford Special Collections
    • Conference presentations
  • Additional QA to identify bugs

9 of 9

Next steps (medium term)

  • Identify future work cycles
    • Indexing/publishing directly from ArchivesSpace
    • Aeon integration
  • Identify potential community contributors
  • Develop Stanford-internal plan for evaluation and future implementation