ABCGHJLMNOPQSTUXYZAB
1
citationType of Articleaccessibilitycontent accessdevice usabilitydiscoverabilityidentifying usersinformation retrievalinterface designmeasuring costsmeasuring effectivenessmeasuring efficiencymeasuring learnabilitymeasuring usabilityquality assessment (of content)user behavioruser perceptionuser requirementsSynopsis
(Note: Synopses and identified gaps may include reviewer commentary as well as direct text from abstracts and/or the bodies of articles.)
2
User and usability studies
3
Agosti, M., Crivellari, F., Di Nunzio, G. M., & Gabrielli, S. (2011). Understanding user requirements and preferences for a digital library Web portal. International Journal on Digital Libraries, 11(4), 225–238. Retrieved from http://link.springer.com/10.1007/s00799-011-0075-7Literature ReviewXXXUser study of needs and perceptions of The European Library (TEL) Web portal. Previous assessment of TEL included web log analysis. 216 students from the University of Padua, Italy used TEL in computer laboratories over two semesters. They completed a five-part survey questionnaire after completing free navigation and searching on TEL. The results of the questionnaire, combined with the web logs, can be used to improve web portals, including TEL.
4
Alonso Gaona García, P., Martín-Moncunill, D., Sánchez-Alonso, S., & Fermoso García, A. (2014). A usability study of taxonomy visualisation user interfaces in digital repositories. Online Information Review, 38(2), 284–304. Retrieved from http://www.emeraldinsight.com/doi/abs/10.1108/OIR-03-2013-0051Research study/articleXXXXAnalyzes usability of different approaches to visualization interfaces. Seeks to find whether visualization techniques and "awareness of HCI methodologies" could improve services. First part of the study involved eight interfaces with simple taxonomies and data and sought to analyze the visual perception of participants. The second part of the study used a simulated collection comprised of 40,000 digital resources from Europeana digital library described by styles and periods from the Art & Architecture Thesaurus (AAT) to measure the usability of taxonomy visualisation user interfaces. 32 middle-aged, educated, web-savvy, English speaking participants from Europe and Latin America were selected. In the first part of the study, user perception did not always match up with the user's ability to find information in the second part of the study. The study showed that visualization techniques offer good overviews of digital collections, and that the visual perception of the user greatly informs their tendency to select a particular collection. The article then recommendations advice for the development of graphical interfaces.
5
Balatsoukas, Panos. “An eye-tracking approach to the evaluation of digital libraries.” In User Studies for Digital Library Development, edited by Milena Dobreva, Andy O’ Dwyer and Pierluigi Feliciati, 95-103. London: Facet Publishing, 2012.
Case StudyXXXXXXXXXArticle about using eye-tracking technology as a tool for evaluation of digital library usage. Focus on data collection and analysis using The Europeana study as a case study.
6
Bull, S., Craft, E., & Dodds, A. (2014). Evaluation of a Resource Discovery Service: FindIt@Bham. New Review of Academic Librarianship, 20(2), 137–166. Retrieved from http://www.tandfonline.com/doi/abs/10.1080/13614533.2014.897238#.VKX0KIrF_QwResearch study/articlexxBull et al evaluate the University of Birmingham’s Primo discovery service, known as "Find it @Bham," a year after its launch. Methodology includes a large-scale online survey and focus groups to determine user satisfaction. The level of appreciation was compared across user groups, and recommendations for improvements and future studies were compiled. Issues of relevancy ranking were examined in particular.
7
Cervone, F. (2014). Evidence-based practice and web usability assessment. OCLC Systems & Services: International Digital Library Perspectives, 30(1), 11–14. doi:10.1108/OCLC-10-2013-0036Literature ReviewXXAuthor discusses application of evidence-based practice to web usability assessment, focusing on quality control and design for user ease.
8
Chapman, J. C. (2010a). DOs and DON’Ts: A Primer for User-Friendly Finding Aid Design. Journal for the Society of North Carolina Archivists, 8(1), 2–28.Usability studyXXXXXXThe author tackles usability of archival collection finding aids in a usability study first by reviewing basic usability principles, and then looks at 4 problems users confront in finding aids. Author also narrates tips for "tech heavy" and "tech lite" solutions/edits. Clear layout, organization, and arguments. Usability test outcomes applicable to a larger audience beyond archivists/finding aids. Related to other Chapman article.
9
Chapman, J. C. (2010b). Observing Users: An Empirical Analysis of User Interaction with Online Finding Aids. Journal of Archival Organization, 8(1), 4–30. doi:10.1080/15332748.2010.484361Usability studyXXXXXXXXXXThe author discusses results from usability study of finding aid web displays of one collection housed in a university. Two groups of six people focused on navigation, structure, terms, and search ability of finding aids. Related to other Chapman article.
10
Dobreva, M., & Chowdhury, S. (2010). A User-Centric Evaluation of the Europeana Digital Library. In G. Chowdhury, C. Koo, & J. Hunter (Eds.), The Role of Digital Libraries in a Time of Global Change (Vol. 6102, pp. 148–157). Berlin, Heidelberg: Springer Berlin Heidelberg. doi:10.1007/978-3-642-13654-2_19Case StudyXXXXEuropeana, a digital library which is built around the idea to provide a single access point to the European cultural heritage, is paying special attention to the user needs and behaviour. This paper presents user-related outcomes addressing the dynamics of user perception from a study which involved focus groups and media labs in four European countries.
11
Ellis, Shaun; Callahan, M. (2012). Prototyping as a Process for Improved User Experience with Library and Archives Websites. Code4Lib Journal, (18), 1–14. Retrieved from http://journal.code4lib.org/articles/7394Case StudyXXXXXXEllis and Callahan describe the redesign of the Princeton University Finding Aids site to make the EAD components, not collections, the "atomic unit". They worked with stakeholders to create a list of desirable features. In addition to "atomizing" content, they wanted to make all previously scanned or digitized content available for viewing. When it came to testing and gathering feedback they used Bootstrap to create prototypes and Trello to manage feedback.
12
Emanuel, J. (2013). Usability testing in libraries: methods, limitations, and implications. OCLC Systems & Services: International Digital Library Perspectives, 29(4), 204–217. Retrieved from http://www.emeraldinsight.com.ezproxy.lib.purdue.edu/doi/full/10.1108/OCLC-02-2013-0009Literature ReviewXXXLiterature review argues that usability studies as a research method (and evaluation method) as long as certain parameters are addressed. The author reviews social science literature and talks about methodologies for usability, which others do not do bridging the gap b/w evaluation and research.
13
Garcia, P.A.G. et al. (2014). A usability study of taxonomy visualisation user interfaces in digital repositories", Online Information Review, Vol. 38 Iss 2 pp. 284-304
Permanent link to this document: http://dx.doi.org/10.1108/OIR-03-2013-0051
Research study/articlexxxxxxThis paper aims to analyse user interfaces for search and collection visualisation and navigation from a usability perspective. The final aim is to offer repository owners a scientific basis to support their decisions when they have to choose an interface that can really help users to effectively locate and visualise resources over large digital collections.
14
Gueguen, G. (2010). Digitized Special Collections and Multiple User Groups. Journal of Archival Organization, 8(2), 96–109. doi:10.1080/15332748.2010.513324Case StudyxxxxxThis article examines usability research to
predict how such systems might effectively be used and highlights a
digital library and finding aid system that utilizes a single repository
of digitized objects to fuel two types of user-discovery systems:
a typical digital collections interface with item-level access and a
finding aid that incorporates digitized items at the aggregate level
15
Hariri, N., & Norouzi, Y. (2011). Determining evaluation criteria for digital libraries’ user interface: a review. The Electronic Library, 29(5), 698–722. Retrieved from http://www.emeraldinsight.com/doi/abs/10.1108/02640471111177116Literature ReviewxxxxxxxxxThe present study aims to review the literature concerning Digital Libraries (DLs) and
user interfaces in order to identify, determine, and suggest evaluation criteria for a DLs user interface.
Accordingly, this study’s objectives are threefold: explore which criteria exert a significant
relationship with the DLs user interface; identify a set of criteria that appears to be useful for
evaluating DLs user interface; and determine evaluation criteria that have more frequency and
occurrence in the related texts reviewed.
16
Hu, C.-P., Hu, Y., & Yan, W.-W. (2014). An empirical study of factors influencing user perception of university digital libraries in China. Library & Information Science Research, 36(3), 225–233. Retrieved from http://www.sciencedirect.com/science/article/pii/S0740818814000450Research study/articleXXXXXThis research study explored the different services that impact a user's perception of a digital library. The researchers compiled 27 "service elements" into five "functional service types" (Information providing services, Information organization services, Interactive services, Information retrieval services, and Individual services) and used this data in a survey administered to college students in China. The survey was designed to determine whether these services correlated to positive user perceptions of service quality. Results from the survey showed that the main purpose of a digital library for college students was to retrieve scholarly works; therefore, services associated with information providing and retrieval should be optimized. Additionally, the survey results showed that digital library interfaces should provide space for users to comment, share, and communicate about digital objects. Finally, digital libraries should incorporate customizable features, including social network integration, microblogging capabilities, and functions that allow users to create their own library from the collections.
17
Joo, S. (2010). How are usability elements - efficiency, effectiveness, and satisfaction - correlated with each other in the context of digital libraries? Proceedings of the American Society for Information Science and Technology, 47(1), 1–2. Retrieved from http://doi.wiley.com/10.1002/meet.14504701323Research study/articlexxx
18
Joo, S., & Yeon Lee, J. (2011). Measuring the usability of academic digital libraries. The Electronic Library, 29(4), 523–537. Retrieved from http://www.emeraldinsight.com/doi/abs/10.1108/02640471111156777Case StudyXXXXDesign of a usability evaluation instrument (specific to academic libraries) to measure efficiency, effectiveness, satisfaction, and learnability. Authors performed lit review, developed instrument based on review, and then validated on users. Used the digital library at Yonsei University for testing.
19
Kelly, E. J. (2014). Assessment of Digitized Library and Archives Materials: A Literature Review. Journal of Web Librarianship, 8(4), 384–403. doi:10.1080/19322909.2014.954740Literature Reviewx
20
Khoo, M., & Hall, C. (2012). What Would "Google" Do? Users’ Mental Models of a Digital Library Search Engine. In P. Zaphiris, G. Buchanan, E. Rasmussen, & F. Loizides (Eds.), Theory and Practice of Digital Libraries (Vol. 7489, pp. 1–12). Berlin, Heidelberg: Springer Berlin Heidelberg. doi:10.1007/978-3-642-33290-6_1Theoretical; Case studyxxx
21
Klas, Claus-Peter. “Expert evaluation methods.” In User Studies for Digital Library Development, edited by Milena Dobreva, Andy O’ Dwyer and Pierluigi Feliciati, 75-84. London: Facet Publishing, 2012.
Literature ReviewXXXXXLiterature review on expert evaluation, which is a method for cost-effective and valuable digital library evaluation. Focus on developers on one side, and users on the other. Author finds there isn't much on digital library evalution and it remains undefined. Goal is to increase functionality and task-solving capabilities of digital libraries.
22
Lown, C., Sierra, T., & Boyer, J. (2012). How Users Search the Library from a Single Search Box. College & Research Libraries, 74(3), 227–241. Retrieved from http://crl.acrl.org/content/74/3/227.shortCase Study; Field studyxxxxLog analysis/analytics of single search box user behavior
This article reports on two semesters' worth of data (fall 2010 - spring 2011) gathered from patron/public use of NCSU Libraries' single-search-box. The search box, called QuickSearch, searches multiple sources of data, including the library's digital collections (in addition to the catalog, article searches, FAQs, databases, Library Website, and more). The article discusses use trends overall and at different times during the academic calendar; which types of content were accessed more often; and which types of searches were not well served by the integrated search box. *Although the integrated search box was searching special collections digital materials, the results did not specifically discuss this material. It is interesting, though, to note that the #13 most common search term for spring 2011 was digital repository.*
23
Lundrigan, C., Manuel, K., & Yan, M. (2014). "Pretty Rad": Explorations in User Satisfaction with a Discovery Layer at Ryerson University. Coll. Res. Libr., crl13–514–. Retrieved from http://crl.acrl.org/content/early/2014/01/17/crl13-514.shortCase StudyxxxxSurvey of single search box user satisfaction
Similar to the Lown, Sierra, and Boyer study, this study focused on an integrated search service (specifically, Summon), here referred to as a web-scale discovery (WSD) service. This article reports the results of a user satisfaction assessment project completed by Ryerson University Library and Archives during its September 2011 release of Summon. Questionnaires were used for the study. Results indicated that students were moderately satisfied with Summon.
24
Mansor, Y., & Ripin, F. H. M. (2013). Usability Evaluation of Online Digital Manuscript Interface. Library Philosophy & Practice, 1–12. Retrieved from http://digitalcommons.unl.edu/libphilprac/986/Case Studyxxxxxxxx
25
McKay, D., & Buchanan, G. (2013). Boxing clever. In Proceedings of the 25th Australian Computer-Human Interaction Conference on Augmentation, Application, Innovation, Collaboration - OzCHI ’13 (pp. 497–506). New York, New York, USA: ACM Press. Retrieved from http://dl.acm.org/citation.cfm?id=2541016.2541031Research study/articlexxxxxxLog analysis/analytics of single search box user behavior
Data collected in the first semesters of 2012 and 2013 at an Australian university. The study does not report on whether the search box's scope included digital library materials, but the study does report on typical user behavior in a single-search-box case study setting. The study's results focused on measured improvement of user behavior indicating learnability.
"This study, then, suggests that users are considerably more savvy and aware of effective search techniques than has been believed in the past. In light of this, it behoves us to better support users not just in supporting their native query strategies (e.g. pasting citations into the search box, and searching over multiple metadata types without search operators), but also in evaluating the large number of search results returned by web-scale search."
26
Mischo, W. H., Schlembach, M. C., Bishoff, J., & German, E. M. (2012). User Search Activities within an Academic Library Gateway: Implications for Web-scale Discovery Systems. In M. Popp & D. Dallis (Eds.), Planning and Implementing Resource Discovery Tools in Academic Libraries (pp. 153–173). Hershey, PA: IGI Global. doi:doi:10.4018/978-1-4666-1821-3.ch010Usability studyXXXXXXXXXThe authors examine the Illinois gateway transaction logs, the University of Illinois' discovery layer. With 1.4M searches over the course of 2010-2011 academic year, this large study tested search assistance techniques, conducted user interviews, and focused on transactions logs themselves as valuable in research tools. The study covers average search terms and times, obstacles to users searching OPACs, interface problems, and click-through rates. Article is accompanied by a lot of visuals making this a worthwhile reference piece and a solid contribution to the sub-group's work.
27
Nasreen, N., & Alawi, G. A. A. A. (2011). Impact of Digital Library and Internet Technology on Learner’s Usability and Satisfaction. In 2011 IEEE International Conference on Technology for Education (pp. 128–135). IEEE. Retrieved from http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6004372Research study/articleXNasreen and Alawi aim "to study the relationship between digital library, internet and education in higher education". Their study focuses on use of the Berkeley Digital Library (BDL) by sixteen undergraduates who were trained for two months on using BDL correctly. The authors used three questionnaires (11 questions on experience, 8 questions on usability attributes, and 3 questions measuring satisfaction. They conclude that design affects user interaction, and that digital libraries should be more user friendly and accessible.
28
Nicholas, D. (2010). The Virtual Scholar: the Hard and Evidential Truth. In I. Verheul, A. M. Tammaro, & S. Witt (Eds.), Digital library futures: User perspectives and institutional strategies (pp. 23 – 32). New York: De Gruyter Saur.Research study/articleXXXEvaluates information seeking behavior using deep log studies conducted over 7 years of hundreds of thousands of scholars using ScienceDirect, Oxford Scholarship, Ohiolink, British Library Learning Institute, and other platforms. Average users enter 2.3 search terms; there is a great deal of bouncing around; scholars from research intensive universities have more per capita searches and less time per visit than those at teaching institutions; searchers do not use advanced features.
29
Nicholas, David and David Clark. “Evidence of user behaviour: deep log analysis.” In User Studies for Digital Library Development, edited by Milena Dobreva, Andy O’ Dwyer and Pierluigi Feliciati, 85-94. London: Facet Publishing, 2012.
Case StudyXXXXAuthors focus on deep log analysis as a way to further understand users and their activities in digital libraries. Authors promote use of deep log analyses, but also make room for the downside of using this methodology. Article delves into some technical language, but nothing out of our league.
30
Nichols, A., Billey, A., Spitzform, P., Stokes, A., & Tran, C. (2014). Kicking the Tires: A Usability Study of the Primo Discovery Tool. Journal of Web Librarianship, 8(2), 172–195. doi:10.1080/19322909.2014.903133Case StudyXXXThis article measures the usability of Primo, and mentions in the "Technical Challenges" section that in-house digital library objects do not display correctly in Primo.
31
Perrin, J. M., Clark, M., De-Leon, E., & Edgar, L. (2014). Usability Testing for Greater Impact: A Primo Case Study. Information Technology and Libraries, 33(4), 57–66. Retrieved from http://ejournals.bc.edu/ojs/index.php/ital/article/view/5174Case StudyXX
XXMembers of the Texas Tech University Libraries Usability Task Force conducted a usability study on their discovery system (Ex Libris' Primo). Using 8 participants, the team analyzed how discoverable certain items, including digital objects, were for the users. Their results showed that the time and ease needed to identify digital objects (electronic theses, digitized special collections) were comparable to that of identifying a book, and easier and less time consuming than identifying research articles in a database. They realized they could further increase the ease and quickness of discoverability by labeling resources according to their content type (electronic thesis and dissertation) rather than by providing the name of the digital repository ("ThinkTech").
32
Prom, C. (2011). Using Web Analytics to Improve Online Access to Archival Resources. American Archivist, 74(1), 158–184.Case StudyXXThe researchers investigated the usability of their online finding aid management system, Archon, during a series of evaluations over a two year period. The initial evaluation asked questions about how users were getting to the Archon site and, once there, how they were navigating the site to identify resources needed. Based on their initial analysis, the researchers made several strategic changes to how they make digital objects accessible to the user. They elected to ingest digital images directly into their Archon finding aid management system and to create links to digitized documents in an outside e-records repository. Two years later they revisited these decisions and found that the finding aids with added objects and/or links were downloaded more frequently than when they had no digital objects associated with the finding aid records. They also noticed that digital images, housed in the Archon management system, were viewed far more by users than the digital documents linked from Archon to an outside e-records repository. The researcher hypothesized that the digital documents were not as discoverable because the user had to click multiple links to ultimately gain access to this content. He wrote that the institution should have invested in making Archon more robust so it could integrate the digital documents in the management system alongside the digital images.
33
Savić, D. (2014). Using Google Search Appliance (GSA) to Search Digital Library Collections : A Case Study of the INIS Collection Search. Grey Journal (TGJ), 10(3), 123–132. doi:10.4403/jlis.it-10071.WebCase StudyxxxxReports experiences and lessons learned from three years using Google Search Appliance at the International Nuclear Information System (INIS). The paper is highly descriptive of the system implemented and does not discuss any qualitative or quantitative data gathered on usability. The paper does report an increase in overall traffic to the site and in downloads from the site.
34
St. Jean, B., Rieh, S. Y., Yakel, E., & Markey, K. (2010). Unheard Voices: Institutional Repository End-Users. College & Research Libraries, 72(1), 21–42. Retrieved from http://crl.acrl.org/content/72/1/21.shortResearch study/articleXXXXXXXXSt. Jean, Rieh, Yakel, and Markey discuss the lack of research done on IR end-users. They interviewed 20 end users to explore the patterns of their perceptions, motivations, and use of an IR, as well as the level of credibility an IR holds. They also make recommendations on ways that IRs could improve their user interface and underlying structure to better emphasize the end user needs.
35
Symonds, E. (2011). A practical application of SurveyMonkey as a remote usability‐testing tool. Library Hi Tech, 29(3), 436–445. doi:10.1108/07378831111174404Case StudyXSymonds examines the usefulness of Survey Monkey as an assessment tool for the University of Louisville. The limitations to testing tools were mostly financial and time, but she found that Survey Monkey answered some of the basic redesign questions and informed some of the redesign decisions. They would use it again in the future despite its drawbacks (Users had to summarize their research steps, questionable accuracy, took a lot participant effort to type out search strategies – may have left off some unconscious decisions) because it worked well enough and was free.
36
Teruggi, D. (2010). Who are the Users of Digital Libraries, What do they Expect and Want? The Europeana Experience. In I. Verheul, A. M. Tammaro, & S. Witt (Eds.), Digital library futures: User perspectives and institutional strategies (pp. 33–40). New York: De Gruyter Saur.Case Studyshould be xTeruggi describes how Europeana developed a user experience study using on-line surveys, emailed feedback, login analysis (web analysis), focus groups, user testing and expert analysis. The majority of the article describes these feedback measures. The primary conclusion is that user analysis is an on-going priority for Europeana and that user contributions to the website are an increasingly important factor for Europeana to consider.
37
Turner, N. B. (2011). Librarians Do It Differently: Comparative Usability Testing with Students and Library Staff. Journal of Web Librarianship, 5(4), 286–298. Retrieved from http://www.tandfonline.com/doi/abs/10.1080/19322909.2011.624428#.VKhYQ4rF_QwResearch study/articleXXxTurner examines the differences between how librarians search for information and how students search for information in an academic library setting. This article focuses mostly on discovery through the main library website and does not focus on the digital library in particular. However, Turner does note that students looking for primary source material (most often found in the digital library) often used search terms that combined the subject of their search and the format they desired into one search term. (For example, "erie canal primary materials" or "erie canal letters".
38
Wu, M., & Chen, S. (2014). Graduate students appreciate Google Scholar, but still find use for libraries. The Electronic Library, 32(3), 375–389. Retrieved from http://www.emeraldinsight.com.ezproxy.lib.purdue.edu/doi/full/10.1108/EL-08-2012-0102Case StudyxxxxxWu and Chen analyze how graduate student perceive and use Google Scholar. Only minor mentions are made about institutional repositories and how they are under represented in Google Scholar. Digital Libraries are not mentioned.
39
Zhang, T., Maron, D. J., & Charles, C. C. (2013). Usability Evaluation of a Research Repository and Collaboration Web Site. Journal of Web Librarianship, 7(1), 58–82. Retrieved from http://www.tandfonline.com/doi/abs/10.1080/19322909.2013.739041#.VKYDYYrF_QwCase StudyxxxxxxxxZhang, Maron, and Charles evaluated the usability of a discipline-specific academic research repository from the perspective of the end user and the contributor. This study highlighted the need to analyze contributor workflows for inputting content in a repository as well the potential for collaborative workspace in research repositories and institutional repositories.
40
quantitative:
41
Zimmerman, Don and Paschal, Dawn Bastian. (2009) An Exploratory Usability Evaluation of Colorado State University Libraries’ Digital Collections and the Western Waters Digital Library Web Sites. The Journal of Academic Librarianship, 35:3, 227-240. Case Studyxxxxxcriteria: ease of use during searching, and participants' perceptions of the web site and its ease of use.
42
Collection Guides (“Finding Aids”) Usability Study- Novice User Group, Report of the Findings (2010)
43
Hong, W.; Thong, J. Wong, W., and Tam, K. (2002) Determinants of User Acceptance of Digital Libraries: An Empirical Examination of Individual Differences and System Characteristics. Journal of Management Information Systems, Winter 2001-2002, 18:3, 97-124.theoreticalxxUses the Technology Acceptance Model (TAM) which explains IS adoption behavior, to identify critical external variables that have significant effects on users' intention to use digital libraries. (see definitions of perceived usefulness and perceived ease of use in tagging dictionary). Individual differences include "computer self-efficacy" and "knowledge of search domain"; system characteristics include relevance, terminology and screen design. All 5 impact perceived ease of use; the last 3 with this perception impact perceived usefulness -- the two perceptions impact behavior intention.
44
Tsakonas, G and Papatheodorou, C. (2007) Exploring Usefulness and usability in the evaluation of open access digital libraries. Information Processing and Management 44 (2008) 1234-1250. doi:10.1016/j.ipm.2007.07.008theoreticalxxxxxxxProvides a triangular framework of interactions between system, content and user. Between system and user is usability, consisting of ease of use, aesthetics, navigation, terminology, learnability. Between user and content is usefulness, consisting of relevance, format, reliability, level and coverage. Between content and system is performance, consisting of precision, recall, relevance and response time.
45
Kramer, Elsa F. (2005) "IUPUI Image Collection: A Usability Survey" in Digital Library Usability Studies (Bradford, GBR: Emerald Group Publishing, 2005), 346-359.Case Study
46
Lessons Learned: A Primo Usability Study 2016
47
Leveraging Encoded Archival Description for Access to Digital Content: A Cost and Usability Analysis 2012
48
OAC First Round Usability Test Findings 2008
49
OAC Second Round Usability Test Findings 2009
50
Seek and You May Find: Successful Search in Online Finding Aid Systems 2010
51
Shape and usability of an educational resource on purpose: Digital Library in Health Promotion (2014 -- Google translator version of Spanish article also included, “Forma y Usabilidad”)
52
University of Oregon, Orbis Cascade Alliance, Northwest Digital Archives Final Narrative Report, LG-51-08-0164-08 (see usability study methodology) 2010
53
Usability Issues of Faceted Navigation in Digital Libraries 2014
54
Usability of Online Archival Resources: The Polaris Project Finding Aid 2001
55
Usability Testing of Digital Libraries: the Experience of Eprints 2014
56
What Would Users Do? An Empirical Analysis of User Interaction with Online Finding Aids 2009
57
H. Rex Hartson, Priya Shivakumar, Manuel A.
Pérez-Quiñones. "Usability inspection of digital libraries: a case
study." International Journal on Digital Libraries (2004) 4:2, 108-123.
Case StudyxxxxAnalysis of a single system by the 3 evaluators themselves. Focus seems to be on the interface most of all. Deep consideration of cost to implement repairs
58
qualitative:
59
Appendices of A Multi-Dimensional Framework for Academic Support: A Final Report
60
Design and process of a contextual study of information literacy: An Eisenhardt approach
61
First Entry: Report on a Qualitative Exploratory Study of Novice User Experience with Online Finding Aids 2006
62
Historians and the Use of Primary Source Materials in the Digital Age
63
How Students Research: Implications for the Library and Faculty
64
Lessons Learned: How College Students Seek Information in the Digital Age 2009
65
Modes of Seeing: Digitized Photographic Archives and the Experienced User (2010)
66
Observing Student Researchers In Their Native Habitat
67
Researchers at Work: Assessing Needs for Content and Presentation of Archival Materials 2016
68
Some basic issues of diversity: a contextual inquiry.
69
Studying Students: The Undergraduate Research Project at the University of Rochester,
70
The Next Generation of Academics: A Report on a Study Conducted at the University of Rochester (2008)
71
Universal methods of design : 100 ways to research complex problems, develop innovative ideas, and design effective solutions
72
Using Contextual Inquiry to Build a Better Intranet
73
What Humanists Want: How Scholars use Source Materials (2010)
74
concepts:
75
An Exploratory Study into Determining the Relative Importance of Key Criteria in Web Usability: a Multi-Criteria Approach (2008)
76
Petras, V.; Stiller, J. and Gäde, M. (2013) Building for Success? Evaluating Digital Libraries in the Cultural Heritage Domain in Recent Developments in the Design Construction and Evaluation of Digital Libraries: Case Studies, 2013 (IGI Global: Hershey, PA, 141-161.Literature Reviewx\xxlists several frameworks for evaluting digital libraries, compares measures used by Europeana, the European Library, Multimatch and CACAO with regards to system-centric versus user-centric evaluation. Divides user groups into General Users, Cultural Heritage Professionals, Educational Users and Tourist Users. Divides system components and interaction patterns into Content and Internal Organization, Management and Maintenance, and User Access and Interaction, pulling the components of these from Saracevic (2001) 21 elements or constructs of digital library systems. User-Centric Evaluation is composed of usability and user satisfaction. Compares methods of data collection. System-Centric evaluation centers on information retrieval experiments.
77
Consolidating the ISO Usability ModelsTheoretical
78
Developing a Holistic Model for Digital Library Evaluation (2010)Theoretical
79
Evaluating Digital Libraries: A User-Friendly Guide (2005) -- very basic and general overview
80
Buchanan, Steven and Salako, Adeola. (2009) "Evaluating the Usability and Usefulness of a Digital Library". Library Review 58:9 (2009) 638-651.Case Studyxxxxxxxx
81
Banati, Hema; Bedi, Punam; and Grover, P.S. (2006) "Evaluating Web Usability from the User’s Perspective." Journal of Computer Science 2:4 (2006), 314-317.TheoreticalxxxAuthors review the existing criterion for usability and add some of their own, proposing a formula for weighting the factors. To them criteria for usability include: efficiency, effectiveness, learnability, memorability, appearance of the site, emotional saatisfaction, work satisfaction, state of features, trustworthiness of site.
82
Joo, S. and Xie, I (2013) Evaluation Constructs and Criteria for Digital Libraries: A Document Analysis (2013) in Recent Developments in the Design Construction and Evaluation of Digital Libraries: Case Studies, 2013 (IGI Global: Hershey, PA, 126-140.Theoreticalxxxxxxxxprovides detailed criterion for the following constructs: collection, information organization, interface design, system performance, effects on users, services, preservation, administration/sustainability, user engagement and context.
83
Saracevic, Tefko. (2004). Evaluation of Digital Libraries: An Overview. https://comminfo.rutgers.edu/~tefko/DL_evaluation_Delos.pdf (accessed 5/7/16) Literature ReviewThe purpose of this report is to provide an overview of works on digital library evaluation that
included data. The approach was first to analyze and then isolate (or even deconstruct) some
80 evaluation studies along the lines of: 1. construct that were evaluated; 2. context in which
evaluation were conducted; 3. criteria that were chosen as a basis for evaluation, and 4.
methods that were used. For each of these lines of analysis approaches taken in evaluation
studies are identified, enumerated, and approximately ranked: a list of constructs evaluated as
to entities or processes is given; context or approaches taken in evaluation are enumerated;
numerous criteria used as a base of evaluation are classified; and finally the methodologies
used are identified, listed and discussed. Evaluation of digital libraries is not widely practiced.
The corpus in this analysis represents the majority of efforts in digital library evaluation that
contain data. Conclusions, among others, speculate as to the reasons for a relatively low
presence of evaluation in digital libray research and practice.
84
From digital libraries to digital preservation research: the importance of users and context (2009)
85
Measuring the User Experience: collecting, Analyzing, and Presenting Usability Metrics by Tom Tullis and Bill Albert. (outline of book with notes about useful info included; online version of book http://www.sciencedirect.com/science/book/9780123735584
86
Understanding User Acceptance of Digital Libraries: What are the roles of interface characteristics, organizational context, and individual differences? (2002)
87
Usability Assessment of Academic Digital Libraries: Effectiveness, Efficiency, Satisfaction, and Learnability (2005)
88
Usability of Digital Libraries: A Study based on the areas of information science and human-computer-interaction (2005)
89
User’s Views about the Usability of Digital Libraries (2005)
90
Users’ Evaluation of Digital Libraries (DLs): Their Uses, Their Criteria, and Their Assessment (2007)
91
What is Usability in the Context of the Digital Library and How Can It Be Measured? (2005)
92
Varghese, R. (2008) User Studies in the electronic environment: Review and brief analysis. International Information and Library Review 2008 40(2):83-93. doi:10.1016/j.iilr.2008.02.002
93
94
95
96
97
98
99
100