Where’s the EASY Button? Uncovering E-Book Usability

Kat Landry Mueller, Zachary Valdes, Erin Owens, Cole Williamson

Abstract


E-book platforms have multiplied among vendors and publishers, complicating not only acquisitions and collection development decisions, but also the user experience. Using a methodology of task-based user testing, the researchers sought to measure and compare user performance of eight common tasks on nine e-book platforms: EBSCO eBooks, ProQuest Ebook Central, Gale Virtual Reference Library (GVRL), Oxford Reference, Safari Books Online, IGI Global, CRCnetBASE, Springer Link, and JSTOR. Success and failure rates per task, average time spent per task, and user comments were evaluated to gauge the usability of each platform. Findings indicate that platforms vary widely in terms of users’ ability and speed in completing known-item searches, navigation tasks, and identification of specialized tools, with implications for library acquisition and user instruction decisions. Results also suggest several key vendor design recommendations for an optimal user experience. The study did not aim to declare a “winning” platform, and all the platforms tested demonstrated both strengths and weaknesses in different aspects, but overall performance and user preference favored ProQuest’s Ebook Central platform.

Full Text:

HTML PDF


DOI: https://doi.org/10.5860/rusq.59.1.7224

Refbacks

  • There are currently no refbacks.


ALA Privacy Policy

© 2023 RUSA