rusq: Vol. 53 Issue 3: p. 253
The Website Design and Usability of US Academic and Public Libraries: Findings from a Nationwide Study
Anthony S. Chow, Michelle Bridges, Patricia Commander

Anthony S. Chow (aschow@uncg.edu) is Assistant Professor, Department of Library and Information Studies, The University of North Carolina at Greensboro School of Education. Michelle Bridges (michellek.bridges@cms.k12.nc.us) is Media Specialist, McClintock Middle School, Charlotte, North Carolina. Patricia Commander (mackpm@wssu.edu) is Health Sciences Librarian, Winston-Salem State University, Winston-Salem, North Carolina

This paper describes the results of a nationwide study which examined the design, layout, content, site management, and usability of 1,469 academic and public library websites from all 50 states in the United States. Our findings show common trends for homepage design, navigation, and information architecture. Library websites were found to consistently provide information about hours of operation (97.9 percent), library address (91 percent), news and events (88.9 percent), access to OPACs (84.6 percent), online renewal (77.7 percent), contact information (72.5 percent), and ability to give feedback (74.2 percent). Websites were mainly designed (33 percent) and managed (50 percent) by librarians as part of their professional job duties and the majority did not conduct any web usability testing (72.3 percent). This study provides a profile of how the nation’s academic and public libraries design and manage their websites and how this compares to recommended best practices from the research literature. Library websites rated high in general usability based on recognized heuristics; however, a need to conduct usability evaluations remains. A basic set of guidelines for library webpage design is proposed.

Library websites are essential in a variety of ways. They are the public face of the institution. They are a nexus of information provision and access. They are often the first and only place users go for information and the only way library services are used by virtual patrons who never physically visit the library.

The relationship between user and website, however, is extremely fickle. Library websites need to be easily navigable, including obvious signs that quickly lead the user to the information that they need to find. Websites have as little as 25–35 seconds to convince users that the information they are looking for is available.1 Users quickly scan a webpage to determine whether they have what they need: Can this site answer my question? If so, where is the answer I am looking for? Can I find it with minimal mental effort while having my question answered with maximum effectiveness and satisfaction?

This study examines library web design. Academic and public libraries across the United States were surveyed about their website content, design, and maintenance through the lens of recommended website guidelines. In addition, the study evaluated a random sample of libraries from all fifty states on their design features and adherence to usability standards and compared and contrasted these results.


LITERATURE REVIEW
Human-Computer Interaction (HCI) and Usability

Designing technology solutions to be “user friendly” for a wide array of people with different technology skills is a difficult task. Interest in designing computers around the needs and abilities of their human users became an important topic with Licklider’s (1960) paper “Man-Computer Symbiosis,” which called for heightened awareness of the relationship between design and user.2 During the late 1970s and early 1980s, the creation of IBM’s first personal computer ushered in a new era in which novice users of technology had access to computers.3 Many users had problems using this new technology, and companies started designing their products to be easier to use; designing technology to be more user friendly became a priority, but companies quickly realized that most programmers and engineers were not very effective at understanding how to design technology for the novice user. Thus, the field of human-computer interaction (HCI) emerged.

According to Eason, HCI specifically seeks to address six factors in the human-computer interaction: safety, utility, effectiveness, efficiency, usability, and appeal.4 A central tenet in designing technology interfaces within an HCI context is that the people who will be using it must be consulted from the very beginning. This is referred to as User Centered Design (UCD), which is “the practice of creating engaging, efficient user experiences” and places the human user as the starting point for designing effective technology solutions.5 UCD requires a systematic process of analysis, design, and development that involves iterative testing with representative users at each phase. Although many developers tend to think of effective web design in aesthetic terms, the functionality of a website’s interface design and information architecture are equally important and need to be specifically designed for a targeted group of users.6

What is Usability?

The degree to which users seeking information find a website relevant and easy to use reflects the site’s general usability; the International Organization for Standardization (ISO) formally defines usability as the “extent to which the product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.”7 Usability applied to web design treats it as a software development project rather than a mere “intuitive determination of user friendliness.” 8 As Nielsen points out, “Usability allows us to make everyday life more satisfying by empowering people to control their destiny and their technology rather than be subjugated by computers.”9

According to Nielsen, methods for user testing were well established in 1983 by John Gould and Clayton Lewis, who held that there were three primary principles of usability:

  • 1. Establish an early focus on users and run field studies before starting any design work.
  • 2. Conduct empirical usability studies throughout development.
  • 3. Use an iterative design process.10

Information Seeking and Website Usability

Theories of information seeking behavior serve as an appropriate window into understanding the information needs of users on the web. Taylor defined the information need of users as a “vague sense of dissatisfaction” wherein their active cognitive state is troubled by a “certain incompleteness in his picture of the world.”11 Belkin, Oddy, and Brooks refer to this information need as an Anomalous State of Knowledge (ASK), which begins a negotiation process between user and information system; users actively begin seeking information in what Dervin refers to as a “sense making” process where “knowledge is the sense made at a particular point in time-space by someone.”12

Pirolli and Card developed an information foraging theory to describe information seeking using a hunting analogy—information needs create a hunger that starts a cognitive hunt. Their behavioural approach to information seeking is referred to as the adaptive control of thought in information foraging (ACT-IF) model in which humans are informavores constantly on the hunt for the information they seek.13 Morville and Rosenfeld believe the information architecture of a website should be developed with clear and strong information scents so users can quickly identify the right pathways to find what they are looking for.14

According to Lazar, usability is much more important for websites because they are different (than traditional information systems). Because the user may access a Web site infrequently, the site must be easy to use each time it’s accessed. Web users must be able to figure out immediately how to use an interface. If previous knowledge is required, the interface is confusing, information is not easy to find, or the user has to ask for outside assistance, he or she might visit another Web site because there is virtually no cost involved in switching.15

The web is designed for user control and autonomy. Nielsen states, “The original ideology of hypertext and the World Wide Web, as expressed by Vannevar Bush (1945), Ted Nelson (1960), and Tim Berners-Lee (1991) makes individual users the masters of the content and lets them access and manipulate it in any way they please.”16

Nielsen argues that site designers must always remember what he calls the Jakob Law of the Internet User Experience: “users spend most of their time on other websites.” His research has found patterns in user behavior in specific domains (e.g., investors and financial analysts) and he feels it is possible to “derive high-level design patterns for other domains as well. Such patterns must both retain sufficient flexibility and give users a sense of consistency and mastery in the things that matter.”17

Burnett and Erdelez found that researchers in the field continue to make an impassioned plea that “we remember the centrality of the user in information provision”; it is important to recognize the relationship between physical and architectural design of a system and its impact on how users seek information. The visual cues offered to a user will explicitly influence what users perceive to be available and a match to what information they are looking for.18 Richardson suggests that there are actually three interrelated elements necessary for successful provision of reference services to users: information resources, information technology, and users.19

Website Usability Evaluation

While usability as a concept can be extremely complex to understand and consider in terms of application and implementation, it is actually quite elegantly simple to apply three basic processes:

  • 1. Identify and engage representative users as design partners from the very start of the project.
  • 2. Iteratively test at all design and development stages, including paper copies detailing only the information architecture and mockup of a projected digital environment.
  • 3. Continuously improve, refine, and collect representative user feedback.20

Jordan defined two categories of usability evaluation and testing—empirical (with representative users) and nonempirical (without representative users). Empirical usability testing methods include the use of focus groups, surveys, interviews, and usability tests with well-defined metrics (qualitative and quantitative data) and performance tasks (e.g., task analysis, task completion, time to complete tasks, etc.). Nonempirical methods include creating feature checklists (what features are most important?), task analysis (what tasks are most important?), and cognitive walkthroughs (what are the most efficient pathways to information?).21 Hornbaek approaches this dichotomy as the difference between usability evaluations that represent user perceptions, which are subjective, and those that do not, which are considered objective.22

Alshamari and Mayhew classified usability evaluations into four types—model/metrics-based (use model to general usability measure), inquiry (communicate with users to derive insights into usability problems), inspection (experts use the interface to try and find problems), and testing (collect data by having users attempt to complete tasks).23 Furthermore, they contextualize general usability evaluation as a complex interaction between six interrelated factors—evaluator, users, tasks, reporting, test environment, and prioritizing problems found.24

Nielsen’s five user principle suggests that 85 percent of usability problems can be found by testing with only five users, and this has been supported by multiple research studies.25 This principle based on a combination of probability theory and empirical evidence has been found to have significant caveats based on the unique contexts and differences of individual users.26 According to Alshamari and Mayhew, “it can now be concluded that if the website has different types of users, it is vital to consider user numbers and their characteristics seriously.”27

Hornbaek examined 180 usability studies published in the literature and classified their usability measures using the ISO 9241 definition of usability: effectiveness, efficiency, and satisfaction.28 He found six major issues with current usability evaluations:

  • 1. Measures of the quality of interaction (assessed by experts) are used only in a few studies.
  • 2. Approximately one quarter of the studies do not assess the outcome of the users’ interaction with the system they are testing.
  • 3. Measures of learning and retention are hardly employed.
  • 4. Some studies treat measures of how users interact with interfaces as being synonymous with quality-in-use.
  • 5. Measures of user satisfaction with interfaces are in disarray and most studies ignored validated questionnaires available.
  • 6. Some studies mix together users’ perceptions of phenomena with objective measures of the phenomena.29

Library Website Design and Usability

In reviewing the literature, there have been many studies describing usability testing and research done on singular institutional websites for one individual library; however, broad studies focusing on public and academic libraries are not common. A study done by Liu examined 111 Association of Research Library websites and found that the sites tended to have search, resources by subject or subject guides, about section, library services, site search, ask-a-librarian, news or events, and contact us. Some common design patterns were columns by category and mouse-over links with sidebars.30 Solomon conducted a survey of public library websites in Ohio, using a checklist of 61 usability guidelines, features, and content. Solomon found that only 35 of the 211 websites surveyed met 80 percent of her criteria, and she noted that important features were missing such as privacy policies, site searches, and feedback mechanisms.31

Additionally, usability studies have shown that creating websites with usability guidelines are important as “patrons who cannot successfully complete specific tasks may not revisit the site.”32 In a survey of web developers in academic libraries, Connell found that only 46.8 percent of them had conducted usability testing of any kind on their websites.33 Chen, Germain, and Yang found that 49 percent of the Association of Research Libraries’ libraries had web usability policies, standards, or guidelines, and 85 percent of those libraries had conducted usability testing on some part of their website.34

Shieh and Liu noted that the quality of the information architecture greatly influences the user’s experience and satisfaction with a library website.35 King suggests that one must first envision a site as a business with information being the product: “Usability studies play a vital role in making sure library users can find information on your Web site quickly and accurately.”36

Further studies gave recommendations for what a library website should look like. Liu found that “the universe of information presented on academic library homepages still focuses on library functions, requires numerous pathways for access, has overwhelming options, and takes a ‘one-design-for-all’ approach that fails to recognize users as individuals.”37 Liu further recommended changing this to reduce the intimidating appearance of library homepages by “employing an appealing graphical design that accommodates usability and accessibility requirements.”38 Other researchers, such as Poll, created lists for the main topics users expect to find on academic and library websites and summarized the main checklist points for a library website of high quality: adequate language (to the population), clear structure, options for different user groups, up-to-date information, and short, concise information.39

A comprehensive review of the literature revealed, however, that no large-scale study had been conducted to determine the current design and usability of academic and public library websites. Unanswered questions such as, “did sites follow a consistent design,” “did they conform to website design guidelines,” “what content did they possess,” and “who designed and maintained them?,” helped inform the study’s five research questions:

  • • RQ1: What is a standard design layout for academic and public library websites?
  • • RQ2: What are the common features and content academic and public library websites include?
  • • RQ3: Who designs and maintains academic and public library websites?
  • • RQ4: To what extent do academic and public library websites adhere to recommended design guidelines?
  • • RQ5: What is the general usability of library websites?


Method

The study used both a library website usability checklist (LWUC), which was designed for the study and derived from the literature to empirically evaluate randomly selected websites, and an online survey, which collected self-reports from academic and public libraries across the United States. In total 1,469 websites were analyzed in the study, which provides a 99 percent confidence level that the sample is valid with an error rate of +/−4 percent.40

The Library Website Usability Checklist (N = 203)

The Library Website Usability Checklist (LWUC) is a website evaluation tool designed specifically for the study containing 67 questions divided into five discrete sections—site information, recommended website features, content, feature placement, and recommended information architecture and usability factors. Section 1 collected general information about the site being evaluated: library name, URL, and webmaster email address. Section 2 comprised 19 recommended website features from the extant literature (e.g., navigation, search tools, and graphic design elements) adopted from three studies: Raward, Solomon, and Neal and Herzig.41 Section three was a checklist of 28 types of library web content recommended by previous research (e.g., location information, a link to the online public access catalog [OPAC], and describing circulation information) adapted from five studies: Raward, Solomon, Neal and Herzig, Poll, and Duncan and Holliday.42

Section 4 sought information about the location and placement of five standard web features and content—library name and logo, search box, main navigation tools, library location information, and library contact information. Section 5 addressed factors of information architecture and usability identified by Morville and Rosenfeld, which involves nine essential questions they believe comprise a quality homepage.43 Usability was assessed for each site by attempting to measure the degree of effectiveness, efficiency, and satisfaction based on whether each member of the research team was able to answer the nine questions.44 Effectiveness was further broken down into task completion and quality of output, while efficiency was more specifically defined as deviations from the critical path, error rate, time-on-task, and mental effort. Each of these factors was rated on a scale from 1 to 10.

To increase inter-rater reliability and protect against threats to face and construct validity, three members of the research team tested the checklist on the same website and then compared responses for each question. By comparing ratings for each question, each member was able to clarify a common understanding of each question and increase precision in terms of what specific website features were appropriate to count for each checklist item. The process was repeated with four additional library websites until inter-rater ratings were 95 percent similar for subjective questions and within 1 point for scale items.

Sampling and Sampling Frame

The researchers randomly selected four library websites from each state and the District of Columbia. Websites were stratified into four categories: one rural public library (a population of 25,000–50,000), one urban public library (a population of 50,000 or more), one private academic library, and one public academic library. See table 1.

The public library website for each site was identified using the Public Libraries (www.publiclibraries.com) and Library Sites (www.librarysites.info) websites. If the city chosen did not meet the needed stratification of either urban or rural, the process was repeated. The selection process continued in this manner until one urban library and one rural library were selected for each state.

Academic libraries were randomly selected from a Microsoft Excel spreadsheet downloaded from the Carnegie Foundation website that lists every private and academic university in each state in the United States and Washington, DC. For the purposes of viability, only four-year institutions that awarded bachelors, masters, or doctoral degrees were included in the sample. This process was repeated until one private and one public academic library were selected for each state plus Washington, DC.

Across a two-month period, 203 library websites (4 libraries for each state plus 3 libraries from Washington, DC) were evaluated using the LWUC evaluation tool by a member of the research team.

The Library Website Survey (N = 1,266)

The Library Website Survey (LWS) was emailed to 9,000 academic and public libraries across the United States with a response rate of 14.1 percent or 1,266 respondents. The email distribution list was collected through a combination of each state’s state library website and the American Library Directory. This represents a valid sample of the nation’s libraries at a 99 percent confidence level with a sampling error of 3.4 percent.45

The survey represented a revised version of the LWUC evaluation tool with most of the sections being condensed. The instrument comprised 44 questions broken down into 5 sections: general information (4 questions), web design and management (5 questions), feature checklist (5 questions), content (22 questions), and page location and placement (8 questions). Additional questions asked the libraries for general demographic information such as library type, size of user population and categorization of library (urban or rural, and public or private), and information about how their website was designed and managed.

Participants

There were 1,266 participants who completed a portion of the Library Website Survey (LWS) with a completion rate of 79.6 percent or 1,016 respondents. User population varied greatly among respondents. The two largest groups of respondents had either a population of fewer than 2,500 users (24.7 percent) or a population of 10,000–35,000 users (24.6 percent). More than three-quarters (76.9 percent) of responding libraries were public libraries, while only 23.1 percent were academic libraries. Of those public libraries, more than two-thirds (69.9 percent) were rural libraries. Of the 331 academic libraries who answered the survey, 48.3 percent were private institutions and 51.7 percent were public institutions (see figure 1).


Results
Homepage Design for Academic and Public Libraries

There was a high degree of agreement in homepage design elements found between library responses and researcher evaluations. Main navigation was located at the top center (38.4 percent) or left side (36.3 percent) of the page. Surprisingly 37.8 percent reported not having a search feature available on the homepage, and 30 percent of search tools were located at the top right of the page. Library names and logos were located either at the top center (45.4 percent) or top left (43.3 percent) of the page. Locations of library contact information was the most varied as the majority were found at the bottom center of the homepage (21.6 percent), followed by not at all (16.2 percent), top center, side left, top right, and the center of the page. Library location information, such as address and directions, were either not present (21.4 percent) or located predominately at the bottom center (18.7 percent) or top center (15.3 percent) of the page.

Our randomly selected, independent evaluations aligned well with the survey results. Table 2 indicates the most frequently selected web element for both the surveys and evaluations and overall average of the two.

Figure 2 shows the homepage design trends graphically.

Website Features

The library websites evaluated using our Library Website Usability Checklist made a positive first impression—86 percent of academic library websites and 73 percent of all public library websites were rated favorably. In terms of overall features offered to their patrons, however, public library websites appeared to offer more features and services then academic library websites (see table 3).

Many features were available on both academic and public library websites; there were, however, some differences. Academic libraries were more likely to have a search feature, consistent fonts, high contrast between text and backgrounds, a sitemap, and a clean, uncluttered design. Public libraries were more likely to have a library tag line, ability to resize text, and the option to choose another language.

Website Content

Library websites were found to contain contact information, such as a general phone number and email address, on a remarkably consistent basis between library surveys (98.1 percent) and researcher evaluations (96.5 percent). In addition, other frequently found information was contact information for key staff individuals (72.5 percent of surveys, 71.3 percent of evaluations) and location information, such as an address, map, or directions (92.4 percent of surveys, 91.0 percent of evaluations). Almost all library websites posted opening times or library hours (97.9 percent of surveys, 97.5 percent of evaluations), while only around half included a creation or copyright date (41.8 percent of surveys, 59.2 percent of evaluations).

Library websites also allowed patrons to make comments or suggestions about the library (69.0 percent of surveys, 59.4 percent of evaluations) or to give feedback about the website (74.2 percent survey, 93.5 percent evaluation); half (50.8 percent) of the library websites evaluated also included frequently asked questions (FAQs). Links to the online public access catalog (OPAC) were also common (84.6 percent survey, 97.0 percent evaluation), while search tips for how to use them were less common (54 percent survey, 65 percent evaluation). Most library websites contained circulation information (79.3 percent survey, 93.5 percent evaluation) and library policy information (76.8 percent survey, 89.0 percent evaluation).

Most library websites also allowed users to access personal accounts (94.5 percent evaluation), allowed them to renew books or materials online (77.7 percent survey, 81.0 percent evaluation), and gave access to electronic resources, such as databases, online reference sources, or e-books (90.4 percent survey, 96.5 percent checklist).

For academic library websites specifically, user access to course reserves was prevalent (92.9 percent), and 84.3 percent of both public and academic library websites offered patrons information about interlibrary loans (ILL). Public library websites consistently provided information about services for children and teenagers (84.2 percent survey, 91.1 percent evaluation) and information about branch libraries (80.4 percent) as well.

Other information found on most websites included library news and events (88.9 percent survey, 83.4 percent evaluation), a description of library services or a link to their library services (87.7 percent survey, 84.3 percent evaluation), and an “About Us” section (74.4 percent survey, 81.1 percent evaluation).

Using the average of the library responses and research evaluations, table 4 lists the content found on 80 percent or above of library websites.

Table 5 lists the content found on 80 percent or less of library websites.

Table 6 represents the recommended content areas that were found on less than half the websites with particular deficits in use of Web 2.0 tools, the ability to resize text, a site map, and noting when the site was last updated.

Web Design and Management, Information Architecture, and Usability

Most library websites appear to be managed by a librarian as part of his or her job (50.0 percent). The next largest group was the “other” category (18.4 percent) that included volunteers, state libraries, or committees.

Similarly, library websites are often designed by a librarian who does it as part of his or her job (33.0 percent), followed again by “other” (28.4 percent), and an outside company (21.2 percent). See figure 3.

Many of the library websites (54.3 percent) were designed within the last two years and updated either daily (56 percent) or weekly (37.0 percent). See figure 4.

Usability testing was not a high priority for most library websites as 72 percent reported that they did not conduct usability testing when designing their current website. A minority (18.9 percent) reported conducting usability testing before launching a new site, 8.3 percent did so during the launch of a new site, and 10.0 percent of respondents conducted usability tests after the launch of a new website.

Usability Evaluations

Morville and Rosenfeld held that well-designed homepages should allow users to answer nine questions.46 The randomly selected library websites evaluated during the study rated well, as six of the nine were answered successfully more than 80 percent of the time. The three questions that were not answered as successfully focus on establishing the uniqueness of the library, location of a search box, and providing feedback. The nine questions are sorted by success rate in table 7.

Answering the nine questions represented a preliminary usability test of the library website homepages that allowed each to be evaluated for effectiveness, efficiency, and satisfaction. Usability was extremely high. The websites evaluated were high in effectiveness (M = 9.08, SD = 1.40), which was calculated using the ratings for overall task completion and quality of output. The efficiency (M = 9.04, SD = 1.15) was also high and represented the average of four factors—deviations from the critical path, error rate, time-on-task, and mental effort. The sites also scored high on satisfaction (M = 8.64, SD = 1.56) and the average of all combined usability factors or grand usability rating was M = 8.92 (see table 8).


Discussion

Library websites in general appear to have the basic information available for patrons. The data collected from a combination of nationwide library responses and randomly selected evaluations serve as a preliminary step toward establishing library website standards and guidelines. They also allow the study’s five research questions to be answered.

RQ1: What is a standard design layout for academic and public library websites?

The results suggest that the majority of library websites have four common design features: Main navigation that tends to be horizontal and located at the top center of the page or vertical on the left side of the page; library logos that are located at the top of the page, which are either centered or at the left top corner; contact and location information that is centered on the bottom of the page footer or on a left side bar; and a search feature, when available, tends to be usually found at the top right of the page.

Both academic and public library websites were found mostly to be uncluttered with clean graphics and no splash pages or unnecessary graphics; they tended to be organized in a logical, hierarchical fashion. The use of color was also effective as most sites had high contrasts between site backgrounds and texts and consistent use of font styles and text formatting for increased readability. They also were compatible with multiple browsers and had user-friendly headings, so it was easy to know where one was when navigating library websites.

RQ2: What are the common features and content academic and public library websites include?

Based on the results using the library website usability checklist, the majority of library websites, similarly to the findings of Liu, readily provided contact information, directions, hours of operation, and access to their OPAC.47 Other frequently found content and features included access to patron circulation information; library policies; the ability to renew, reserve, or checkout books; information on children/youth services; and access to electronic content. Information about the library, other branches, library news, and general library services were also typically available.

Content and features not as readily available included the date of page creation or updates, copyright notices, opportunities to provide feedback, frequently asked questions (FAQs), information about special collections, ready access to Web 2.0 tools and to social networking such as Facebook, Twitter, blogs, photos, RSS feeds, and libguides. Information or availability of virtual reference services through email, instant messenger, and video were also often hard to find.

RQ3: Who designs and maintains academic and public library websites?

Based on our representative sample, the majority of library websites are designed and managed by librarians who work on the website part-time as part of their regular librarian duties (50 percent). The second highest category, “other” (18.4 percent), represented a combination of volunteers, the state library, or entire web committees. Given the complexity and rapid changes involved in designing and maintaining effective websites, libraries might consider more dedicated resources to meet the growing online demands of patrons. Most websites were, however, recently designed or redesigned in the past two years (54.3 percent), which suggests that continued improvement and refinement of library websites is occurring for over half the participating libraries.

RQ4: To what extent do academic and public library websites adhere to recommended design guidelines and RQ5: What is the general usability of library websites?

The library websites evaluated in this study followed a typical homepage layout and were somewhat aligned with design guidelines as proposed by Raward, Solomon, and Neal and Herzig.48 Navigation was located either at the top center or left side and the name of the library or logo was also above the navigation top center or top left of the page. Information that was missing, however, included basic yet important information for users—library contact and location information.

HCI guidelines call for organizations to study how their users are interacting with and impacted by the technology that they are using, and user-centered design (UCD) principles call for representative users to be included in all phases of the design and development process, so that their user experiences are “engaging and efficient.”49 The finding that more than 70 percent of libraries who responded to our survey have not conducted usability testing supports Connell’s findings that it is not a priority in the majority of libraries and suggests that most libraries are not practicing HCI guidelines or UCD principles, which require direct user input and testing.50 This also indicates that other facets of usability design standards and heuristics, along with recommended web design stages, may be unknown and unused, reflecting what Liu referred to as the “one-design-for-all” approach.51

Despite this finding, preliminary analysis of library website homepages, both through survey self-reports and evaluations of a random sample of library websites in every state across the United States using the library website usability checklist derived from the literature and developed for this study, suggest that there are some common design conventions being followed. For example Morville and Rosenfeld recommended nine homepage questions should be easily answered with a high level of success on most library websites and six of the nine were consistently answered on over 80 percent of library websites.52 Preliminary usability evaluation suggests that library homepages are relatively high in effectiveness, efficiency, and satisfaction, at least in terms of being able to answer questions around the information architecture guidelines of well-designed homepages from the perspective of the researchers.

It is essential to emphasize though that this does not reflect the perspective of library users. Only 30 percent of participants reported that their website had been tested for usability, which suggests that the perspective of users does not appear to have been systematically taken into account. The usability for general users remains unknown, although the principles of HCI and UCD suggest that it is difficult to achieve high levels of usability with any level of specificity around unique users without close collaboration with the users a site seeks to serve.


Study Limitations and Implications

There are three primary limitations to the study. First, the study did not involve library users in the survey or usability tests. In usability testing, evaluations in absentia of actual users are considered “nonempirical.” The rationale for not working with users had to do with the scope and viability of the study and represents the next step in our research. Second, a small sample size of 14.1 percent of the nation’s public and academic libraries responded to the surveys. A larger sample size would increase the overall external validity and generalizability of the study findings. Lastly, the use of library self-reports introduces large amounts of potential error, as library representatives completed the survey about their own institution’s website. The independent checklist used by the researchers was introduced as a validity and reliability check to triangulate the data and protect against this limitation to ensure the data and conclusions we have drawn from it are relatively consistent.

There are four major implications of the study. First is the identification and articulation of library website design standards; as one of the largest studies of library websites ever conducted, the features identified both through self-reports and randomly selected and independently evaluated library websites in each state represent collectively a valid, reliable, preliminary list of both design and content features. Through the process of stratified random sampling the sampling frame reflected randomly selected academic and public libraries from both urban and rural areas, we now have a preliminary profile of a typical academic and public library website in terms of design, content, maintenance, and general usability. Second, the study identified a general trend that human–computer interaction techniques, user-centered design, and usability testing, all of which require direct input from representative users, and website management are not a high priority for the libraries that participated in the study. Two-thirds of respondents reported never having conducted a usability test and approximately half assigned web design and management responsibilities as a part-time responsibility to a librarian. Third, despite this lack of consistent emphasis on designing library websites specifically for users, the usability and information architecture of library website homepages as defined by heuristic standards and nonempirical (without users) usability testing ranked high in overall usability.

Despite the large number of websites examined for this study, the patron user experience in terms of general satisfaction and how they use and perceive library websites remains unknown. An initial hypothesis, based on this fact, would be that library websites could improve their general usability by more systematically working with users to design, test, and redesign their web information spaces. While rating high on general information architecture and homepage usability guidelines is excellent, further study is required to understand how usable library websites are for their specific users.


Conclusions and Future Research

The findings of the study suggest that some standard design features are shared by academic and public library websites. Initial examination of homepage information architectures and content suggests that, in general, library websites are high in usability based on nonempirical evaluator heuristic tests. Without data and input from actual users, however, the overall usability of US library websites remains unknown. All website visitors have a chance to come to their own conclusion regarding their likes and dislikes about a site. The systematic process of web design called for by HCI and UCD increases the probability that patrons will have a positive experience and be able to use web spaces specifically designed around their information-seeking needs and predicted behavior.

Through both library self-reports and independent evaluations, the study’s findings suggest that much of the basic content and features typically called for by patrons, as identified by previous research, are available.53 From a usability perspective, however, there are opportunities for improvement. Understanding and working with users to ensure that a website is appropriately designed and refined requires significant time and effort and reflects a pervasive, ongoing process. Understanding how to interpret, use, and implement changes to a website based on user testing requires skills that librarians who are not performing these tasks on a full-time basis may not possess and certainly do not have time for. Future research will stratify and analyze the data in terms of urban and rural and public and private libraries; in addition we will seek to repeat the study, focusing on library users, their information needs, and their general perceptions of usability through the primary factors of effectiveness, efficiency, and satisfaction.

Ultimately, usability is most important on a personal, subjective basis. The primary author’s children wanted to visit the library on a late Saturday evening; the library’s website was used to quickly determine when the library closed on the weekend. The problem was, that unlike many library websites, the business hours were not found on the homepage at all. There were twenty-one links located on the left sidebar tucked into the web design shell apparently run by the city. The footer had information about the city rather than the library. Several minutes ticked by before business hours were finally found under a link labeled “branches,” along with a growing sense of frustration. The author still packed the kids into the car and drove to the library, making it in time before it closed, but the overall usability of the website left a lot to be desired even though the information was eventually found. What if the information was never found? The trip to the library would never have happened. The impact on potential patrons and their ability to utilize the library’s services in a highly efficient, effective, and satisfying fashion is where the true power and value of usability for library websites can be found.


Acknowledgments

We humbly thank all libraries across the country that took the time to complete our survey.

Thanks to my former graduate assistant, Amy Figley, MLIS, for all of her dedicated and exemplary editing and project management of our project and writing the manuscript and to Jenna Stout, my new and current graduate assistant for her editing and ensuring our Chicago format and design was appropriate based on journal guidelines.


References
1. Jakob Nielsen and Hoa Loranger,  "Prioritizing Web Usability (Berkley, CA: New Riders,"   (2006)
2. Jakob Nielsen,   Usability Engineering (Boston:  Academic Press, 1993): .
3. Daniel McCraken, Rosalee Wolfe,  and Jared Spool, User-Centered Web Site Development: A Human–Computer Interaction Approach (Upper Saddle River, NJ: Pearson Education, 2004),4
4. Ken EasonInformation Technology and Organizational Change (London:  Taylor & Francis, 1988):
5. Jessie James Garrett, "The Elements of User Experience: User-Centered Design for the Web and Beyond. "2nd Edition (Berkeley, CA: New Riders, 2011)
6. Garrett,  The Elements of User Experience; Laura Manzar and Jerimiah Trinidad-Christensen, “User-Centered Design of a Web Site for Library and Information Science Students: Heuristic Evaluation and Usability Testing,” Information Technology & Libraries (September 2006): 163–69; Peter Morville and Louis Rosenfeld, Information Architecture for the World Wide Web: Designing Large-Scale Web Sites, 3rd edition (Sebastopol, CA: O’Reilly, 2008)
7. Anthony Chow and Tim Bucknall,  Library Technology and User Services (Cambridge, UK: Chandos, 2011); Jakbob Nielsen, “Usability 101: Introduction to Usability,” useit.com, January 4, 2012, accessed January 16, 2012, www.useit.com/alertbox/20030825.html; International Organization for Standardization 9241–11, “Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs), Part 11: Guidance on usability” (Geneva, Switzerland, 1998)
8. Manzari and Trinidad-Christensen, “User-Centered; J. Nielsen, “Heuristic Evaluation,” in Usability Inspection Methods, J. Nielsen and R. L. Mack,eds. (New York: Wiley, 1994), 25–62
9. Nielsen, “25 Years in Usability,” useit.com, April 21, 2008, accessed January 16, 2012, www.useit.com/alertbox/25-years-usability.html
10. Ibid. 
11. Donald Case,  Looking for Information—A Survey of Research on Information Seeking, Needs, and Behavior (Oxford: Elsevier; R.S. Taylor, 2007), 72; Robert Taylor, “Question-Negotiation and Information Seeking in Libraries,” College & Research Libraries 29 (1968): 178–94
12. Nicholas J.. Belkin, Robert N.. Oddy,  and Helen M. Brooks,  “Ask for Information Retrieval: Part I. Background and Theory,” Journal of Documentation 38, no. 2 (1982): 61–71; Brenda Dervin, “Sense-Making Theory and Practice: An Overview of User Interests in Knowledge Seeking and Use,” Journal of Knowledge Management 2, no. 2(1998): 36–46
13. Peter Pirolli and Stuart K. Card,  "“Information Foraging,” Psychological Review 106, no,"   (1999)   4:  643–75.
14. Chow and Bucknall, Library Technology and User Services; Peter Morville and Louis Rosenfeld, Information Architecture for the World Wide Web: Designing Large-Scale Web Sites, 3rd edition (Sebastopol, CA: O’Reilly, 2008)
15. Jonathan Lazar,  Web Usability: A User-Centered Design Approach (Boston: Pearson Education, 2006), 9
16. Jakob Nielsen,  “Mastery, Mystery, and Misery: The Ideologies of Web Design,” useit.com, August 30, 2004, accessed January 16, 2012, www.useit.com/alertbox/20040830.html
17. Ibid. 
18. Gary Burnett and Sandra Erdelez,  "“Forecasting the Next 10 Years in Information Behavior Research: A Fish Bowl Dialogue,”,"  American Society for Information Science and Technology 2009 Annual Meeting Coverage  (2009) :  44–48.
19. John V. RichardsonJr,  “The Future of Reference: The Intersection of Information Resources, Technology, and Users,” Reference Services Review 31, no. 1 (2003): 43–45; Denice Adkins and Sanda Erdelez, “An Exploratory Survey of Reference Source Instruction in LIS Courses,” Reference & User Services Quarterly 46, no. 2 (Winter 2006): 50–60
20. James Garrett,  The Elements of User Experience: User-Centered Design for the Web and Beyond, 2nd ed. (Berkeley, CA: New Riders, 2011); Anthony Chow, “The Usability of Digital Information Environments: Planning, Design and Assessment,” in Trends, Discovery and People in the Digital Age, Wendy Evans and David Baker, eds. (Cambridge, UK: Chandos, 2013), 13–38
21. Patrick Jordan,   An Introduction to Usability (Philadelphia:  Taylor & Francis, 1998): .
22. Kasper Hornbaek,  "“Current Practice in Measuring Usability: Challenges to Usability Studies and Research,” International Journal of Human-Computer Studies 64,"   (2006) :  79–102.
23. Majed Alshamari and Pam Mayhew,  "“Technical Review: Current Issues of Usability Testing,” The Institution of Electronics and Telecommunication Engineers (IETE) 26, no,"   (2009)   6:  402–6.
24. Ibid. 
25. Ibid, "Carl Turner, James Lewis, and Jakob Nielsen, “Determining Usability Test Sample Size,”,"  International Encyclopedia of Ergonomics and Human Factors (2nd ed.)  (2006)   vol. 3:  3084–88.
26. Robert Virzi,  “Refining the Test Phase of Usability Evaluation: How Many Subjects is Enough?” Human Factors 34 (1992): 457–68; Gitte Lindgaard and Jarinee Chattratichart, “Usability Testing: What Have We Overlooked?” (Computer and Human Interaction 2007 Proceedings, San Jose, California, April 28–May 3): 1415–24; Alshamari and Mayhew, “Technical Review.”
27. Alshamari and Mayhew, "“Technical Review.”. "
28. Hornbaek,  "“Current Practice in Measuring Usability.”. "
29. Ibid. 
30. Shu Liu,  "“Engaging Users: The Future of Academic Library Web Sites,”,"  College & Research Libraries  (2008)   69:  6–27.
31. Laura Solomon,  “Sinking or Swimming? The State of Web Sites in Ohio’s Public Libraries,” 2005, accessed February 9, 2012, www.designforthelittleguy.com/study.pdf
32. Yu-Hui Chen, Carol A.. Germain,  and Huahai Yang,  “An Exploration into the Practices of Library Web Usability in ARL Academic Libraries,” Journal of the American Society for Information Science & Technology 60, no. 5 (2009): 953–68
33. Ruth S. Connell,  “Survey of Web Developers in Academic Libraries,” Journal of Academic Librarianship 34, no. 2 (2008): 121–29
34. Germain Chen and “An Yang,  "Exploration into the Practices of Library Web Usability in ARL Academic Libraries.”. "
35. Jiann-Chang Shieh, Chih-Feng Liu,  and “The,  "Usability Evaluation Study of University Library Websites,” Journal of Educational Media & Library Sciences 47, no,"   (2009)   2:  163–97.
36. David King,  “The Mom-and-Pop Shop Approach to Usability Studies,” Computers in Libraries 23, no. 1 (2003): 13
37. “Engaging Liu,  "Users.”. "
38. Ibid. 
39. Roswitha Poll,  “Evaluating the Library Website: Statistics and Quality Measures” (paper presented at the World Library and Information Congress: 73rd IFLA General Conference and Council, Durban, South Africa, August 2007), http://archive.ifla.org/IV/ifla73/papers/074-Poll-en.pdf
40. “Survey Random Sample Calculator,” Custom Insight, accessed February 16, 2013, www.custominsight.com/articles/random-sample-calculator.asp
41. Roslyn Raward,  “Academic Library Website Design Principles: Development of a Checklist,” Australian Academic & Research Libraries 32, no. 2 (2001): 1–7; Solomon, “Sinking or swimming?”; Diane Neal and Cary Herzig, “A&TRT’s Website Scorecard: Evaluate (and Improve) Your Library’s Website,” Texas Library Journal 83, no. 2(2007): 56–61
42. Raward, “Academic Library Website Design Principles”; Solomon, “Sinking or Swimming?”; Neal and Herzig, “A&TRT’s Website Scorecard”; Pol, “Evaluating the Library Website”; Jennifer Duncan and Wendy Holliday, “The Role of Information Architecture in Designing a Third-Generation Library Web Site,” College & Research Libraries 69, no. 4 (2008): 301–18
43. Morville and Rosenfeld, "Information Architecture for the World Wide Web. "
44. International Organization for Standardization 9241-11, “Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs), Part 11: Guidance on Usability”; Jordan, An Introduction to Usability
45. “Survey Random Sample Calculator.”
46. Morville and Rosenfeld, "Information Architecture for the World Wide Web. "
47. Liu, “Engaging Users.”
48. Raward, “Academic Library Website Design Principles”; Solomon, “Sinking or Swimming?”; Neal and Herzig, “A&TRT’s Website Scorecard.”
49. McCraken, Wolfe, and Spool, User-Centered Web Site Development; Garrett, The Elements of User Experience
50. Connell, “Survey of Web Developers in Academic Libraries.”
51. Liu, “Engaging Users.”
52. Morville and Rosenfeld, "Information Architecture for the World Wide Web. "
53. Liu, “Engaging Users”; Poll, “Evaluating the Library Website.”

Figures

Figure 1

Library Website Survey Participant Demographics



Figure 2

Location of Main homepage Elements



Figure 3

Academic and Public Library Website Management



Figure 4

Who Designed Your Website?



Tables
Table 1

Web Evaluation Sampling Frame


Study’s Sampling Frame
Type of Library Evaluations per State Total (All 50 States plus DC)
Urban Public 1 51
Rural Public* 1 50
Private Academic 1 51
Public Academic 1 51
Total 4 203

*Washington, DC, has only one public library


Table 2

Academic and Public Library Website Design Elements


Location of Primary Web Elements
Web Element Average Survey Top Selected Researcher Top Selected
Navigation
Side left 30.0% (n = 498) 36.3% (n = 379) 60.4% (n = 119)
Top center 29.0% (n = 493) 38.4% (n = 400) 47.2% (n = 93)
Search Tool Placement
Not on homepage 37.0% (n = 463) 37.8% (n = 382) 41.5% (n = 81)
Top right 29.0% (n = 365) 30% (n = 303) 31.8% (n = 62)
Name and Logo
Top left 45.0% (n = 598) 43.3% (n = 460) 69.7% (n = 138)
Top center 39.0% (n = 518) 45.4% (n = 483) 17.7% (n = 35)
Contact Information
Bottom center 21.0% (n = 293) 21.6% (n = 227) 33.5% (n = 66)
Not on homepage 14.0% (n = 192) 16.2% (n = 170) 11.2% (n = 22)
Top center 11.0% (n = 158) 13.5% (n = 142) 8.1% (n = 16)
Side left 11.0% (n = 154) 10.6% (n = 112) 21.3% (n = 42)
Location Information
Not on homepage 20.0% (n = 267) 21.4% (n = 226) 20.8% (n = 41)
Bottom center 19.0% (n = 261) 18.7% (n = 197) 32.5% (n = 64)
Top center 13.0% (n = 179) 15.3% (n = 162) 8.6% (n = 17)
Side left 9.0% (n = 125) 8.8% (n = 93) 16.2% (n = 32)

Table 3

Library Website Features


Are These Features Available? Public Library Website Academic Library Website
Are there clear navigation tools on all pages? 88.0% 88.0%
Is there navigation back to the homepage from every page? 90.0% 92.2%
Is there a search tool of the site?* 53.0% 69.3%
Is the date of the last update indicated? 17.0% 19.6%
Is there a tag line that briefly describes what the webpage/library does?* 36.0% 9.8%
Are the library’s name and logo in a reasonable size and location? 91.0% 88.2%
Are font styles and text formatting limited and consistent?* 92.1% 98.0%
Are high contrast colors used between the text and the background?* 87.9% 98.0%
Can the text be resized?* 20.2% 2.0%
Does the graphic design feel clean and uncluttered?* 83.0% 89.1%
Are graphics used appropriately to address specific needs
(No random splash pages or huge graphics with no seeming purpose)?
88.0% 87.0%
Is the website multi-browser friendly? 98.0% 99.0%
Does the website give its users the ability to pick their language?* 29.7% 3.0%
Is the website organized logically so that similar sections are grouped together in the organization hierarchy? 87.0% 85.3%
Is there a site map?* 27.7% 38.0%
Are headings user friendly? 88.9% 92.2%
Are headings, titles, and links jargon free?* 66.3% 49.0%
Are abbreviations and acronyms spelled out or explained? 8.9% 8.8%

*indicates a larger than a 5% difference between public and academic libraries


Table 4

Content Library Websites Had Consistently


Library Website Content Survey Evaluation Average
Are opening times/library hours posted? 99% 99% 99%
Does the site include library contact details (general phone and email)? 99% 98% 99%
Is there a link to access electronic resources including databases, online reference, and e-books? 92% 97% 95%
Does the site include a link to the OPAC? 88% 98% 93%
Does the site include location information such as an address, map, or directions? 94% 92% 93%
Does the site include information about services for children and teens? 84% 98% 91%
Are there clear navigation tools on all pages? 91% 89% 90%
Does the site provide circulation information (how to get a library card, loan periods, fines, etc.)? 82% 95% 88%
Is there information about library news and events? 91% 84% 88%
Does the website describe library services or is there a link to library services? 90% 84% 87%
Can users renew books or materials online? 80% 94% 87%
Is it possible to get help or feedback? 78% 95% 87%
Does the site provide information about library policies? 78% 90% 84%

Table 5

Library Website Content Found on Some Sites


Library Website Content Survey Evaluation Average
Does the site include contact information for key staff individuals? 75% 72% 74%
Are virtual reference services present? 58% 86% 72%
Can comments or suggestions be made about the site? 71% 61% 66%
Are there search tips for the OPAC? 59% 67% 63%
Is there a search tool of the site? 59% 62% 60%
Is there a link to Special Collections? 46% 60% 53%
Is there a creation or copyright date on the website? 45% 59% 52%

Table 6

Library Website Content Found on Less Than 50 Percent of Websites


Library Website Content Survey Evaluation Average
Can users customize their experience / Are there Web 2.0 tools used on the website? 20% 74% 47%
Does the library have an RSS feed for blogs, new materials, events, etc? 38% 52% 45%
Can the text be resized? 62% 11% 37%
Is there a site map? 38% 33% 36%
Is the date of the last update indicated? 36% 19% 27%

Table 7

Answers to the Nine Home Page Questions


Can the Question be Answered from the Homepage? Yes No
1. Where am I? 98% (197) 2% (4)
8. How do I contact a human? 98% (195) 2% (4)
5. What is available at this site? 95% (190) 5% (10)
9. What is their address? 88% (177) 12% (24)
6. What is happening there? 83% (167) 17% (34)
3. How do I get around this site? 82% (165) 18% (36)
4. What is important and unique about this organization? 79% (158) 21% (41)
2. I know what I am looking for, how do I search for it? 67% (134) 33% (67)
7. Do they want my opinion about this site? 58% (116) 42% (85)

Table 8

Home Page Usability Factors


Home Page Usability Factor Ratings
Effectiveness 9.08
Task Completion 9.26
Quality of Output 8.90
Efficiency 9.04
Deviations from Critical Path 9.10
Error Rate 9.11
Time-on-Task 8.96
Mental Effort 8.99
Satisfaction 8.64
GRAND USABILITY RATING 8.92


Article Categories:
  • Library Reference and User Services
    • Features

Refbacks

  • There are currently no refbacks.


ALA Privacy Policy

© 2023 RUSA