rusq: Vol. 50 Issue 2: p. 170
Finding Articles and Journals via Google Scholar, Journal Portals, and Link Resolvers: Usability Study Results
Lydia Dixon, Cheri Duncan, Jody Condit Fagan, Meris Mandernach, Stefanie E. Warlick

Lydia Dixon previously worked as Library Applications Specialist, all part of Libraries and Education Technologies, James Madison University, Harrisonburg, Virginia
Cheri Duncan is Assistant Professor, Head of Acquisitions, all part of Libraries and Education Technologies, James Madison University, Harrisonburg, Virginia
Jody Condit Fagan is Associate Professor, Content Interfaces Coordinator, all part of Libraries and Education Technologies, James Madison University, Harrisonburg, Virginia
Meris Mandernach is Assistant Professor, Collection Management Librarian, all part of Libraries and Education Technologies, James Madison University, Harrisonburg, Virginia
Stefanie E. Warlick is Assistant Professor, Health and Human Services Librarian, all part of Libraries and Education Technologies, James Madison University, Harrisonburg, Virginia

Abstract

Finding journal titles and journal articles are two of the toughest tasks on academic library webpages. Challenges include choosing the best tools, using terms that make sense, and guiding the user through the process. In addition, the continued development of Google Scholar raises the question of whether it could become a better tool for finding a full-text article than link resolver software or journal portals. To study these issues, researchers at James Madison University analyzed results from two usability tests. One usability test focused on the library homepage navigation and had two tasks related to finding articles by citation and journals by title. The other test asked participants to find citations in three web interfaces: the library’s journal portal, Google Scholar, and the library’s link resolver form. Both usability studies revealed challenges with finding journal titles and journal articles. The latter study showed Google Scholar provided more effective user performance and user satisfaction than either the journal portal or the link resolver form. Based on the findings from the two usability studies, specific changes were made to the library webpages and to several library systems, including the catalog and link resolver form.


In the academic environment, finding articles by citation and finding journal titles are two common tasks. Letnikova found 86 percent of the twenty-two academic library homepage studies she reviewed included a task asking participants to find a journal title in print or online.1 Although many users will simply search Google, the library still needs to provide intuitive pathways for these tasks from its website. The library website should remain an authoritative source, with definitive answers about the institution’s access to a particular article or journal.

Finding known articles and journals poses a challenge for many users. New students may not understand what a journal is or what different elements of an article citation mean. Even experienced students and faculty struggle with potential complications such as embargoed holdings, platform changes, or subscription lapses.

Most academic libraries have two pieces of software to assist users with these tasks. Journal portals provide a quick journal title search. Results show what dates of coverage are available for each title, broken down by information provider. Link resolver software connects users from one provider’s database to full text in another provider’s database. Link resolver software also features a web form in which users can enter an article citation to obtain full-text options. Google Scholar can also find journals and articles and has the ability to use link resolver software to connect users with their local library.

It is important to note that many libraries are exploring “discovery tools”—an emerging type of software combining library catalogs, journal lists, and articles into one search interface.2 The investigation of such tools is in its infancy; however, they may provide additional options for finding journal titles and finding citations without requiring the user to differentiate between these two tasks.

In the fall of 2009, many librarians at James Madison University (JMU) were unfamiliar with the libraries’ link resolver form and relied heavily on the journal portal for finding journals by title and articles by citation. The library homepage navigation also reflected this emphasis on the journal portal. Yet anecdotal evidence suggested that users found the journal portal extremely confusing.

This article therefore investigates three research questions:

  • 1.What difficulties do users encounter when trying to find a journal title from the JMU Libraries’ homepage?
  • 2.What difficulties do users encounter when trying to find an article by citation from the JMU Libraries’ homepage?
  • 3.Of the three interfaces easily available to the library, which web interface supports finding an article by citation most effectively: the journal portal, the link resolver form, or Google Scholar?

These questions were examined by analyzing results from two usability studies conducted at JMU. While these studies’ findings are specific to JMU Libraries, most libraries have similar journal portal and link resolver software, and everyone with an Internet connection has access to Google Scholar.


LITERATURE REVIEW

Conducting usability studies of a library’s web interface provides concrete evidence about user behavior and preferences. The literature documents the benefits of usability studies along with basic principles and practices.3 While there is little evidence that systems that incorporate user input in the development stage are more efficient or effective end products, users excel at determining whether an interface is intuitive and able to be efficiently navigated. Bailey notes that users provide much-needed insight into novice user behavior and are better at defining the parameters of the system rather than contributing to design of the infrastructure.4 Usability studies provide guidance by gathering information from an end user perspective.

Specific usability methods have been developed for libraries,5 and Letnikova provided a summary of academic library usability case studies and compiled a standard list of questions for testing.6 Most library usability studies are qualitative in nature, using as few as five test subjects to inform design characteristics.7 For a quantitative usability study that allows the results to be generalized to broader user behavior, twenty users need to be observed.8

The JMU studies touch on several relevant areas of information-retrieval and search behavior. In a seminal article, Kuhlthau urged researchers to add user-oriented approaches to information-seeking studies as opposed to solely focusing on systems.9 In their 2004 article, Järvelin and Ingwersen suggested system efficiency can be assessed along several dimensions, including not only the quality of documents retrieved but also the searcher’s effort (time), satisfaction, and the tactics employed. “The real issue in information retrieval systems design,” say Järvelin and Ingwersen, “is not whether its recall-precision performance goes up by a statistically significant percentage. Rather, it is whether it helps the actor solve the search task more effectively or efficiently.”10 It is within Kuhlthau’s and Järvelin and Ingwersen’s context that the present article’s study is situated. Rather than study a statistical sample of citations in the three systems examined, this study focused on how effective the interfaces were at helping users complete the tasks.

Lookup tasks, or known-item searches, have been studied repeatedly by information scientists in the context of the library catalog.11 Known-item searches, wrote Marchionini, begin with “carefully specified queries” that should “yield precise results with minimal need for result set examination and item comparison.”12 This article examines known-item searching for article citations, which is a common physical and virtual reference need.13

When usability studies at libraries have concentrated on known-item searching, these studies have involved locating specific journals or books, rather than articles.14 Letnikova noted finding journals proved to be one of the most difficult tasks on academic library homepages.15 Ipri, Yunkin, and Brown included a journal title task on a fourteen-task test with five graduate and five undergraduate students. They found many users had trouble distinguishing “Journals” tabs from “Articles and Databases” tabs and combined article and journal searching on one tab.16 Mvungi, de Jager, and Underwood found confusion among their five participants over the difference between electronic journals and print journals.17 In contrast, Whang and Ring tested twenty undergraduate and thirteen graduate students and found that 90 percent of undergraduates and 100 percent of graduates were able to find a specific journal title using either the library catalog or the library’s SFX journal finder.18 These studies show differences depending on local context.

Fewer studies have examined the task of finding an article by citation. Ascher, Lougee-Heimer, and Cunningham had eight participants perform five tasks at a health sciences library, one of which was finding an article given the citation.19 In this study the participants were instructed to find the article from the library homepage rather than from a particular interface. Most of their participants used PubMed, and all users successfully found the journal’s page. However, they experienced problems related to local authentication.

Terminology is also a major challenge to finding articles: library jargon, nonspecific terms, or variant terms (e.g., serials, journals, periodicals) are barriers, especially when used inconsistently throughout the library website.20 Kupersmith’s website notes terms like “journal article” or “find article” are cited as being more helpful than “databases,”21 but choosing specific words to further distinguish between article- and journal-related tasks is still challenging. McHale used a card sort to help choose new language for her library’s website redesign, and “find a journal by title” and “find articles and more” both ended up in the “search” category.22

Very few library usability studies have focused on evaluating the interface of journal portals or link resolver forms with known-item citations. The University of North Carolina at Greensboro (UNCG) developed a journal portal and studied its use.23 Because the portal was locally developed, developers were able to respond to barriers such as retrieving no hits when a user enters an ampersand or colon. Ellington conducted a usability study on UNCG’s journal portal with forty participants and two of the questions related to finding known citations in the journal portal. Her users commented that they liked finding a link that allowed them to enter article citation information, and they performed better on the task with complete citation information, including volume and issue, rather than just article title, journal title, and date.24 Jayaraman and Harker studied what makes link resolver software effective, but they focused on the linking action rather than the web entry form.25

Library studies using Google Scholar have focused on searching for topics or general search terms and comparing results with subscription sources. Callicott and Vaughn highlighted differences in content and the user experience between traditional library resources and Google Scholar.26 They found that, although Google Scholar guarantees results, constructing complicated queries or limiting the results retrieved is difficult. Lee found that while users prefer the simplicity of the Internet search box, they readily admit they are trading quality for speed and ease of use.27 Donlan and Cooke noted that Google Scholar provides a helpful filter for the web, but it is still unclear what this search engine indexes.28 In addition, users might be prompted to pay for access to journal articles when attempting to access library resources remotely.29 Studies were not found that evaluate user success with Google Scholar for finding journal article citations.

This article, which analyzes results from two usability studies in an academic library, adds to existing research on finding journal titles and attempts to fill gaps in the research relating to finding articles by citation and the relative usability of journal portals, link resolver web forms, and Google Scholar.


BACKGROUND AND METHOD

JMU is an undergraduate-focused institution with approximately eighteen thousand students. The JMU Libraries’ usability lab features one workstation with two pieces of usability software: Techsmith’s Morae (version 2) (www.techsmith.com/morae.asp), which records participant actions during the usability studies, and Bailey’s Usability Testing Environment (UTE) (version 2) (www.mindd.com/Content.aspx?pid=UTEStandard), which presents participants with tasks in the web browser environment. The UTE also presents end-of-task questions to measure time on task and task success.

Two studies conducted in April 2009 were covered by an institutional review board-approved protocol. The authors recruited participants for both studies through a blast e-mail sent to all students and faculty, and interested respondents were randomly selected to include a variety of grades and majors. There was no overlap in the two studies’ participants. Both studies began with several pre-study questions and ended with the System Usability Scale (SUS). The SUS is a ten-item scale using statements of subjective assessment and covering a variety of aspects of system usability.30

The first study, the “homepage study,” had twelve tasks chosen to measure library homepage navigation for central functions. Only two of these tasks will be discussed in this article:

  • 1.Does the library have access to the journal Brain Research?
  • 2.Find the full text for the journal article “Anxiety in High-Functioning Children with Autism” by Alinda Gillott, Fredd Furniss, and Ann Walter that was published in a 2001 issue of the journal Autism.

The homepage study included twenty-one participants with a range of experience: eight freshmen, five sophomores, three upper-classmen, one graduate student, and four faculty. Twelve of the participants were from the arts and humanities, seven were from the sciences, and two were from the school of business. Sixty-two percent of the homepage study participants said they visited the library website at least once per month.

The “find-a-citation” study consisted of twelve tasks that asked participants to find four citations in three web interfaces: a simplified version of the library’s journal portal, Periodical Locator (PL), which isolated the journal search function; Google Scholar (GS); and the library’s link resolver form, known as Check for Full Text (CFFT). Both CFFT and PL are Serials Solutions products. Twenty participants were chosen to make our findings generalizable to the JMU student population.31 However, as a rule of thumb, Manning, Raghavan, and Schütze have suggested that fifty information needs are necessary to evaluate an information-retrieval system, so the twelve tasks used in this study was not a sufficient number to make generalizations about the tested systems.32

The find-a-citation study had nineteen students and one faculty member. Fifteen of the twenty participants had been at JMU for just one or two years. Ten of the participants were from the arts and humanities, seven were from the sciences, and three were from the school of business. Eleven participants indicated they needed full-text journal articles on a monthly basis or more, while nine said they needed full-text articles “a few times per semester” or less. No one thought it was very difficult to “find full text for journal articles.” The authors asked participants if they had previously seen the “Check for Full Text” link resolver button in research databases or the link in GS; more than half had used the button in research databases, and one-third had used the link resolver from GS.

While many readers will be familiar with these interfaces, it is important to consider how each one operates when trying to find an article by citation. To use PL, the user must identify and search on the journal title. The results show which information providers offer access to each journal title and the various dates of coverage for each provider. A user looking for an article must click on the information provider having the appropriate dates of coverage, then conduct a search for the article citation on that provider’s site. If PL does not have any matches by journal title, the user sees a “no results were found” screen, which offers tips for conducting additional searches.

To use GS, the user should enter at least the article title from the citation. Entering only the journal title will often return too many results to effectively find a specific citation. If GS finds a match, the user can click directly on the result (often the article title) and get to the full text. Since the library’s link resolver software is enabled in GS, participants had the additional option to click on “Check for Full Text @ JMU,” which would take them to the link resolver results screen. While clicking on the result itself is the fastest way to get to full text, the “Check for Full Text @ JMU” link offers the most options. If GS does not find a match on the user’s query, it generally still has enough information to display some results, however irrelevant they may be.

CFFT requires the user to identify an article citation’s parts and enter each part into the correct field. The user submits the form and must then choose the correct link on the results screen. Depending on the completeness of the user’s entry and the accuracy of the software, the results screen might have article links or journal links, or it might indicate no full text was found. Clicking on the article links delivers the user to full text. Clicking on the journal links requires the user to perform an additional search on the publisher’s website, similarly to PL. Finally, if there is no full text found, the user is offered options to search the library catalog or the library’s journal portal.

The find-a-citation study began with a practice task and seven pre-study questions (available from the authors on request). The user then completed four tasks using the modified journal portal, PL, followed by a post–interface questionnaire, the SUS. Next, the user completed four tasks using Google Scholar (GS) and again completed an SUS. Finally, the user completed four new tasks in the link resolver form (CFFT) and an SUS. Each task asked the user to find full text for a given citation. All participants took the test in the same order: PL, GS, and finally CFFT. To measure whether they had found full text, each of the twelve tasks offered the participants a multiple-choice question when they clicked “Answer,” which required them to choose the correct first three words of the article full text. “No full text is available” was always an option. In each set of tasks, one task asked the participant to find full text for a citation for which JMU had no full text access. Participants timed out if they failed to provide an answer after five minutes, and for each question, users had the option to skip the task. At the end of the study, participants answered three post–study survey questions about interface preferences.

The authors address limits of this method in the discussion section.


RESULTS
Homepage Study

Although there were twelve tasks in this study, ten were unrelated to the topic of this article and concerned other navigation tasks on the library homepage. It is worth noting that participants were generally successful on the other ten tasks and completed each in a short amount of time.

The task on the homepage study that asked users to find a specific journal title had the most incorrect answers of any of the study’s twelve tasks. Eight of twenty-one people got the task wrong, and one person skipped the task. Also, this task was the second most time-consuming: even when users answered correctly, their time on task ranged from less than one minute to about four minutes, with an average of one minute, fifty-three seconds. For this task, participants chose to begin searching with either the library catalog search box on the library homepage, the journal portal, “Research Databases,” or “Find Articles.” All eight participants who got this task wrong entered their search in the library catalog search box on the library homepage.

On the post–study survey for the homepage study, many comments related to the journal task. One participant stated, “Searching for journals is the most difficult thing for me.” Another commented, “I could not come up with articles and journals even though I thought I was on the correct page.” A third stated, “Sometimes it takes a while to find the correct link to find journal articles during research.”

Another task on the homepage study asked participants to find an article when given a citation. Of the twelve tasks in this study, this took the longest for successful participants to complete. Five people skipped the task after making an initial attempt, and another person answered incorrectly. This task took an average of four minutes, fifteen seconds.

Of those that answered correctly, six people tried to use the library catalog search box on the library homepage, entering the article title. An additional seven people tried this strategy after first trying to search for the article in the journal portal. The search box does not support article searching. Users who skipped the task or selected the wrong journal might have been confused by the specific journal title chosen: a search on “autism” in the journal portal retrieves six results. The actual full title of the journal being searched for is Autism: The International Journal of Research and Practice.

Find-a-Citation Study

There were 240 tasks in the find-a-citation study, and participants were successful on 84 percent of them. Figure 1 shows participants’ respective success on each of the three interfaces. For the PL interface, participants took an average of ten minutes, fifty-nine seconds, the GS interface took an average of five minutes, forty-five seconds, and CFFT took an average of seven minutes, nine seconds. This includes time on wrong answers, skipped tasks, and timeouts. Using a paired-samples T-test, the differences between averages of participants’ performance on the three interfaces were significant at the .05 level.

Although users were generally successful with all three interfaces, it took participants much longer to arrive at the correct answer with PL than the other two interfaces. When using PL, successful participants took an average of two minutes, four seconds, compared to one minute, seven seconds for GS and one minute, thirty-six seconds when using CFFT.

Across all three interfaces, participants were more successful at finding the correct answer when the article was available in full text than when no full text was present. On tasks where no full text was available, the most participants were correct when using CFFT: sixteen were correct, three timed out, and one was wrong. GS ranked a close second: fifteen participants were correct, two were wrong, two timed out, and one skipped the task. Only half of the participants were correct when using PL; nine timed out, and one skipped the task. Figure 2 shows a summary of results on tasks where full text was not available.

Periodical Locator Interface

For the PL interface, users spent a lot of time reading the results screen, even when the answer was visible. Based on screen capture video, users were unclear which links were to print holdings versus online full text. Also while completing many tasks using the PL interface, participants left to try another strategy (e.g., the library catalog) and then returned to the interface.

In the PL interface, four of the five wrong answers occurred when participants searched for the article title rather than the journal title. Eleven participants timed out on PL tasks, nine on the task where no full text was available. An additional participant chose to skip this task. In the screen-capture videos, users were seen clicking on various links, including links to individual research databases, header navigation links, and the library catalog.

Google Scholar Interface

When examining videos from the GS interface, even successful participants sometimes searched first on the journal title, then the article title. In the GS interface, there were only two wrong answers. One participant found the correct article, but then seemed to second-guess himself and chose the wrong answer. The other participant searched for a journal title, rather than the article title, and then made a seemingly random guess.

Three different participants timed out when using GS, with two of these occurring when there was no full text available. In GS, two additional participants skipped the task where no full text was available.

Check for Full Text Interface

For the CFFT form, participants chose the wrong answer in ten tasks. Most errors occurred when users copied and pasted data into several combinations of fields but did not fill out enough information to return article-level links. In some cases, participants gave up and chose a random answer to move on to the next task, a limitation the authors will address in the discussion section.

There were three timeouts and one skip when using the CFFT interface. All three timeouts occurred when no full text was available; looking at the videos, these participants kept trying different actions until they timed out.

System Usability Scale and Post–Study Survey Results

After completing four tasks in each interface, participants rated that interface using the SUS. Figure 3 shows a comparison of SUS scores, which have a range of 0–100, 100 being the best score; GS scored highest, followed by CFFT, then PL. The differences between PL and GS and between CFFT and GS were statistically significant; the differences between PL and CFFT were not statistically significant.

In response to the post–study survey, all participants chose GS as either their first or second preference, with sixteen participants ranking it as their first choice. Twelve of the sixteen participants who chose GS interface as their first choice chose the CFFT interface as their second choice, with the other four choosing the PL interface. The four who preferred GS as their second choice were split in their preference for the CFFT and PL interfaces for their first choice. Figure 4 shows the full breakdown of data for this question.


DISCUSSION

This study approached known-item information-seeking from the user-oriented, task-focused approach recommended by Kuhlthau and Järvelin and Ingwersen, and it focused on the user’s effort, satisfaction, and tactics.33

The first research question was, “What difficulties do users encounter when trying to find a journal title from the JMU Libraries’ homepage?” In agreement with Letnikova’s findings at other libraries,34 the results of the “homepage study” revealed that locating a journal title was one of the most difficult tasks to perform on the JMU libraries’ website. Terminology was a major issue because the links did not clearly differentiate where to begin searching for this task. For finding a journal, most participants either entered the journal title in the front-and-center search box, which only targets the library catalog, or clicked the journal portal link. Participants who had difficulty primarily struggled interpreting the results from both the catalog and the journal portal. This supports Marchionini’s assertion that known-item searches should produce precise results sets with little need for item comparison.35 It also corroborates Mvungi, de Jager, and Underwood’s findings, which suggest that results obfuscate whether a journal is online or in print.36 A dropdown menu within the catalog search box included the term “periodicals,” which some participants failed to equate with “journals.” Although these findings may not seem surprising in the context of other libraries’ studies, it was important to determine what the specific problems were at JMU in order to make effective changes.

The second research question was, “What difficulties do users encounter when trying to find an article by citation from the JMU Libraries’ homepage?” Finding a known article was the other one of the two most difficult tasks to perform on the library homepage. Study participants most often tried to use the journal portal by clicking on the “Periodical Locator” link. More than half eventually resorted to using the library catalog search box on the homepage, which does not support citation searching. A common problem involved entering the article title into a search box, whether using the library catalog or the journal portal. Four of the participants chose the link “Find Articles” from the library homepage, which led to a list of the libraries’ general research databases and other resources, such as PL and GS.

The final research question was, “Of the three interfaces easily available to library users, which web interface supports finding a citation effectively: the journal portal (PL), the link resolver form (CFFT), or Google Scholar (GS)?” GS was the users’ favorite tool for finding citations in the find-a-citation study, as determined by the SUS and the post–study questions. Participants were also the most successful and completed tasks more quickly when using GS. Although the link resolver was enabled in GS, most participants clicked on the article title for the full text rather than the link resolver. In GS, it is more effective to search on the article title, but several participants still searched first on the journal title when trying to find a citation. Since the study was conducted on campus, users were automatically authenticated for access to JMU resources. However, off-campus users would have to enable the link resolver in GS to find JMU full-text subscription resources. This finding points to the need for additional instruction about GS on the library website.

The CFFT interface posed some significant challenges for users, but employing the form saved many participants time. The most common problem observed in the study was the failure to complete the most important fields of the form. Participants often omitted important elements resulting in journal-level, rather than article-level, links. Some did not notice the date ranges on the link resolver results screen. Other participants clicked on journal-level or database-level links, even when an article link was available. Yet, if participants completed the CFFT form correctly and found the article link, they completed the task quickly. Participants seemed to have more confidence that full text was not available when using the CFFT form than with PL, perhaps because the form appeared to be a more robust tool with clearly labeled data entry fields.

The PL interface slowed users down. Even the task where all participants were successful took longer than a similar task in the CFFT interface. If users entered the article title instead of a journal title, they retrieved no results, which increased the time on task. When there were no results, videos showed that users were not reading the textual suggestions on the “no results” screen in PL, but instead would try to take immediate action. They would change the dropdown menus, click the browser’s back button, or select a link from the “no results” page header. Even for successful tasks, using PL to find an article involved interpretation of the results screen and navigation of an additional website (e.g., the journal publisher’s website), introducing an automatic extra step.

The observations regarding the question of which interface most effectively allows searching for articles by citation would suggest that librarians should direct users to GS as a first choice and that it should be featured most prominently for finding articles by citation. However, in addition to this study’s limitations, discussed below, there are several other important considerations. Since Google does not reveal what GS indexes, it is difficult to determine appropriateness and completeness of coverage.37 When JMU librarians were presented with the results of this study, some thought GS should be the top-recommended tool, but others had understandable concerns about using a commercial tool with unpublished policies rather than licensed vendor software. The study team encouraged JMU librarians to test GS for their disciplines and to recommend its use and instruction at their discretion.

This study revealed several additional trends in user behavior that did not fall within the purview of any of the three research questions.

First, none of the interfaces gave the user confidence that no full text was available. Figure 5 shows an example of the circuitous pathway taken by one participant that illustrates the general pattern of “trying everything” when full text was not available.

Second, contrary to anecdotal evidence, users attempted to use the dropdown menus in the journal portal (PL) and on the library website. Unfortunately, the dropdown menu options were not helpful for these tasks. For example, the users who tried to use the library homepage search box to find a journal title did not interpret the option “periodicals” as a way to find journals, and the library homepage search box contained no dropdown options that would support finding an article by citation. On the journal portal (PL), the dropdown menu includes options such as “title equals” and “title contains all words,” and although participants did sometimes try these options, this did not address the most common problems of entering the article title rather than the journal title.

Another interesting behavior observed was that users clicked on “Refine/Alter Search” when their initial attempt at using the CFFT link resolver form failed, often because they failed to fill out only one or two fields. At the time of the study, the form provided no guidance about which fields were most important.

Limitations of This Study

This study had several limitations, some inherent to the method and some that arose during execution. First, having twelve citations is not a representative sample of the enormous pool from which users might search. This means the study’s findings need to be reviewed with an understanding of contextual issues (for example, the content searched by each interface). Second, although the researchers attempted to find citations of equal difficulty by ensuring PDF access was available, there is a risk that differences in the difficulty level of citations influenced the results. Participants were given different citations for each task, which meant that comparisons on performance between interfaces on the same citations were not possible. To compare performance on specific citations would have required a larger pool of different participants and comparisons between participant groups, rather than within one participant group.

Another limitation was that some results might have been influenced by the order in which participants used the interfaces. Each participant began the session with PL, then used GS, then the CFFT form. Therefore, fatigue and learning effects could have been present over the course of each session. One specific example of how learning effects could have influenced results can be seen in participants’ performance on the tasks where no full text was available. Participants who correctly answered the task where no full text was available were able to find the task answer faster in both GS and CFFT than in PL, which was the first interface. It is also important to note that the practice question that came before the first four PL tasks used the CFFT link resolver form, so participants might have been accustomed to entering article titles because “article title” is one of the field forms. Although the order of interface may have some influence, the things people struggled with on each interface seemed to be different, indicating some findings might not have been influenced by the order of interface.

Another potential limitation of the study is participants’ use of guessing. Observing Morae’s screen capture revealed that in some cases an incorrect answer meant participants were actually misinformed and incorrect; other times, it seemed like a wrong answer meant they wanted to move on and just guessed. In retrospect, an answer option of “don’t know” should have been offered on each multiple-choice question so it would be possible to determine whether participants really thought their answer was right, or if they were guessing.

Changes Made to the Libraries’ Website

In response to the above findings, the libraries made several changes to its website. On the basis of find-a-citation usability study results suggesting that journal portals were not the best tool for finding a known article, a new “Find Articles by Citation” webpage was created. Results from the two usability studies discussed in this article and web traffic statistics drove the decisions regarding what information was included on this new webpage. First, a quick link was offered to research databases for finding articles by topic to redirect users who have misunderstood the intent of this page. The most prominent visual item on the page, however, was the link resolver form (CFFT). GS was offered next on the page as a secondary tool. A direct link to the journal portal was not included, since usability results suggested the journal portal was not effective for this task. A link to the journal portal still appeared in a dropdown menu in the page header.

On the basis of usability results, several changes to the CFFT link resolver form itself were also identified and implemented. First, the most important fields were highlighted using a red font. Additionally, small question marks that provide tips about the data required for each field were added to this page. For example, the tip for the date field shows the required date formats.

In an attempt to alleviate the confusion between the tasks “finding journals” versus “finding articles,” the articles tab on the homepage library catalog search box was revised to read “Articles and Journals,” similar to Ipri, Yunkin, and Brown.38 Also, the links on the tab itself were changed to “Find Articles by Topic,” “Find Articles by Citation,” and “Find Journal Titles (Periodical Locator).” An additional link was added from the library’s “Find Articles” page to the new “Find Articles by Citation” page. Figure 6 shows what this tab looked like before and after the changes. As shown, the old version of the “Articles” tab had numerous links and options. An analysis of the use data of these links, using Google Analytics, supported the decision to remove these options.

To clarify the task of finding journal titles, links to the library’s journal portal were modified to read “Find Journal Titles (Periodical Locator).” Also a dropdown list option of “Journal Titles” was added to the library homepage search box that targets the journal portal rather than the library catalog. Unfortunately, because limitations of the journal portal software, the label in the category itself cannot be changed, as recommended by Mvungi, de Jager, and Underwood.39

The researchers observed many usability issues with the journal portal software, especially on the results screen. Unfortunately, the portal software does not offer customization of the results screen, nor is there flexibility to change the system response on the basis of the user’s query. For example, a message such as “it looks like you might have entered an article title, not a journal title,” or corrections for misspellings would be helpful. However, the word “journal” was added to the dropdown menu options, to read “Journal Title Contains All Words,” since users used the dropdown options.

Other issues unearthed during the usability studies with the display of library catalog results still remain, but display customization is limited. In 2010, JMU will be implementing a discovery service and plans to include aggregator titles in the discovery service results, which may resolve some of the issues found in this study.

Effect of Changes

Web statistics gathered before and after the changes showed large differences in web traffic. Data collected with Google Analytics, which tracks users’ clicks on hyperlinks, was compared for the same two-week period during the spring 2009 and fall 2009 semesters (the changes were made during the summer of 2009). The data was normalized to adjust for a 10 percent increase in overall web traffic between spring and fall. However, adjustments could not be made for behavioral differences that might exist between a typical spring and a typical fall semester.

Use of the new “Articles and Journals” tab increased about 150 percent from the old “Articles” tab. In the spring, after clicking on the “Articles” tab, only 38 percent of users then clicked on a hyperlink on that tab. In the fall, 70 percent of people who clicked on the “Articles and Journals” tab then clicked on a hyperlink. This seems to indicate that the revised tab is better serving users. When looking at what type of action the user performed on the tab in the spring, 67 percent of people chose an action related to finding articles by topic, and 33 percent chose an action relating to the journal portal, PL. In the fall, 80 percent of users chose the “Find Articles by Topic” or “Find Articles by Citation” links, compared with 20 percent who chose “Find Journal Titles.” Based on reference and instruction interactions, users need to find articles by topic much more frequently than they need to find journal titles, so this change seems appropriate.

The new “Find Articles by Citation” link received only 144 clicks in two weeks, compared with 1,878 clicks on “Find Articles by Topic.” This also seems like an appropriate proportion of use based on reference transactions. However, when looking at the data about the “Find Articles by Citation” page itself, there is an indication that users are not finding what they want. Of the people who come to the page, 53 percent use the “Look Up” button to submit the link resolver form (CFFT), and only 2 percent use GS. Forty-five percent of users leave the page. Specifically, 28 percent of people go to the libraries’ homepage, and 6 percent go to the journal portal (PL) using the “Quick Links” dropdown menu. There are plans to investigate the use of this page again after a full semester’s data are available.

Future Research

The results of this study suggest future avenues for research. The research team plans to continue to analyze web traffic using Google Analytics on both the library homepage and the new “Find Articles by Citation” page to make further refinements. Another interesting question is where users begin the research process when looking for the full text of an article, and how satisfied they are with their chosen approach. Rather than start users in a particular interface, a research protocol could have them begin with no applications open, requiring them to navigate to their chosen website to begin. Another area of research could investigate differences in vendors’ journal portals and link resolver forms.

GS offered the best interaction for users in this study; however, the lack of information about what GS covers makes it difficult to recommend it as the primary choice for a typical website user. While one could investigate GS’s coverage with a research approach, the content covered changes continually.


CONCLUSION

Finding journal titles and finding articles by citation will remain challenging tasks for users. The principle of least effort, also known as Zipf’s law, suggests that individuals “will adopt a course of action that will expend the probable least average of their work.”40

Librarians can change the information architecture of their homepages to respond to this principle and to make the pathways more intuitive. They may also be able to make some customizations in third-party software, such as journal portals and link resolver forms. In addition, librarians should remain open-minded toward commercial tools, like GS, that have the potential to increase success and save time for users. Subject experts should use GS when searching within their discipline until they have an intuitive sense of whether it is a good option for users. Finally, librarians need to remember that many users do not begin their search on the library website. If the top tools offered on the library website are user-friendly and effective rather than frustrating and time-consuming, users will have a reason to begin their search there.


References and Notes
1. Galina Letnikova,  "Developing a Standardized List of Questions for the Usability Testing of an Academic Library Web Site,"  Journal of Web Librarianship  (2008)   2, 2:  381–415.
2. Marshall Breeding,  "Next-Generation Library Catalogs,"  Library Technology Reports  (2007) July/August;  43, 4Susan Marcin and Peter Morris, “OPAC: The Next Generation: Placing an Encore Front End onto a SirsiDynix ILS,” Computers in Libraries 28, no. 5(May 2008): 6–9, 62, 64
3. Jeffrey Rubin and Dana Chisnell,   Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests,   2nd ed.. (Indianapolis, Ind.:  Wiley, 2008): , Joseph S. Dumas and Janice Redish A Practical Guide to Usability Testing, rev. ed. (Exeter, England, and Portland, Ore.: Intellect, 1999).
4. Bob Bailey,  “Users are Not Good Designers,” www.usability.gov/pubs/032005news.html (accessed Oct. 26, 2009)
5. Nicole Campbell,   Usability Assessment of Library-Related Web Sites: Methods and Case Studies (Chicago:  Library and Information Technology Association, 2001): , Elaina Norlin and C. M. Winters Usability Testing for Library Web Sites: A Hands-In Guide (Chicago: ALA, 2002).
6. Letnikova, “Developing a Standardized List.”
7. Jakob Nielsen,  “Why You Only Need to Test with 5 Users,” www.useit.com/alertbox/20000319.html (accessed Oct. 27, 2009)
8. Jakob Nielsen,  “Quantitative Studies: How Many Users to Test?” www.useit.com/alertbox/quantitative_testing.html (accessed Oct. 19, 2009).
9. Carol C. Kuhlthau,  "Inside the Search Process: Information Seeking from the User’s Perspective,"  Journal for the American Society for Information Science  (1991)   42, 5:  361–71.
10. Kalervo Järvelin and Peter Ingwersen,  "Information Seeking Research Needs Extension Toward Tasks and Technology,"  Information Research  (2004) October;  10, 1http://informationr.net/ir/10–1/paper212.html (accessed June 16, 2010)
11. Frederick G. Kilgour,  "Effectiveness of Surname-Title-Words Searches by Scholars,"  Journal of the American Society for Information Science  (1995)   46, no. 2:  146–51,  Frederick G. Kilgour and Barbara B. Moran. “Retrieval Effectiveness of Surname-Title-Word Searches for Known Items by Academic Library Users,” Journal of the American Society for Information Science 50, no. 3 (1999): 265–70; Min-Yen Kan and Danny C.C. Poo, “Detecting and Supporting Known Item Queries in Online Public Access Catalogs,” JCDL 2005, June 7–11, 2005, Denver, Colo., 91–99
12. Gary Marchionini,  "Exploratory Search: From Finding to Understanding,"  Communications of the ACM  (2006) Apr.;  49, 4:  42.
13. Pascal Lupien and Lorna Evelyn Rourke,  "Out of the Question!… How We Are Using Our Students’ Virtual Reference Questions to Add a Personal Touch to a Virtual World,"  Evidence Based Library and Information Practice  (2007)   2, 2:  67–80,  Barbara J. Cockrell and Elaine Anderson Jayne, “How do I Find an Article? Insights from a Web Usability Study,” Journal of Academic Librarianship 28, no. 3(May 2002): 122–32
14. Burton Callicott and Debbie Vaughn,  "Google Scholar vs. Library Scholar: Testing the Performance of Schoogle,"  Internet Reference Services Quarterly  (2006)   10, 3:  71–88,  Brenda Battleson, Austin Booth, and Jane Weintrop, “Usability Testing of an Academic Library Web Site: A Case Study,” Journal of Academic Librarianship 27, no. 3(2001): 188–98; Susan H. Mvungi, Karin de Jager, and Peter G. Underwood, “An Evaluation of the Information Architecture of the UCT Library Web Site,” South African Journal of Library & Information Science 74, no. 2 (2008): 171–82; Michael Whang and Donna M. Ring, “A Student-Focused Usability Study of the Western Michigan University Libraries Home Page,” Journal of Web Librarianship 1, no. 3 (2007): 67–88; Tom Ipri, Michael Yunkin, and Jeanne M. Brown, “Usability as a Method for Assessing Discovery,” Information Technology & Libraries 28, no. 4 (2009): 181–83
15. Letnikova, “Developing a Standardized List.”
16. Ipri, Yunkin, and Brown, “Usability as a Method.”
17. Mvungi, de Jager, and Underwood, “An Evaluation of the Information.”
18. Whang and Ring, “A Student-Focused Usability Study.”
19. Marie T.. Ascher, Haldor Lougee-Heimer,  and Diana J. Cunningham,  "Approaching Usability: A Study of an Academic Health Sciences Library Web Site,"  Medical Reference Services Quarterly  (2007)   26, 2:  37–53.
20. Cockrell and Jayne, “How Do I Find an Article?”
21. John Kupersmith,  “Library Terms Evaluated in Usability Tests and Other Studies,” www.jkup.net/terms-studies.html (accessed Dec. 9, 2009)
22. Nina McHale,  "Toward a User-Centered Academic Library Home Page,"  Journal of Web Librarianship  (2008)   2, 2:  139–76.
23. Terry W.. Brandsma, Elizabeth R.. Bernhardt,  and Dana M. Sally,  "Journal Finder, a Second Look: Implications for Serials Access in Today’s Library,"  Serials Review  (2003)   29, 4:  287–95.
24. Ellington Beth,  "The Usability of the Journal Finder Interface,"  Journal of Web Librarianship  (2008)   2, 2:  307–37.
25. Shobana Jayaraman and Karen Harker,  "Evaluating the Quality of a Link Resolver,"  Journal of Electronic Resources in Medical Libraries  (2009)   6, 2:  152–62.
26. Callicott and Vaughn, “Google Scholar vs. Library Scholar,” 71–88.
27. Hur-Li Lee,  "Information Structures and Undergraduate Students,"  Journal of Academic Librarianship  (2008)   34, 3:  211–19.
28. Rebecca Donlan and Rachel Cooke,  "Running with the Devil: Accessing Library-Licensed Full Text Holdings through Google Scholar,"  Internet Reference Services Quarterly  (2005)   10, 3/4:  149–57.
29. Ibid.
30. John Brooke,  "SUS: A ‘Quick and Dirty’ Usability Scale," in Usability Evaluation in Industry ,   ed. Jordan P. W.JordanP. W. ,  et al.,  (London:  Taylor and Francis, 1996) ; www.usabilitynet.org/trump/documents/Suschapt.doc, accessed Oct. 27, 2009).
31. Nielsen, “Quantitative Studies: How Many Users to Test?”
32. Christopher D.. Manning, Prabhakar Raghavan,  and Hinrich Schütze,   An Introduction to Information Retrieval (Cambridge:  Cambridge Univ. Pr., 2008): .
33. Kuhlthau, “Inside the Search Process”; Järvelin and Ingwersen, “Information Seeking Research.”
34. Letnikova, “Developing a Standardized List.”
35. Marchionini, “Exploratory Search.”
36. Mvungi, de Jager, and Underwood, “An Evolution of the Information.”
37. Chris Neuhaus, Ellen Neuhaus,  and Alan Asher,  "Google Scholar Goes to School: The Presence of Google Scholar on College and University Web Sites,"  Journal of Academic Librarianship  (2008)   34, 1:  39–51.
38. Ipri, Yunkin, and Brown, “Usability as a Method.”
39. Mvungi, de Jager, and Underwood, “An Evaluation of the Information.”
40. Donald O. Case,  "Principle of Least Effort," in Theories of Information Behavior ,   ed. Karen E.. Fisher, ed., with Sanda Erdelez and Lynne McKechnie,  289-92 (Medford N.J:  Information Today, 2005) .

Figures

Figure 1

Test Results on All Three Interfaces.



Figure 2

Results for Tasks Where Full Text was not Available



Figure 3

SUS Scores for All Three Interfaces



Figure 4

Participants’ Tool Preferences



Figure 5

Circuitous path taken by a participant searching for full text



Figure 6

Changes to the “Articles” Tab on JMU Libraries Homepage



Article Categories:
  • Library Reference and User Services
    • Features

Refbacks

  • There are currently no refbacks.


ALA Privacy Policy

© 2023 RUSA