Making a Case for User Experience Research to Drive Technical Services Priorities
Emma Cross (emma.cross@carleton.ca) is a Cataloguing and Metadata Librarian, Cataloguing, Metadata, and Digitization Department, Carleton University, Ottawa, Ontario. Shelley Gullikson (shelley.gullikson@carleton.ca) is a Systems Librarian, Library Technology Services, Carleton University, Ottawa, Ontario.
Manuscript submitted July 30, 2019; returned to authors for revision October 7, 2019; revised manuscript submitted December 3, 2019; returned to authors for additional revision February 4, 2020; revised manuscript submitted February 19, 2020; accepted for publication February 23, 2020.
The authors wish to thank the students who participated in the study and provided so much useful information. Thanks also to Amber Lannon, Associate University Librarian, Academic Services and Erika Banski, Head, Cataloguing, Metadata and Digitization Department, Carleton University Library for their support and ongoing interest in this study. Preliminary results from this study were presented at the Access 2017 conference in Saskatoon, Saskatchewan, Canada.1
This paper takes a technical services perspective on user experience (UX) research into student searching behaviors. In this observational study, students were free to search as they normally would while conducting research for an upcoming essay or assignment. Researchers took careful note of the search process, including how searches were composed and which metadata fields students looked at in their results lists. The findings of the study, and how local technical services staff responded to them, are discussed in this paper. The project was a useful way to prioritize the work of technical services based on insights from user searching behavior and to help ensure library resources are discoverable in the most effective manner.
User experience (UX) can be difficult to pin down in a single definition, or as Buley notes, it is “a famously messy thing to describe.”2 UX describes the overall experience and emotions of users as they engage in a service, product, or space. The UX of the circulation desk would include how easy it is to find, whether there is a line, the friendliness of the staff, the user’s physical comfort while being served, whether staff can meet the user’s need, etc. UX research describes the work done to understand the user and their experience. UX design describes the work done to create a good user experience and is generally undertaken in conjunction with UX research to iterate improvements. Any or all of these things—the experience, the research, the design—can be referred to with the shorthand of “UX.”
UX became popularized as a concept in libraries in 2010. Its focus on users makes it attractive to public services staff who use UX techniques to improve spaces, services, and the overall user experience. The roots of UX in usability and human-computer interaction make it a natural fit for systems staff in libraries who use UX techniques to improve interfaces and task flow in digital library spaces and services. Technical services staff, however, despite their long history of conducting user research into bibliographic records and search behavior, have been largely absent from the emergence of UX in libraries over the past decade.
This paper aims to fill a gap in the literature by taking a technical services perspective on UX research into student search behaviors. This paper shows how Carleton University Library’s technical services department collected and used observational data to improve the search experience for their students. This UX study, and how the technical services department responded to the resulting data, could be used as a model for other technical services departments to study their own students and respond to user needs.
In this paper, the authors discuss a UX research project in which ten undergraduate and ten graduate students were observed as they conducted research for essays and assignments. The researchers analyzed the observation data to determine how the user experience was—or could be—affected by technical services work. The results were presented at a technical services staff meeting for discussion. The outcomes of this study have been extremely helpful for adjusting technical services workflow in response to user behavior, needs, and expectations at the Carleton University Library.
Carleton University is a comprehensive university located in Ottawa, Canada. In 2018 to 2019, the student population was just over twenty-seven thousand undergraduate students and four thousand graduate students, and the library had an annual acquisitions budget of approximately $7 million. The library has a collection of over 1,000,000 print monographs, 872,000 eletronic books (e-books) and 78,000 electronic journals.
Literature Review
As alluded to above, the literature has no standard definition of UX. In 2010 there was a wave of papers describing the concept of UX to a library audience, and in one of the first, Walker said in part, “the study of user experience helps those providing library services understand how our patrons use the services we offer, and how they integrate them into their daily lives.”3 She went on to explain that “user experience design seeks to understand and assess users’ actual behavior and performance, rather than their opinions and attitudes.”4 This focus on understanding actual user behavior in the context of their lives is key to UX.
There is very little literature on UX in libraries from the technical services perspective. Much has been written about the UX of discovery platforms, library catalogs, and other elements of library search.5 However, this research has been largely undertaken by public services or systems staff, with the analysis and conclusions geared toward instruction, reference work, or interface design. One exception is Walsh, who observed three graduate students and two faculty members searching for monographic series in a library catalog, which led to UX recommendations related to both cataloging practice and interface design.6 Walsh’s 2012 paper may stand alone as an example of UX research with a technical services perspective, but technical services staff were conducting research into users’ search behavior years before UX appeared in the library literature.
Yee’s 1991 review of research on the user interfaces in OPACs covered user studies related to various issues with the OPAC interface. For each issue she provided both “record design solutions” (i.e., recommendations for catalogers) and “system design solutions” (i.e., recommendations for OPAC designers). Yee asserted that questions about the design of OPAC interfaces “should be answered based on research into user needs, and based on dialogue between the record designers and system designers who together create the user interfaces for our online public access catalogs.”7 Yee’s vision did not become reality; the user research literature on OPAC design in the 2000s is, with a few exceptions, the domain of system designers and public services staff.
There was a small surge of cataloging-related user research as the Functional Requirements for Bibliographic Records (FRBR) were introduced. Pisanski and Žumer conducted user research on the bibliographic model behind FRBR, but their aim was to see whether the structure of FRBR made sense within the mental models of users, not to make suggestions related to cataloging work.8 The eXtensible Catalog project aimed to build a better (and FRBR-based) catalog and, to this end, employed interviews with eighty students and faculty members across the four participating universities to better understand user needs related to resource discovery.9 Zhang and Salaba also took a user research approach to FRBR, but—like so much of the user research into online catalogs—they were primarily interested in how users interacted with the interface of a FRBR-based catalog, rather than the records within it.10
In 2015, Wilson discussed the possible relevance of ethnographic research methods to catalogers who want to better understand user behavior.11 She examined how ethnographic methods such as observation and interviews could provide a richer picture of user behavior than other qualitative methods commonly used in libraries. In particular, she critiqued what she called the “think aloud” method, where users complete assigned tasks while verbalizing their thoughts. She found it wanting: “While this method can reveal how a user would undertake a contrived task, it does not reveal what features a user would wish or need to exploit if left to their own devices, or what sort of queries they would typically bring to the catalog of their own accord. It does not therefore construct a picture of the actual uses to which the catalog might be subject in the real world.”12 While certainly participants can “think aloud” while completing any task—assigned or self-directed—Wilson’s underlying frustration with basing research on “contrived” tasks was not new.
Markey reviewed twenty-five years of research into end-user searching looking solely at research based on transaction logs to capture only “user-initiated searches in which no observers were present.”13 Based on this review, she made recommendations to improve the effectiveness of searches by helping users to access controlled vocabulary and recommendations for future directions for user research. One of her recommendations included: “let us avoid research protocols that assign tasks to end users. As much as possible, researchers should design experiments that capture what end users really do, not what researchers want or expect them to do.”14
Markey was likely reacting to the majority of early user research into searching, which assigned participants specific tasks. Only a few studies allowed participants to search as they normally would. In 1990, Charles and Clark asked users who had just completed a search using a CD-ROM database to replicate their search strategy in an online database and then observed those searches.15 In their 1998 study, Twidale and Nichols clearly state that “volunteers undertook authentic activities, bringing along a search task that they had to undertake anyway.”16 Komlodi observed eight attorneys searching “for a topic of their choice” in 2004.17 Anderson conducted a longitudinal ethnographic study of research practice in 2005 that included observation of searching behaviors, though searching was not the study’s primary driver.18 These examples are the exceptions rather than the rule; most researchers have observed users performing assigned search tasks, not observing users searching more naturally.
After the landmark ethnographic study of students at the University of Rochester in 2007, ethnographic methods became more popular in library research.19 One might assume there would be an increasing appetite for observing users searching how they would normally search, but the literature does not bear this out; again, it is difficult to find more than a few examples. As part of the Ethnographic Research in Illinois Academic Libraries (ERIAL) Project in Illinois in 2012, researchers observed students searching for sources they needed for their coursework.20 More recently, Leeder and Shah’s 2016 study asked students to collaboratively search for sources on their research topic and “[t]he goal of this task was to capture participants’ authentic behavior in an exploratory search condition.”21 Most current studies of user search behavior in libraries continue to use a task-based methodology rather than, as Markey suggested, “experiments that capture what end users really do.”22
Observation studies that capture natural user behavior are much more common in the physical library. One example is “‘Sweeping the library’: mapping the social activity space of the public library” by Given and Leckie, who studied how patrons used the space in two large public libraries.23 They described the observational method as particularly applicable in situations where “observed behaviors may not match what individuals say that they do on a written or oral survey and therefore might be able to provide concrete evidence to support a particular library design or certain types of policy decisions.”24 Given and Leckie note that it is important to be mindful that observational studies provide “an insightful glimpse into “what” is happening in libraries”, but require additional methods to also capture the “why” of patron behavior.25
To address the question of “why,” the authors’ study also incorporates the idea of “emplacement, the interrelationship of body, mind, and place” recently described by Polkinghorne, Given, and Carlson.26 In their paper “Interviews that Attend to Emplacement: The ‘Walk-Through’ Method,” the authors examine the limitations of the traditional sit-down interview for collecting data on user behavior that underreports the role of place in people’s experiences. Their study of undergraduate use of library space incorporated both a traditional sit-down interview and a “walk-though” interview where participants led a researcher around the library spaces they had described in the sit-down interview. “During the walk-through interviews, participants clarified details they provided in the sit-down interview, they recalled new details that they had not mentioned previously, and in some cases, they raised entirely new topics beyond those first explored in the sit-down interview.”27 Polkinghorne, Given, and Carlson conclude by stating that the walk-through interview “elicits greater detail because participants are powerfully prompted by perceiving and moving in a place.”28 Thus, the authors’ own study could be considered a form of “digital emplacement” with students taking researchers on a “click-through” interview providing a great deal of detailed information about how they experience the online environment (or online “place”) as they work on research for an assignment.
Data Collection and Analysis
The research project aimed to understand and possibly improve the user experience of search by observing students conducting research for an upcoming assignment. At the time of the study, Carleton University Library had both an online library catalog (Innovative Interfaces’ Millennium) and a discovery layer (ProQuest’s Summon). Summon was most visible on the library’s website, with a search box on the main page, but links to the catalog were available nearby. The library website also provided a list of databases—subscription and otherwise—and various library guides. Students in the study were free do their research in any way they liked and were not limited to using the library website.
The authors recruited undergraduate students and graduate students via the library website, library Twitter account, and emails to student academic societies, the Graduate Student Association, and members of the Student Library Advisory Committee. Ten undergraduate and ten graduate students volunteered to participate, and the twenty individual sessions were held between February and March 2017. As is common for this type of study, each student was given a gift card at the start of their session in appreciation for their time. They could keep the gift card—a $10 Starbucks card they could use at the library’s café—regardless of whether they continued with the study. All twenty students completed their sessions.
Each session was held in a private room in the library, equipped with a desktop computer and a small table. Students could choose to use the room’s computer or their own laptop. One of the authors moderated the session, while the other observed and took notes. Students were told that the authors wanted to observe them searching for information they needed for an upcoming research assignment, and that they should do what they would normally do, not what they thought they should do. They were specifically told that even though we were from the library—and in the library—the authors did not want them to feel that they should use library resources if they would not normally do so. They were assured that there was no right or wrong way to do anything during the session (see appendix for the session script).
Students were asked for consent to record their screen while they were searching. Although all gave their consent, continuing with the session was not contingent on consent for recording. After signing consent forms, they were asked what year they were in and their major or field of study. They were also asked for a short description of what they would be working on during the session and what they hoped to find.
Finally, the students were asked to think aloud as they worked. They were asked to mention some specific things: for what they were looking, if they found something helpful, if they were confused by anything, and if something did not work the way they expected. They were also asked to explain any decisions they were making—a decision to look at something, to ignore something, to change their strategy, to continue, or to give up. If they remained quiet, students were prompted with these topics or asked neutral questions such as “What are you looking at now?” or “Is that what you expected?”
Students were told that the sessions would last thirty minutes, and were notified when twenty-five minutes had passed and given the option to continue or stop. In some cases, it was clear before the twenty-five-minute mark that the students were satisfied with the number of resources they had found and were finished with the searching stage of their process. In these cases, sessions ended at this point. Overall, sessions ranged from ten to forty minutes, but most were between twenty-five and thirty minutes.
Once each session was over and the student had left the room, the authors discussed what had been observed and made additions to the observation notes. When all the sessions were completed, one of the authors watched the video captures, noting the stated topic, the search terms used, which tool was searched (Summon, Google Scholar, etc.), and any filters or facets used. It was also noted how many search results each student scanned, how many results were examined more closely, and how many were saved. The students’ stated reasons for changing their search terms or search strategy were captured as well. The other author analyzed and coded her notes, identifying themes most relevant to technical services. Session notes were carefully reviewed to identify frequently occurring keywords and concepts. Related comments and actions were grouped together using different colored highlighters to create key themes.
Findings
Based on the detailed written transcripts and the video, five main themes emerged. These are not novel themes, having been discussed elsewhere in the library literature, but they were the most striking and the most relevant to technical services staff.
Overwhelming Use of the Single Search Box
Students in the study showed a strong preference for the single search box provided by the library’s discovery layer, Summon. Out of twenty total participants, Summon was used by fifteen participants, Google Scholar was used by twelve people, and the classic library catalog by two people. Nine participants used both Summon and Google Scholar, six used Summon but not Google Scholar, and three used Google Scholar but not Summon. In comparison, only seven participants used subject-specific databases—four undergraduates and three graduate students. In terms of total searches during the twenty sessions, participants completed seventy-eight searches in Summon, thirty-four searches in Google Scholar and nine in the library catalog. The two students who used the library catalog were both undergraduate students. Aside from catalog use, there were no striking differences in the graduate and undergraduates who used these tools.
The Get it! Button
Carleton University has a Get it! button that students click to access full text through the link resolver. The logical corollary to the overwhelming use of unified search platforms is the corresponding popularity of the Get it! button with students. One graduate student said, “I love the Get it! link—it makes my life much easier.” Another noted, “Get it! is really useful.” Four of the twenty students in the study expressed genuine enthusiasm about the Get it! button. Even when students did not mention or recognize the Get it! button, they used it seamlessly to access full text. Every student appeared to be clear about what the button does: click on it to access full text content.
Students in the study also recognized that they could access Carleton Library content in Google Scholar. Indeed, there was a moment of joy when a student made this realization during a session. “Hey, look! Carleton offers to ‘get this’ in Google Scholar. Hey! That is great!” an undergraduate exclaimed.
Most Frequently Cited Metadata Fields
After careful analysis of the transcripts, the same pattern appeared repeatedly, with participants: 1) rapidly scanning the search results list; 2) quickly reviewing the titles for relevant keywords; 3) checking the date (the majority of students were not interested in older material); 4) if the title and date sparked interest, clicking on the record to read the abstract; and 5) if title, date, and abstract met the searcher’s criteria, downloading or saving for further reading. This pattern of search behavior was completed at high speed (see theme 5). For example, an undergraduate told us, “I scan the list and look at titles,” while a graduate student said, “I’m looking at title and dates. The earliest in the list are the most recent. I also read the abstract.”
A puzzling point was students looking for an abstract for books. With the overwhelming use of unified search platforms that intermingle a large number of journal articles with a small number of books, students appear to be conditioned to look for an abstract for all resources. When students did not see an abstract, even when they were looking at a record for a monograph, they moved on. A summary or table of contents in monograph records appeared to be useful and was briefly mentioned by a few students. Given this, it could be helpful for library instruction sessions to prompt students to look at subject headings when summaries or tables of contents are not provided. Students did not seem to realize that if they just scrolled down a little bit, they could find the subject headings to see what a book is about. Only two students mentioned subject headings.
Overwhelming Popularity of Keyword Searching
Almost all the searches in the twenty sessions were keyword searches. Out of a total of 121 searches, only two were subject searches and two were searches by author. Students typically start with general keyword terms and then refine searches. There appeared to be little deliberation about what terms to use. Indeed, it was common for students to work at speed with no pause to reflect on keyword terms even when they stated that a search was not producing the results expected.
Thus, where students could use more assistance is with the choice of keywords. There is a lot riding on keywords, and poor choice of keywords can mean not finding relevant resources and wasting time. Most participants looked solely at titles to determine if a record was relevant, so keyword choice was even more crucial. Some students chose keywords that did not seem to match their stated topics, such as the undergraduate who was looking for “ethics and privacy concerns related to digitization in libraries” but used the search string “privacy and open access.” Synonyms and related words occasionally proved problematic; an undergraduate who said they were looking for information on “when girls start ballet” searched for “ballet and girlhood” and found nothing relevant in the results list. In addition, students frequently mistyped and misspelled words, likely because they were working very quickly.
Speed, Impatience, and Ease of Access
As mentioned earlier, a majority of students in the study worked at high speed. In fact, on many occasions it was difficult to keep track of what the students were doing and make written notes. A number of students mentioned they would work faster on their own laptop as they are familiar with how it is laid out and configured. Students quickly skimmed the list of search results and rarely went beyond the first page (or first screen) of search results, averaging less than seven results examined per search. Undergraduates looked at fewer results than graduate students (not quite six versus almost ten results examined per search). It is interesting to note that apart from this, the only striking difference in search behavior between graduate students and undergraduate students was that the library catalog was searched by two undergraduates but no graduate students.
Other Findings
In addition to these five main themes relevant to technical services, there were other interesting observations about students’ searching behavior. The authors observed that students skipped over materials that required more effort to obtain. This included books on course reserve, books in the storage facility, slow loading documents, and books out on loan. Surprisingly, no students in the study skipped over a resource just because it was not available online, however, students clearly expressed a preference for online resources for ease of access. They mentioned working off campus and being unable or unwilling to visit the library. One undergraduate, upon finding a print book in the results list said, “This is useful if I can find it. It is not online so I will have to search the library itself. This makes me cry a little.”
Students had no qualms about clearly stating they were busy and did not wish to waste time. They wanted and needed research to be as quick and easy as possible. “My sister is a grad student and showed me some quick ways to do things. She told me not to waste time,” said one undergraduate. Similarly, students expressed frustration when research took too long or they got stuck and could not find relevant resources. After waiting not quite a minute for search results to load in Summon, a graduate student remarked, “Hmmm, usually the library is kind of fast. I don’t have the patience to wait so I go to Google,” and then immediately repeated the search in Google Scholar.
Almost half the students (seven, both undergraduate and graduate) apologized at the end of their session for not doing “proper library research.” Students apologized even when they had completed competent online searches. Furthermore, most students were not dissatisfied with the searches they completed during the sessions. Indeed, five of the students explicitly stated they were happy with the work they had done. One student did both—expressed happiness with the search research and then apologized right afterwards. This kind of apology may have arisen from having two librarians observe and take notes on their search behavior. However, if students are apologizing but are not actually unhappy, then this could point to an issue of relevance of what they perceive as “proper library research.” Students who complete searches with Google Scholar rather than using the library’s knowledge base may think what they are doing is not “proper library research.” As many of the students referred to Google Scholar as simply “Google,” perhaps they have been told in the past that Google should not be used for academic research. This could be an interesting area for further research.
Students mentioned getting information about research from peers and family (four students) and faculty (four students). The use of peer networks especially came up in relation to Google Scholar. One student told us, “Some people have changed their computer so they can access library material via Google Scholar.” Another said, “A friend told me about Google Scholar—the library is not teaching this.”
Finally, for many students in the study, the research process was not a linear one of searching, selecting, reading, then writing; searching, selecting, reading, and writing were blended. Some students created a document during the session which included citations, notes, and preliminary outlines and thoughts. When emailing this document to themselves at the end of the session, the students often remarked that they were happy with the work they had completed during the session.
Using UX Study Results
After preliminary analysis of the results, findings were presented to the library’s technical services staff, followed by a moderated discussion. The presentation and discussion lasted eighty minutes in total. A written transcript was made of the question and answer session to assist with the analysis of the research data. The staff were asked to provide comments, observations, interpretation, opinions, and ideas on the information presented. They were also asked if there was anything they found surprising. It was a very interesting discussion and people appeared engaged.
The link resolver and the knowledge base generated the most discussion. Staff said that learning more about how students actually do research was valuable. For example, “We can put staff time and energy into making sure [link resolver] works,” and “this validates where we need to spend time. We can call out vendors where there is a consistent problem. I can be pushy to get Summon issues resolved. If that is what students are relying on, then we have to make sure what we have is right.”
The presentation sparked an interesting discussion about what resources are indexed in Summon. One staff person wondered if all library databases were included in a Summon search, because if databases were missing, these resources might not get used by many students. It was decided to review the database content covered in Summon and try to get more included. When staff saw how much Summon was being used, there was general agreement that it was worth taking the time to carefully review the Summon documentation to see exactly how and from where Summon obtains information. After hearing the findings of this research project, staff stated they felt more confident in deciding what should be at the top of the “to do” list, even if it is a time-consuming project. For staff interested in additional information, the authors provided a citation for Wilson’s “The Knowledge Base at the Center of the Universe.”29
Technical services staff were clearly disappointed that the library catalog was used so little and that students overwhelmingly searched keywords as opposed to using subject headings or name searches. There was a very interesting discussion on these points, especially for catalogers. Staff noted that keyword searching is problematic since the title cannot reliably reflect the content of books, and they recommended that students be taught how to search for books using subject headings. We explained that students were generally pleased with their searches and no students expressed a need or a desire to learn about searching with subject headings. Then, a senior staff member stated, “I’m not buying into this discussion that keyword searching is a bad search. Remember that keyword also searches subject. Indexing is the most important part of this. If you search something using keywords then it is still a good result.” The tone of the meeting shifted slightly after this comment.
A cataloger suggested that to help students, they could copy catalog with monograph records that contain tables of contents and summaries where such records were readily available. There was a realization that this would also strengthen keyword search results. Finally, seeing that relatively few students search directly in the library catalog, catalogers gained an understanding that MARC records are now mostly accessed and displayed by the discovery layer.
During the meeting, staff said the student search behavior was familiar; many people search the internet this way, so why would students search for resources any differently? Also familiar was that students skipped over material that was more difficult to obtain, with one staff member noting that she did this when she was a student. Staff generally felt that students were not being lazy, but rather that they were busy and had to use time efficiently. This prompted a discussion about being more careful about what materials were put in library storage as they would be far less likely to be used due to the time required for retrieval. Broken links and the library’s e-resource troubleshooting form were also discussed. Staff wondered if perhaps only a small number of broken links are reported. They recommended that the library make it as easy as possible to report broken links and access issues as this was likely the “tip of the iceberg.” A note was made to investigate how or if a link to the library e-resource troubleshooting form could appear in Summon search results.
This study began as a grassroots initiative of the library’s technical services department to set practical student-centered priorities for workflow to complement the department’s more general priorities. The study provided useful information for staff about how students look for information for essays and assignments and why adjustments in priorities are necessary. The concrete steps outlined below are based on staff discussions and relate to technical services functions. This could be helpful to managers in technical services at other institutions wishing to develop a student-centered approach to library service.
Concrete steps taken as a result of this study in priority order:
- Prioritizing ongoing work to keep the library’s knowledge base up to date. This includes: checking that metadata for all packages and titles owned by the library are included in the knowledge base; keeping up to date with titles added and dropped from packages and making sure this information is updated in the knowledge base; and reporting errors and omissions to the knowledge base vendor.
- Confirmed the library’s e-resource trouble shooting form is easily accessible in discovery layer search results.
- Reviewed relevant documentation from the vendor about the discovery layer to maximize access to library resources.
- Reviewed keyword indexing in the discovery layer and shared this information with staff to enhance their understanding of how the discovery layer works.
- Investigated how MARC contents and summary notes in monograph records appear in searches via the discovery layer. Using MARC records with these fields when they are readily available.
In addition to these concrete actions, technical services staff said they felt more confident in deciding what tasks should take priority. A technical services supervisor explained, “Now I can attack the right problems with purpose. We can put staff time where it is relevant.” This is a very positive outcome for this UX research project.
Discussion
The idea of convenience as a key factor in the research process is not new.30 However, it does appear that ease of access in an online environment and the abundance of information changes searching behavior. Users’ expectations and perception about the availability of information results in a general tendency not to follow through; students do not have to make a sustained effort to find any particular resource if they can easily find something else just as good. Students in this study demonstrated limited knowledge of library resources beyond using Summon and Google Scholar to quickly access full text content. Individual databases and subject guides created by subject librarians were rarely mentioned. In a few instances, students tried to find a library subject guide because it had been mentioned in a library instruction session but they were unable to locate it. While this raises a number of issues for front-facing library services, particularly instruction, it clearly indicates the centrality of the work done in technical services to help students find and access library resources.
The participants in this study demonstrated an overwhelming preference for searching Summon and Google Scholar. How does this affect technical services operations? It appears that the library catalog may no longer play a central role in student research and the gradual shift in technical services work is starting to be discussed in the library literature. Wilson discusses the evolution in technical services operations, outlining the development of knowledge bases: “initially created as a byproduct of OpenURL link resolvers and A-to-Z lists, they have evolved into useful tools in their own right,” also supporting unified search platforms and e-resource management in key areas such as licensing, usage statistics, and resource sharing.31 Wilson concludes, “It’s safe to say that the knowledge base has truly become the center of the management universe for academic and research libraries.”32 The results of this study indicate the shift from library catalog to unified search platforms is also well underway among students.
Meeting with technical services staff to discuss our results was a productive way in which to engage and orient staff to changes in student research patterns. Rather than listening to a presentation based on library literature, staff heard how the students at their own library are searching. It was rewarding that there appeared to be progress in the attitudes and opinions of staff. While a few people offered a “knee-jerk” response, falling back into traditional approaches, most staff appeared to listen with an open mind, perhaps because they recognized some of the search behaviors being described.
It has now been almost two years since the data was collected. However, it is helpful to have the benefit of hindsight to get a long-term perspective on the value of UX research for technical services operations. After the meeting with technical services staff to discuss the UX data, there was an initial flurry of enthusiasm and clarity on which projects and tasks are higher priority because they directly help students. However, technical services staff work in very busy departments includes dealing with multiple projects with competing priorities, ongoing technological change, and staffing turnover. The results of the UX study, especially the central importance of the knowledge base, continue to influence the priorities of the department. The extent to which students use the discovery layer and are reliant on the knowledge base and the link resolver to access library content did make a lasting impression on staff. However, a one-off discussion of user experience is not sufficient and, even with the best of intentions, a clear focus on user-centered priorities can fade over time in a busy workplace. Thus, it would be helpful to have a regular technical services UX discussion to maintain focus on user needs and address ongoing changes in technology, perhaps on an annual or biannual basis. It would be too labor intensive to repeat the study every year but an ongoing commitment to UX research and updates would be beneficial and should be added to the library’s strategic plan. Indeed, the Carleton University Library has moved on to use the library services platform Alma as part of a consortium as of January 2020, so it is clearly time for a follow-up study on search behavior in this new environment.
Limitations
This was a small-scale study with twenty participants at a single academic library, which suited the objectives of this research project. The point of the study was not to make broad generalizations about how students do research but to provide insight into the user experience of research at Carleton University and how it could be improved. Using the data, technical services staff have been able to refocus and realign priorities based on UX research.
Conclusion
Recent trends and changes in library technical services have resulted in an environment where staff no longer work with a single library catalog but are adding metadata in a variety of formats to a growing number of databases. These databases may include the knowledge base, classic catalog, institutional repository, course reserve software, and data repositories such as Dataverse. To direct effort where it is most useful, staff in technical services require more information about how users search and access library resources, including common problems encountered. By adopting a UX focus, libraries can try to ensure the policy decisions taken in technical services are making library resources accessible in the most effective manner and not making research more complicated for users in a fractured digital environment.
In this study, the authors observed how students search online when conducting academic research, paying special attention to themes and issues relevant to technical services. There is a long history of technical services research into user behaviors specific to catalog records and catalog searching, but not into the overall user experience of the search process. This research helps fill a gap in the library literature, which has very little on UX from a technical services perspective, or technical services from a UX perspective. UX research findings can help reorient existing workflows and priorities in technical services to have a user focus. This UX study, and how the technical services department responded to the data, could be used as a model for other technical services departments to respond to user needs. In our experience, it is refreshing for technical services staff to see their work from a user-oriented perspective and empowering to have the data to provide student-centered services.
References
- Shelley Gullikson and Emma Cross, “User Experience From a Technical Services Point of View,” Access 2017, https://www.youtube.com/watch?v=bap1zrcx-ZU&list=PLomHagvStAaDzXullxohONtPPcD_T2AO4&index=14.
- Leah Buley, The User Experience Team of One (Brooklyn, NY: Rosenfeld Media, 2013), 4.
- Cecily Walker, “A User Experience Primer,” Feliciter 56, no. 5 (2010): 195.
- Walker, 196.
- Julia Gross and Lutie Sheridan, “Web Scale Discovery: The User Experience,” New Library World 112, no. 5–6 (2011): 236–47, https://doi.org/10.1108/03074801111136275; Kylie Jarrett, “Findit@Flinders: User Experiences of the Primo Discovery Search Solution,” Australian Academic & Research Libraries 43, no. 4 (2012): 278–99, https://doi.org/10.1080/00048623.2012.10722288; Andrew D. Asher, Lynda M. Duke, and Suzanne Wilson, “Paths of Discovery: Comparing the Search Effectiveness of EBSCO Discovery Service, Summon, Google Scholar, and Conventional Library Resources,” College & Research Libraries 74, no. 5 (2013): 464–88, https://doi.org/10.5860/crl-374; Sonya Betz and Ian Roberton, “Integrating Discovery to Improve the User Experience,” in Exploring Discovery: The Front Door to your Library’s Licensed and Digitized Content, ed. Kenneth J. Varnum (Chicago: ALA Editions, 2016), 155–68; Karine Larose et al., Library Search UX Report Summer 2016 (London: Imperial College London Library Services, 2017), http://hdl.handle.net/10044/1/44345; Rice Majors, “Comparative User Experiences of Next-Generation Catalogue Interfaces,” Library Trends 61, no. 1 (2012): 186–207, https://doi.org/10.1353/lib.2012.0029; Bill McMillin, Sally Gibson, and Jean MacDonald, “Mapping the Stacks: Sustainability and User Experience of Animated Maps in Library Discovery Interfaces,” Journal of Electronic Resources Librarianship 28, no. 4 (2016): 219–31, https://doi.org/10.1080/1941126X.2016.1236537; Emery Shriver, “Managing Discovery Problems with User Experience in Mind,” Code4Lib Journal no. 44 (2019), https://journal.code4lib.org/articles/14481.
- Larisa Walsh, “The Faceted Catalog as a Tool for Searching Monographic Series: Usability Study of Lens,” Cataloging & Classification Quarterly 50, no. 1 (2012): 43–59, https://doi.org/10.1080/01639374.2011.610433.
- Martha Yee, “System Design and Cataloging Meet the User: User Interfaces to Online Public Access Catalogs,” Journal of the American Society for Information Science 42 no. 2 (1991): 92, https://doi.org/10.1002/(SICI)1097-4571(199103)42:2<78::AID-ASI2>3.0.CO;2-2.
- Jan Pisanski and Maja Žumer, “Mental Models of the Bibliographic Universe. Part 1: Mental Models of Descriptions,” Journal of Documentation 66, no. 5 (2010): 643–67, https://doi.org/10.1108/00220411011066772.
- Nancy Fried Foster et al., ed., Scholarly Practice, Participatory Design and the eXtensible Catalog (Chicago: Association of College and Research Libraries, 2011); Jennifer Bowen, “The eXtensible Catalog: Taking Control of Library Metadata” (presentation, June 4, 2009), https://hdl.handle.net/1813/12873.
- Yin Zhang and Athena Salaba, “What Do Users Tell Us About FRBR-Based Catalogs?” Cataloging & Classification Quarterly 50, no. 5–7 (2012): 705–23, https://doi.org/10.1080/01639374.2012.682000.
- Victoria Wilson, “Catalog Users ‘In the Wild’: The Potential of an Ethnographic Approach to Studies of Library Catalogs and Their Users,” Cataloging & Classification Quarterly 53, no. 2 (2015): 190–213, https://doi.org/10.1080/01639374.2014.980022.
- Wilson, 205–6.
- Karen Markey, “Twenty-Five Years of End-User Searching, Part 1: Research Findings,” Journal of the American Society for Information Science & Technology 58, no. 8 (2007): 1072, https://doi.org/10.1002/asi.20462.
- Karen Markey, “Twenty-Five Years of End-User Searching, Part 2: Future Research Directions,” Journal of the American Society for Information Science & Technology 58, no. 8 (2007): 1128, https://doi.org/10.1002/asi.20601.
- Susan K. Charles and Katharine E. Clark, “Enhancing CD-ROM Searches with Online Updates: An Examination of End-User Needs, Strategies, and Problems,” College & Research Libraries 51, no. 4 (1990): 321–28.
- Michael Twidale and David Nicols, “Designing Interfaces to Support Collaboration in Information Retrieval,” Interacting with Computers 10, no. 2 (1998): 177–93, https://doi.org/10.1016/S0953-5438(97)00022-2, 185.
- Anita Komlodi, “Task Management Support in Information Seeking: A Case for Search Histories,” Computers in Human Behavior 20, no. 2 (2004): 171, https://doi.org/10.1016/j.chb.2003.10.013.
- Theresa Dirndorfer Anderson, “Relevance as Process: Judgements in the Context of Scholarly Research,” Information Research: An International Electronic Journal 10, no. 2 (2005), http://informationr.net/ir/10-2/paper226.html.
- Nancy Fried Foster and Susan Gibbons, Studying Students: The Undergraduate Research Project at the University of Rochester (Chicago: Association of College and Research Libraries, 2007); Donna Lanclos and Andrew D. Asher, “‘Ethnographish’: The State of Ethnography in Libraries,” Weave: Journal of Library User Experience 1, no. 5 (2016): http://doi.org/10.3998/weave.12535642.0001.503; Bryony Ramsden, “Ethnographic Methods in Academic Libraries: A Review,” New Review of Academic Librarianship 22, no. 4 (2016): 355–69, http://doi.org/10.1080/13614533.2016.1231696.
- Lynda M. Duke and Andrew D. Asher, College Libraries and Student Culture: What We Now Know (Chicago: American Library Association, 2012).
- Chris Leeder and Chirag Shah, “Library Research as Collaborative Information Seeking,” Library & Information Science Research 38, no. 3 (2016): 202–11, https://doi.org/10.1016/j.lisr.2016.08.001, 204.
- Markey, “Twenty-Five Years of End-User Searching, Part 2,” 1128.
- Lisa M. Given and Gloria J. Leckie, “‘Sweeping the Library’: Mapping the Social Activity Space of the Public Library,” Library & Information Science Research 25 (2003): 365–85.
- Given and Leckie, “‘Sweeping the Library,’” 383.
- Given and Leckie, “‘Sweeping the Library,’” 383.
- Sarah Polkinghorne, Lisa M. Given, and Lauren Carlson, “Interviews that Attend to Emplacement: The ‘Walk-Through’ Method,” Proceedings of the Annual Conference of the Canadian Association of Information Science 2017, https://journals.library.ualberta.ca/ojs.cais-acsi.ca/index.php/cais-asci/article/view/1028, 1.
- Polkinghorne, Given, and Carlson, “Interviews that Attend to Emplacement,” 3.
- Polkinghorne, Given, and Carlson, “Interviews that Attend to Emplacement,” 3.
- Kristen Wilson, “The Knowledge Base at the Center of the Universe,” Library Technology Reports 52, no. 6 (2016), https://doi.org/10.5860/ltr.52n6.
- Shawn V. Lombardo and Kristine S. Condic, “Convenience or Content: A Study of Undergraduate Periodical Use,” Reference Services Review 29, no. 4 (2001): 327–38, https://doi.org/10.1108/EUM0000000006494; J. Patrick Biddix, Chung Joo Chung, and Han Woo Park, “Convenience or Credibility? A Study of College Student Online Research Behaviors,” Internet and Higher Education 14, no. 3 (2011): 175–82, https://doi.org/10.1016/j.iheduc.2011.01.003; Lynn Sillipigni Connaway, Timothy J. Dickey, and Marie L. Radford, “If it is Too Inconvenient I’m Not Going After it: Convenience as a Critical Factor in Information-Seeking Behaviors,” Library & Information Science Research 33, no. 3 (2011): 179–90.
- Wilson, “The Knowledge Base at the Center of the Universe,” 8.
- Wilson, “The Knowledge Base at the Center of the Universe,” 8.
Appendix. Session Script
We’re interested in getting a better understanding of how people search for information related to their academic research. We’d like to observe you searching for information you need for an upcoming project—a paper or assignment. We know that it’s strange having people watch you do this, but we’d really like you to do what you normally do. We don’t want you to feel that you’re being evaluated; what will be most helpful to us is to see what you actually do when you look for information. It doesn’t matter if you think there’s a better way, we just want to know what it is that you do. So even though we’re from the library, please don’t feel that you should be using library resources if you don’t normally do that. This isn’t a test; from our point of view there is no right or wrong way to do anything in the next 30 minutes or so.
With your permission, we’d like to record your screen while you’re searching. This will help us so that we don’t have to take as many notes. We have a consent form here that we’d like you to sign and you can opt out of the video recording if you prefer. [Go over the consent form and give them the Starbucks card.]
I have a few quick questions before we get started:
- What year are you in? / Are you doing your Masters or your PhD?
- What is your major? / What is your area of study?
- And finally, can you tell me a little bit about what you’re working on today and what you’re hoping to find?
As you’re searching, it would be extremely helpful for you to say what you’re thinking as you go along. Tell us what you’re looking for, if you find something that helps you, if you’re confused by anything, if something didn’t work the way you expected. Tell us about how you’re making decisions—decisions to look at something, to ignore something, to change your strategy, to continue or to give up. It can be difficult to think out loud, so I might ask you some questions, particularly if you’ve been quiet for a while. Another way to think of it that might be helpful is to tell us the story of what you’re doing.
Do you have any questions for me before we start? Please start when you’re ready.
Refbacks
- There are currently no refbacks.
© 2024 Core