02_Lowry

Fighting an Uphill Battle

Troubleshooting Assessment Practices in Academic Libraries

Lindsey Lowry (lrlowry@ua.edu) is Electronic Resources Librarian and Assistant Professor at the University Libraries, The University of Alabama.

Manuscript submitted October 14, 2020; returned to author for minor revision December 12, 2020; revised manuscript submitted December 22, 2020; accepted for publication January 5, 2021.

The author would like to acknowledge Dr. Millie Jackson and Dr. Kevin Walker for help with this research as well as Alice Daugherty for her help and mentorship on this and other projects.

Scholarly literature provides many examples of librarians who have assessed troubleshooting data in various capacities and demonstrated the benefits that can be gleaned from such an analysis. Though some studies have confirmed that troubleshooting data is often being tracked, the frequency with which that data is being assessed in libraries is not well established. For this study, the author surveyed academic librarians who are currently involved in e-collection management to determine to what extent and for what purposes troubleshooting assessments are being carried out. The results reveal that though many librarians can see the benefits of assessing troubleshooting data, the obstacles to gathering, analyzing, and acting on results are often too great to overcome.

The effective troubleshooting of electronic resource (e-resource) access problems is of paramount importance for librarians aiming to provide seamless service for library users. The complicated and intertwined nature of discovery services, link resolvers, knowledge bases, etc., makes fertile ground for access errors, and collection managers responsible for addressing e-access problems rely on a wealth of knowledge about how each of these systems integrate with one another to successfully resolve outages. For many libraries, users often report e-access problems through an online form, by e-mail, a dedicated ticket system, or by some other means for library staff to address and resolve. The abundant data that exists within these types of communications provides an opportunity for librarians to assess that data and use it to improve both troubleshooting workflow, access to e-resources, and overall service to users. While many libraries engage in ongoing data collection for various services, such as gate counts, circulation metrics, reference interactions, or instruction assessments, the extent to which libraries assess troubleshooting data or workflows and for what purposes is not well established.

For this study, the author created and distributed a survey (see appendix) intended to collect data from academic librarians to answer the following questions:

  1. To what extent are librarians assessing troubleshooting data and workflows in academic libraries?
  2. For what purposes are troubleshooting assessments carried out?
  3. What barriers exist for librarians to perform such an analysis on troubleshooting data?
  4. Is undertaking a troubleshooting assessment a worthwhile endeavor to improve services?

Literature Review

Benefits to Mining Troubleshooting Data

A number of authors have analyzed troubleshooting data and published findings that demonstrate the benefits of performing a troubleshooting assessment. For instance, in the absence of a dedicated ticket tracking system for troubleshooting, Browning’s team at Auraria Library at the University of Alabama examined e-mail chains from e-access problem reports to “answer some fundamental questions about the nature of Auraria’s access problems.”1 As a result, Browning created a new “quarterly e-resources spreadsheet” in which student workers can systematically check for outages before they are reported. Furthermore, Auraria Library added additional “Report a Problem” links on the A-Z databases page and amended the link on their link resolver landing page, hoping to increase visibility, which ultimately led to more reports of outages from students and faculty. Browning also used the data from the study to advocate for a new position to help with e-resource access and noted that one clear conclusion of the study was that “troubleshooting needs more focused and dedicated attention.”2

Like Browning, Wright studied outages that occurred over one calendar year and implemented changes to the troubleshooting workflow at the University of Michigan to proactively address frequently occurring access issues. More specifically, Wright created an “outage framework” with the implementation of a ticketing system and a controlled vocabulary to classify each of the incoming tickets. At the conclusion of the study, Wright opined that “no one institution can systematically rid itself of the kinds of errors seen repeatedly, across platforms, vendors and content delivery services.”3 Wright continued, “Improving our ability to describe errors, to capture examples of them and the attempts made to fix them, is the first part of what is sure to be an arduous but ultimately worthwhile process.”4

Similarly, Goldfinger and Hemhauser used the resulting data from their study of troubleshooting tickets to propose projects at the University of Maryland, College Park, intended to mitigate future outages and access issues for users. These proposals include updating a local Frequently Asked Questions service page, wherein users could be directed to a “report a problem” link for certain types of outages, and make future changes if a more in-depth analysis revealed additional frequently occurring problems that could be alleviated by providing users with more information.5 Goldfinger and Hemhauser also proposed adding standardized responses for staff to use in communications when resolving frequently occurring issues. Furthermore, a local internal troubleshooting guide for training purposes could enhance staff understanding of certain issues and provide tips for troubleshooting. In addition to providing proposals for enhanced services as a result of the study, Goldfinger and Hemhauser concluded that “Similar future studies at other institutions can surely also suggest local enhancements to optimize the existing troubleshooting framework at each given institution.”6 They encourage other librarians to conduct their own local analyses.7

Brett at the University of Houston, Lowry at The University of Alabama, and Gould and Brett at the University of Tennessee, Knoxville, and Texas A&M University respectively, examined troubleshooting data in a somewhat different light, wherein rates of access problems across multiple research institutions were used to form a comparative analysis in three different studies.8 Brett first concluded that it was indeed possible to perform a comparative analysis between institutions when troubleshooting data is analyzed, and illuminated similarities and differences in a comparison between two universities, highlighting where improvements could be made to the University of Houston’s services. For example, Brett discovered that the University of Houston had more tickets concerning problems with EZProxy and IP addresses than the University of Maryland, College Park. Proposed improvements included better tracking of EZProxy changes, and adding more information and “report a problem” links in key areas of the library’s website to serve patrons at the point of need and hopefully minimize EZProxy or IP related outages.9 In 2020, Lowry built upon Brett’s study to include a third institution in a comparative analysis and iterated that as a result of both a comparative and local analysis of troubleshooting tickets, the best course of action for The University of Alabama Libraries would be to “empower public services faculty and staff to better understand and report access issues so that frustrations are minimized.”10 Lowry indicated that the results of the study “are highly indicative that research libraries experience some types of access problems at approximately the same rates,” and that efforts to improve discovery should be “at the forefront of the minds of librarians when communicating and negotiating with vendors.”11 Finally, Gould and Brett compared rates of access problems at the University of Tennessee and Texas A&M University, ultimately advocating for a standardized or controlled vocabulary to be establish by librarians and the National Information Standards Organization (NISO) to foster collaboration between institutions and to simplify the process of comparing outages across institutions to improve e-access for all library patrons.12

Taking a slightly different approach, Ashmore and Macauly of Samford University analyzed unfilled interlibrary loan (ILL) requests to detect patterns.13 As a result, workflow improvements implemented included increased access to ILLIAD, wherein librarians could download reports into Excel for further analysis, rather than relying on e-mail chains. The study also identified groups who may need additional library instruction and improved collaboration among the different library departments. Moreover, Ashmore and Macauly examined potential interface design changes that would increase wayfinding for patrons and improved staff training on troubleshooting. Ashmore and Macauly deemed the project successful with a number of benefits, and that “this process was a service opportunity offering a good way to establish positive relationships with users by saving their time.”14

Considering the many service benefits that are demonstrated in the literature, Samples and Healy were straightforward in their own recommendation: “Librarians should take the time outside of troubleshooting to mine their own data regarding access failure to improve electronic resource troubleshooting workflows.”15 Likewise, perhaps Wright elucidated the benefits of analyzing troubleshooting data the most robustly: “With enough data gathered through systems like Footprints and shared with both vendors and other institutions, libraries stand poised to improve the functionality of e-resources, not just for their own patrons, but for patrons everywhere.”16 Indeed, Goldfinger and Hemhauser, Wright, and Brett each noted that obtaining more robust data on e-access outages is a key component to communicating with vendors about access problems.17 Carter and Traill also opined that “tracking complicated troubleshooting leads to a more sophisticated understanding of both the frequency of various problem types and their levels of complexity,” noting that in short, the benefits of implementing a formalized tracking of troubleshooting problems “helps to ensure that problems are resolved.”18 Carter and Traill remarked that “reviewing data on reported issues is critical for revising and improving the workflow of troubleshooting,” and discovered that methodical and detailed problem tracking plus periodic and ongoing analysis in conjunction with their recommended training strategies provides the best possible service environment for library patrons.19

Barriers to Analyzing Troubleshooting Data

Though authors have advocated for librarians to analyze local troubleshooting data and workflows, the literature also highlights many barriers. Samples and Healy indicated that 56 percent of Association of Research Libraries (ARL) libraries surveyed were either not tracking troubleshooting data or had an unclear method for doing so, meaning that no troubleshooting assessment occurred in these instances. They remarked that the lack of troubleshooting data tracking at ARL libraries likely means that the troubleshooting practice has few quality-control measures in place and “decreases the return on investment for these electronic resources.” 20 The amount of time and lack of tools required to perform such an analysis was cited as one barrier to analyzing troubleshooting data. In fact, interviewees for Samples and Healy’s study indicated that among the barriers to creating proactive troubleshooting workflows “finding the time to pull details from emails or correlate information in Excel from forms with disparate fields or fields that have changed over time” weighed heavily as problematic.21 Browning indicated that implementing software used for tracking requests for troubleshooting (one that could potentially provide robust data for analysis) meant more time and resources than Auraria Library’s staff could offer at the time of the study.22 Rathmel et al. likewise indicated that survey respondents reported staff time and budgets were impediments to implementing robust tracking tools for troubleshooting.23

Furthermore, while Rathmel et al. and Heaton found e-mail to be the most frequently used tool for troubleshooting, it lacks the functionality for easy archiving and reporting of metrics necessary for an in-depth analysis of the data within. 24 Rathmel et al. described e-mail as “ubiquitous” and of no extra cost to institutions, unlike specialized ticket tracking systems or customer relations management (CRM) tools that may provide robust data but are otherwise unobtainable. 25 Samples and Healy state that “counting emails is easy, but figuring out what the email is really reporting and using emails to expose large patterns or repeated problems with a particular vendor can be prohibitively time consuming.” 26 Borchert detailed the difficulties her team faced when using e-mail to track and respond to requests for troubleshooting: “E-mail messages can be buried in an inbox full of other messages, and because several people received the e-mail, no one knew when someone else had already responded to it. Also, if we had a pattern of access problems, it was not readily apparent because the old e-mails were deleted once the immediate problem was handled.”27 Ashmore and Macauly eventually switched from using e-mail to analyze unfilled ILL reports to downloading reports from ILLIAD that enabled greater examination of information than the original e-mail chains provided. 28 Finally, despite the fact that e-mail was found to be one of the most widely used tools for tracking data related to troubleshooting, Rathmel et al. found that ticketing systems that provided better functionality for data tracking were not widely implemented in libraries, with only 26 percent of respondents indicating that such a software was in place.29 As it related to the complicated nature of e-resources workflows, Collins reiterated that “workflow processes should not be memory-bound or isolated within individual silos such as e-mail; otherwise, ineffective knowledge management is likely to result.”30

Interviewees in Samples and Healy’s study likewise indicated that analyzing troubleshooting data is not straightforward, as sometimes the problem and resolution are not clear from the data provided in the tickets.31 Wright indicated that detecting patterns within troubleshooting data can be difficult, and “attributing outages to the correct source of the problem swiftly becomes a point of contention.” 32 Brett, Goldfinger and Hemhauser and Wright all indicated difficulty in categorizing tickets to determine patterns.33 In fact, Brett, who set out to compare rates of access outages between two institutions, noted that it is necessary to have a standardized vocabulary of outage types to categorize each ticket instance, such as was developed by Goldfinger and Hemhauser, to enable vendors to address them on wide scale.34 Goldfinger and Hemhauser’s methodology in examining troubleshooting ticket data included a team of library staff determining a controlled vocabulary for access outage types, and required the team to reach a consensus about each problem report before classifying it under a specific heading. Likewise, Goldfinger and Hemhauser note that a lack of a standardized, controlled vocabulary in the discipline made comparisons across institutions impossible. Browning indicated that the classification “Category of Problem” was vague and subjective, but that a controlled vocabulary to classify tickets is what made the analysis worthwhile.35

Method

For this study, the author created a survey using Qualtrics with questions related to the assessment of troubleshooting data in libraries. The author requested that only one member from each institution respond to the survey to prevent multiple responses from the same library. Furthermore, participants were asked to indicate if they were currently employed at an academic library in higher education. Those participants who indicated “No” were directed to the end of the survey and excluded from the sample. Participants were directed in a specific path within the survey according to whether they indicated that an assessment of troubleshooting data had been conducted at the respondent’s library. If respondents indicated that their institution did not perform data analyses on e-access problems or troubleshooting workflow, they were directed to later questions in the survey, and skipped questions that asked more information about a data analysis. Additionally, only the survey questions about demographics were required, so response rates to individual questions within the survey vary.

The survey was distributed to four library professional discussion lists: NASIG’s SERIALST listserv (serialst@simpleslist.com); the Electronic Resources in Libraries ERIL listserv (eril-l@lists.erl-l.org); the ALCTS E-Resources listserv (alcts-eres@lists.ala.org); and the American Library Association’s University Libraries section listserv (uls-l@lists.ala.org). By choosing these discussion lists, the author hoped to target those library professionals who both work in academic libraries and who also actively work to troubleshoot e-resource access problems as part of regular job responsibilities. The study was approved by the Institutional Review Board of The University of Alabama, and the author collected responses for fourteen days in June 2020. A total of 174 responses were collected, of which 143 were complete. The results presented here represent an analysis of those completed responses.

Results

Demographics

All of the participants in the sample indicated that they are currently employed in academic libraries. One response was excluded since the participant indicated employment at another type of library.

The approximate Full Time Enrollment (FTE) of schools represented in the sample ranged from 200 students to 110,000 (see table 1). The majority of responses reported FTE of between 200 and 9,900, making up 60 percent (n=86) of the sample. Additionally, most respondents indicated that their libraries were not ARL members for a rate of 74 percent (n=106).

Tracking and Data Analysis

Of 143 responses, 51 percent (n=73) indicated that e-access problems were being tracked in some way. Additionally, of the 73 respondents who indicated that e-access problems were tracked, the most frequently cited tool used was e-mail at 61 percent (n=45), followed closely by SpringShare products (LibGuides/LibAnswers) at 47 percent (n=34). No respondents indicated using an ILS system to track troubleshooting data, and twelve respondents selected “Other,” indicating tools like Trello, Sharepoint, and home-grown solutions (see figure 1).

When asked what types of data were tracked, respondents provided a variety of answers. Some of the more common types of data cited were the date and time of the report, who reported the problem (faculty, staff, or student), who resolved the problem, and the vendor involved. Some rather unique answers included tracking the access points or origin of the user’s request, IP ranges of the reporting user, and time spent by staff resolving the problem. Interestingly, while some respondents indicated perhaps only one or two data points were tracked, other respondents indicated large amounts of data points being recorded for each instance, with some including eight to ten data points being tracked per issue. Other participants indicated that they used less formality in tracking data types, such as only tracking the number of reports received in a given timeframe or only the resource and vendor name involved. Some participants indicated that e-mails or tickets were filed for later analysis, and had not established formal data points to track.

E-Access Problems Assessment

Of the 143 respondents, fifteen (10 percent) indicated that a formal analysis of e-access problems had been conducted in the past, and 19 percent (n=27) indicated uncertainty of whether a formal analysis had occurred. The affirming respondents were asked for what purpose an analysis was undertaken and multiple options were provided. The most common purposes indicated for an analysis were “To identify common points of failure” (n=10), followed closely by “For reporting purposes” (n=7) (see figure 2). Five of the fifteen respondents indicated that an analysis had been undertaken for training purposes, to justify staffing decisions, and/or to identify gaps in the troubleshooting workflow. Two respondents indicated that analyses were performed to present or publish the findings, while one respondent indicated that data was analyzed for communicating with vendors about renewals. Moreover, 53 percent (n=8) of the formal analyses reported were undertaken within the last year, with one respondent indicating that an analysis had occurred more than five years ago.

Eleven respondents (73 percent) indicated that an assessment of e-resources access reports was beneficial to users and services, with one respondent indicating that an analysis was not beneficial. Three respondents were uncertain whether benefits were realized as a result of an assessment. Only one respondent indicated that comparative analysis across more than one institution had been undertaken, while ten (71 percent) of the remaining respondents indicated that a comparative analysis of troubleshooting instances might be worthwhile in the future.

Local Troubleshooting Practices Assessment

For survey questions regarding an assessment of troubleshooting practices, rather than e-access problem reports, most participants responded that no assessment of local troubleshooting practices had ever been performed at 70 percent (n=101) of the full sample, and 20 percent were uncertain whether one had been performed. The most common indicated reason for undertaking an assessment of troubleshooting practices was to identify gaps in the troubleshooting workflow at 23 percent (n=9) and for training and/or documentation purposes, both at 20 percent (n=8) as shown in figure 3. Most assessments of this type were performed within the last year (31 percent or n=4) or the past three years (46 percent or n=6). Additionally, most of the thirteen responses indicated that a troubleshooting practices assessment resulted in improved services at 85 percent (n=11).

Barriers and Future Directions

The primary barrier to performing a troubleshooting analysis was time/staff constraints, with 106 respondents, or 74 percent, indicating difficulty in this area. The second most common barrier was “difficulty in organizing or obtaining data about problem reports,” with 41 percent (n=58) of respondents noting this as an obstacle to performing a troubleshooting assessment. The third most common reason was the lack of appropriate tools to conduct an assessment (see figure 4). For those respondents who selected “Other,” additional trends emerged as common barriers, including the lack of a request for such an analysis from administration, lack of interest on the part of staff and administration, or resistance to beginning a new project. Of the 143 respondents, 60 percent (n=86) indicated a decisive interest in performing or repeating a troubleshooting assessment in the future, and just 6 percent (n=8) indicated no interest.

Discussion

Though the professional literature establishes that there are many returns to be gained from an assessment of local troubleshooting metrics and data, the results of this study demonstrate that very few libraries are actually engaging in troubleshooting data assessment, though many actively collect or track the data necessary for an analysis. Specifically, this study shows that a large portion of troubleshooting data is being tracked (51 percent of the sample), yet only 10 percent of librarians reported using data for assessing e-resource access problems and 9 percent for assessing local troubleshooting practices. More in-depth study is needed to understand more clearly why a majority of libraries track troubleshooting data, if not for assessment purposes.

Likewise, the results of this study are highly indicative that many significant constraints prevent librarians from taking a deep dive into data related to troubleshooting, even though many respondents expressed interest in conducting a future assessment. More specifically, the limitations of time and staffing, plus the lack of available tools to collect and organize data, prevent librarians from performing analyses that may lead to improved services. In fact, all the barriers represented in this survey were cited by multiple respondents as problematic, suggesting that there are multiple barriers deterring librarians from undertaking a troubleshooting assessment project, though some barriers were more frequently cited than others. Interestingly, many barriers reported by respondents are more local concerns rather than broader concerns, such as a currently disorganized or transitioning workflow for resolving and tracking problems, and individual perceptions that a study of reports would not yield any new information. At least one respondent reported that there are no barriers or difficulties preventing an assessment project. In fact, the comments from participants about additional barriers provided compelling evidence that the decision to undertake a troubleshooting assessment project is very specific to an institutional need. Librarians seem to assess troubleshooting data with a specific goal to address a need or concern rather than on a vague, exploratory basis, and do not want to exert great effort without the promise of returns in regards to a troubleshooting assessment.

The survey results demonstrate that email is the most consistently used tool for troubleshooting in this study, as it was in studies by Heaton and Rathmel et al.36 The persistence of e-mail as the most ubiquitously used tool for troubleshooting is clear: troubleshooting largely involves effective communication, and e-mail is almost universal for interoffice correspondence. However, synthesizing the contents of an e-mail chain and gleaning organized, usable data is no small task. While good communication is paramount to a successful patron interaction and troubleshooting resolution, tools designed primarily for communication do not provide the luxury of easy data collection and analysis. Likewise, dedicated ticket tracking systems that could provide a more sophisticated level of data organization were used by only 24 percent of respondents, supporting Rathmel et al.’s notion that ticket tracking systems for e-resources troubleshooting are not widely implemented, despite the fact that those types of systems often provide a more robust way to collect and report data than e-mail.37 Samples and Healy found that a higher percentage of ARL libraries (43 percent) indicated using ticket tracking system for troubleshooting, demonstrating that ARL libraries particularly, seem to have easier access to more robust troubleshooting and data tracking tools.38 However, as Browning notes, the time and staff required to implement a robust ticket tracking system is greater than some libraries can take on, which may explain why ticket tracking tools have not been more widely adopted.39

Of the fifteen respondents who indicated that an assessment had taken place, the most frequently cited reason for it was to identify common points of failure, mirroring the goals of many of the studies cited here. More specifically, this result suggests that most libraries assess troubleshooting data to find and minimize frequently occurring problems and/or create proactive measures to reduce common access issues, as has been done in many published studies. The second most common reason for an assessment, “For reporting purposes,” gives rise to potential areas of additional study. For instance, future studies might consider how many libraries report troubleshooting data and metrics to administration or governing bodies and what is done with the reported data. The author posits that perhaps, in some cases, when troubleshooting data are reported to other bodies, the data could be assessed outside the knowledge or control of the librarians who gathered the data or be stored for the potentiality of future assessments.

Moreover, 73 percent of respondents who had conducted a troubleshooting analysis indicated that services or workflow had improved as a result of such a study, and 94 percent of respondents indicated interest in performing a future assessment. The literature and the results of this study support the idea that troubleshooting data assessment is a worthwhile endeavor with desirable results, but with often insurmountable obstacles to obtaining those results. A future study might more closely examine the specifics of how troubleshooting and e-access has improved following an assessment so that librarians could see tangible impacts of the work to assess troubleshooting data. In fact, a pre- and post-assessment of troubleshooting tickets to see the efficacy of measures undertaken to improve services would be ideal for those hoping to learn if goals had been obtained. A study that can demonstrate measurable impacts on services would make an excellent addition to the existing literature on troubleshooting studies.

Interestingly, only one respondent indicated that data had been used for a comparative analysis with other institutions. The literature demonstrates that comparing rates of e-access problems across institutions may provide benefits for libraries at large, rather than simply local analyses. However, the lack of tools, time, and staff available to perform local analyses, as cited in this study, are enough to deter a large percentage of librarians from assessing local data, much less from making comparative analyses. Nonetheless, ten out of fourteen respondents indicated that a comparative analysis between institutions might be worthwhile. It is important to have data related to potential widespread or ongoing access concerns when communicating with vendors about problems, and comparing data across institutions could reveal industry-wide concerns to be addressed. In fact, Goldfinger and Hemhauser state that “if more libraries determined the external causes of access problems, libraries might be better able to work with vendors to prevent the problems outside of libraries’ control” and advocate for a standardized vocabulary to classify types of outages across institutions.40

Finally, the functionality or failures of e-resources are an important consideration when assessing library and resource value and return on investment. One study participant indicated that troubleshooting assessment data was used when negotiating lower subscription costs and hosting fees with vendors. Indeed, the data from troubleshooting reports could help librarians demonstrate a resource’s value if no troubleshooting tickets exist for the resource and if multiple problems were reported. Moreover, the number of troubleshooting tickets answered in a given time period, a common metric collected by librarians in this study, can help demonstrate the value of staff time spent helping to resolve problems. Browning used troubleshooting data at Auraria Library to advocate for a new Electronic Resources Librarian position to handle some of the workflow needed to effectively respond to and resolve e-access problem reports.41

Conclusion and Future Directions

This study shows that while an assessment may provide tangible benefits to libraries, the obstacles to successfully complete one may be too great to overcome. However, if librarians responsible for e-collections management choose to assess troubleshooting instances and workflow, efforts to make such an assessment need not be so prohibitive. For those librarians unsure if an assessment would be worthwhile, considering the needed time and resources, the author suggests creating measurable and obtainable goals as a start, and then deciding it is worth pursuing in consideration of the required staff time and effort. The author believes that although there is much to be gained, conducting an assessment project is a highly localized decision that should not be made without great care and consideration.

Additional studies might take a more extensive examination of what librarians who have performed such troubleshooting assessments have done to conquer any obstacles. The author also encourages librarians who set out to assess troubleshooting data and practices to continue publishing, presenting, and comparing data to capture trends over time and set examples for other librarians to follow. The more librarians analyze the types of outages experienced, the better prepared we may be as a community to serve our library patrons, communicate with library vendors about services rendered, and maximize the return on investment from e-resources management.

References

  1. Sommer Browning, “Data, Data, Everywhere, nor Any Time to Think: DIY Analysis of E-Resource Access Problems,” Journal of Electronic Resources Librarianship 27, no. 1 (2015): 26, https://doi.org/10.1080/1941126X.2015.999521.
  2. Browning, “Data, Data, Everywhere,” 34.
  3. Jennifer Wright, “What Broke, Who Broke it, and How to Track It,” Library Resources & Technical Services 60, no. 3 (2016): 212, https://doi.org/10.5860/lrts.60n3.204.
  4. Wright, “What Broke,” 212.
  5. Rebecca Kemp Goldfinger and Mark Hemhauser, “Looking for Trouble (Tickets): A Content Analysis of University of Maryland, College Park E-resource Access Problem Reports,” Serials Review 42, no. 2 (2016): 91–92, https://doi.org/10.1080/00987913.2016.1179706.
  6. Goldfinger and Hemhauser, “Looking for Trouble,” 92.
  7. Goldfinger and Hemhauser, “Looking for Trouble.”
  8. Kelsey Brett, “A Comparative Analysis of Electronic Resources Access Problems at Two University Libraries,” Journal of Electronic Resources Librarianship 30, no. 4 (2018): 198–204, https://doi.org/10.1080/1941126X.2018.1521089; Lindsey Lowry, “Where Do Our Problems Lie?: Comparing Rates of E-access Problems across Three Research Institutions,” Serials Review 46, no. 1 (2020): 26–36, https://doi.org/10.1080/00987913.2020.1733173; Elyssa M. Gould and Kelsey Brett, “A Tale of Two Universities: Electronic Resources Troubleshooting Comparisons,” Serials Librarian 79, no. 1–2 (2020): 1–8, https://doi.org/10.1080/0361526X.2020.1760184.
  9. Brett, “A Comparative Analysis,” 203.
  10. Lowry, “Where do our Problems Lie?,” 34.
  11. Lowry, “Where do our Problems Lie?,” 35.
  12. Gould and Brett, “A Tale of Two Universities,” 7.
  13. Beth Ashmore and David Macaulay, “Troubleshooting Electronic Resources with ILL Data,” Serials Librarian 70, nos. 1–4 (2016): 288–94, https://doi.org/10.1080/0361526X.2016.1153336.
  14. Ashmore and Macaulay, “Troubleshooting Electronic Resources,” 293.
  15. Jacquie Samples and Ciara Healy, “Making it Look Easy: Maintaining the Magic of Access,” Serials Review 40, no. 2 (2014): 114, https://doi.org/10.1080/00987913.2014.929483.
  16. Wright, “What Broke,” 212.
  17. Goldfinger and Hemhauser, “Looking for Trouble (Tickets)”; Wright, “What Broke”; Brett, “A Comparative Analysis.”
  18. Sunshine Carter and Stacie Traill, “Essential Skills and Knowledge for Troubleshooting E-resources Access Issues in a Web-scale Discovery Environment,” Journal of Electronic Resources Librarianship 29, no. 1 (2017): 4, https://doi.org/10.1080/1941126X.2017.1270096.
  19. Carter and Traill, “Essential Skills and Knowledge,” 5.
  20. Samples and Healy, “Making it Look Easy,” 114.
  21. Samples and Healy, “Making it Look Easy,” 113.
  22. Browning, “Data, Data, Everywhere,” 26.
  23. Angela Rathmel et al., “Tools, Techniques, and Training: Results of an E-resources Troubleshooting Survey,” Journal of Electronic Resources Librarianship 27, no. 2 (2015): 98, https://doi.org/10.1080/1941126X.2015.1029398.
  24. Rathmel et al., “Tools, Techniques, and Training,” 95; Robert Heaton, “Tools for Troubleshooting: Which Ones and What For,” Journal of Electronic Resources Librarianship 30, no. 1 (2018): 12, https://doi.org/10.1080/1941126X.2018.1443903.
  25. Rathmel et al., “Tools, Techniques and Training,” 91.
  26. Samples and Healy, “Making it Look Easy,” 112.
  27. Carol Ann Borchert, “Untangling the Jungle of E-journal Access Issues using CRM Software,” Library Collections, Acquisitions, & Technical Services 30, nos. 3–4 (2006): 226, https://doi.org/10.1016.j.lcats.2006.10.002.
  28. Ashmore and Macaulay, “Troubleshooting Electronic Resources,” 292.
  29. Rathmel et al., “Tools, Techniques and Training,” 97.
  30. Maria Collins, “Evolving Workflows: Knowing when to Hold ’em, Knowing When to Fold ’em,” Serials Librarian 57, no. 3 (2009): 262, http://doi.org/10.1080/03615260902877050.
  31. Samples and Healy, “Making it Look Easy,” 113.
  32. Wright, “What Broke,” 212.
  33. Brett, “A Comparative Analysis”; Goldfinger and Hemhauser, “Looking for Trouble”; Wright, “What Broke.”
  34. Brett, “A Comparative Analysis,” 203; Goldfinger and Hemhauser, “Looking for Trouble.”
  35. Browning, “Data, Data, Everywhere,” 29–30.
  36. Heaton, “Tools for Troubleshooting”; Rathmel et al., “Tools, Techniques and Training.”
  37. Rathmel et al., “Tools, Techniques and Training.” 97.
  38. Samples and Healy, “Making It Look Easy,” 110.
  39. Browning, “Data, Data, Everywhere,” 26.
  40. Goldfinger and Hemhauser, “Looking for Trouble,” 92.
  41. Browning, “Data, Data, Everywhere,” 34.

Appendix

  1. Are you currently employed at an academic library in higher education?
    • Yes
    • No
  2. What is your school’s approximate full time enrollment (FTE)? ________________________________
  3. Is your library a member of the Association for Research Libraries (ARL)?
    • Yes
    • No

Definitions

The following questions will assess the extent to which your library has collected and analyzed data related to troubleshooting of e-resource access problems.

For the purposes of the study, the following definition of terms will apply:

Reports of e-access problems: A report received by library staff and originating from a library user in which the user informs staff that he or she is unable to access an electronic resource. This communication is often transmitted via a web form, ticketing system, e-mail, telephone, or the like.

Local troubleshooting practices: The workflow of how a library receives and resolves reports of e-access problems

Track reports: Recording data or information related to user reports of e-access problems in an archived or historical manner. E.g., An Excel spreadsheet containing data about troubleshooting tickets as they occurred over time.

  1. Does your library track reports of e-access problems?
    • Yes
    • No
    • Not Sure
  2. What types of tools does your institution use in order to track reports of e-access problems? Choose all that apply.
    • E-mail
    • Spreadsheet
    • Dedicated ticket tracking system (Footprints, SysAid, JIRA, etc.)
    • LibGuides/LibAnswers or other Springshare product
    • ILS system (SirsiDynix, Voyager, etc.)
    • Library Service Platform (LSP)
    • Electronic Resource Management system (ERM)
    • Other ___________________________________
  3. In a few words, please describe some of the types of data or metrics that are tracked: (e.g., Vendor involved, time to resolution, type of problem, etc.)_______________________________________________
  4. Has a formal analysis of reports of e-access problems ever been conducted at your institution?
    • Yes
    • No
    • Not Sure
  5. For what purpose(s) was an analysis of reported e-access problems performed? Choose all that apply.
    • To identify gaps in the troubleshooting workflow
    • To identify common points of failure
    • For reporting purposes
    • To justify staffing decisions
    • For training purposes
    • To improve documentation
    • Other ___________________________________
  6. Approximately how long ago was the most recent analysis of reports of e-access problems performed?
    • Within the past year
    • Within the past three years
    • Within the past five years
    • More than five years ago
    • Not sure
  7. In your opinion did the results of an analysis of reports e-access problems lead to improved troubleshooting practices and/or improved services for your users?
    • Yes
    • No
    • Not sure
  8. Have the results of an analysis been used to compare with that of any other institutions? (including consortial partners, branches, and/or peer institutions)
    • Yes
    • No
    • Not sure
  9. If no, in your opinion, would a comparative analysis of reported e-access problems between institutions be worthwhile?
    • Yes
    • No
    • Maybe
  10. Has a formal assessment or analysis of local troubleshooting practices ever been conducted at your institution?
    • Yes
    • No
    • Not sure
  11. For what purpose(s) was an assessment or analysis of local troubleshooting practices performed? Choose all that apply.
    • To identify gaps in the troubleshooting workflow
    • To identify common points of failure
    • For reporting purposes
    • To justify staffing decisions
    • For training purposes
    • To improve documentation
    • Other ___________________________________
  12. Approximately how long ago was the most recent assessment of local troubleshooting practices performed?
    • Within the past year
    • Within the past three years
    • Within the past five years
    • More than five years ago
    • Not sure
  13. In your opinion did the results of an assessment of local troubleshooting practices lead to improved workflow and/or improved services for your users?
    • Yes
    • No
    • Not sure
  14. What barriers or difficulties in analyzing reports of e-access problems or local troubleshooting practices exist at your institution? Choose all that may apply
    • Time/Staff constraints
    • Difficulty in organizing or obtaining data about problem reports
    • Not enough data to analyze
    • Lack of appropriate tools to conduct an assessment
    • An analysis is not needed
    • Other ___________________________________
  15. Would you consider performing an assessment of troubleshooting activities or reported e-access issues in the future? (If you have already conducted an assessment, would you consider performing another in the future?)
    • Yes
    • No
    • Maybe
Tool types used by respondents (N = 73)

Figure 1. Tool types used by respondents (N = 73)

Purposes for an Analysis of E-Access Problems (N = 15)

Figure 2. Purposes for an Analysis of E-Access Problems (N = 15)

Purposes for an Assessment of Troubleshooting Practices (N = 13)

Figure 3. Purposes for an Assessment of Troubleshooting Practices (N = 13)

Barriers to Performing a Troubleshooting Assessment (N = 141)

Figure 4. Barriers to Performing a Troubleshooting Assessment (N = 141)

Table 1. Full Time Enrollment of institutions. N =143

FTE

n =

% of sample

0–9,900

86

60

9,900–19,800

26

18

19,800–29,700

15

10

29,700–39,600

9

6

39,600–49,500

3

2

49,500–59,400

3

2

> 59,400

1

<1

Refbacks

  • There are currently no refbacks.


ALA Privacy Policy

© 2024 Core