Using Perceptions and Preferences from Public Services Staff to Improve Error Reporting and Workflows

Dawn McKinnon (dawn.mckinnon@mcgill.ca) is Electronic Resources & Serials Librarian at McGill University in Montreal.

Submitted July 7, 2015; returned to author September 21, 2015 for revisions; revised manuscript submitted November 17, 2015; returned to author for minor revisions December 10, 2015; accepted for publication January 24, 2016.

The ongoing transition from purchasing mostly print materials to electronic resources (e-resources) continues to pose workload challenges in libraries. In response, many libraries have focused on improving workflows to increase efficiency, which provides better service. This paper discusses a project undertaken to tackle one aspect of these challenges, in which data was gathered on how front-line library staff report errors found in the library catalog and discovery layer, and their preferences and perceptions for reporting errors to Collection Services staff. It also identifies improvements that can be made to error reporting, workflows and communication between Collection Services and front-line staff, to create a more service-oriented and efficient working environment in the library.

Over the past decade most academic libraries have transitioned from purchasing mostly print materials to electronic resources (e-resources). This transition has been well documented in library literature, particularly the workload struggle that libraries face during this transition and the challenges because of the ad hoc fashion in which e-resources are often managed.1 As libraries attempt to work more efficiently to improve service, a few studies have analyzed staff reporting of access issues and catalog errors but these tend to focus more on improving workflows from a technical services perspective or factors that lead staff to report problems.2 McGill University Library is a large research library in North America. Within its Collection Services department, the e-resources division handles cataloging, access and troubleshooting related to e-resources such as electronic journals (e-journals) and databases. Like many libraries, this division has undergone considerable change during the transition from print to electronic. To address some of these challenges and to help fill a gap in the literature, a research project was undertaken to gather data on how the library’s front-line staff reports errors they find in the discovery layer and catalog, their preferences for reporting, and perceptions of the response times and quality of the responses provided by the Collection Services staff. As many libraries face friction between front-line staff and those who work “behind the scenes,” the author aims to share lessons learned from this project and continue the discussion on the need for best practices in this area. This paper discusses the project, analyzes the results, and identifies where improvements can be made to error reporting, workflows and communication between Collection Services staff and front-line staff to create a more service-oriented and efficient working environment. It should be noted that the word “errors” in this paper refers to questions asked and errors reported to Collection Services, including, but not limited to, e-resource access problems, questions about subscriptions and renewals, and cataloging errors. “Front-line staff” refers to librarians and nonlibrarian staff who work with patrons in public services, including subject librarians, library assistants and supervisors who work at the service desks, and the interlibrary loans staff.

Literature Review

A recurring theme in the literature pertaining to e-resource management is the fluctuating roles and responsibilities of e-resources librarians.3 As roles remain in flux, frustrations with workflow inefficiencies are often highlighted. Waterhouse discusses the challenges of having “many systems involved in managing and delivering e-resources” at the University of Illinois Springfield (UIS), including SFX, Serials Solutions 360 and WorldCat Local.4 In addition to many systems, inefficiencies also occurred because the “acquisitions, processing, and cataloging workflows were quite separate from those of e-resource management” and the staff supporting e-resources was unfamiliar with each workflow.5 Mackinder refers to workflows as the “seemingly endless challenge” because “the staff time and effort involved in crafting, implementing, and revising process documentation can be overwhelming” because the workflows are not linear.6 Four years after creating her “ER lifecycle,” she is still “workflow brainstorming” because “change is the status quo” in this field.7

As librarians describe their unique challenges with workflows, software is often evaluated in the literature as a possible solution. Duke University Libraries turned to IBM’s BlueWorks Live and Business Process Manager to improve e-resources workflows following what they called a “fallout” from cumulative errors made over several years.8 At Ohio State University Libraries, Feather examined tools to improve e-resources communication workflows, productivity and efficiency. This makes sense, as her study and others found email to be a main tool used for reporting access issues and troubleshooting e-resources.9 Electronic Resource Management (ERM) systems are one of the latest tools discussed to help libraries with their workflows. Although ERMs are good at “issuing renewal reminders . . . they are less successful with more complex workflow issues.”10 At UIS, the ERM is but one piece of the workflow and they also rely on the library’s intranet and face-to-face meetings.11 In 2008, Emery reported that “in theory, ERMs are a winner . . . yet, in practice, we have discovered that ERMs do not immediately solve all the problems as we expected.” Grogg examined ERMs in 2008 and again 2011 with Collins, but still found unfavorable reviews where workflow was concerned, calling it “one of the biggest deficiencies (and disappointments) of ERMS functionality.”12 ERMs are continually improving, but their pros and cons are still being discussed, as evidenced by a 2014 ALA Midwinter Meeting panel on this topic.13 Nearly all panelists expressed how ERMs helped overcome some workflow problems but they are only one tool for e-resource management.

Moving beyond software, the “core competencies” for e-resources librarians is another approach found in the literature.14 Proponents cite that “cross functional, cross-trained” teams skilled in communication, problem-solving, and licensing models, who are flexible, persistent, and understand the organizational structure, will have a “high rate of problem resolution and user satisfaction.”15 In addition, the phrase “best practices” is often used but has yet to be fully fleshed out. For example, Samples and Healy describe a “need for libraries to develop best practices for troubleshooting electronic resources.”16 Pomerantz surveyed more than two hundred librarians and concluded that there is “a great deal of variation in practices and inconsistency in training experiences” and that a “set of best practices” is needed.17 Although Sample and Healy were referring mostly to proactive troubleshooting and Pomerantz was referring to the role acquisitions librarians play in e-resource management, the sentiment applies to the broader picture, as shown by the development of Techniques for Electronic Resource Management (TERMS).18 TERMS began in 2008 following a discussion about “what was lacking both in current practice and with the systems available” and has grown to be a reference point for managing e-resources.19 The academic literature is sparse on systematic implementation of TERMS, and TERMS workshops have started occurring at conferences such as Electronic Resources and Libraries (ER&L).20 TERMS provides “feedback from those in the field who are actively managing electronic resources” with what Mackinder calls invaluable “real-world data” that creates a “shared understanding” that can help with e-resources management.21

All these efforts are necessary because e-resources teams “must be responsive to the high expectations of users and other library staff.”22 Samples and Healy identified that “initiating a troubleshooting workflow can come from two main avenues—library staff and patrons.”23 Library literature includes an abundance of papers on patron perceptions and opinions, particularly as LibQUAL assessment moves into its second decade, yet little has been written about front-line staff expectations and preferences regarding e-resource error reporting.24 Foster and Williams’ 2010 article is one of the recent few that includes library staff in their study, which focuses on factors that lead staff to report errors and how to encourage more reporting. They refer to front-line staff as a “vital group in identifying problems” as they are “best positioned to discover problems with resources” that may not be revealed through other work done by e-resources staff.25

Several papers discuss using error reporting to improve e-resources workflows, but many of the data sets are now nearly a decade old.26 Samples and Healy’s 2013 survey polled libraries about error reporting forms and showed that just over half of respondents (57 percent) had a single form designed for both staff and patrons, and they echoed Dowdy and Raeford’s sentiment that “effective communication across units is hampered by inefficient and largely non-automated techniques.”27

Given that front-line staff are well situated to discover problems and that e-resources workflows constantly need improvement, the project discussed in this paper focuses on one gap in the literature: front-line staff’s preferences and perceptions around reporting errors found in the library’s discovery layer and catalog.

Background

McGill University is a research university with approximately 22,000 undergraduate students and 10,000 students in masters, doctoral, and postdoctoral programs. McGill Library is an Association of Research Libraries (ARL) member, with 174 employees—63 librarians and 111 full-time library staff, located in ten urban branches and one suburban branch. Its Collection Services department manages tasks related to cataloging, metadata, acquisitions, processing and most aspects of maintaining the discovery layer. In 2012–13, through attrition and austerity measures, the number of Collection Services staff decreased from 55 to 36 and the department was restructured to rebalance workloads. The ten-person “Serials, E-resources and Acquisitions” division became the “E-resources and Serials” division with two librarians and four staff, managing cataloging and access related to print and e-journals, databases, and streaming media, with primary responsibility for maintaining the discovery layer. This division also triages questions from patrons and staff sent to the Collection Services email account, as the majority of the questions are related to e-resources. Other types of questions, such as print cataloging or processing questions, are forwarded to the appropriate division.

During this period, the library administration moved to an “e-preferred” collection policy because of a concern about the lack of shelf space for print material and the ability to have more purchasing power for buying e-books in bulk packages. The e-book collection continued to grow as faculty and students provided positive feedback about e-books from publishers with unlimited simultaneous users. As the number of e-book acquisitions grew faster than they could be cataloged, the backlog swelled to over one million e-books. A new “E-books Cataloging” division was created, and the e-resources staff member who worked on e-books moved to this new division along with two others who had previously worked with print material. Formerly, at least one staff member in each division was dedicated to acquisitions. For example, one staff member in the e-resources division worked primarily on acquisitions tasks for e-resources; several people in (print) Cataloging completed acquisitions tasks for print material. These disparate acquisitions staff were merged into a new “Collection Development” division. Through attrition, Collection Development decreased from eight people to five during this period, and staff were not replaced because of financial constraints. As staffing numbers were reduced through attrition, the library decided to outsource most cataloging of current physical material (i.e., shelf ready monographs). The remaining “Processing and Cataloging” division handles rush monograph cataloguing and related end processing. A cataloging backlog of rare material became a priority for the library administration, who wanted to highlight unique items in the collection. Several people who had been doing a variety of cataloging and processing tasks were moved into the Rare and Special Collections Cataloging division to address this priority, bringing the number up to seven.

During these organizational changes, in 2012–13 the e-resources team completed a soft implementation of the discovery layer while maintaining the traditional catalog. The number of questions directed to Collection Services increased during and after the discovery layer implementation for three broad reasons: public services staff did not know who to contact for help, the discovery layer came with a learning curve, and it exposed more e-resources and access issues than the library’s traditional catalog.

A year after the restructuring, in November 2013, the library prioritized the need to “improve mechanisms for reporting and responding to problems” with the discovery systems and the catalog during a strategic planning session.28 Before the planning session, the process for reporting problems to Collection Services consisted of a mix of phone calls, email, web forms, and in-person visits. Front-line staff had difficulty remembering which form or email address to use to report problems. Foster and Williams reported a similar issue at Milner Library, where the “reason most often given for why someone was not likely to report a problem was being unsure of how or to whom to report” it.29 The planning session also revealed that front-line staff felt Collection Services responses were often delayed or nonexistent. Many e-resources staff were frustrated and overwhelmed by the organizational and workload changes, and the lack of clear workflows.

To address some of these issues, email service accounts were created to relieve front-line staff from remembering who performed each task, and it allowed for workload sharing. However, so many service accounts were created that front-line staff then had difficulty remembering which account to use for each problem. In 2014, a single Collection Services email account was created and staff were encouraged to use it for all questions, from purchasing to cataloging to access. This mailbox is triaged by the E-resources and Serials division.

In June 2014, the library officially launched its discovery layer and a new link resolver. Noticeable changes included a having the new discovery layer as the default search from the library’s website, updates to the look and behavior of the link resolver, and the removal of all e-resources from the legacy catalog. Although it is not prominently displayed on the website, the legacy catalog is available and can be used to locate circulation information for nonelectronic materials such as print books and journals.

The staff restructuring in tandem with the migration to the new discovery layer and link resolver were the catalysts for this research project. The number of issues reported dramatically increased but there were fewer staff to respond, which emphasized inefficiencies and gaps in existing workflows. The library’s strategic goal of improving mechanisms for reporting problems became paramount for the e-resources staff as they searched for new ways to manage it.

Method

Data was collected in three ways: statistics on errors reported were collected, an online survey was conducted, and personal interviews were conducted. Statistics were collected for errors reported to Collection Services during a one-month period. This provided a sample that could be analyzed and compared against data and comments collected through the online survey and interviews. Data were compiled and analyzed in Microsoft Excel spreadsheets. Comments were summarized to ensure anonymity, which was important for gaining trust from the staff and helped to provide a higher response rate.

Statistics on Reported Errors

During October 2014, errors reported to the Collection Services and e-resources mailboxes were monitored as were errors reported through the “Report a Problem” form from the legacy catalog typically used to correct print holdings or other errors. Errors reported directly to e-resources staff by phone, email or in person were also included. Although errors had never been systematically tracked, October was selected for the project because anecdotally it seemed to be the month in which the most errors were reported every year. As with many academic libraries, students seem to start using the library’s resources more heavily in October because of mid-term exams and papers and the beginning of group project work that is due at the end of the term.

Errors are normally triaged by several Collection Services staff. As a pilot method for this project, the author triaged the majority of the errors. Responses were provided by the author and other Collection Services staff. To mimic normal working conditions outside the project, work was done only during regular business hours and the staff was not encouraged to work faster than normal. Even with these parameters, the pilot method of one person triaging the errors may have created artificial response times and is discussed later in this paper. To provide more conclusive results in this area beyond a pilot, the triage method would need to be assessed further and addressed.

Since Collection Services does not use an automatic mechanism for tracking errors, during the project the following was entered into an Excel spreadsheet for each error:

  • date and time the issue was submitted (using the email timestamp or time the person phoned/visited)
  • date and time the issue was first viewed/heard by Collection Services staff
  • name and division of the person reporting the issue (e.g., front-line staff, ILL, etc.)
  • method of how the issue was delivered (e.g., email, phone, in-person)
  • division responsible for responding (e.g., e-resources, e-books, Collection Development)
  • if/how and when an acknowledgement was provided to the sender (e.g., verbally or by email)
  • description of the issue

Noting the time differences between when errors were submitted and when they were first viewed by Collection Services staff served two purposes:

  1. to detect delays between when errors are submitted and when they are viewed by Collection Services staff;
  2. to provide possible explanations for longer response times when errors are submitted after business hours.

The staff who triage errors sent by email use a schedule so that at a given time, typically only one person is managing the inbox. This prevents multiple people from accidentally working on the same problem at the same time. Anecdotally, it was common practice for staff to begin resolving issues immediately upon first viewing of the report, and thus “viewed by Collection Services” captured the start of the process to resolve the error.

Tracking how and when acknowledgements were provided was in response to concerns from front-line staff who felt that reported errors were never addressed. As common practice, the e-resources division sends email acknowledgements for errors that they expect will take longer than a day to be resolved and when errors are forwarded outside of the division. Acknowledgements are not sent automatically, and occasionally staff forget to send them as it is not an explicit policy. This practice was not altered during the project.

For resolved errors, the following was added to the Excel spreadsheet:

  • resolution date and time
  • “response time from issue sent”: the time difference between when the issue was sent and when it was resolved
  • “response time from issue viewed”: the difference in time between when the issue was viewed by Collections Services staff and when it was resolved

Online Survey

The author searched for an existing survey tool to evaluate staff preferences and perceptions, particularly related to error reporting. Foster and Williams also designed an online survey tool to gather feedback from library employees, but as this survey is longer and more detailed than desired, it was not used.13 Thus, the author drafted a survey and collaborated with colleagues to establish validity (see the appendix).

The survey was created using LimeSurvey, open-source survey software (https://www.limesurvey.org). One advantage of using this program is that the raw data can be received in a variety of formats and there are settings to ensure anonymity. The survey was designed to be completed in ten minutes or less to elicit a high response rate. Visually, it is a single online page of ten numbered questions; some questions have multiple, related parts, so participants are actually asked fifteen questions.

The final two questions are open-ended and the remaining questions are a mixture of multiple choice and forced choice. All questions are optional, and participants can exit at any time. Controls were not in place to prevent individuals from responding multiple times since this appeared to be a low risk. It was assumed that staff would take the survey seriously and want to improve service.

The questions were grouped into these categories:

  • current behavior when reporting errors (questions 1–3)
  • expectations and preferences for reporting errors (questions 4–7a)
  • perceptions of response times and quality of responses (questions 7b–9)
  • comments (question 10a and 10b)

In addition to general comments, respondents were asked to describe a time when they were not satisfied with Collection Services and to describe what could have been done differently for a more satisfactory result. The objective of this question was to gather commonalities between the historical “worst case scenarios.”

The questions, methods for distribution and dissemination of results were approved through the university’s Institutional Review Board (IRB). Because of the potentially sensitive results and the possibility for employees to provide unfavorable feedback on their colleagues’ work or to be accidentally identified, raw results were only viewed by the author and were anonymized and summarized before they were shared.

The survey was distributed by email to all (174) full-time library employees. The email instructed employees who were not front-line staff not to respond. A second email was sent to each division that was not considered front-line, reminding them again not to respond. Collection Services staff were reminded verbally as well, as responding would mean that they would be reporting on their own work and would invalidate results. Although there is no guarantee that other (non-front-line) staff did not respond, this risk is assumed to be low, not only due to the number of strong reminders but also because staff were interested in the results and were on board with the survey and improving service. The survey email was sent from the library’s communication officer as she does not supervise any employees, and this minimized potential pressure to respond or answer favorably, as stipulated by the IRB. This resulted in 103 front-line staff or potential respondents; 56 people responded, yielding a 54 percent response rate.

Personal Interviews

At the end of the survey, participants were invited to email the author if they were interested in completing an in-depth interview. Six weeks after the survey closed, an additional email request for volunteers from the front-line staff was sent to all full-time employees using the same email distribution method as was used for the survey. This resulted in eight volunteers who completed the interviews in January and February 2015. A separate consent form was used for this part of the project, as the data was confidential but no longer anonymous. As stipulated by the IRB, only questions from the online survey could be asked during the interview; however, they could be asked in a different order. To facilitate an easy flow of conversation, all interviews started with the final question from the survey, asking interviewees if they would like to provide general feedback. The author then asked the questions from the online survey in their original order, skipping questions if they had already been answered through the normal course of conversation.

Results

Methods for Reporting Errors

During the project, 296 errors were reported in a variety of ways as shown in table 1. As nearly three-quarters of the errors were sent through the Collection Services email account, it is clear that using this single email account was the preferred reporting method during the project period.

Survey questions 1–3 asked which methods were used the last time respondents reported different types of errors, including problems with missing information in the record, unable to find known items using the discovery layer, and subscription or access problems. Respondents could select multiple responses, but for all error types, there was a clear preference for emailing service accounts, as shown in table 2. One interviewee stated, “In the past it wasn’t always clear who we were supposed to report to . . . it’s much clearer now with one service account.” Most interviewees echoed this sentiment, specifying that not having to find email addresses or names of people responsible for each division is faster and less frustrating when reporting errors. Responses in the survey’s “Other” comment box and some interviewees cited time constraints as a reason why an issue might not be reported.

Some survey comments and three interviewees mentioned a newly created pilot web form. It was not included as an option in the survey, as it was still being tested and not yet available to all staff. Due to pressure to re-create the old “Report a problem” form that was available in the legacy catalog, the e-resources division designed a new web form that includes fields for the title, URL, format (e.g., e-book, database, etc.), type of problem (e.g., broken link, missing print holdings, etc.) and a comment box for additional information. Upon clicking “Submit,” an email is sent to Collection Services. Many said completing the new pilot web form is faster than writing an email. One interviewee said, “Once the form was created, I stopped using the service account to report basic errors.”

Types of Errors Reported

Nearly 78 percent of the errors were related to e-books (29.4 percent) and e-resources (48.3 percent), as shown in in table 3. As e-book errors are resolved by a separate division, for this project they were considered separately from errors related to e-resources (databases, e-journals, etc.), which are resolved by the e-resources and Serials division.

Resolution Rates and Response Quality

Of the 296 errors reported during the project, 59 percent (175) were resolved by e-resources staff (see table 4). Although 10 percent (29) were assigned but unresolved at the end of the data collection period, the majority of these were resolved in the immediate weeks after the project closed.

A quarter of the errors were coded as “forwarded internally” and no longer deemed e-resources’ responsibility. These errors were passed to other Collection Services divisions, and were typically subscription problems sent to Collection Development (5 percent) and access errors sent to the e-books Cataloging division (18 percent). It should be noted that while eighty-seven errors were coded as “e-book” errors (table 3), only seventy-five were forwarded to the e-books Cataloging division (table 4). The remaining twelve e-book errors were handled by the e-resources staff member triaging the mailbox because the questions were short, customer-service questions such as printing from an e-books platform, rather than errors that required cataloging expertise. Partially because of the outcome of this project, all e-books questions are now forwarded to the e-books Cataloging team. The remainder were unique, “one-off” questions related to database maintenance, interlibrary loans, and patron reporting. It was outside the scope of the project to track response times for other divisions, although anecdotally, all e-book errors were resolved within two days. Errors forwarded to Collection Development are discussed in more detail later.

Five percent of the errors were sent to OCLC, the vendor responsible for the library’s discovery layer and copy cataloging records. None of these errors had been resolved by January 31, 2015, three months after the study had closed, but e-resources staff continue to track and follow-up on these errors. OCLC has various responses, including that some features are not yet available, and they were working on system updates that would include resolutions.

Of the survey respondents who answered questions about quality (question 7b), about two-thirds were “satisfied” with the response they received the last time they reported an issue to Collection Services. Question 9 asked respondents to select the statement that best represents them regarding reporting errors; half of those who answered the question indicated that they felt their errors were answered to the best of the staff’s abilities, as shown in table 5. All of the interviewees indicated this sentiment as a typical experience, excepting subscription problems and those forwarded to OCLC.

Question 10 asked respondents to describe a time when they were not satisfied and to comment on what Collection Services could have done differently. Fourteen people provided examples and four suggested improvements without specific examples. These comments can be grouped into the following themes:

  • frustration with errors that cannot be resolved by Collection Services, in particular errors forwarded to OCLC
  • frustration with little or no follow-up communication on outstanding errors
  • poor treatment by Collection Services staff
  • too much reliance on front-line staff to report errors

The remaining participants did not respond to this question or wrote that they had never had a bad experience. All interviewees said that most of the time they are generally happy with response quality. Three interviewees said they had never had a negative experience.

Response Times

Over half of the online survey respondents indicated that they preferred resolutions within the same day or next day, as shown in table 6.

When respondents were asked to indicate the response time for the last error reported to Collection Services (question 4), nearly the same number of respondents indicated that it had occurred within the same day or by the next day.

This is consistent with the response times collected during the project, where 156 errors were resolved within 24 hours of being submitted, representing 53 percent of all errors reported, or 89 percent of errors resolved by the e-resources division. Of those resolved within the twenty-four-hour period, more than half (84) were resolved within sixty minutes of submission (see table 8).

There were differences in response times when calculating from the time the issue was viewed by Collection Services staff, rather than when it was sent, particularly within the twenty-four-hour window. For example, the number of errors resolved within ten minutes increased from twenty-six (counting from time sent) to eighty-seven when counting from time viewed. However, at the twenty-four-hour turnaround time, both resolution times are equal (156/296 resolutions, from time sent and from time viewed). Of the errors resolved by the e-resources staff, all but one were resolved within five days of viewing; the outlier was resolved in fourteen days because it required the help of two different vendors.

Every interviewee indicated that most of the time they felt that errors were resolved in a timely manner. Four people indicated that “occasionally” their reported errors were never answered but that this happened less frequently now than in years past. Some survey respondents and interviewees perceived problems with response times for errors related to subscriptions and renewals. Several people stated that even though many of the errors forwarded internally were resolved quickly, the poor response time and lack of follow-up on the few outstanding subscription errors was so significant that it overshadowed the positive resolutions. As with all data collected, these comments were summarized for anonymity and then shared first with Collection Services division coordinators with recommendations for moving forward. The proposed solutions to these challenges are discussed in the Recommendations section.

Communication

During the project, email was the most commonly used method of communication, with 94.9 percent of errors reported via one of three email options (sum of 73.3 percent to Collection Services, 13.5 percent to individuals and 8.1 percent to the e-resources service account). Survey results indicated that email acknowledgements are preferred, with thirty-two participants preferring them and only seven indicating that they did not (in response to question 5). However, many respondents commented that acknowledgements are preferred only when resolutions cannot be provided within the same day. During the project, acknowledgements were delivered 65 percent of the time. They were not sent for errors that were expected to be resolved within a few hours, as an email confirming resolution was sent instead; this is common practice for the e-resources division. The interviewees were split evenly on the usefulness of acknowledgements: two said it helped them track outstanding errors; two people only wanted resolutions, not acknowledgements; the remainder had neutral opinions on this topic.

Most survey respondents and interviewees indicated neutral or positive encounters regarding communication with Collection Services staff, but as indicated earlier, several respondents mentioned that they felt that Collection Services staff was sometimes unresponsive or rude, although they all said that this feeling was happening less often than in previous years. They wished updates were sent proactively by Collection Services staff, particularly when there are delays. Several others mentioned feeling “lost” or not knowing the procedures to follow up on their errors themselves.

Discussion

It is indicative of the times that the majority of the errors reported during the project were for e-resources rather than print material. This notion has been documented by many in recent years, including Henderson and Bosch who predicted in 2010 that a “shift from print to digital is likely to accelerate greatly.”30 This is certainly true for McGill Library, further adding to the evidence found in the literature for a need to focus on e-resources workflows and management. Like Dowdy and Raeford at Duke University, taking stock of the existing environment and workflow helped staff at McGill University to determine a course of action for improvement.31 In part, this project aimed to establish commonalities in the types of errors reported to better understand the situation. The following themes emerged from the interviews and the online survey comments:

  • frustration with data errors that cannot be fixed in-house and must be forwarded to OCLC
  • cases where front-line staff felt the communication from Collection Services was unpleasant
  • difficulty receiving answers for subscription questions
  • front-line staff feel they are relied upon too heavily to report errors found in the discovery layer, that this is beyond their responsibilities, and that Collection Services should be doing more to proactively fix these errors

Examples of errors that are forwarded to OCLC typically involve incorrect metadata. Some metadata can be corrected locally while others can only be fixed by OCLC. For example, when the discovery layer provides a link to an e-book or e-journal that has the same title as the item in the record but is actually a different item (with a different author, or ISSN, etc.), the correction can only be done by OCLC. It is often several months before these errors are resolved.

Difficulty with subscription issues is partly because of silos of information, where updates are not shared across Collection Services departments, which can cause delays when changes are made to subscriptions. McGill Library is not alone in this struggle; Samples and Healy also found silos to be one of the main points of workflow failure reported by the ARL e-resources librarians.32

The final theme revealed through survey comments was that front-line staff feel indicated that they feel they are relied upon too heavily to report errors found in the discovery layer, that this was beyond their responsibilities, and that Collection Services should be doing more to proactively fix these errors. This fits with Samples and Healy’s research on the need for proactive troubleshooting best practices and Dowdy and Raeford’s recommendation for proactive quality control in e-resources. For example, using a wiki or other mechanism for informing public services staff so that they would know about planned database down times, and doing subscription inventories to ensure that the databases, e-journals and e-book collections are all activated properly.33 Ensuring that current subscriptions have the correct links, and that obsolete subscriptions are removed takes the onus of reporting access issues away from the patrons and staff.

One area that appears to have improved since the strategic planning session in 2013 is the confusion regarding who in Collection Services to contact for help. During the project, 74 percent of errors were reported through the Collection Services email account, and the majority of survey respondents indicated that they used this method the last time they reported errors. Other libraries also found email to be a one of the most popular ways report access errors, and also web forms.34 As previously mentioned, the e-resources division had created a new web form for reporting errors to automate a step in the workflow to improve efficiency. During the research project, it was testing by a several staff members who provided positive feedback. Several interviewees mentioned that the form was a “huge improvement” and one interviewee said, “I’m reporting more lately because I love the form.”

This feedback suggests that streamlining the reporting process has made it easier for front-line staff to report errors. Triaging through the one service account is also easier for Collection Services staff as several people can monitor the account, staff can share the workload, and scheduling is less of a concern. As Feather notes, there is a danger of using personal accounts as “if one person is absent and receives a message, no one else will be able to respond to it in a timely manner.”35

Even though many front-line staff had previously said they were unsure of who to contact, it is interesting that emailing individuals in Collection Services is the second-highest survey response. This may be due to habit or because of a friendship between a front-line staff member and a Collection Services employee. It could also suggest that some people find they receive better service by emailing an individual that they know.

Not surprisingly, no one selected using the “chat with a librarian” service (QuestionPoint) to report errors. It was added to the survey to see if anyone preferred using this method. This service is only occasionally staffed by Collection Services librarians, and is not currently used to communicate between staff at McGill Library, so the result was expected.

Resolution Rates

As it was outside the scope of the project to analyze errors forwarded to other Collection Services divisions, resolution rates were only included for errors answered by the e-resources division. At the end of the project, the division staff was surprised at the high number of resolutions: 83 percent of those assigned to e-resources (or 59 percent of all errors) were resolved within the project timeframe, and the remaining were resolved in the weeks after the project ended. It speaks to human nature that staff remember the errors that they were unable to resolve or that took longer than expected.

In contrast, all errors reported to OCLC remained unresolved during the project timeframe and for many months afterward. Some were never resolved and some were marked as “features” for the future. Although these represented only 5 percent of reported errors during the project, the volume of comments and level of frustration from survey respondents and interviewees far outweighed what the statistics demonstrate. If OCLC errors continue to be unresolved for long periods of time, front-line staff may stop reporting them, as shown at Milner library, where staff do not always report problems that they “figure can’t be fixed.”36

Response Times

Many survey respondents and interviewees noted that most of the time, responses arrived within the preferred timeframe of the same day or the next day, and this timeframe was consistent with data collected during the project. However, unlike the other data collected in the research project, the pilot method of one person triaging the errors is not true to the working environment and may have affected response times. Anecdotally, response times during the project appeared to resemble response times external to the project, but cannot be stated with certainty. It is notable that during the project, the differences in the response times counting from when the issue was sent versus when it was viewed by Collection Services occurred only within the twenty-four-hour timeframe. After that point, both resolution times are equal (156/296 resolutions, from time sent and from time viewed). Thus, for the duration of the project, about half of all reported errors are resolved within 24 hours regardless of when they were sent or viewed.

Response times for errors relating to subscriptions were highlighted in the survey and interviews as an area that required particular improvement. This perception underscored the need to break down information silos between the Collection Services divisions and create a better workflow for e-resources acquisitions. As Mackinder pointed out, e-resources workflows “are in a near-constant state of flux by forces that are mostly outside of our control” including “shifting staff dynamics.”37 This holds true at McGill Library, where Collection Development, the division responsible for acquisitions, has faced high turnover since 2012 and new employees face a steep learning curve as they try to cope with the volume of work.

Communication

Poor communication from Collection Services staff was emphasized throughout the survey responses and interviews. It is common practice for e-resources staff to send acknowledgements, and this occurred 65 percent of the time during the project. Beyond showing receipt of the issue, the acknowledgement also demonstrates to the front-line staff member that the question is understood and implies that the issue will be investigated.

Interviewees were split on the usefulness of acknowledgements but over half (57 percent) of survey respondents preferred acknowledgements, particularly for responses that would take “a long time,” or “longer than a day.” This suggests that e-resources should continue their common practice and send acknowledgements for errors when there are delays.

Front-line staff also asked for more follow-up communication with subscription problems. Unlike fixing broken links, subscription errors typically take extra time to resolve because resolutions require responses from vendors. Given the staffing changes mentioned earlier, Collection Development is particularly low on human resources—the division is strapped and has little time for providing updates. Like Dowdy and Raeford faced at Duke, “there was a lack of transparency with information” that is “time consuming to dig out” and it is difficult to know when something has dropped out of the process.38

Recommendations

The recommendations below are specific to McGill Library, but similar improvements could be made in many other libraries It is evident from the project data that the library should continue promoting the single Collection Services email account. Given the popularity of web forms at other academic libraries and the positive feedback received thus far, the new web form for reporting errors should be made available to all staff.

Many of the front-line staff preferences reported throughout the project point to implementing better workflows for reporting access errors and being proactive about managing e-resources. To facilitate workflows, several other divisions in the library use formal ticketing systems, which could be investigated by the e-resources division as a possible solution to showing front-line staff the status of outstanding errors, work assignments and who to contact for more information. This option is popular among libraries according to the Samples and Healy study, as 43 percent of libraries that responded to their survey use a ticketing system to manage errors.39 Alternatively a simpler, informal approach may be more appropriate, such as a dedicated page on the library’s intranet. Both solutions should increase transparency and help with proactive and reactive troubleshooting. Each would need to be evaluated for effectiveness and how much it increases the workload.

In addition to investigating a tracking mechanism, it is clear that all communication surrounding acquisitions needs improvement, both within the Collection Services divisions and with the front-line staff. Librarians in acquisitions may see the complexity of their “acquire” portion of the lifecycle, yet not have much sense of the “provide access” and “provide support” workflows that make what is acquired actually accessible.”40 Pomerantz’s research noted the need for staff to collaborate and “to develop a set of best practices for the acquisition of electronic resources” to help cope with the changes in the acquisitions model from print to electronic.41 As a direct result of this project, a monthly meeting with all Collection Services staff who work on acquisitions-related tasks, regardless of division, was recommended. While Collection Development handles the bulk of this work, the e-books and e-resources and serials staff provide access and troubleshoot problems with new subscriptions, and liaise with front-line staff and vendors. The meetings allow everyone to share information and to collaborate on additional improvements to the workflow. It also helps resolve subscription problems more quickly.

As there were several examples from the survey and interview data that indicated that Collection Services staff were sometimes rude or sarcastic, and that front-line staff sometimes felt that they were “bothering” them, a future look into the tone of responses is warranted. Investigation and resolution of this issue was outside the scope of the project but one possible approach could involve creating template or standard responses when troubleshooting with front-line staff.

The idea that e-resources units should collaborate closely with front-line staff to provide excellent service is repeated in many studies and was demonstrated through this project.42 As recommended by some survey respondents, Collection Services information sessions on various topics could facilitate such collaboration. Topics could include an overview of each division’s primary function and its employees, and open sessions where front-line staff can have their questions answered by a panel of Collection Services staff. It is also recommended that this type of information be added to the library’s intranet. Similarly, supervisors from each Collection Services division are encouraged to visit each branch library annually (at a minimum), to facilitate knowledge sharing between front-line staff and Collection Services.

Limitations

As the online survey was sent to all staff and results were anonymous, there is a risk that employees who are not front-line staff responded. There is also a risk that the same person could have completed the survey multiple times. However, it is assumed that these risks are minimal.

Typically two or three people in the e-resources division triage errors. It is a limitation of the method that the same individual triaged the errors during the month of data collection, creating an artificial environment that may have affected response times. Other staff may triage at different rates. To determine whether response times during the pilot are representative of the real working environment, the study would need to be replicated using standard procedures (i.e., having the entire team triage errors). As much as possible, precautions were taken to remind staff to respond at a normal rate (i.e., not faster than usual), and work was done only during business hours to minimize the potential for misrepresentation.

Response times may also have been affected by the way the information was collected, as each method for reporting presents information in a different manner. A web form collects specific, sparse information compared with a phone call. As this is true for work outside of the project as well, there was no tabulation for differences in response rates based on the reporting method.

Another limitation of this study is that it was beyond the scope to track resolution times for errors forwarded to other divisions in Collection Services. Even when resolutions by other divisions were known, the affected 25 percent are listed as “forwarded internally” rather than “resolved.” If the project is repeated, full data should be captured to provide a more comprehensive picture.

Conclusion

This project focused on error reporting by front-line staff, identifying how errors are reported and preferences for reporting them. It also shed light on many other areas where Collection Services can improve, including workflows and communication. It demonstrated that in all aspects, from receiving to tracking to resolving errors, that efficiency will improve when Collection Services divisions can successfully communicate and collaborate with each other, and with front-line staff.

References

  1. A. S. Chandel and Mukesh Saikia, “Challenges and Opportunities of E-resources,” Annals of Library & Information Studies 59, no. 3 (2012): 148–54; Beverly Dowdy and Rosalyn Raeford, “Electronic Resources Workflow: Design, Analysis and Technologies for an Overdue Solution,” Serials Review 40, no. 3 (2014): 175–87, http://dx.doi.org/10.1080/00987913.2014.950040; Jill Emery, “Beginning to See the Light: Developing a Discourse for Electronic Resource Management,” Serials Librarian 47, no. 4 (2005): 137–47, http://dx.doi.org/10.1300/J123v47n04_13; Jill E. Grogg, “Electronic Resource Management Systems in Practice,” Journal of Electronic Resources Librarianship 20, no. 2 (2008): 86–89; Richard P. Jasper, “Collaborative Roles in Managing Electronic Publications,” Library Collections, Acquisitions, & Technical Service 26, no. 4 (2002): 355–61.
  2. Jeannette Ho, “Enhancing Access to Resources through the Online Catalog and the Library Web Site: A Collaboration Between Public and Technical Services at Texas A&M University Libraries,” Technical Services Quarterly 22, no. 4 (2005): 19–37, http://dx.doi.org/10.1300/J124v22n04_02; Taryn Resnick et al., “E-resources: Transforming Access Services for the Digital Age,” Library Hi Tech 26, no. 1 (2008): 141–54, http://dx.doi.org/10.1108/07378830810857861; Jacquie Samples and Ciara Healy, “Making it Look Easy: Maintaining the Magic of Access,” Serials Review 40, no. 2 (2014): 105–17, http://dx.doi.org/10.1080/00987913.2014.929483.
  3. Chandel and Saikia, “Challenges and Opportunities of E-resources”; Jasper, “Collaborative Roles in Managing Electronic Publications”; Lisa Mackinder, “The Seemingly Endless Challenge: Workflows,” Serials Librarian 67, no. 2 (2014): 158–65, http://dx.doi.org/10.1080/0361526X.2014.940481; Sarah B. Pomerantz, “The Role of the Acquisitions Librarian in Electronic Resources Management,” Journal of Electronic Resources Librarianship 22, no. 1–2 (2010): 40–48, http://dx.doi.org/10.1080/1941126X.2010.486726.
  4. Janetta Waterhouse, “Managing eResource Processes and Projects,” Serials Librarian 67, no. 1 (2014): 62, http://dx.doi.org/10.1080/0361526X.2014.899292.
  5. Ibid.
  6. Mackinder, “The Seemingly Endless Challenge,”158–62.
  7. Ibid., 162–65.
  8. Dowdy and Raeford, “Electronic Resources Workflow,” 176.
  9. Feather, “Electronic Resources Communications Management,” 208; Resnick et al., “E-resources,” 154; Samples and Healy, “Making it Look Easy.”
  10. Grogg, “Electronic Resource Management Systems,” 89.
  11. Waterhouse, “Managing eResource Processes and Projects,” 64–67.
  12. Maria Collins and Jill E. Grogg, “Building a Better ERMS,” Library Journal 136, no. 4 (2011): 23; Grogg, “Electronic Resource Management Systems,” 89.
  13. Caitlyn Lam, “Technical Services Report: Report of the ALCTS/LITA Electronic Resources Management Interest Group Meeting, American Library Association Midwinter Meeting, Philadelphia, January 2014,” Technical Services Quarterly 31, no. 3 (2014): 271–77, http://dx.doi.org/10.1080/07317131.2014.908597.
  14. Taryn Resnick, “Core Competencies for Electronic Resource Access Services,” Journal of Electronic Resources in Medical Libraries 6, no. 2 (2009): 101–22.
  15. Ibid., 115–16.
  16. Samples and Healy, “Making it Look Easy,” 111.
  17. Pomerantz, “The Role of the Acquisitions Librarian in Electronic Resources Management,” 46.
  18. TERMS: Techniques for Electronic Resources Management blog, accessed March 1, 2015, http://6terms.tumblr.com.
  19. Jill Emery and Graham Stone, “Introduction and Literature Review,” Library Technology Reports 49, no. 2 (2013): 5.
  20. Jill Emery and Graham Stone, “Developing Workflow from TERMS: Techniques for Electronic Resource Management” (workshop presented at the Electronic Resources & Libraries Conference, Austin, TX, March 19, 2014).
  21. Mackinder, “The Seemingly Endless Challenge,” 160.
  22. Feather, “Electronic Resources Communications Management,” 228.
  23. Samples and Healy, “Making it Look Easy,” 110.
  24. Colleen Cook and Michael Maciel, “A Decade of Assessment at a Research-Extensive University Library using LibQUAL+®,” Research Library Issues, 271 (2010): 4–12; Bruce Thompson, Martha Kyrillidou and Colleen Cook, “Library Users’ Service Desires: A LibQUAL+ Study,” Library Quarterly, 78, no. 1 (2008): 1–18, http://dx.doi.org/10.1086/523907.
  25. Anita Foster and Sarah C. Williams, “We’re All in this Together: Library Faculty and Staff and their Reporting of Electronic Resources Problems,” Journal of Electronic Resources Librarianship, 22, no. 3–4 (2010): 126, http://dx.doi.org/10.1080/1941126X.2010.535738.
  26. Ho, “Enhancing Access to Resources”; Resnick et al., “E-resources.”
  27. Samples and Healy, “Making it Look Easy,” 110; Dowdy and Raeford, “Electronic Resources Workflow,” 177.
  28. McGill University Library, “Immediate Priorities,” accessed July 7, 2015, https://www.mcgill.ca/library/about/planning/immediate-priorities.
  29. Foster and Williams, “We’re All in this Together,” 130.
  30. Kittie S. Henderson and Stephen Bosch, “Seeking the New Normal: Periodicals Price Survey 2010,” Library Journal 135, no. 7 (2010): 36.
  31. Dowdy and Raeford, “Electronic Resources Workflow,” 176.
  32. Samples and Healy, “Making it Look Easy,” 111.
  33. Dowdy and Raeford, “Electronic Resources Workflow,” 187; Samples and Healy, “Making it Look Easy,” 114.
  34. Feather, “Electronic Resources Communications Management,” 208; Resnick et al., “E-resources,” 154; Samples and Healy, “Making it Look Easy,” 112.
  35. Feather, “Electronic Resources Communications Management.”
  36. Foster and Williams, “We’re All in this Together,” 130.
  37. Mackinder, “The Seemingly Endless Challenge,”158.
  38. Dowdy and Raeford, “Electronic Resources Workflow,” 176.
  39. Samples and Healy, “Making it Look Easy,” 110.
  40. Ibid., 106.
  41. Pomerantz, “The Role of the Acquisitions Librarian,” 46.
  42. Foster and Williams, “We’re All in this Together”; Ho, “Enhancing Access to Resources”; Jasper, “Collaborative Roles in Managing Electronic Publications.”

Appendix

The following appendix includes the questions for the online survey and the first page of the survey indicating the participants’ consent. The same questions were used in the personal interviews.

  1. Think about the most recent occasion when you were unable to find an item record (print or electronic) in WorldCat Local but you were certain that the Library owned or subscribed to that item. What did you do? Check any that apply.
    • This has never happened to me / I can’t remember.
    • Verbally told my colleague who works in Collection Services.
    • Verbally told a colleague who does not work in Collection Services.
    • Emailed one of the Collection Services general mailboxes.
    • Emailed a Collection Services staff member directly.
    • Emailed a colleague outside of Collection Services (e.g., another librarian or a supervisor).
    • Used the “Catalog Correct” function to report it.
    • Used the “Chat with a librarian” function to report it.
    • I did not report it.
    • Other
  2. Think about the most recent occasion when you noticed that a WorldCat Local record was missing some information (such as an e-book record missing the link or a print book missing a call number). What did you do? Check any that apply.
    • This has never happened to me / I can’t remember.
    • Verbally told my colleague who works in Collection Services.
    • Verbally told a colleague who does not work in Collection Services.
    • Emailed one of the Collection Services general mailboxes.
    • Emailed a Collection Services staff member directly.
    • Emailed a colleague outside of Collection Services (e.g., another librarian or a supervisor).
    • Used the “Catalog Correct” function to report it.
    • Used the “Chat with a librarian” to report it.
    • I did not report it.
    • Other
  3. Think about the most recent occasion when you noticed or suspected a problem with a subscription to a resource (e.g., hitting a paywall when looking for articles in e-journals, unable to access an electronic resource that we subscribe to). What did you do? Check any that apply.
    • This has never happened to me / I can’t remember.
    • Verbally told my colleague who works in Collection Services.
    • Verbally told a colleague who does not work in Collection Services.
    • Emailed one of the Collection Services general mailboxes.
    • Emailed a Collection Services staff member directly.
    • Emailed a colleague outside of Collection Services (e.g., another librarian or a supervisor).
    • Used the “Catalog Correct” function to report it.
    • Used the “Chat with a librarian” function to report it.
    • I did not report it.
    • Other
  4. Think about the most recent occasion when you reported a problem regarding items in the Classic Catalog and/or WorldCat Local.
    1. Did you receive a verbal or email acknowledgement that someone in Collection Services has received your error report? Choose one of the following answers.
      • Yes
      • No
      • Unsure
    2. How long did it take for you to receive an answer or resolution regarding the reported problem? (Please choose the closest response, even if you received an answer but were not satisfied with the resolution.)
      • A few minutes
      • Within the same day
      • The next working day
      • Within a week
      • Within a month
      • I never heard back about the problem
      • Unsure
      • Other / Comments:
    3. Still thinking of this same occasion, would you consider this to be a typical response time for receiving answers/resolutions to errors/problems reported to Collection Services? Choose one of the following answers.
      • No, it took less time than usual to receive a response.
      • No, it took longer than usual to receive a response.
      • Unsure
  5. After you report an error found in the Classic Catalog or WorldCat Local, would you prefer to receive an acknowledgement that someone in Collection Services has received your error report, even if an answer or resolution cannot be provided right away? Choose one of the following answers.
    • Yes, I prefer an email acknowledgement.
    • Yes, I prefer a verbal acknowledgement.
    • Yes, I prefer either an email or verbal acknowledgement.
    • No, I prefer not to receive an acknowledgement. I prefer only to be informed when the issue has been resolved.
    • It doesn’t matter to me.
    • Unsure
    • Other
  6. When you report an error found in the Classic Catalog or WorldCat Local, what is the preferable time frame for a response to be communicated? (Response in this case means that your question has been addressed, the error has been fixed, your question has been referred to someone else, or a tentative course of action has been presented; it does not necessarily mean you have received a satisfying resolution.) Choose one of the following answers.
    • A few minutes
    • Within the same day
    • By the next working day
    • Within a week
    • Within a month
    • Depends on the problem
    • I don’t have expectations for response times
    • Unsure
    • Other
  7. Think about an occasion you reported an error found in the Classic Catalog or WorldCat Local and received a response from someone who works in Collection Services. Choose one of the following answers.
    1. Did you receive a response that answered your question?
      • Yes
      • Somewhat
      • No
      • Unsure
      • No answer
    2. Still thinking of the same occasion, were you satisfied with the response you received? Choose one of the following answers
      • Yes
      • Somewhat
      • No
      • Unsure
      • No answer
  8. a) Select the statement that best describes you. Typically, when I report errors regarding items in the Classic Catalog to Collection Services, I: (Choose one of the following answers.)
    1. Feel like my problems are addressed in a timely manner.
    2. Feel like my problems are addressed eventually but they are not a priority.
    3. Feel like my problems are rarely addressed or not looked into at all.
    4. Feel like my problems are noted but are part of a larger problem that has not yet been resolved.
    5. Depends—sometimes a, b, c or d.
      b) Select the statement that best describes you. Typically, when I report errors regarding items in the WorldCat Local to Collection Services, I: (Choose one of the following answers)
    6. Feel like my problems are addressed in a timely manner.
    7. Feel like my problems are addressed eventually but they are not a priority.
    8. Feel like my problems are rarely addressed or not looked into at all.
    9. Feel like my problems are noted but are part of a larger problem that has not yet been resolved.
    10. Depends—sometimes a, b, c or d.
  9. Select the statement that best describes you. Typically, when I report errors regarding items in the Classic Catalog and/or WorldCat Local to Collection Services, I: (Choose one of the following answers.)
    1. Feel like the problems are resolved to the best of the staff’s abilities.
    2. Feel like Collection Services is aware of the problem but they do not or cannot resolve it.
    3. Feel like my particular case has been noted but it is part of a larger problem that has not yet been resolved.
    4. Feel like my problems are eventually resolved but they are not a priority.
    5. Depends—sometimes a, b, c or d.
  10. a) Think about a time when you were not satisfied with a response that you received for reported error or problem. What could Collection Services staff have done differently?
    b) Do you have any other comments you wish to include, relating to errors and questions sent to Collection Services?

Table 1. Actual Methods for Reporting Errors

Method

% of Total Reported

No. Reported

E-mail to Collection Services mailbox

73.3

217

E-mail to e-resources staff member

13.5

40

E-mail to e-resources mailbox

8.1

24

Phone

3.0

9

Feedback form on the Library’s website

1.0

3

“Report a problem” form within the catalog

0.7

2

In person

0.3

1

Total

 

296

Table 2. Online Survey Responses on Methods used for Reporting Errors

Responses

Q1. Unable to Find Item Record

Q2. Catalog Record Missing Information

Q3. Suspected Subscription Problem

This has never happened to me / I can’t remember.

6

2

4

Verbally told my colleague who works in Collection Services.

5

1

1

Verbally told a colleague who does not work in Collection Services.

5

0

1

E-mailed one of the Collection Services general mailboxes.

26

30

29

E-mailed a Collection Services staff member directly.

7

10

10

E-mailed a colleague outside of Collection Services (e.g., another librarian or a supervisor).

2

1

2

Used the “Report a problem” form within the traditional catalog.

3

4

1

Used the “Chat with a librarian” to report it.

0

0

0

I did not report it.

4

2

1

Other

7

3

4

Table 3. Types of Errors Reported to Collection Services

Type of Error

% of Total Reported

No. Reported

e-resources (databases, e-journals, etc.)

48.3

143

e-books

29.4

87

Print books

6.4

19

Print serials

6.4

19

Acquisitions and subscriptions

5.1

15

Database maintenance

1.7

5

Films

0.7

2

VPN

0.7

2

Using the library’s website

0.7

2

Inter-library loans

0.3

1

Reports from the library’s ILS

0.3

1

 Total

296

Table 4. Errors Resolved, Unresolved and Forwarded

% of Total Reported

No. Reported

Resolved by e-resources in October

59.1

175

Assigned to e-resources but unresolved as of October 31

9.8

29

Forwarded internally

25.3

75

Forwarded externally to OCLC (unresolved)

5.7

17

Forwarded externally to another vendor

0.3

1

Total

296

Table 5. Online Survey Responses on Perception of Response Quality

“Select the statement that best describes you. Typically, when I report errors regarding items in the Classic Catalog and/or WorldCat Local to Collection Services, I:”

Statement

No. Who Selected This Response

% Who Answered This Question

Feel like the problems are resolved to the best of the staff’s abilities.

22

50

Feel like Collection Services is aware of the problem but they do not or cannot resolve it.

2

5

Feel like my particular case has been noted but it is part of a larger problem that has not yet been resolved.

5

11

Feel like my problems are eventually resolved but they are not a priority.

0

0

Depends—sometimes a, b, c, or d.

15

34

Total

44

Table 6. Preferences for Response Times

“When you report an error found in the Classic Catalog or WorldCat Local, what is the preferable time frame for a response to be communicated?”

Response Time 

No. Who Selected This Response

% Who Answered This Question

A few minutes

1

2

Within the same day

12

27

By the next working day

19

42

Within a week

5

11

Within a month

0

0

Depends on the problem

4

9

I don’t have expectations for response times

1

2

Unsure

0

0

Other

3

7

Total

45

Table 7. Perceptions of Response Times

“Thinking of the last time you reported an issue, how long did it take for you to receive an answer or resolution regarding the reported problem?”

Response Time

No. Who Selected This Response

% Who Answered This Question

A few minutes

3

6.7

Within the same day

21

46.7

The next working day

7

15.6

Within a week

7

15.6

Within a month

1

2.2

I never heard back about the problem

0

0.0

Unsure

2

4.4

Other

4

8.9

Total

45

Table 8. Response Times from when Errors were sent to Collection Services

Response Time

No. of Errors (accumulative count)

% of All Errors Reported

% of Resolved Errors

10 min or less

26

9

15

30 min or less

51

17

29

60 min or less

84

28

48

Within a half day (4h)

133

45

76

Within 24h

156

53

89

Within 5 days

173

58

99

Refbacks

  • There are currently no refbacks.


ALA Privacy Policy

© 2024 Core