Technical Services Assessment: A Survey of Pennsylvania Academic Libraries | |
Rebecca L. Mugridge | |
Rebecca L. Mugridge (rmugridge@albany.edu) is Associate Director for Technical Services and Library Systems, State University of New York at Albany. | |
Abstract | Academic libraries regularly conduct assessment of library services through the use of rubrics or assessment tools such as LibQUAL (www.libqual.org/home). Technical services activities are frequently assessed; however, the assessment is typically limited to the evaluation of specific processes. This study was designed to explore assessment activities in Pennsylvania’s academic libraries. The author designed a survey to investigate whether technical services activities are assessed, how they are assessed, who is responsible for assessment, how the results of assessment activities are shared with others, and how those results are used to improve services or for other purposes. Sixty-three libraries responded to the survey (a 53 percent response rate). Survey results show that 90 percent of academic libraries in Pennsylvania have conducted some form of assessment of technical services activities but that most of that assessment is quantitative in nature. |
Assessment is a topic of great interest to academic library directors and administrators. For the purposes of this paper, assessment is defined as the process of evaluating a procedure, service, product, or person to determine its value or effectiveness. The Association of Research Libraries (ARL) has sponsored a biannual conference on the topic since 2006. The Association for College and Research Libraries (ACRL) sponsored a report published in 2010, The Value of Academic Libraries: A Comprehensive Research Review and Report, which encouraged libraries to use assessment to demonstrate the value and impact of libraries on their communities.1 The LibQUAL suite of tools (www.libqual.org) is used to assess and evaluate library services; however, the focus is on public services rather than on technical services.
In 2012, the Association for Library Collections and Technical Services (ALCTS) sponsored an interactive electronic discussion forum (e-forum) on the topic “Technical Services Statistics and Assessment,” which focused primarily on the collection and use of statistics.2 The discussion covered issues such as what statistics are collected, the difficulty of using automated systems to collect statistics, the reasons why statistics are collected, and how librarians make use of and report the statistics that they collect and maintain. The discussion concluded by questioning how well statistics address concerns that were raised in The Value of Academic Libraries, including how well they demonstrate the impact of libraries on our customers.
There are many reasons why technical services assessment can benefit library managers and administrators. Assessment findings can be used to improve effectiveness, identify areas that need improvement, and communicate with customers and other stakeholders. Communicating with customers and other stakeholders can take several forms. For example, a customer service survey, while clearly seeking feedback from customers, is also a communication vehicle that serves as an outreach tool to those customers, indicating that their opinion is important and that their feedback is valuable. Communicating the results of technical services assessment activities to stakeholders and customers shows that their opinion was heard and that it will be acted on. The results of technical services assessment activities can be used to communicate with administrators and to help make the case for increased funding, staffing, or other resources. Technical services assessment findings can also be used to inform decision-making and reduce costs, such as those related to processing, vendor services, staffing, supplies, and other costs. The author believes that the evaluation and assessment of the activities and effectiveness of technical services units require more than the simple collection and reporting of statistics. It is necessary to make use of both quantitative and qualitative assessment tools to articulate technical services’ effect on the teaching and research mission of a college or university. This study is intended to investigate whether libraries assess technical services activities, how they are assessed, who is responsible for assessment, how the results of assessment activities are shared with others, and how those results are used to improve services or for other purposes.
The author examined the library science literature published between 2000 and June 2013 to determine current practices and trends in the area of technical services assessment, discovering few publications that address technical services assessment as a whole. Neither the ARL Library Assessment Conferences nor the ACRL initiatives focused significant attention on the assessment of technical services activities, despite the fact that technical services librarians and staff make up a significant portion of the employees, and therefore human resources budget, in academic libraries. A review of the proceedings of the 2006, 2008, and 2010 ARL-sponsored Library Assessment Conferences shows no sessions that specifically address the assessment of technical services activities.3 The most recent Library Assessment Conference was in 2012; while the proceedings are not yet available, a review of the program shows that no sessions address the assessment of technical services activities.4
Wright and White conducted a research project for ARL on the topic of library assessment, which was published as a SPEC kit in 2007.5 One question they asked was which units were assessed during the five years before the survey’s distribution. Of the sixty-seven libraries that responded to this question, 75.8 percent indicated that they had done some form of assessment of cataloging; 79 percent had done some form of assessment of acquisitions and 67.2 percent had done some form of assessment of preservation. The most frequently cited form of assessment in all three functions was statistics collection and analysis.6
The library science literature reveals many articles that address processes and workflows within and across technical services units. Webber reported on the application of program assessment techniques to electronic resources management, finding that all libraries can benefit from the use of these techniques to improve performance.7 Herrera et al. assessed the serials and monographic ordering process, using a survey to identify strategic improvements.8 Dragon and Sheets used a time and path study at the East Carolina University’s Joyner Library to assess technical services workflow.9 Herrera et al. also assessed cataloging and database maintenance to evaluate customer satisfaction and assist with departmental strategic planning.10 Yue and Kurt reported on the assessment of print serials management practices at the University of Nevada, Reno, nine years after they ceased checking in print periodicals.11 Their report is a reminder of the importance of follow-up assessment after workflow changes have been implemented.
Chase and Krug discussed the experiences of the Appalachian College Association (ACA) as they participated in a Council on Library and Information Resources (CLIR) grant to improve technical services work processes.12 Andreadis et al. reported on the effort to redesign technical services workflow at Denison University and Kenyon College in an effort to make better use of staff and other resources.13 Similarly, Loring addressed the assessment of technical services workflow at Smith College.14 Medeiros reported on how the Tri-College Library Consortium of Bryn Mawr, Haverford, and Swarthmore Colleges assessed issues related to the management of electronic resources.15 Godbout discussed how Wells College streamlined the workflow between the acquisitions and cataloging units.16 Using tools acquired from a workshop on continuous improvement, they were able to implement changes that made a measurable improvement in productivity. Schroeder and Howland conducted a study of shelf-ready processing, finding that shelf-ready was cheaper and took less time to process.17 Stouthuysen et al. presented the results of their research to apply a time-driven activity-based costing (TDABC) model to the acquisitions process in a Belgian university library in an effort to improve cost management.18 Their findings show that TDABC is well suited for use in a library setting and may lead to potential cost efficiencies.19
Fewer studies address the assessment of preservation activities. Brown reported on the results of her research that investigated the use of general preservation assessment to develop a preservation plan.20 She followed that paper with one that addressed to what extent libraries implemented the recommendations that resulted from their assessment activities.21 Miller reported on several online tools that are intended to assist archivists with assessing their preservation needs.22
The value of cataloging has been addressed in numerous studies. Stalberg and Cronin reported on the efforts of the ALCTS Technical Services Directors of Large Research Libraries Interest Group Task Force on Cost/Value Assessment of Bibliographic Control.23 The task force identified seven operational definitions of value plus many elements that contribute to the cost of cataloging. In their final report, they made many recommendations for further investigation of these issues. El-Sherbini and Chen investigated the use of non-Roman subject headings and their effect on access to library resources in the online catalog, finding that a majority of users would like to be able to search non-Roman headings.24 Mitchell investigated the value of metadata to libraries, archives, and museums by analyzing three approaches to assessment: pure counting, user-based, and case study–focused.25
Two studies addressed technical services webpages. Groves evaluated twenty academic libraries’ technical services webpages, determining that few libraries list their online work tools and that there is very little overlap of online work tools between those that do.26 Mundle, Huie, and Bangalore conducted an evaluation of ARL library catalog department websites.27
While the library science literature includes reports of assessment activities in technical services units, such as workflow analysis; statistics collection; assessment of training, documentation, and websites; and the value of cataloging and metadata, the author was unable to find any studies that consider a holistic assessment of technical services activities and their impact on the faculty, staff, students, or other customers. This study is intended to supplement existing literature by examining the assessment of technical services activities in Pennsylvania academic libraries.
The author designed a survey to gauge the existence and extent of technical services assessment in Pennsylvania academic libraries. Pennsylvania has more than one hundred institutions of higher education, and the author felt that the large number of libraries would provide a robust source of data about typical assessment activities. The author chose SelectSurvey software (http://selectsurvey.net) to develop the survey, which included twelve questions. The brevity of the survey was intentional to encourage a high response rate.
The author identified all Pennsylvania academic libraries by accessing a spreadsheet available on a website maintained by the Pennsylvania Department of Education.28 This website allows users to download a spreadsheet listing all the academic libraries in the state. The spreadsheet provides the institution name, the library name, the library director’s name and phone number, and other information. It does not supply the library directors’ email addresses. As the author planned to invite library directors to participate in the survey using personally addressed emails, their email addresses had to be identified and recorded. This was done by searching each institution’s website, identifying the director, dean, or university librarian, and recording the email addresses on a locally saved copy of the spreadsheet. This process revealed that many smaller institutions lacked a library website; these institutions tended to be technical or art institutes, or small seminary or other religious institutions. This led the author to limit the survey population to institutions that had “college” or “university” in their names, but kept the pool large enough to gather useful data. The spreadsheet was alphabetized by institution name, duplicates were deleted, director names were updated, and email addresses added. This resulted in a list of 120 academic library directors who were each sent an invitation to participate in the survey.
The survey invitation, provided in appendix A, requested that either the library directors or deans complete the survey or forward it to the person in their organization who held primary responsibility for carrying out technical services assessment activities. For the purposes of the survey, technical services were defined as cataloging and metadata; acquisitions; preservation, bindery, and physical processing; and electronic resources management units or staff. The author indicated in the email that the survey consisted of twelve questions and that it should take ten to twelve minutes to complete. Respondents were assured of confidentiality and that institution names were collected to avoid duplication. Confidentiality was indicated in the survey itself by making the institution name question optional. The email also included an invitation to share any documentation related to technical services assessment, such as links to online statistics or reports, print documents reporting on assessment activities, or procedural documents regarding assessment activities.
The survey was attached as a Microsoft Word file to the email messages to give the directors the option of completing the survey offline. Two follow up emails were sent in subsequent weeks, resulting in sixty-three completed responses by the deadline, August 31, 2012. Seven of those surveys were returned as Word attachments; in those cases, the author manually entered the answers into SelectSurvey to allow the survey software statistics reporting to function accurately. (See appendix B for the survey.)
Of the 120 surveys mailed, sixty-three respondents completed the survey by the deadline (a 52.5 percent response rate). All but one response included the institution’s name. Of the responding libraries, sixteen (25 percent) were libraries at public institutions, and forty-seven were libraries at private institutions. Of the public institutions, six were community colleges, five were libraries in the four “state-related” universities,29 and five were libraries in the Pennsylvania State System of Higher Education (PASSHE), i.e., the state university system.30 Four of the survey responses represented libraries at ARL institutions. The responding libraries, including both public and private institutions, employed an average of thirteen librarians and seventeen staff in the library. Of those employees, an average of two librarians and four staff worked in technical services. The number of librarians ranged from 1 to 171; the two largest employed 171 and 135, respectively, and the next largest employed 50. The outliers were not excluded from the survey analysis.
The survey asked participants to indicate whether their library assessed technical services activities. Sixty libraries answered this question, with 60 percent (thirty-six respondents) of the libraries indicating that they assessed technical services activities. Three libraries skipped this question. There was very little difference between public and private institutions’ assessment activities as reported in the responses to this question: 60 percent of public institutions and 59.1 percent of private institutions reported that they had assessed technical services activities.
Although only 60 percent of responding libraries reported assessing technical services activities, 90.5 percent responded to the next question in the survey intended to gather information about specific assessment methods used by the libraries. The question was, “Which specific assessment methods do you currently use or have used in the past to assess technical services activities?” In retrospect, this discrepancy may have been avoided by more clearly stating a time in this question; for example, asking whether assessment has been done in the past five years and then asking what kinds of assessment was conducted during that period. The discrepancy noted may indicate that assessment is only regularly conducted in 60 percent of the responding libraries; whereas 90.5 percent of the libraries have at one time conducted some form of assessment. Table 1 illustrates the types of assessment methods reported by responding libraries.
Other methods of assessment reported by survey respondents included
- comparing statistics for online resources, interlibrary loans, and acquisitions with those from similar institutions;
- return-on-investment studies of specific technical services functions;
- participating in a 360 degree review process;
- comparing practices with other institutions; and
- conducting a self-study exercise.
An analysis of the responses to this question shows a difference in the practice of technical services assessment between public and private institutions. A review of the responses reveals that 81.2 percent of public institutions report that they have conducted some assessment of technical services activities compared with 93.6 percent of private institutions which report the same. All five of the libraries in state-related institutions reported that they conduct assessment of technical services activities; however, in two of those cases that assessment consisted solely of gathering statistics. Two of the five libraries in PASSHE institutions reported that they do not assess technical services activities. Two of the community college libraries (33.3 percent) do not assess technical services activities.
According to the survey results, more responding libraries selected quantitative rather than qualitative assessment as ways they have conducted assessment in the past. The top two methods of assessment cited are quantitative: gathering statistics and gathering usage data. The survey question did not specify or ask what statistics were collected, and it did not specify or ask what or how usage data were collected. Other methods, such as benchmarking, conducting customer service surveys or focus groups, gathering input or anecdotes from other staff or customers, and using an anonymous suggestion box are used less frequently than quantitative assessment methods.
The most frequently selected reason for assessing technical services activities was to improve or streamline processes, followed closely by the goal of improving services. Other reasons that libraries identified were to make better decisions, to inform strategic planning activities, to explore the possibility of offering new services, to reallocate staff or other resources, and to compare with other institutions. Table 2 illustrates the goals of technical services assessment activities.
Survey respondents supplied additional goals:
- Build better collections
- Identify activities and services that could be eliminated
- Demonstrate the value of technical services activities to the university and library
- Demonstrate value to scholarship and research
- Establish best practices based on national standards
Only one library reported that assessment of technical services activities is conducted to demonstrate the value of technical services to the university and library.
As mentioned in the methods section, for the purposes of this survey, technical services was defined as people or units responsible for cataloging and metadata; acquisitions; electronic resources management; and preservation, bindery, and physical processing. The survey question identified four areas of responsibility commonly found under the umbrella of technical services. These vary from institution to institution and are combined into a variety of departmental configurations within those institutions. The goal of this part of the survey was not to identify how the technical services units or departments are configured, but whether the activities traditionally performed by technical services units were assessed within the past five years. Units responsible for cataloging, metadata, and acquisitions were the most likely to have undergone some form of assessment in responding libraries, followed by electronic resources management and units responsible for preservation, binding, and physical processing. Table 3 illustrates the units that were assessed by responding libraries within the last five years.
Primary responsibility for conducting technical services assessment lies with the library director, dean, or university librarian in twenty-one of the responding libraries (38 percent). Others identified as holding primary responsibility for technical services assessment include the division head, department head(s), unit head(s), a committee, and in two cases, a single librarian. Table 4 illustrates who in responding libraries holds primary responsibility for technical services assessment.
In addition to the people or units identified in the survey question, six respondents supplied answers to this question. In each case the written response indicated that no single person or unit held primary responsibility for technical services assessment; rather, multiple individuals or units shared that responsibility:
- It varies; we do have a department which does assessment, but work is also done at the division, department, and unit levels
- The technical services librarian provides the library director with information used for assessment
- Department heads and unit heads
- Director, associate director, and staff in technical services
- Library administrative team (associate dean/director, associate director, and assistant director in consultation with department heads and supervisors)
- Library director and a committee
Libraries report the results of their technical services assessment activities in many ways, with the most prevalent being through the library’s annual report. Other ways that these activities are shared are through informational reports to library administration, a mass email to all library employees, a library or campus newsletter article, presentations, or a website. Table 5 illustrates the various ways assessment results are communicated to others.
In addition to the responses identified in the survey question, sixteen additional responses were supplied by the survey respondents. These included a variety of written reports, and the following other methods:
- Assessment report
- Five-year audit report
- Department outcome assessment reports
- Emails and presentations when appropriate to faculty and students
- Internal discussions between department heads
- Report to the Provost
- Information is included in the College’s Fact Book
- Part of performance evaluation
- Annual assessment report
- Report within WEAVEonline (www.weaveengaged.com/weaveonline.html)
- Internal communications
- Discussions with library director
- Library committee report
- Surveys and questionnaires submitted to external accrediting or collegial organizations
- Internal self-study results were made available to the finalists in our library director search
This question required survey respondents to record their answer in a text box, and generated thirty-five responses. Many themes that emerged from the outcomes described in the responses provided. Table 6 summarizes these themes as reported by survey respondents.
Specific responses from the survey illustrate these themes. For example, fourteen libraries indicated that they had reallocated staff on the basis of the results of their assessment activities. Several libraries reported that many positions were eliminated, and one library was able to justify filling a vacant position with statistics collected as part of their assessment activities. Some of their comments include the following:
- We reallocated a position from print to electronic resources management.
- We have shifted staff from bindery preparation to assist with storage activities.
- We have shifted staff to monitor reports of incorrect links or problems with electronic resources based on statistics and feedback from the librarians.
- We have increased the number of student assistants.
- We have reallocated staff time among acquisitions, cataloging, and serials in response to e-resources.
- We have realigned staff responsibilities.
- We have shifted and eliminated duties.
- Staff have been reassigned to different tasks (metadata cataloging, serving on the reference and circulation desks, and creating library exhibits).
Ten libraries reported that they streamlined processes because of their assessment activities. In some cases, libraries eliminated procedural steps to streamline their processes, and in other cases, they eliminated entire functions or services. Some of their comments include the following:
- We changed some ordering procedures to provide quicker access and less hassle for the business department.
- We adopted shelf-ready processing.
- We have trimmed costs by cutting back on stripping and covering.
- We eliminated shelflisting and writing call numbers on the verso of the title pages.
- We are currently assessing approval plan returns with the hope of eliminating all (or most), in order to move into more shelf-ready plans.
- We have streamlined our government documents workflow.
- We ceased binding, check-in and claiming.
- We changed our monthly authority control processing to a quarterly process, thereby saving money.
Another common theme that emerged from this question involved collection development decisions. Ten libraries reported that they made collection decisions based on their technical services assessment activities. These decisions included weeding, reallocation of funds, and transferring materials from one collection within the library to another. Some of the respondents’ comments included:
- We decided to add new online resources to the collection.
- We purchased additional databases for specific disciplines.
- We reorganized our collection to co-locate reference books with circulating books, and to allow more reference books to circulate.
- We are currently weeding most of our collection. We are making better decisions on what needs to remain in our collection, what can be de-accessioned and what we need to purchase.
Four libraries changed vendors or vendor services because of their assessment activities. Of those, one library changed its book jobber and another library cancelled the approval plan because of usage statistics. A third library is considering cancelling their approval plan, and a fourth library reported that they had consolidated their print, electronic serials, and standing orders under one vendor.
Three of the responding libraries reported that they made changes to staff training because of their assessment activities. One library reported that they were providing more training to their staff in new technologies, including electronic resources management. Another library identified cross-training as an area that warrants more attention. Finally, a third library reported that they are developing training materials and adapting policies to achieve efficiencies.
Improved communication was an outcome of assessment identified by three libraries in the survey. In one case, the consolidation of print, serials, and standing orders with one vendor improved communications with that vendor. In another case, communication with teaching faculty regarding collection building was improved because of their assessment activities. In another, new services for faculty were offered that directly improved communication with them. Those services included new-publications email notification, new-book display shelves, and an improved book order and request system.
Finally, two of the responding libraries changed their integrated library system (ILS) because of their technical services assessment activities. One library reported that they were not satisfied with the services offered by their current ILS vendor and are migrating to another system. The second library reported that they are upgrading their current ILS to the software-as-a-service (SaaS) model to eliminate the need to do manual backups and upgrades, thereby enabling the technical services/information technology librarian to devote more time to other services.
The author analyzed the survey results to determine whether the type of assessment conducted affected the outcomes that were reported. Libraries that only gathered statistics or usage data were less likely to report any outcomes. Twenty-two libraries only gathered statistics or usage data. Of those libraries, only ten (45.5 percent) provided examples of outcomes from their assessment activities. This is in contrast to the thirty-five libraries whose assessment went beyond gathering statistics or usage data. Of those thirty-five, twenty-five libraries (71.4 percent) reported outcomes.
The method of assessment also affected the types of outcomes reported by responding libraries. The ten libraries that only gathered statistics or usage data reported outcomes that included reallocating staff, streamlining processes, making collection development decisions, and changing vendor or vendor services. Table 7 illustrates these findings.
Thirty-five libraries used assessment methods that went beyond collecting statistics or usage data. Those libraries reported outcomes that include the four cited in the previous paragraph, but also included adjusting staff training, improving communication, implementing new services, and changing ILSs. It is likely that some forms of assessment, e.g., gathering input from non–technical services librarians and staff, collecting anecdotes or feedback from customers, administering customer service surveys, benchmarking, providing a suggestion box, or conducting focus groups may elicit information and feedback that is useful for a variety of management purposes. Table 8 demonstrates this.
This study revealed that 90 percent of responding academic libraries in Pennsylvania have conducted some form of assessment of technical services activities. The most commonly used form of assessment consists of collecting and reporting statistics, but survey respondents report using a variety of qualitative methods as well. These methods include the use of customer service surveys, focus groups, benchmarking, workflow analysis, collecting feedback from customers, gathering input from non–technical services librarians and staff, and using a suggestion box. The top three goals of technical services assessment are to (1) improve or streamline processes, (2) improve services, and (3) make better decisions. Cataloging and metadata and acquisitions units were the most likely to be evaluated, followed distantly by electronic resources management and preservation, bindery, and processing units. This may reflect the variety and relatively small size of the academic institutions that were surveyed. Smaller institutions are almost certainly more likely to have units or people responsible for acquisitions and cataloging, and they may be less likely to have a unit or person specifically responsible for electronic resources management or preservation.
According to the survey results, responsibility for technical services assessment resided with the dean or director more often than with other administrators or managers. Again, this may be due to the relatively small size of the academic institutions surveyed. Larger institutions may be more likely to push responsibility for technical services assessment down to the managerial or administrative head of those units or divisions. The results of assessment activities are reported primarily through either the library’s annual report or informational reports to the library’s administration. Outcomes of assessment activities included the reallocation of staff, more streamlined processes, and making decisions related to collection development.
It is clear that while most Pennsylvania academic libraries perform some assessment of technical services activities, the assessment is heavily weighted toward quantitative assessment and the collection of statistics. Academic libraries would benefit from an assessment toolkit at their disposal that would facilitate the planning and implementation of a qualitative assessment program. Such a toolkit should include instructions and suggestions for how academic library managers and administrators could create an assessment program that evaluates their technical services units and activities. It should also include examples of customer service surveys, focus group questions, benchmarking surveys, workflow analysis projects, and other types of qualitative and quantitative assessment practices that administrators could emulate, adopt, and modify for use in their libraries.
Further research on this topic would be useful. Studies that focus on specific assessment methods, such as the use of customer service surveys, focus groups, or benchmarking, would be helpful. Research on whether the type of assessment conducted correlates to specific outcomes would also be of interest. The author’s current research involves the use of benchmarking as a tool for assessment in cataloging, and her future research plans include the replication of this study on a national scale. The author is hopeful that the increased attention on assessment will lead to a more programmatic and consistent use of assessment tools to evaluate the effect of technical services activities on their customers.
References and Notes
1. | Association of College and Research Libraries, Value of Academic Libraries: A Comprehensive Research Review and Report, researched by Megan Oakleaf (Chicago: Association of College and Research Libraries, 2010), accessed June 13, 2013, www.acrl.ala.org/value |
2. | Association for Library Collections and Technical Services, “Turning Statistics into Assessment: How can Technical Services Measure the Value of Their Services?” accessed October 9, 2013, www.ala.org/alcts/turning-statistics-assessment-how-can-technical-services-measure-value-their-services |
3. | Proceedings of the Library Assessment Conference: Building Effective, Sustainable, Practical Assessment, September 25–27, 2006, Charlottesville, VA (Washington, DC: Association of Research Libraries, 2007); Proceedings of the 2008 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment, August 4–7, 2008, Seattle, WA (Washington, DC: Association of Research Libraries, 2009); Proceedings of the 2010 Library Assessment Conference, Building Effective, Sustainable, Practical Assessment, October 24–27, 2010, Baltimore, MD (Washington, DC: Association of Research Libraries, 2011), accessed June 13, 2013, http://libraryassessment.org/bm∼doc/proceedings-lac-2010.pdf |
4. | Library Assessment Conference: Building Effective, Sustainable, Practical Assessment, October 29–31, 2012, Charlottesville, VA (Washington, DC: Association of Research Libraries, 2012), accessed June 13, 2013, http://libraryassessment.org/bm∼doc/2012Program.pdf |
5. | Stephanie Wright and Lynda S. White, SPEC Kit 303 Library Assessment (Washington, DC: Association of Research Libraries, 2007) |
6. | Ibid., 22 |
7. | Sheri Webber, "“Applying Program Assessment Techniques to Electronic Resources Management,”," Technical Services Quarterly (2004) 22, no. 1: 9–20. |
8. | Gail Herrera et al., "“Technical Services Serials and Monographic Ordering Assessment,”," Technical Services Quarterly (2006) 24, no. 1: 45–62. |
9. | Patricia Dragon and Lisa Sheets Barricella, "“Assessment of Technical Services Workflow in an Academic Library: A Time-and-Path Study,”," Technical Services Quarterly (2006) 23, no. 4: 1–16. |
10. | Gail Herrera, "“Technical Services Cataloging and Database Maintenance Assessment,”," Technical Services Quarterly (2006) 23, no. 3: 51–72. |
11. | W.Yue Paoshan and List Kurt, "“Nine Years after Implementing the Unthinkable: The Cessation of Periodical Check-in at the University of Nevada, Reno,”," Serials Librarian (2011) 61, no. 2: 231–52. |
12. | Anne Chase and Tony Krug, “New Techniques in Library Technical Services at the Appalachian College Association,” in Library Workflow Redesign: Six Case Studies, ed. Marilyn Mitchell, 8–20 (Washington, DC: Council on Library and Information Resources, 2007) |
13. | Debra K.. Andreadis et al., “Cooperative Work Redesign in Library Technical Services at Denison University and Kenyon College,” in Library Workflow Redesign: Six Case Studies, ed. Marilyn Mitchell, 39–49 (Washington, DC: Council on Library and Information Resources, 2007) |
14. | Christopher B. Loring, (): , “Increasing Productivity through Workflow Redesign at Smith College,” in Library Workflow Redesign: Six Case Studies, ed. Marilyn Mitchell, 50–59 (Washington, DC: Council on Library and Information Resources, 2007). |
15. | Norm Medeiros, (): , “Managing Electronic Resources in the Tri-College Consortium,” in Library Workflow Redesign: Six Case Studies ed. Marilyn Mitchell, 60–72 (Washington, DC: Council on Library and Information Resources, 2007). |
16. | Muriel Godbout, "“Preparing an Item for Circulation While Streamlining the Workflow between the Acquisitions and Cataloging Offices,”," Indiana Libraries (2007) 26, no. 4: 59–67. |
17. | Rebecca Schroeder and Jared L. Howland, "“Shelf-Ready: A Cost-Benefit Analysis,”," Library Collections, Acquisitions & Technical Services (2011) 35, no. 4: 129–34. |
18. | Kristof Stouthuysen et al., "“Time-Driven Activity-Based Costing for a Library Acquisition Process: A Case Study in a Belgian University,”," Library Collections, Acquisitions & Technical Services (2010) 34, no. 2: 83–91. |
19. | Ibid., 90–91 |
20. | E. K. Brown Karen, "“Use of General Preservation Assessments: Process,”," Library Resources & Technical Services (2005) 49, no. 2: 90–106. |
21. | Karen E. K. Brown, "“Use of General Preservation Assessments: Outputs,”," Library Resources & Technical Services (2006) 50, no. 1: 58–72. |
22. | Mary Miller, "“Archivist, Assess Thyself’: On-line Tools for Preservation Assessments,”," MAC Newsletter (2008) 35, no. 3: 25–26. |
23. | Erin Stalberg and Christopher Cronin, "“Assessing the Cost and Value of Bibliographic Control,”," Library Resources & Technical Services (2011) 55, no. 3: 124–37. |
24. | Magda El-Sherbini and Sherab Chen, "“An Assessment of the Need to Provide Non-Roman Subject Access to the Library Online Catalog,”," Cataloging & Classification Quarterly (2011) 49, no. 6: 457–83. |
25. | Erik Mitchell, "“Assessing the Value of Metadata in Information Services,“," Technical Services Quarterly (2013) 30, no. 2: 187–200. |
26. | Deana Groves, "“Online Work Tools: A Look at 20 Academic Libraries Technical Services Web Pages,”," Library Collections, Acquisitions & Technical Services (2005) 29, no. 4: 395–402. |
27. | Kavita Mundle, Harvey Huie, and Nirmala S. Bangalore, "“ARL Library Catalog Department Web Sites: An Evaluative Survey,”," Library Resources & Technical Services (2006) 50, no. 3: 173–85. |
28. | The website on which the directory of Pennsylvania academic libraries is no longer available. Another resource for similar information is the 2012 Directory of Pennsylvania Libraries, accessed June 13, 2013, www.portal.state.pa.us/portal/server.pt/community/bureau_of_library_development/8810/online_library_directory/606694 |
29. | Pennsylvania has four state-related universities: Lincoln University, Temple University, Pennsylvania State University, and the University of Pittsburgh |
30. | The Pennsylvania State System of Higher Education includes fourteen universities: Bloomsburg University of Pennsylvania, California University of Pennsylvania, Cheyney University of Pennsylvania, Clarion University of Pennsylvania, East Stroudsburg University of Pennsylvania, Edinboro University of Pennsylvania, Indiana University of Pennsylvania, Kutztown University of Pennsylvania, Lock Haven University of Pennsylvania, Mansfield University of Pennsylvania, Millersville University of Pennsylvania, Shippensburg University of Pennsylvania, Slippery Rock University of Pennsylvania, and West Chester University of Pennsylvania |
I would like to invite your institution to participate in a brief survey on Pennsylvania academic library technical services assessment practices. The purpose of the survey is to investigate what assessment activities are conducted, who is responsible for technical services assessment, how the results of the assessment activities are shared with others, and how those results are used to improve services, or for other purposes. Please forward this message to the person in your organization who holds primary responsibility for carrying out technical services assessment activities:
https://surveys.libraries.psu.edu/TakeSurvey.aspx?SurveyID=7l30969
The survey includes 12 questions and should take no longer than 10–12 minutes to complete. All questions are optional, and you can quit the survey at any time. No identifying information will be shared in any way, whether through presentation or publication; all survey respondents and institution names will remain confidential.
For the purposes of this survey, technical services are defined as units responsible for Cataloging/Metadata, Acquisitions, Electronic Resources Management, and Preservation/Bindery/Physical Processing.
For the purposes of this survey, customers are defined as faculty, staff, students, and/or members of the general public that use your resources for research or other purposes.
In addition to the online survey, I would be interested in obtaining any documentation related to technical services assessment that you are able to share. This may include:
- Links to online statistics or reports
- Print documents reporting on assessment activities (e.g., annual report)
- Procedural documents regarding assessment activities
Please send URLs for documentation to rlm31@psu.edu or mail print documents to Rebecca Mugridge, 126 Paterno Library, University Park, PA 16802.
The survey will be open until August 31, 2012. I’ve attached two Word versions (.doc and .docx) if you would prefer to complete it on paper.
The results of the survey (without any identifying information) will be shared at a presentation at the Pennsylvania Library Association Annual Conference, September 30, 2012, in Gettysburg, PA.
Thank you,
Rebecca L. Mugridge
*****************
Head, Cataloging and Metadata Services
Pennsylvania State University Libraries
126 Paterno Library
University Park, PA 16802
email: rlm31@psu.edu
phone: 814-865-1850 fax: 814-863-7293
Academic libraries regularly assess the services that they provide to their customers, including the faculty, staff, or students of their institutions, and often members of the general public. Library technical services units serve customers by acquiring library materials, providing timely access to them, and preserving those collections for future use. This study will look at the assessment activities used by Pennsylvania academic libraries’ technical services units to evaluate the success of their activities.
For the purposes of this survey, technical services are defined as units responsible for Cataloging/Metadata, Acquisitions, Electronic Resources Management, and Preservation/Bindery/Physical Processing.
For the purposes of this survey, customers are defined as faculty, staff, students, and/or members of the general public that use your resources for research or other purposes.
Please complete only one survey response per institution.
- What is the name of your institution? (Optional: This information will not be shared; it is only to ensure that there is only one survey response per institution.)
- Is your institution public or private?
- a. Public
- b. Private
- How many employees (Full Time Equivalent) work in the Library? You may answer in fractions, e.g., 4.5 FTE.
- a. Librarians
- b. Staff
- c. Hourly Staff/Students
- How many employees (Full Time Equivalent) work in Technical Services? You may answer in fractions, e.g., 4.5 FTE.
- a. Librarians
- b. Staff
- c. Hourly Staff/Students
- Does your library conduct assessment of technical services activities?
- a. Yes
- b. No
- Which specific assessment methods do you currently use or have used in the past to assess technical services activities? Select all that apply:
- a. Gather statistics
- b. Gather usage data
- c. Collect anecdotes or feedback from customers
- d. Conduct customer service surveys
- e. Conduct focus groups
- f. Gather input from non-technical services librarians or staff
- g. Anonymous suggestion box
- h. Benchmark with other institutions
- i. Other (please describe)
- What are the goals of the technical services assessment activities at your institution? Select all that apply:
- a. Improve services
- b. Explore offering new services
- c. Improve or streamline processes
- d. Reallocate staff or other resources
- e. Compare with other institutions
- f. Make better decisions
- g. Inform strategic planning activities
- h. Other (please describe)
- Which of the following departments or units has your library assessed within the past five years?
- a. Cataloging/Metadata
- b. Acquisitions
- c. Electronic Resources Management
- d. Preservation/Bindery/Physical Processing
- Who has primary responsibility for conducting technical services assessment activities?
- a. Library Director, Dean, University Librarian
- b. Division Head
- c. Department Head(s)
- d. Unit Head(s)
- e. Committee
- f. Single librarian
- g. Single staff member
- h. Other (please describe)
- How do you report the results of your technical services assessment activities? Select all that apply:
- a. Informational report to library administration
- b. Library newsletter article
- c. Campus newsletter article
- d. Mass email to library employees
- e. Mass email to campus employees
- f. Annual report
- g. Presentations
- h. Web site
- i. Other (please describe)
- Please provide examples of outcomes that you have made to technical services’ policies, procedures, or services based on information that you learned from your assessment activities.
- Please provide any additional information about your library’s technical services assessment activities that might help with the analysis of this survey.
Tables
Methods of Assessment (N = 63)
Methods of assessment | Libraries | Percent |
Gather statistics | 53 | 84.1 |
Gather usage data | 31 | 49.2 |
Gather input from nontechnical services librarians or staff | 28 | 44.4 |
Collect anecdotes or feedback from customers | 19 | 30.2 |
Conduct customer service surveys | 16 | 25.4 |
Benchmark with other institutions | 12 | 19.0 |
Anonymous suggestion box | 8 | 12.7 |
Conduct focus groups | 6 | 9.5 |
Goals of Technical Services Activities (N = 63)
Goals of Assessment | Libraries | Percent |
Improve or streamline processes | 43 | 68.3 |
Improve services | 40 | 63.5 |
Make better decisions | 39 | 61.9 |
Inform strategic planning activities | 35 | 55.5 |
Explore offering new services | 25 | 39.7 |
Reallocate staff or other resources | 19 | 30.2 |
Compare with other institutions | 14 | 22.2 |
Technical Services Departments or Units Assessed within the Last Five Years (N = 62)
Department or Unit | Libraries | Percent |
Cataloging/metadata | 35 | 56.5 |
Acquisitions | 35 | 56.5 |
Electronic resource management | 28 | 45.2 |
Preservation/bindery/physical processing | 16 | 25.8 |
Primary Responsibility for Conducting Technical Services Assessment (N = 56)
Person or Unit Responsible | Libraries | Percent |
Library director, dean, university librarian | 21 | 37.5 |
Division head | 11 | 19.6 |
Department head(s) | 8 | 14.3 |
Unit head(s) | 4 | 7.1 |
Committee | 3 | 5.4 |
Single librarian | 2 | 3.6 |
Single staff member | 0 | 0.0 |
Methods of Reporting Assessment Results (N = 62)
Methods of Reporting | Libraries | Percent |
Annual report | 38 | 61.3 |
Informational report to library administration | 32 | 51.6 |
Mass email to library employees | 7 | 11.3 |
Library newsletter article | 5 | 8.1 |
Presentations | 5 | 8.1 |
Campus newsletter article | 1 | 1.6 |
Mass email to campus employees | 0 | 0.0 |
Outcomes Based on Assessment Activities (N = 35)
Outcome Reported | Libraries | Percent |
Reallocated staff | 14 | 40.0 |
Streamlined processes | 10 | 28.5 |
Made collection development decisions | 10 | 28.5 |
Changed vendor or vendor services | 4 | 11.4 |
Adjusted staff training | 3 | 8.6 |
Improved communication | 3 | 8.6 |
Implemented new services | 2 | 5.7 |
Changed integrated library systems | 2 | 5.7 |
Outcomes Based on Gathering Statistics or Usage Data (N = 22)
Outcome Reported | Libraries | Percent |
Reallocated staff | 4 | 18.2 |
Streamlined processes | 2 | 9.1 |
Made collection development decisions |
3 | 13.6 |
Changed vendor or vendor services | 2 | 9.1 |
Adjusted staff training | 0 | 0.0 |
Improved communication | 0 | 0.0 |
Implemented new services | 0 | 0.0 |
Changed integrated library systems | 0 | 0.0 |
Outcomes Based on Forms of Assessment in Addition to Gathering Statistics or Usage Data (N = 35)
Outcome Reported | Libraries | Percent |
Reallocated staff | 10 | 28.6 |
Streamlined processes | 8 | 22.9 |
Made collection development decisions | 7 | 20.0 |
Changed vendor or vendor services | 2 | 5.7 |
Adjusted staff training | 3 | 8.6 |
Improved communication | 3 | 8.6 |
Implemented new services | 2 | 5.7 |
Changed integrated library systems | 2 | 5.7 |
Article Categories:
|
Refbacks
- There are currently no refbacks.
© 2024 Core