ltr: Vol. 47 Issue 4: p. 27
Chapter 4: Vendor Profiles
Marshall Breeding
Andromeda Yelton

Abstract

Chapter 4 examines in depth a few major ILS vendors: proprietary vendors Polaris, Biblionix, SirsiDynix, Innovative; and Koha support companies ByWater and LibLime. These serve a range of library types and span the range of satisfaction ratings. For each vendor, respondents’ comments illuminate possible reasons behind the ratings.


As the survey both asks for numerical ratings of satisfaction and provides a free comment field, it is possible to explore the reasons behind libraries’ ratings of various vendors. However, readers are cautioned against taking these subjective impressions as definitive. Libraries’ experiences with their vendors vary, and every vendor has both satisfied and unsatisfied customers. In addition, there are a few factors that make inferences from comments a challenge.

First, the majority of people who fill out the survey do not leave comments; those who do may not speak for everyone. In fact, it seems likely that people with unusually positive or—especially—negative views are more likely to comment. Therefore, the comments may present an exaggerated view of companies’ strengths and weaknesses.

Second, while some issues recur frequently, others may be mentioned only a handful of times. Are they outliers, or do they represent views of the many libraries that did not comment? We have tried to consider how these minority views fit within the general themes for each vendor and quote, or exclude, them responsibly.

Third, many libraries are simply not in a good position to comment in depth, because a consortium or IT office handles contact with their vendor. These libraries typically fill out the numerical questions, but their comment fields address only their lack of contact.

Finally, no matter the general trends for each vendor, libraries should make their own decisions based on their individual circumstances. Many ILS products are best suited to a particular niche, and libraries’ satisfaction may mostly reflect whether they are in that niche, not the skills of the company or the quality of its software. Also, libraries’ overall satisfaction with customer support sometimes appears to have much more to do with the representatives assigned to their institution than with the company as a whole. Libraries are urged to think about how their specific experiences may vary from the average.


Polaris Library Systems: Polaris

Polaris has earned outstanding ratings for ILS, company, and support satisfaction from 2007 through 2010 (see figure 24). Among commenters, it appears that the major reason for this high satisfaction is an excellent relationship with the company; for instance, the 2008 commenter who said, “Polaris has the best customer service of any company I've dealt with, even outside the library industry.” Numerous comments through all four years of the survey compliment the quality and responsiveness of customer service. A few also praise the company for listening and note that its users group is an effective forum for two-way communication; says one 2010 commenter, “A company that listens to their customers wants and needs via the Polaris Users Group enhancement process and does a great job implementing changes that benefit all.”

Although there are no areas in which Polaris receives persistently negative comments, libraries do seem to have had mixed experiences in terms of usability and migration. Two commenters refer to the software as “easy to navigate” and “smooth, intuitive,” while a third complains of the “hassle” of performing certain searches. Comments on migration note its difficulty and sometimes express dissatisfaction with quality assurance during the process, but typically express satisfaction at the end result. One hurricane-afflicted library liked “their flexibility and willingness to readjust schedules because of our circumstances.”

Comments on functionality are harder to interpret. Presumably, given the high ILS satisfaction ratings, most libraries are pleased with Polaris's functionality; however, specific comments on it are most likely to be negative. This may just mean that people are more likely to comment on things that don't work well than on things that do, however. Two academic libraries note that they don't think their demographic is the company's main focus. One 2010 commenter says it is “not the most advanced product out there,” but another is pleased that “updates don't cost anything,” and several praise Polaris for developing functionality on the basis of listening to users.

One library summarizes this issue as follows: “I like the Polaris company and its philosophy of service. However, we really miss some of the functionality we had on [previous vendor's system]. That being said, there was NO WAY I felt we could continue a relationship with [previous vendor].” Overall, it seems that some libraries are wholly satisfied with Polaris's functionality, and others are willing to make tradeoffs in this area in order to receive superlative customer service.


Biblionix: Apollo

Biblionix's ratings for ILS satisfaction, company satisfaction, and customer support satisfaction (see figure 25) have been consistently outstanding—all average above 8 for all four years of the survey. The comments throughout this period also paint a consistent picture. Commenters note that Apollo is very well-suited to the needs of small and mid-sized public libraries.

Libraries comment favorably on the software. Several note that it is easy to use, and this has been particularly helpful for training volunteers. Others are pleased with the system's flexibility, including a variety of reporting options. Several mention a responsive and forward-looking development process; for example, “They listen to librarians’ needs and then design user-friendly and relevant upgrades with those needs in mind.” One extremely satisfied library says, “They integrate new technologies and services before any of our neighbor's systems do, and they make us look trendy to our members.”

Where Biblionix truly garners praise, however, is in the area of customer service. Comments like “totally satisfied,” “very pleased,” and “terrific” are common across all four years of the survey. Libraries specifically praise the company's responsiveness to both feedback and support requests and its speed in addressing issues. Astonishingly, though some new adopters experience the typical stresses of migration, others comment favorably even on this process, due to the quality of support; for example, “Migration was completed overnight, with minimal disruption to staff or customers.” Several libraries have comments like “the service is the best I have ever had.”


SirsiDynix: Symphony (Unicorn), Horizon, Dynix

As the largest company in the industry, supporting multiple ILS products, SirsiDynix received a large number of responses, many with sharp comments. In 2010, for example, the survey attracted 282 responses from libraries using Symphony, 80 of which provided comments; 185 responses from Horizon with 61 comments; and 13 responses from the legacy Dynix Classic ILS with 6 comments. From these comments a number of themes emerge, many of which address changes in the company, which respondents perceive as producing negative effects on service and product development.

The 2007 survey fell on the heels of acquisition of SirsiDynix by Vista Equity Partners and an announcement that its product development would focus solely on Unicorn, subsequently renamed Symphony. The company has since softened its position on Horizon and continues at least some development. The comments from that first year's survey in 2007 reflected the high level of concern many customers expressed regarding those events. In subsequent surveys, it is of interest to see whether time has healed those wounds and if libraries using Symphony and Horizon have come around to a more positive outlook. In general, while company satisfaction (see figures 26 and 27) seems to have recovered somewhat, the comments have not lost much of their bite; the majority of the comments offered continue to slant toward the negative, though at least a minority reflect strong satisfaction with Symphony and appreciate its maturity and stability. In addition, some Horizon customers, while dismayed that they will eventually have to migrate, like the software they have and appreciate that SirsiDynix has continued to support it. In 2007, at least one response complained that following the merger that the Sirsi and Dynix sides of the company did not communicate well with each other; no such comments appeared in subsequent iterations of the survey following the company's aggressive business integration process.

In 2010, SirsiDynix made further changes in its organization, centralizing support in its Provo facility. This change was implemented to strengthen the company's support capacity. Comments in the latest 2010 survey indicated considerable resistance to this strategy. Criticism was especially strong from international users of both Symphony and Horizon, expressing concern with the loss of local expertise and support options. While some libraries were quite happy with their individual support representatives, others expressed difficulty in locating people with relevant expertise or dissatisfaction with their response time. These concerns about the new support strategy are not reflected in the overall numerical rating for support satisfaction, though, which has not changed in 2010; it may be too early for the effects of this change to be apparent. Again, it will be important to watch survey results in the next year or so to see if this strategy achieves its desired results.

Many comments offered by respondents using SirsiDynix automation products complained about the company's business transitions and impact on product options and direction. A few libraries using Horizon voiced support for the capabilities of that system and concern that they would be shuffled toward Symphony, a product some perceived as inferior. That said, the plurality of migrations away from Dynix and Horizon in 2007–2010 have been to Symphony; it may be that these libraries which are apparently satisfied with SirsiDynix's direction are less likely to comment.


Innovative Interfaces: Millennium

In 2010 the Perceptions survey attracted more responses from libraries using Millennium from Innovative Interfaces than any other automation system. Of the 395 responses received, 110 provided comments. A minority complained about specific problems with Millennium, casting it as clumsy and antiquated, but others called it a solid, modern system and praised aspects of its capabilities and functionality; most seemed pleased with the software. Similarly, some were dissatisfied with support, while others were happy that Innovative worked with them to customize the product around their needs (see figure 28). Overall, satisfaction with the ILS has remained roughly constant from 2007 to 2010, while satisfaction with the company and its support have risen slightly.

The dominant theme of the comments, however, was cost issues. Some mentioned that they did not appreciate the way that pricing was structured, such that any new component was priced separately. The key issue, as revealed by survey comments, lies in the opinion that the costs associated with operating Millennium press the limits of what budgets can tolerate and in the perceptions that alternative arrangements might be less expensive or provide more upgrades as part of the base price. In some cases, this concern has led to libraries considering migrations despite being otherwise satisfied with the ILS.


ByWater, LibLime, and Independent Installations: Koha

Support for the open source ILS Koha provides an interesting point of comparison. Two firms providing hosting and support services for Koha—ByWater Solutions and LibLime—were well-represented in the survey, as were libraries that have implemented Koha independently. Those depending on support from ByWater Solutions gave ILS satisfaction ratings at a very high level (7.86); those using Koha with support from LibLime gave lower ILS satisfaction scores (6.90). Libraries’ satisfaction with the ILS may have been linked to their satisfaction with support: 8.44 for ByWater customers and 5.64 for LibLime customers, though one notes support has improved since the acquisition by PTFS. Company satisfaction and support satisfaction were similar for all support strategies (see figure 29).

It's difficult to interpret company satisfaction ratings when a library operates an ILS without support from a commercial company (and, indeed, some commenters noted that these questions did not apply well to them). It may be that the ratings are directed toward its own efforts, toward the broader community of libraries that provide peer support, or to other entities. These independent users rated the ILS as highly as ByWater users (7.87), but their satisfaction with support was lower (7.38).

It should be noted that these are not the only companies providing Koha support; libraries using nine different support companies, as well as independent users and those not specifying a support vendor, participated in the 2010 survey. This is a substantial increase over past years; there were only four support vendors mentioned in 2009, and only one (LibLime) in 2008 and 2007. As there are typically only a handful of respondents in each category, we cannot meaningfully analyze the data. However, it will be interesting to watch the rapid growth in the Koha support market in future years.


Concluding Thoughts

The data represented across the four years of the Perceptions survey provide considerable insight on the dynamics of the library automation industry. As we take the data apart and look at different sectors, each reveals its own distinct issues and concerns. Sifting the results by the size and type of library affords the opportunity to gain a more nuanced understanding of the trends that are not as apparent when looking at the aggregated data.

The survey serves as a barometer to measure the pressure of the industry: the force of library expectations versus what their automation providers deliver. Libraries today have fewer resources to spend on automation and must deliver their services efficiently and effectively. Measuring the levels of satisfaction in the performance of the current systems and the vendors that support them provides useful information to libraries reflecting on whether to continue with their current automation strategy or to explore new tangents.

For the companies and other organizations that provide and support automation systems, the survey provides a source of constructive criticism. The numeric rankings provide a four-year running indicator of the effectiveness of their support programs and whether changes made have produced positive results. The public nature of the results may not feel entirely comfortable—the comments offered by survey respondents include sharply negative statements as well as positive ones. Redacted only to preserve confidentiality, the comments bring to the surface issues and concerns that prevail among library customers and hopefully provide insight to the vendors on what is working and not working about their current product and support offerings. An independent survey such as this one elicits different comments than those that might be offered in response to a company's own efforts to solicit feedback from its customers. Several of the vendors covered in the survey report to the author that their own metrics trend more positively.

Only within the ranks of small libraries do we find superlative satisfaction with their automation scenario. Once we excavate below the surface layer of highly satisfied libraries, we find strata of trends that run in different directions. In this report we have explored some of the differences that arise as we look at public versus academic libraries among those with differing collection sizes. While some companies and products perform better than others, none provide a resoundingly satisfactory solution for most libraries of substantial size and complexity.

The survey seems to reinforce the idea that the costs of the current systems press the limits of what libraries can bear. Of the comments dealing with cost issues, almost all reflected concern; some state that current costs already exceed what they can tolerate. Hardly any comments reflected a sense that libraries feel they receive excellent value for their investments.

Analysis of the results fails to confirm open source library automation as a panacea. While those already involved with open source continue to support the concept strongly, the survey does not validate the open source ILS as the key to satisfaction. Outside the ranks of those already involved, we detected no evidence of libraries being poised ready to abandon proprietary systems in droves. We saw combinations of open source ILS products and support companies that produced widely varying levels of support and product satisfaction. Companies providing services surrounding an open source ILS face the same kinds of challenges in satisfying their clients as those faced by their counterparts involved in proprietary software.

The four-year view of the survey data both answered and raised questions. In some cases, it confirmed commonsense assumptions: for instance, cost is a major ongoing concern, and libraries with low company loyalty are more likely to migrate away from their current ILS and work with a new vendor. Other trends revealed in the survey results seem more baffling and warrant further investigation. Given that ILS satisfaction, company satisfaction, and customer support satisfaction have remained more or less constant over the years, why has company loyalty risen so sharply?

The data on the open source market are particularly open to interpretation. Are open source library automation systems nearing the maximum market penetration they can achieve given their reputation for requiring technical skills, or will the rising percentage of highly interested libraries propel them forward? What does it mean for ILS adoption, and the software marketplace, that libraries’ interest in open source ILSes is polarizing? As the number of companies supporting open source ILSes rises dramatically, will we increasingly see different entities providing software and support? Given that many open source users commented that the survey did not mirror their situation, will we find ourselves needing to consider what the ILS marketplace means in different terms?

The survey data show that, on average, libraries are moderately—sometimes extremely—satisfied with their software, and fairly loyal to their vendors. However, cost pressures, troubled relationships with vendors, and alternate models such as discovery layers and open source software drive widespread reevaluation; 21 percent of libraries surveyed in 2010 are shopping for a new ILS. While this benchmark stands a bit lower than in the economically stronger years of 2007 and 2008, it predicts that we may be in store for new rounds of churn in the turnover of automation systems. In broadest strokes the survey results do not paint a picture of a libraries in turmoil against their automation systems and vendor. Rather it reflects levels of disconnect between expectation and performance that may drive libraries out of their patterns of inertia and lead vendors toward new models of technology and service with the potential to narrow the gaps of discontent.



Figures

[Figure ID: fig24]
Figure 24 

Satisfaction with Polaris over time.



[Figure ID: fig25]
Figure 25 

Satisfaction with Biblionix over time.



[Figure ID: fig26]
Figure 26 

Satisfaction with Symphony (Unicorn) over time.



[Figure ID: fig27]
Figure 27 

Satisfaction with Horizon over time.



[Figure ID: fig28]
Figure 28 

Satisfaction with Millennium over time.



[Figure ID: fig29]
Figure 29 

Satisfaction with Koha for major support strategies, 2010.



Article Categories:
  • Information Science
  • Library Science

Refbacks

  • There are currently no refbacks.


Published by ALA TechSource, an imprint of the American Library Association.
Copyright Statement | ALA Privacy Policy