Chapter 5. Principle 5: Measure Use and Encourage Reuse

Measure use and encourage reuse in social media optimization (SMO) recognizes that our users want to share, repost, and embed resources into multiple online social environments and that libraries can utilize both quantitative and qualitative approaches for measuring the use and dissemination of content through social networks. This chapter begins by exploring definitions of social network success, then details the process of tracking and evaluating shared content, and concludes with a discussion of cultivating a culture of sharing.

Measure Use

The practice of measuring use provides an essential view into your social network activity and also provides an understanding of your community and your content. Measuring use is linked closely with the success of social networking, as measurement can show progress, growth, and change. In this section, we will introduce and discuss quantitative and qualitative approaches for measuring the use and evaluating the success of social network activity.

Assessing Your Social Network Activity

Effectively assessing social network activity first requires a definition of success that can be measured over time using multiple methods.

Defining and Measuring Success

Success in social networking does not have a single fixed definition. In the business world, for example, social networking is often deployed for marketing purposes, with the goal of growing sales and revenue. In libraries, social networking is often used for similar purposes, with the goal of increasing the usage of resources and services. Growth often serves as a guiding theme for social media. In our experience with social networks, we have found that a focus on community growth is a productive and mission-driven framework for goal setting. From this perspective, we seek fundamentally to grow our community of users. We measure our success on social networks according to the following three key factors:

  • community growth
  • community engagement
  • connectedness

In achieving these goals, we can produce benefits that flow throughout our organization, with the effect of increasing, for example, our Web traffic, our workshop attendance, and our patrons’ sense of togetherness. We measure this success through a combination of quantitative metrics and qualitative feedback.

Community Growth

To measure community growth, we track the number and type of community members on each social network. At the beginning of each month, we make a proactive effort to record the number of community members on each network and the percent change in number from the previous month (table 5.1).

For Facebook, we record page likes. For Twitter and Instagram, we record followers. These metrics present a high-level view into our social networking activity and let us track our progress in building the membership of our communities. To add nuance to this analysis, we periodically examine the makeup of our community. This process—described in detail in the introduction—helps us understand which user types constitute our community, such as undergraduate students, alumni, campus organizations, or local businesses. When we compare this community analysis with the detailed analytics produced by Google Analytics, Facebook Analytics, and Twitter Analytics, we can start to see the interrelationship between content and community. We seek to grow certain defined types of community through each social network; therefore, we seek to publish content that is meaningful and relevant to those communities. Our guiding question through this reflective process can be expressed by the following: What content will be most engaging to our target community?

Community Engagement

To measure community engagement, we primarily study the analytics available through Google, Facebook, and Twitter. These platforms provide post-level metrics that help shed light on engagement. Engagement is a broad measure that typically reflects user interactions. For example, engagement metrics can include the interactions of liking a post, replying to a post, or sharing a webpage. Exact engagement measures vary by platform but are all connected by the theme of interaction and the attempt to quantify the level of interest that your community shows in your content. Insights into engagement can be used to evaluate and shape the nature of your content, with the overall goal of publishing content that is more relevant and meaningful to your community.

Connectedness

To measure connectedness, we employ a range of qualitative methods that include focus groups and online surveys. The use of focus groups and online surveys to understand and evaluate your community and content is detailed in the chapter on Principle 4. The user feedback generated through these methods can be combined with Web and social analytics to provide a three-dimensional view of your user community that helps shed light on connectedness. Quantitative analytics can tell you how your users behave, and qualitative assessment methods can reveal the motivations behind those behaviors. Connectedness encompasses a combined understanding of your community makeup, your users’ level of engagement, and your users’ motivations for joining your community and interacting with your content.

Tools and Metrics

Social network activity can be measured through a variety of widely available analytics tools.1 The metrics produced by analytics tools are best interpreted in combination with complementary user feedback mechanisms. For this reason, implementing a framework of assessment and measurement will aid in interpreting analytics and productively applying insights to your social networking activity. In this section, we provide further examples of analytics in action.

Google Analytics

Google Analytics is an easy-to-implement and free-to-use Web analytics tool.2 Google Analytics produces an extensive view of a website’s traffic, ultimately offering clues and insights into the behavior of your site’s visitors.3 Librarians from varying contexts have shared their experiences with Google Analytics, offering wide-ranging examples for implementation and use.4 For understanding social network interactivity, Google Analytics produces individualized measurements for social traffic. At the MSU Library, we tune in to two key metrics available through Google Analytics: network referrals and landing pages. These metrics are available in the Acquisition menu in Google Analytics (figure 5.1). Network referrals (figure 5.2) produce visitor metrics for a variety of originating social networks. If a Web user visits your site from a social network, Google Analytics will track the behavior of that visitor according to a series of related site metrics. Figure 5.2 shows the social network referrals for the MSU Library website during the 2015 calendar year, along with the site metrics that have been coordinated with each of the our top ten originating social networks: sessions, page views, average session duration, and pages per session. These site metrics point towards certain user behavior that can help shape social sharing strategy.

Facebook Insights

Beyond Google Analytics, leading social networks provide platform-specific analytics that offer a high level of detail. Facebook’s internal analytics is called Insights and can be accessed from the menu bar at the top of any page that you manage. Facebook Insights are organized according to six major categories:

  • Overview
  • Likes (figure 5.3): Shows how many likes your page has gained and lost and where new likes come from.
  • Reach (figure 5.4): Indicates the number of Facebook users who see your posts on their news feeds. Posts that receive more reactions, comments, and shares are more likely to appear on users’ news feeds. The reach category also shows the level of engagement of your posts. Engagement measures reactions, comments, and shares (figure 5.5).
  • Page views (figure 5.6): Reports which sections of your page have been visited and the traffic sources.
  • Posts (figure 5.7): Provides metrics, such as reach, for individual posts.
  • People (figure 5.8): Shows demographic information about the users who like your page, who have seen your posts, and who have interacted with your posts by reacting, commenting, or sharing.

The analytics produced through Facebook are designed to help you understand your Facebook content and community so that you are better equipped to create and publish content that is more shareable and engaging for your community.

Twitter Analytics

Twitter similarly offers platform-specific analytics.5 As with Facebook, Twitter Analytics offers a variety of metrics that are unique to the platform, organized into categories: Tweets, Audiences, Events, Twitter Cards, Videos, App Manager, and Conversion Tracking. Only two of these categories will be relevant for most libraries: Tweets and Twitter Cards. The remaining categories are designed primarily for e-commerce and paid content or for use in conjunction with apps designed specifically for Android and Apple platforms. In exploring the full range of analytics categories, you may find metrics that are helpful for your local strategies. At the MSU Library, we focus on the following categories and metrics:

  • Tweets: This category presents key tweet-level metrics (figure 5.9):
    • Impressions: The number of times a user sees a tweet in the timeline.
    • Engagements: The number of times a user interacts with a tweet, measured by retweeting, replying, following, liking, or clicking within the tweet.
    • Engagement Rate: The number of engagements divided by the number of impressions.
  • Twitter Cards: This category presents the performance of tweets that include Twitter Cards (we address the use of Twitter Cards in the chapter on Principle 2):
    • URL Clicks: Shows click behavior on tweets that have Twitter Cards installed.
    • Install Attempts: Shows app installs originating from tweets that have Twitter Cards installed. This metric will be relevant only for libraries with native Android and Apple apps.
    • Retweets: Shows retweet behavior on tweets that have Twitter Cards installed.

Analytics Case Study 1: Google Analytics

Evaluating the analytics generated through Google Analytics is one possible method for understanding and measuring the success of a social media strategy. Our Google Analytics (figure 5.2) show that the majority of our Web visitors who arrive via social networks originate from Facebook, Twitter, and Pinterest. The Sessions metric provides the clue: of all the visits—counted as a session—from social networks to our library’s website during the year 2015, 70 percent originated from Facebook, 19 percent from Twitter, and 8 percent from Pinterest. These metrics inform our library’s social media strategy, which revolves primarily around Facebook and Twitter. In that sense, we expect to see that the majority of our website’s social network visits originate from our two main social networks.

Additional metrics offer more nuanced views into the behavior of our Web visitors. Pageviews represents the total number of pages visited by each visitor during each session. Avg. Session Duration shows the average length of time that a visitor spends on our website. Pages / Session shows the average number of pages that each visitor views during a session. We can see from our analytics that visitors from Twitter tend to spend more time on our website, with a 3:11 average session duration. Visitors from Pinterest, however, tend to view more pages, with just over five pages viewed per session. This tells us that Twitter users stay on our site longer, but that Pinterest users range more widely throughout our site. These intriguing metrics serve to signal the behavior of our users but cannot on their own produce a full view of the user. More investigation is required to fully understand why Twitter users and Pinterest users behave differently on our library’s website. In this way, social analytics can produce insights that in turn generate follow-up research questions. These questions are best answered by speaking directly with our library’s community, for example through interviews, focus groups, or surveys. At the MSU Library, we regularly evaluate the effort that we dedicate to each social network with respect to Web traffic generated and then enrich that evaluation through conversations with our users.

Analytics Case Study 2: Twitter Analytics

At the MSU Library, we use Twitter Analytics to help us understand our community and our content. For the month of April 2016, we can see from our Twitter Analytics that certain tweets have generated comparatively high levels of impressions and engagement (figure 5.9). This view, available through the Tweets category, shows the Top Tweets for this time period. We can see at a glance that these tweets share related characteristics. Firstly—as described in our Social Media Guide in the introduction—we strive to convey a welcoming and friendly voice through our social network posts so that our library is recognized as an open and accessible member of the campus community. Secondly, we strive to publish content that reflects the values and experience of our community. The top four tweets from April 2016 show that our voice is positive and warm, while the content itself is in tune with the experience of our community of mostly undergraduate students. Twitter Analytics helps us evaluate the relationship between our content and our community. If well matched, our content will be engaging to our community, resulting in increased resource usage and an overall increased sense of connectedness.

Analytics Case Study 3: Going beyond the Metric

Metrics are an important aspect of evaluating social network success—but metrics alone can’t provide a full measure of success.6 In this case study, we discuss the complex nature of defining and measuring success for one aspect of a social networking—social links. These links can come in the form of buttons that activate a share or as linked text or icons that lead a user to the social network account pages for a company or organization. Social links of this kind are prevalent across the Web. Websites from major brands to minor bloggers include social links on their homepages and throughout their sites. An extensive report released by GOV.UK detailed the performance of social links across the GOV.UK website. The summation of this report: “During the time period we analysed, GOV.UK URLs were shared a total of 14,078 times to Facebook and Twitter using our sharing buttons—that’s 0.2% of the total of 6.8 million pageviews.”7 GOV.UK has approached this question with the implicit assumption that the click-through performance of social links is the primary measure of value. Based on this performance-based value proposition, the writers conclude: “From what we’ve seen so far, our users aren’t exactly demonstrating an overwhelming case for us retaining social sharing buttons.”8 GOV.UK click-through rates are indeed low. It’s easy to understand why they would consider removing these buttons, provided that value is measured by click-through rate.

Another example comes from Erik Runyon, the Director of Web Communications at the University of Notre Dame.9 Social links are present on project homepages throughout the Notre Dame Web domain, and these links are prominently displayed using icons and large text. An analysis of the click-through rate of these social links found the following:

  • Twitter: 0.370%
  • Facebook: 0.059%
  • LinkedIn: 0.008%
  • YouTube: 0.112%
  • Flickr: 0.028%

Runyon concludes, “Even though these numbers are low, I wouldn’t advocate for pulling them from your site. Let’s be honest, finding you on social media isn’t the why most people visit your site. But obviously some people do want to engage further.”10

With this in mind, let’s study an example from the MSU Library. Our social links are displayed as icons in the footer of our library’s homepage (figure 5.10).

Google Analytics allows us to measure the click-through rate of these buttons, relative to other clickable items on our homepage:

  • Twitter: 0.053%
  • Facebook: 0.043%
  • Tumblr: 0.043%
  • Pinterest: 0.031%
  • YouTube: 0.018%
  • WordPress: 0.018%

These metrics indicate that the level of engagement for these buttons is quite low in the context of the page as a whole. What should our response be? GOV.UK notes in its report that it will benchmark its click-through rates against comparable figures from other sites. Given the similarly low click-through rates across many sites, this will likely offer only limited insight. GOV.UK also says that it will run A/B testing on the position of buttons.11 Likewise, click-through rates are already so low that alterations in button placement are unlikely to move the meter significantly.

Instead of shifting the placement of social links, we might benefit from shifting our value perspective to a more fully contextual understanding of our users’ relationship to social links. Discussion of social links is often centered on the click-through, a useful but limited metric. A more interesting line of investigation might instead center on the page view, or more specifically, the experience of the page view. From this point of view, we can begin to ask a number of broader user-centered questions: What do our users expect to find on our pages? Do they want to see social links? How do our users feel when they see social links? How are our users’ perceptions of us shaped when they see social links? By taking a more holistic view of the experience of our users, we can start to understand the more subtle and complete effects of social links. Users might not be clicking on our social links very often, but could the buttons be serving a purpose in a different way?

This line of questioning expands the analysis far beyond the click-through metric. Analytics can tell us what our users do on our websites, but to discover why our users behave in certain ways, we must supplement our quantitative analytics with other qualitative user feedback mechanisms such as interviews and focus groups that can begin to answer more complex questions: Will a user see a Twitter button on a library webpage and later think of tweeting at the library to ask a reference question? If your library’s Tumblr or Instagram features images from special collections, will a user see those buttons and later scroll through the feeds and decide to visit special collections in person or digital collections online? How can we enable the kinds of social network interactions that bring users into the world of the library? How can we utilize social networking to expand the library community? Are social links an effective way to do any of these things?

Quantitative analytics show us that user engagement is low for social links. These click-through metrics also show us that qualitative follow-up may be necessary to provide a fuller understanding of why users behave in this way, thus offering valuable insight into the nature and effect of social links. It will be essential to ask our users why they’re not clicking, what they’re seeing and feeling when they encounter social links, and what behavior follows the page view.

In the ongoing evaluation of the value and success of social networking, we will benefit from combining metrics such as the click-through with qualitative feedback mechanisms such as focus groups and user interviews. A well-rounded, user-centered analysis will offer a more complete understanding of social networking value and success.

Encourage Reuse

Just as the methods for measuring use are diverse, so are the approaches for encouraging reuse. Social networks are purposefully built for sharing and reusing Web content. Many users want to share, repost, and embed resources into multiple online social environments, and the built-in functionality of social networks enables and rewards this kind of reuse. Reblogging on Tumblr, retweeting on Twitter, and sharing on Facebook are just a few examples of platform features of reuse. Libraries can further amplify interactive community-based content sharing by encouraging reuse among our user communities. Cultivating a culture of sharing and reuse is a key principle of SMO and can be achieved by promoting reusable content and by leveraging the interactive Web.

Reusable Content

Not all content is allowed to be reused. Content across the Web may be fully copyright-protected by the owner or creator, with no allowable sharing or reuse. Luckily, shareable and reusable content can easily be found by following licensing guidelines, often in the form of a Creative Commons (CC) license.12 When libraries create shareable content as described in the chapter on Principle 1, we can also license our content CC, as allowable, to encourage sharing. We can also find CC-licensed content to share through our organizational social network accounts, and we can encourage our users to follow CC guidelines for finding and sharing content through their own social network accounts. A few key sources can uncover ready-to-use CC-licensed content: Flickr, Google, Wikimedia Commons, and Creative Commons itself.

The Flickr community maintains one of the Web’s most extensive collections of CC-licensed content.13 The Flickr search tool—also one of the best of the Web—allows users to discover images and video according to a preferred reuse license (figure 5.11).

Google also offers extensive image searching, with a dedicated search parameter for Usage Rights (figure 5.12). Whereas the Flickr engine searches its own collection of user-contributed images, the Google engine indexes the open Web to find images that match search queries.

Wikimedia Commons, the repository for objects published through Wikipedia, offers over 32,000,000 freely usable media files.14 Wikimedia contains images, video, and audio that can be shared and reused. Finally, Creative Commons itself offers a search portal for discovering reusable content from more than ten different content sources.15 Not only are these tools excellent sources of content for our own sharing, but through user education, we can help our patrons understand and adopt practices of sharing and reuse.16 Many excellent LibGuides can be found that can help librarians and library patrons understand and apply the complexities of copyright and content licensing.17 Through the SMO principle of encouraging reuse, we librarians can demonstrate a positive model of sharing for our user communities.

The Interactive Web

The interactive Web presents opportunities to engage users directly with reusable content. By creating Web applications and content promotions, we can invite users into our collections for a creative exploration of shareable and reusable content.

At the MSU Library, we have created targeted campaigns using content from our Acoustic Atlas digital collection. Our Acoustic Atlas Ringtones webpage invites users to download, use, and share sound files representing the calls of various animals of the Western US.18 Examples from other digital collections further highlight the interactive Web, such as the Chronicling America collection from the Library of Congress. This archive of American newspapers dating from 1836 to 1922 encourages users to read and share historical accounts from across the United States. The chronological scope of this collection neatly ends at 1922, thereby placing most content in the public domain, as copyright generally expires for works published in the United States before January 1, 1923.19 The collection scope of Chronicling America encourages reuse by intentionally making available this veritable trove of reusable content.

The Digital Public Library of America (DPLA) has creatively engaged users through interactive web application development. The DPLA makes its material openly available for reuse through an API: Application Program Interface, and maintains a growing list of apps that provide unique points of discovery for its vast collection of digitized objects.20 To pick just one example of many, web developer and librarian Adam Malantonio created “Historical Cats,” a Twitter bot that randomly finds and tweets an object from the DPLA collection.21 The DPLA also hosts an annual conference, the DPLAfest, that includes a “hackfest” where conference attendees can gather, brainstorm, and build new apps together that reuse DPLA material.22 Furthermore, DPLA representatives travel to across the nation to participate in library conferences and lead hackfests and other collaboration events.23 In this way, the DPLA has become a leader in encouraging reuse. In the first place, the technical structure of the DPLA allows users to access, reuse, and share its collection in new and creative ways. Secondly, the DPLA is cultivating and promoting a culture of reuse by facilitating new application development at professional conferences.

Conclusion

In complement to Principles 1–4, the fifth principle of SMO—measure use and encourage reuse—brings into view the full picture of optimizing social network activity for building and engaging community. Our discussion of this final principle provides an approach for defining goals and articulating success. With the parameters of success established, it becomes possible to apply quantitative Web analytics together with qualitative user feedback mechanisms to measure and evaluate a library’s social networking activity. This evaluation can then be used in two primary ways: first, to inform and improve future social network activity, and second, to understand and justify the use of social networking by providing evidence of community growth and engagement. SMO can help shape your social network activity so that it reflects your unique community of library users, ultimately serving to expand the world of the library by connecting our users with our collections, services, and people.

Notes

  1. Ines Mergel, “A Framework for Interpreting Social Media Interactions in the Public Sector,” Government Information Quarterly 30, no. 4 (October 2013): 327–34, http://dx.doi.org/10.1016/j.giq.2013.05.015.
  2. “Google Analytics Solutions,” Google website, accessed June 1, 2016, https://www.google.com/analytics.
  3. Jody Condit Fagan, “The Suitability of Web Analytics Key Performance Indicators in the Academic Library Environment,” Journal of Academic Librarianship 40, no. 1 (January 2014): 25–34, http://dx.doi.org/10.1016/j.acalib.2013.06.005.
  4. Le Yang and Joy M. Perrin, “Tutorials on Google Analytics: How to Craft a Web Analytics Report for a Library Web Site,” Journal of Web Librarianship 8, no. 4 (2014): 404–17, http://dx.doi.org/10.1080/19322909.2014.944296; Kirk Hess, “Discovering Digital Library User Behavior with Google Analytics,” Code4Lib Journal, no. 17 (June 2012), http://journal.code4lib.org/articles/6942; Tabatha Farney, “Google Analytics and Google Tag Manager,” Library Technology Reports 52, no. 7 (August/September 2016).
  5. “Analytics,” Twitter website, accessed June 25, 2016, https://analytics.twitter.com.
  6. Scott W. H. Young, “Measuring the Value of Social Media Buttons,” Scott W. H. Young (blog), March 10, 2014, http://scottwhyoung.com/social-media/measuring-value-of-social-media-buttons.
  7. Graham Francis and Ashraf Chohan, “GOV.UK Social Sharing Buttons: The First 10 Weeks,” Inside GOV.UK (blog), February 20, 2014, https://insidegovuk.blog.gov.uk/2014/02/20/gov-uk-social-sharing-buttons-the-first-10-weeks.
  8. Ibid.
  9. Erik Runyon, “Social Media Click Stats,” WeedyGarden (blog), February 27, 2014, https://erikrunyon.com/2014/02/social-media-click-stats.
  10. Ibid.
  11. Scott W. H. Young, “Improving Library User Experience with A/B Testing: Principles and Processes,” Weave 1, no. 1 (2014), http://dx.doi.org/10.3998/weave.12535642.0001.101.
  12. Creative Commons website, accessed June 2, 2016, https://creativecommons.org.
  13. “Explore / Creative Commons,” Flickr website, accessed June 2, 2016, https://www.flickr.com/creativecommons.
  14. “Commons: Reusing Content Outside Wikimedia,” Wikimedia Commons website, accessed June 2, 2016 https://commons.wikimedia.org/wiki/Commons:Reusing_content_outside_Wikimedia.
  15. “Search,” Creative Commons website, accessed June 2, 2016, https://search.creativecommons.org.
  16. Molly Kleinman, “The Beauty of ‘Some Rights Reserved’: Introducing Creative Commons to Librarians, Faculty, and Students,” College and Research Libraries News, 69, no. 10 (2008): 594–97.
  17. Hannah Bennett, Patty Guardiola, and Rebecca Stuhr, “Finding Open Access Images: Overview,” April 3, 2016, Penn State University Library Guide, http://guides.library.upenn.edu/open_access_images; Meg Kribble, “Finding Public Domain & Creative Commons Media,” June 9, 2016, Harvard Law School Library Guide http://guides.library.harvard.edu/Finding_Images.
  18. “Wild Ringtones for Your Phone,” Acoustic Atlas website, accessed June 5, 2016, http://acousticatlas.org/ringtones.php.
  19. “Welcome to the Public Domain,” Stanford University Library Copyright & Fair Use website, accessed June 12, 2016, http://fairuse.stanford.edu/overview/public-domain/welcome.
  20. “App Library,” Digital Public Library of America (DPLA) website, accessed June 3, 2016, https://dp.la/apps.
  21. Adam Malantonio, “Historical Cats,” Digital Public Library of America (DPLA) website, accessed June 3, 2016, https://dp.la/apps/20.
  22. “DPLAFest 2015,” Digital Public Library of America (DPLA) website, accessed June 3, 2016, https://dp.la/info/get-involved/dplafest/april-2015.
  23. “Clepanapy,” “Hacking the DPLA: DPLA to Present at Code4Lib Conference in February 2013,” News (blog), January 15, 2013, Digital Public Library of America (DPLA) website, accessed June 3, 2016, https://dp.la/info/2013/01/15/hacking-the-dpla-dpla-to-present-at-code4lib-conference-in-february-2013.
Acquisitions menu in Google Analytics

Figure 5.1

Acquisitions menu in Google Analytics

Network referrals in Google Analytics

Figure 5.2

Network referrals in Google Analytics

Facebook Insights: Net Likes

Figure 5.3

Facebook Insights: Net Likes

Facebook Insights: Post Reach

Figure 5.4

Facebook Insights: Post Reach

Facebook Insights: Reactions, Comments, and Shares

Figure 5.5

Facebook Insights: Reactions, Comments, and Shares

Facebook Insights: page views

Figure 5.6

Facebook Insights: page views

Facebook Insights: post-level metrics

Figure 5.7

Facebook Insights: post-level metrics

Facebook Insights: people who liked your page

Figure 5.8

Facebook Insights: people who liked your page

Twitter Analytics by post

Figure 5.9

Twitter Analytics by post

MSU Library homepage

Figure 5.10

MSU Library homepage

Flickr search, with license filters

Figure 5.11

Flickr search, with license filters

Google Image Search, with Usage Rights filter

Figure 5.12

Google Image Search, with Usage Rights filter

Table 5.1. Growth over time of MSU Library communities on Facebook, Twitter, and Instagram

Facebook

Twitter

Instagram

Likes

% Change

Followers

% Change

Followers

% Change

10/1/2015

1,527

1,567

85

11/1/2015

1,529

0.13%

1,573

0.38%

92

8.24%

12/1/2015

1,547

1.18%

1,599

1.65%

94

2.17%

1/1/2016

1,542

–0.32%

1,641

2.63%

105

11.70%

2/1/2016

1,556

0.91%

1,661

1.22%

118

12.38%

3/1/2016

1,559

0.19%

1,671

0.60%

138

16.95%

4/1/2016

1,563

0.26%

1,685

0.84%

159

15.22%

5/1/2016

1,563

0.00%

1,704

1.13%

176

10.69%

6/1/2016

1,570

0.45%

1,722

1.06%

213

21.02%

Refbacks

  • There are currently no refbacks.


Published by ALA TechSource, an imprint of the American Library Association.
Copyright Statement | ALA Privacy Policy