Chapter 2. Major Altmetrics Tools

Chapter 2. Major Altmetrics Tools

The altmetrics landscape is largely influenced not only by the thought leaders and outspoken critics and promoters, but also by the very tools that are used to produce, aggregate, and contextualize the raw data that comprises altmetrics data. In bibliometrics, the vast majority of available data is produced by a very small number of providers, mainly through costly library subscriptions. However, with altmetrics, usable data can be generated or harvested from a wide variety of sources, with different cost structures, accessibility levels, and intended audiences and purposes.

There are many reasons for the dichotomous approach between bibliometrics and altmetrics. One big reason is the very nature of the metrics themselves—since bibliometrics are based on journal articles, the big providers are concerned with indexing these articles, creating links between their citations, and using this data as the base for the calculated metrics. Since the field of altmetrics has no strictly set definition or set of defining metrics, an individual altmetric can be generated from a large variety of online tools, including social media websites, information-sharing sites, online scholarly networks, and other tools used to create, collect, share, organize, and manage many types of information. Some tools are specifically created for the purpose of altmetrics, while many take advantage of existing data generated for both scholarly and nonscholarly purposes. Likewise, some are freely available online, while others require a subscription or registration to access and are variously funded by grants, advertisements, companies, or the aforementioned subscriptions.

Given all of this diversity, it’s not easy to keep track of all of the sources and tools that can be included in the large altmetrics umbrella. In this chapter, we will take a look at many of the tools that comprise this increasingly diverse landscape and discuss methods for evaluating new and existing tools as they continue to evolve.

Nonacademic Tools

We begin our tour by focusing on tools that define today’s online user experience—websites, including social media tools, visited or used by, well, just about everyone. None of these sites was developed for the purpose of altmetrics or even with a particularly academic focus. Nonetheless, they can give us some insight into the impact of scholarship, particularly as it affects the public.

Facebook

Perhaps the best known of all social media tools, Facebook is used by individuals, groups, businesses, and other organizations to connect and share information of all kinds, including photos and videos. Sometimes, Facebook is even used to share academic information like journal articles, video presentations, and blog posts. The number of times a URL has been shared or Liked can be counted and reported by outside tools such as altmetrics harvesters, which we will discuss later in the chapter. These metrics can be used as an early indicator of interest or attention regarding any scholarly contribution that can be traced to a URL.

Twitter

Twitter serves a purpose very similar to Facebook’s in that it connects individuals, businesses, and other entities for the purpose of sharing information, including photos and videos. However, Twitter’s most distinguishing feature is that information bites, or Tweets, are restricted to 140 characters. Twitter also seems to be used more often for academic purposes, with people and organizations from publishers to individual journals to editors, researchers, and other academic individuals and entities widely represented. As on Facebook, when a URL is Tweeted or Retweeted, the number of Tweets can be counted, as well as the total reach of those Tweets—that is, the total number of Twitter users that follow everyone who has Tweeted the URL, meaning that they may have read the Tweet or clicked on the URL.

YouTube

YouTube is a popular video-sharing website where individuals and entities can create a YouTube account, allowing them to upload videos, subscribe to other individuals’ video feeds, and comment on or Favorite a video. However, many videos are discovered by users through YouTube search, Google search, or the sharing of YouTube videos on social media sites and elsewhere. Metrics include the total number of views for a video, along with the number of comments and Favorites that a video has received. Videos can serve a variety of academic purposes, from the videotape of a lecture to a video methodology demonstration, or as a supplement to published research. The number of views or subscribers can demonstrate the relative interest in the videos or account. YouTube metrics are particularly useful for things like conference presentations, an area of scholarship that is often lacking in useful metrics.

Amazon

Amazon may not seem like an intuitive addition to the list. Amazon’s main function is to buy and sell all kinds of goods, but it first started in 1995 as an online bookstore of sorts before expanding into other types of goods. Amazon still enjoys heavy revenue from its print and e-book holdings, with over $5 billion earned from books alone in 2013.1 Amazon provides a Best Sellers Rank for all books on its website, as shown in figure 2.1—that is, how often a book is purchased as compared to other books in the same category. This can demonstrate overall interest in the book, since there’s no way to know who, exactly, might be buying the book (or for what purpose). Since Amazon users can also leave a rating and a review for any good, Amazon can also serve as a place to retrieve overall ratings and book reviews, keeping in mind that Amazon ratings and reviews can be added by any Amazon user for any reason and may reflect aspects of the buying process or impressions of the book rather than a reasoned critique of its contents.

Goodreads

Like Amazon, Goodreads can give us metrics only for a specific type of scholarship, that is, books. However, unlike Amazon, which gives us sales metrics, Goodreads can tell us self-reported readership metrics (see figure 2.2). Goodreads is a website and mobile app designed as a sort of “online bookshelf” for readers where they can keep track of books read, rate them, and look for book recommendations from other Goodreads readers. Another similarity to Amazon is the ability to retrieve the overall rating and book reviews from Goodreads members, keeping in mind again that the reviews may be coming from a diverse pool of readers.

SlideShare

As we move down the list, we’re slowly branching away from “tools everyone uses” to “tools used more often by academics,” but SlideShare is the first listed tool that can count academics as one of the primary, but not exclusive, users of the tool. On SlideShare, users can upload a “slidedeck,” or series of slides, like those from PowerPoint or other similar programs. Other users can follow a user, receiving notifications when that person uploads new presentations. Slidedecks are searchable by keyword or by user-input tags. Metrics include total number of views, Favorites, comments, and downloads, and users can access detailed metrics for each slidedeck, including number of views over time, as shown in figure 2.3. As with other sources, metrics can hint at overall interest in a presentation but cannot differentiate between academic interest and interest from the general public.

GitHub

GitHub is a useful website for anyone who creates programming code because it allows individuals to upload code, collaborate on code with others, and freely share code with others. In turn, GitHub tracks watchers, collaborators, and “forks.” A fork is when someone copies code to develop and use for their own purposes, similar to creating a derivative work from a Creative Commons–licensed work. For programmers, this represents one of the only ways to track the impact of written code since citations are not easily trackable within coding. However, since program coding spans academic, business, and other realms, these metrics can show the impact of a code only on other coders, and not necessarily within academia.

Academic Tools and Peer Networks

The following are online tools used for organizing and sharing information, and each generates some type of metric that can be considered a type of altmetric. The main difference between these tools and those in the previous category is that these tools have been created for an academic audience, making academics the core user base for them. Because of this, the metrics generated from these tools can tell us more about the scholarly impact of contributions like journal articles. However, adoption of these tools throughout academia can vary widely, as their features may appeal to some disciplines more than others. These limitations should be kept in mind when using altmetrics information from these tools to portray the impact of a work, particularly when directly comparing works from different disciplines, an issue we will cover in greater detail in chapter 3.

Institutional Repositories

Institutional repositories (IRs) are familiar to many academic librarians since libraries are often responsible for the creation and maintenance of their institution’s IR. But while many librarians are familiar with the role IRs play in contributing to open access, fewer are familiar with the role they play in the production of altmetrics. Many IRs contain metrics about the repository’s artifacts such as views and downloads. These metrics can also serve as a powerful incentive for researchers to place their artifacts in the repository. Stacy Konkiel, former scholarly communications librarian, has written and presented extensively on the subject of IRs and altmetrics.2

CiteULike

CiteULike is a social bookmarking website specifically designed for researchers to save and organize journal citations into their personal libraries. These libraries can be set to be viewed publicly or for private viewing. Metrics can then be generated based on the number of public CiteULike libraries that contain a particular article. Since private libraries can’t be viewed and relatively little is known about the CiteULike user base, these metrics are best when compared to those of other similar articles, though any metric can show a level of interest in the article.

CiteULike

www.citeulike.org

Mendeley

Like CiteULike, Mendeley is a free citation manager, helping researchers save and organize citations and PDFs. Users must register for an account online before downloading the Mendeley desktop program or using its online tools for citation management. However, Mendeley also hosts a social media component through its website by integrating the ability to follow individuals, join groups, and browse articles by discipline. The number of Mendeley users who have saved an article to their citation library is tracked, along with some demographic information about those users, as figure 2.4 demonstrates. These metrics are publicly available, meaning that they can be retrieved and analyzed by other tools. Having detailed demographics related to the metrics helps move the generated metrics from “someone is interested in this work” to “faculty and researchers in specific areas are interested in this work.” Recent studies have shown a modest correlation between Mendeley users and later citation counts, meaning that this particular metric serves as a decent early indicator of scholarly impact, a point discussed in more detail in chapter 3.

Mendeley

www.mendeley.com

Academia.edu

Academia.edu is our first example of a “closed” peer network system. As on Mendeley, researchers can create a free profile and upload citations and full-text works, follow other authors, and track their usage metrics over time. However, unlike Mendeley, this information is available only to the individuals who have registered for an account so that it’s closed to other tools, which are unable to retrieve these metrics. Nonetheless, these metrics can show interest in works over time, and Academia.edu remains a very popular research network for many researchers across many disciplines.

Academia.edu

www.academia.edu

ResearchGate

ResearchGate is a closed peer network system designed for researchers in the sciences, with metrics accessible only to its users. After registering for a free account, ResearchGate users can upload their citations and full-text articles and get metrics for views, bookmarks, and downloads. Additionally, ResearchGate produces an author-level metric, the RG score, which aims at approximating the level of influence the user has within ResearchGate. The RG score is one of the only altmetrics scores whose primary focus is to measure author-level impact (albeit limited to impact within the ResearchGate system)—that is, a metric that is derived from the sum of scholarly contributions, rather than metrics for individual contributions (like journals), which are then summated for an individual author.

ResearchGate

www.researchgate.net

Social Science Research Network (SSRN)

The Social Science Research Network is one of the oldest peer networks, having been around in some form since 1994. However, SSRN is known primarily for allowing users to share pre-publication versions of articles, as well as white papers. Like the other peer networks detailed above, registration is free, and authors can add their own papers and retrieve metrics for those papers. However, since it focuses on articles that have yet to be published, SSRN can be useful in gathering early metric indicators, such as views and downloads, prior to the publication of an article.

Social Science Research Network

www.ssrn.com

Altmetrics Harvesting Tools

This final category of altmetrics tools includes tools that are most commonly associated with altmetrics because they are primarily concerned with harvesting, or gathering, altmetrics from many sources, including many of the sources detailed above. More importantly, these sources not only harvest altmetrics, but also work to contextualize the data in meaningful ways. This helps to provide a more in-depth understanding of what altmetrics can actually say about a scholarly work, particularly as it compares to similar works. Each tool has different features, strengths, and weaknesses, and they all serve similar but distinct purposes with different intended audiences.

Altmetric

The London-based company Altmetric provides a series of tools, all under the Altmetric banner, that increase in complexity from a tool designed to generate altmetrics for a single journal article to a tool that summates and compares altmetrics at the institutional level. However, each tool is built on altmetrics that are harvested and contextualized from the same sources, many of which are detailed above. However, all metrics are derived from journal articles only—more specifically, journal articles with a retrievable DOI, PubMed ID, or arXiv ID with “friendly metadata.” This essentially limits the content for which the Altmetric tools can pull data to only those journal articles that it can correctly identify.

Altmetric

www.altmetric.com

With these limitations in mind, Altmetric is still able to pull together some powerful altmetrics data, starting at the individual article level with its bookmarklet.

Altmetric Bookmarklet

The Altmetric Bookmarklet is a bookmarklet that integrates with Chrome, Firefox, or Safari to provide altmetrics from a journal article’s website. The bookmarklet web page walks through the steps to install and use the bookmarklet. Once it is launched, the signature “Altmetric donut” is displayed, along with the “Altmetric score,” some basic altmetrics, and links to more information at the bottom, as shown in figure 2.5. The colors in the donut indicate the altmetrics source (Twitter, Facebook, Mendeley, etc.), and the Altmetric score in the middle shows the level of attention the article has received in one unified score as measured by the article’s altmetrics interactions.3 The higher the score, the greater the level of attention according to Altmetric’s calculations. These numbers can, in theory, be directly compared between different journal articles.

Altmetric Bookmarklet

www.altmetric.com/bookmarklet.php

Clicking for more details allows the user to view the individual sources that make up the altmetrics displayed, as well as providing some key contextual information. The Score tab gives the more detailed analysis of the Altmetric score, along with ranked and percentile comparisons for the score (see figure 2.6).

Similar to the Score tab, the other tabs within the Altmetric bookmarklet break down the altmetrics data into finer detail, including individual Tweets, Facebook posts, and so on, that are included in the total for that source. This level of detail is an example of the high level of accessibility and openness prominent among altmetrics tools, a concept we’ll return to in chapter 3.

Altmetric Bookmarklet Integrations

While the bookmarklet works well as a stand-alone product for use by individuals on their Internet browsers, the same functionality has also been incorporated into an increasing number of other tools, providing seamless altmetrics data within those tools. Notable examples include Altmetric’s integration within individual journal articles in Scopus, integration with institutional repositories such as DSpace, and integration with journal articles through specific publishers such as SAGE, HighWire, and Nature Publishing Group. These collaborations give increased exposure to Altmetric and, more generally, to altmetrics data, and we expect these types of collaborations to continue to grow in the future.

Altmetric Explorer and Institutional

Altmetric not only provides altmetrics data at the individual journal article level, but it also has two products, Explorer and Institutional, that provide summaries of this data at higher levels of evaluation—that is, they allow an individual to view altmetrics data for many journal articles, grouped by authors or by source (journal). While Explorer and Institutional have slightly different interfaces, due to the slight variations in audience, they both allow for more meaningful analysis and comparisons of the altmetrics. Furthermore, this data can be filtered and sorted in many different ways, allowing for a variety of analyses to take place.

Explorer is targeted toward publishers, librarians, and authors, while Institutional is (not surprisingly) targeted toward institutions and groups, but each provides a similar service. Explorer emphasizes use of the Altmetric donuts for individual article comparisons, while Institutional favors a less journal-centric and higher-order view (see figure 2.7).

Impactstory

Impactstory (formerly known as Total-Impact) was created to help researchers demonstrate research impact using altmetrics. Accordingly, Impactstory is designed for use by these researchers (rather than departments or institutions) by collating and contextualizing a researcher’s scholarly outputs within that person’s Impactstory profile page. This profile page can then be used in any situation in which a researcher needs to demonstrate impact, such as grant applications, tenure, or promotion, or as part of a review.

Impactstory

www.impactstory.org

Although Impactstory originally started with funding obtained through several grants, the company has recently made the decision to implement a modest fee for its users ($45 a year, though fees may be waived based on financial need). However, new users can sign up for a seven-day trial to set up a profile and determine whether it’s worth the cost for them.

Once a researcher has created an account, that person can add scholarly works manually or can import works from SlideShare, ORCID, Scopus, and more. Works are sorted into types of work, and the user’s home page will display an overview of all altmetrics, along with selected works highlighted in the center of the page, as shown in figure 2.8.

Impactstory will then display all available altmetrics for these works using badges like Discussed, Saved, and Viewed. Like other altmetrics harvesters, Impactstory excels in providing contextualized metrics based on raw altmetrics data it collects from other sites. If any metric is higher than 75 percent of comparable works, the badge will be designated as “Highly,” such as “Highly Viewed.” Badges can be clicked on for more detail about the comparison (see figure 2.9 for an example). As explained on the website, Impactstory will compare an article based on its primary reader group on Mendeley.4 So if an article is read primarily by people affiliated with information science, all metrics will be compared to other information science articles published that same year.

PlumX

PlumX was created by two entrepreneurs to help researchers and institutions meaningfully measure and engage with generated altmetrics data, and it serves as a direct competitor to Altmetric Institutional. Within PlumX, altmetrics are gathered from a variety of sources, including EBSCO abstract views and downloads (which are exclusive to PlumX, since the company, Plum Analytics, was acquired by EBSCO in January 2014). This data is gathered for all researchers and the scholarly works (or “artifacts,” as PlumX calls them) that are entered for the researchers. The function of adding works for scholars is similar to that for Impactstory, as researchers and artifacts can be added by DOI, URL, or PubMed ID or uploaded from other systems such as Web of Science or Scopus. Once researchers and their artifacts have been added, researchers can be organized into groups (e.g., departments within an institution or labs within a research facility). Altmetrics data can then be viewed at the institutional level, as demonstrated in figure 2.10, as well as the group, author, or individual artifact levels.

PlumX

www.plu.mx

One of the more unique forms of engagement that PlumX provides is through the Plum Print. This feature is designed to allow users to view types of engagement with altmetrics through a visual display—for example, degree of social media interaction versus citations. The larger the branch of the sunburst, the greater the number of altmetrics in that category, as shown in figure 2.11.

Kudos

Kudos is a relatively new online platform for researchers designed to help them better market their research and track their impact over time. Through Kudos users can associate their published articles with supplemental information and other files like videos, data files, or other articles in one Kudos article web page, as shown in figure 2.12. Users can then track how the sharing of these Kudos web pages affects metrics like views and downloads (see figure 2.13).

Kudos is free for users and is supported by publishers and institutions, which pay a fee for access to their own metrics. Kudos imports and displays metrics from a variety of sources, including data from Altmetric and Thomson Reuters (for Web of Science’s times cited), along with tracking the number of views of the researcher’s Kudos web pages.

Evaluating Tools

Since the field of altmetrics is still emerging, change and experimentation are currently the only norm upon which we can rely, making an up-to-date introduction to the tools that make up the altmetrics field virtually impossible. What doesn’t change, however, is a series of core values and priorities that good tools can bring to this evolving environment. With that in mind, it’s important to be able to not only be familiar with current tools, but also to be able to effectively evaluate new tools from an altmetrics perspective as they are added to the metrics tool landscape or evolve from their current iteration. Here are some factors to consider when assessing potential altmetrics tools.

Audience

Some tools are targeted toward the individual researcher, while others are designed for institutional use. Identifying the target audience will also help identify the intended uses, including the most likely scenarios in which this tool could be useful to your library or its users.

Cost

While the cost structure is usually relatively simple to determine, it is worthwhile to dig deeper and learn a bit more about the financial environment under which this tool operates. This will help identify tools that may implement a subscription or may be more likely to be bought by a larger company in the future.

Metrics and Accessibility

Understanding a bit about the metrics within the tool is important since metrics can tell different stories regarding research impact. For example, whether a tool is generating metrics for an abstract view versus a full-text article view versus a full-text article download can greatly change the understanding of the metric and what it says about the article itself.

Accessing the metrics largely relies on whether the tool is an open tool or a closed tool—that is, whether registration and login are required to access personal metrics or whether metrics can be retrieved by anyone, including altmetrics harvesting tools. Accessibility can ultimately limit the success of the tool, particularly due to “sign-up fatigue” or the reticence to register and manage upkeep for tool after tool. If metrics can be harvested and aggregated by one tool, it all but eliminates the need for management within the tool that creates the metrics.

Unique Features

Finally, learning more about what this tool can provide for the intended user can determine its relative usefulness for that user. In other words, as the business saying goes, have they “built a better mousetrap” that would make this tool useful or appealing or improved existing tools?

Conclusion

The altmetrics landscape is comprised of a diverse set of tools and resources that can be used to measure a variety of ways in which researchers and other people are viewing, saving, and interacting with scholarly content. But, like many 21st-century innovations, the tools themselves emerge, evolve, and disappear rapidly, making it difficult to stay on top of the most recent developments. Using evaluative criteria can help those working with altmetrics better understand the benefits and downsides of using data generated from any given source. However, understanding the central altmetrics tools is only part of the landscape equation. In the next chapter, we will take a look at some of the broader topics surrounding altmetrics, including barriers to broader acceptance for altmetrics, the impact of metrics on different scholarly disciplines, and future directions for altmetrics.

Further Resources

Barker, Kimberley R., and Andrea Horne Denton. “Altmetrics: The Movement, the Tools and the Implications.” April 16, 2014. www.slideshare.net/CMHSL/altmetrics-2014415slideshare.

This presentation, from two health science librarians at the University of Virginia, does a nice job of summarizing the background of altmetrics and takes a look at many of the metrics and tools, with lots of pictures and descriptions. This presentation also serves as an excellent example of a librarian presentation, one of the many ways in which librarians can be involved with altmetrics, as we’ll discuss in greater detail in chapter 4.

Chin Roemer, Robin, and Rachel Borchardt. “From Bibliometrics to Altmetrics: A Changing Scholarly Landscape.” College and Research Libraries News 73, no. 10 (November 2012): 596–600. http://crln.acrl.org/content/73/10/596.full.

This article, written by the authors of this report, although now slightly outdated, gives a nice, succinct summary of currently available metrics and tools within the field of altmetrics as well as bibliometrics.

Fenner, Martin. “Altmetrics and Other Novel Measures for Measuring Scientific Impact.” In Opening Science: The Evolving Guide on How the Web is Changing Research, Collaboration and Scholarly Publishing, edited by Sönke Bartling and Sascha Friesike. Springer, 2014. http://book.openingscience.org/vision/altmetrics.html.

Fenner leads the Article-Level Metrics (ALMs) initiative at PLOS and writes frequently on the subject of altmetrics. This online book chapter does a great job of covering altmetrics sources and tools, as well as helpful terminology, provides a research summary, and more. The entire book, Opening Science, is open to comments and revisions, so the chapter is likely to change over time.

Notes

  1. Jeff Bercovici, “Amazon vs. Book Publishers, by the Numbers,” Forbes, February 10, 2014, www.forbes.com/sites/jeffbercovici/2014/02/10/amazon-vs-book-publishers-by-the-numbers.
  2. Stacy’s publications are accessible through her Google Scholar profile: http://scholar.google.com/citations?user=eslVzYQAAAAJ&hl=en&oi=ao. Stacy is now a Research Metrics Consultant for Altmetric, an altmetrics tool covered later in this chapter.
  3. More information about the Altmetric score and how it is calculated is available on the website: https://www.altmetric.com/whatwedo.php.
  4. “‘Highly Cited’ and Other Impact Badges,” ImpactStory Feedback website, accessed March 12, 2015, http://feedback.impactstory.org/knowledgebase/articles/400281--highly-cited-and-other-impact-badges.
Figure 2.1. Amazon Best Sellers Ranks for the 2014 book Beyond Bibliometrics: Harnessing Multidimensional Indicators of Impact, including #38 in Bibliographies & Indexes.

Figure 2.1

Amazon Best Sellers Ranks for the 2014 book Beyond Bibliometrics: Harnessing Multidimensional Indicators of Impact, including #38 in Bibliographies & Indexes.

Figure 2.2. Detailed Goodreads book metrics, including ratings, readers (“added by”), and users who have the book on their future reading list (“to-reads”).

Figure 2.2

Detailed Goodreads book metrics, including ratings, readers (“added by”), and users who have the book on their future reading list (“to-reads”).

Figure 2.3. SlideShare graph showing number of views by month since this slidedeck was uploaded in 2010.

Figure 2.3

SlideShare graph showing number of views by month since this slidedeck was uploaded in 2010.

Figure 2.4. Mendeley readership metrics for one article, including number of readers, discipline, academic status, and country.

Figure 2.4

Mendeley readership metrics for one article, including number of readers, discipline, academic status, and country.

Figure 2.5. The Altmetric bookmarklet donut shows the summary altmetrics data for this Nature journal article.

Figure 2.5

The Altmetric bookmarklet donut shows the summary altmetrics data for this Nature journal article.

Figure 2.6. A sample Score tab displaying a detailed breakdown of the Altmetric score, including comparative percentiles for the article.

Figure 2.6

A sample Score tab displaying a detailed breakdown of the Altmetric score, including comparative percentiles for the article.

Figure 2.7. This example shows the summary of altmetrics data in Altmetric Institutional for the fictitious Lilliput University. Filtering options are along the left-hand side, while tabs for more granular detail are along the top.

Figure 2.7

This example shows the summary of altmetrics data in Altmetric Institutional for the fictitious Lilliput University. Filtering options are along the left-hand side, while tabs for more granular detail are along the top.

Figure 2.8. Carl Boettiger’s Impactstory home page, with different types of scholarly contributions along the left, selected works in the center of the page, and key profile metrics on the right. https://impactstory.org/CarlBoettiger

Figure 2.8

Carl Boettiger’s Impactstory home page, with different types of scholarly contributions along the left, selected works in the center of the page, and key profile metrics on the right. https://impactstory.org/CarlBoettiger.

Figure 2.9. An article’s Impactstory metrics as compared to similar articles published in the same year.

Figure 2.9

An article’s Impactstory metrics as compared to similar articles published in the same year.

Figure 2.10. An overview of PlumX altmetrics data for journal articles written by members of the Smithsonian Institution. Note the tabs for different artifact types and links to individual researcher profiles and Smithsonian organizations.

Figure 2.10

An overview of PlumX altmetrics data for journal articles written by members of the Smithsonian Institution. Note the tabs for different artifact types and links to individual researcher profiles and Smithsonian organizations.

Figure 2.11. Plum Print showing Usage, Captures, Mentions, Social Media, and Citations for an individual article.

Figure 2.11

Plum Print showing Usage, Captures, Mentions, Social Media, and Citations for an individual article.

Figure 2.12. A sample Kudos article web page, with a short explanation of the article, link to the full-text download, list of other author publications, and supplementary information along the right.

Figure 2.12

A sample Kudos article web page, with a short explanation of the article, link to the full-text download, list of other author publications, and supplementary information along the right.

Figure 2.13. This chart shows how several metrics for this article have changed over time—the <em>A</em> marks activities, such as sharing the article’s Kudos web page via Twitter. This helps show researchers which activities have led to increased interactions (views, downloads, etc.) with the article. In this case, the latest two activities led to an increase in people viewing the article’s Kudos web page, as well as in the number of people who download the article. Image courtesy of Kudos.

Figure 2.13

This chart shows how several metrics for this article have changed over time—the A marks activities, such as sharing the article’s Kudos web page via Twitter. This helps show researchers which activities have led to increased interactions (views, downloads, etc.) with the article. In this case, the latest two activities led to an increase in people viewing the article’s Kudos web page, as well as in the number of people who download the article. Image courtesy of Kudos.

Refbacks

  • There are currently no refbacks.


Published by ALA TechSource, an imprint of the American Library Association.
Copyright Statement | ALA Privacy Policy