Altmetrics is a hot buzzword. What does it mean? What's behind the buzz? What are the risks and benefits of using alternative metrics of research impact – altmetrics – in our discovery and evaluation systems? How are altmetrics being used now, and where is the field going?
This special section of the Bulletin of the Association for Information Science and Technology focuses on these questions. Essays from seven perspectives highlight the role of altmetrics in a wide variety of settings.
Before you dive in, if you are new to altmetrics, let me give you a quick informal introduction. For decades, the most common metric for evaluating research impact has been the number of times a research article is cited by other articles. This metric is sometimes represented by the raw count of citations received by the specific article in question or sometimes through an impact-by-association proxy – the number of citations received by the journal that published the article, summarized using a formula called the journal impact factor.
Citations are not the only way to represent the impact of a research article. A few alternative indicators have been the subjects of webometrics and bibliometrics research for years, including download counts and mentions in patents. However, as scholarly communication moves increasingly online, more indicators have become available: how many times an article has been bookmarked, blogged about, cited in Wikipedia and so on. These metrics can be considered altmetrics – alternative metrics of impact. (Appropriately enough, the term altmetrics was first proposed in a tweet [https:/twitter.com/asnpriem/status/25844968813].)
We might even consider nontraditional applications of citation metrics to be altmetrics – citations to datasets as first-class research objects, for example. Other examples include citation counts filtered by type of citation, like citations by editorials or citations only from review articles or citations made only in the context of experimental replication. All of these are alternative indicators of impact.
A more nuanced understanding of impact, showing us which scholarly products are read, discussed, saved and recommended as well as cited.
Often more timely data, showing evidence of impact in days instead of years.
A window on the impact of web-native scholarly products like datasets, software, blog posts, videos and more.
Indications of impacts on diverse audiences including scholars but also practitioners, clinicians, educators and the general public.
Of course, these indicators may not be “alternative” for long. At that point, hopefully we'll all just call them metrics.
Dive in, read all about it and let us know what you think. Continued conversation, background information and crowdsourced lists of new research and resources can be found on twitter using the hashtag #altmetrics (https://twitter.com/search/realtime?q=%23altmetrics), in the altmetrics Mendeley group (www.mendeley.com/groups/586171/altmetrics/papers/) and probably at a conference near you.
Thanks very much to all authors in this collection for voluntarily making their articles openly available for reuse under a Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/3.0/).
Happy reading!
Heather Piwowar is a postdoc at Duke University, studying the adoption and use of open research data. She is also a co-founder of ImpactStory (http://impactstory.org/), an open-source web tool that helps scholars track and report the broader impacts of their research. @researchremix
Altmetrics: an overview and evaluation
by Ann E. Williams
The primary purpose of this paper is to provide a description, overview, and evaluation of altmetrics, an understudied yet increasingly important arena of study for scholars, academics, and professional researchers. The paper is organized into six parts: the first defines altmetrics and clarifies the concept of altmetrics in its various forms; the second examines how altmetrics work; the third presents multiple typologies under which altmetrics can be classified and studied; the fourth details the technological capabilities of altmetrics; the fifth presents a critical evaluation of the “pros and cons”of altmetrics; and, the sixth outlines some directions for future and ongoing research.
Altmetrics, or Alternative metrics, are a new way of measuring and monitoring the reach and impact of scholarship and research through online interactions.
Altmetrics can answer questions such as:
Altmetrics are intended to compliment, not totally replace, more traditional measurements of academic success (citation counts, journal prestige (impact factor), and author H-index) to give a more complete picture of how research and scholarship is used.
Research impact: Altmetrics make their mark
Alternative measures can yield useful data on achievement — but must be used cautiousl
Altmetrics offer researchers a way to showcase the impact of papers that have not yet gathered many citations, and to demonstrate engagement with the public. They can be accessed through journals or independent websites, and can track the impact of particular data sets or papers, or evaluate the combined influence of publications and products produced by multiple researchers in a department.
Altmetric is a London-based start-up focused on making article level metrics easy. Their mission is to track and analyse the online activity around scholarly literature.
Instantly get article level metrics for any recent paper, for free. Simply drag the button below to your bookmarks bar, navigate to a journal article page, and hit "Altmetric it!"
Impactstory is a tool that gathers altmetrics from many sources under one roof. It is a way to promote, manage, and share your research and scholarship. Some researchers are using Impactstory as an alternative to a static online CV/resume.
Plum is a for-profit company that offers analytics for 20 different kinds of “artifacts,” including journal articles, book chapters, datasets, presentations and source code. It aggregates data based on a variety of sources at a variety of different levels, including artifact, author, lab, department, and journal.
Towards a common model of citation: some thoughts on merging altmetrics and bibliometrics
Reporting back: This article is based on presentations that Mike Taylor gave at the PLoS article level metrics workshop in San Francisco and at the World Social Science Forum (WSSF) in Montreal, both in October 2013
Further reading
Haustein, S., Peters, I., Bar-Ilan, J., Priem, J., Shema, Terliesner, J. (2013, April 6). Coverage and adoption of altmetrics sources in the bibliometric community. ArXiv.
Konkiel, S. (2014). The ultimate guide to staying up-to-date on your articles impact. Impact Story Blog.
Priem, J. Altmetrics: A Manifesto.
Priem, J., H. Piwowar, B. Hemminger. (2012, March 20). Altmetrics in the wild: using social media to explore scholarly impact. ArXiv.
Roemer, R.C., R. Borchardt. (2012) From bibliometrics to altmetrics: a changing scholarly landscape. College and Research Libraries News, 73 (10), 596-600.
Sud, P., Thelwall, M. (2014) Evaluating altmetrics. Scientometrics 98, 1131–1143.
Stuart, D. (2023, December 12) Have we reached the limits of altmetrics? Research Information