Altmetrics: an overhyped fad or an important tool for evaluating scholarly output?
Potential Advantages of Altmetrics as a Measure of Scholarly Productivity
Altmetrics'reliance on the social web to evaluate scholarly publications allows researchers to receive much faster feedback on the impact of their academic work. Due to the lengthy peer-review process for academic journals, it takes at least 2 years for articles to accumulate an appreciable number of citations. While traditional bibliometric measures are tied to the print journal model, altmetrics were created specifically for the web. Heather Piwowar, the co-founder of Impactstory, describes the advantage of altmetrics over traditional bibliometrics as "evidence of impact in days instead of years." If a researcher is looking for influential articles on a topic, searching the Web of Science or Scopus to rank articles by citations will reveal very few articles published
within the last 2 years. However, the peer scholarly network Social Science Research Network (SSRN) ranks academic papers by the number of downloads. By consulting SSRN, researchers can identify the research papers that are most popular among scholars long before they are cited in scholarly journals.
Altmetrics offer promise to researchers who are less likely to publish journal articles. Because traditional bibliometric assessments are tied to citation counts in academic journals, they are less effective for evaluating the impact of monographs or book chapters. Altmetrics provide a more diverse set of evaluations, and researchers can find metrics that are a good match for their research activities. For example, an author of a book can look at reader reviews on Goodreads, along with the number of readers in Mendeley to gauge the influence of his or her book. Another option for them to collect metrics is to share their research onAcademia.edu, a social networking site with strong representation from humanities professors. Some altmetrics resources go beyond evaluating books and articles to measuring the impact of less traditional works such as datasets, interviews, and blogs. While altmetrics offer promise for evaluating scholarship outside of articles, the existing tools do a better job of evaluating journal articles than other scholarly works.
Altmetrics democratize the measurement of the impact of scholarly works because they measure how undergraduates and other lay readers view academic works. By drawing on feedback from a wide variety of sources and a diverse group of readers, altmetrics are a good example of crowdsourcing. This is in stark contrast to citations in scholarly journals, which reflect how scholars and other experts are using academic works to the exclusion of lay researchers. For instance, any reader can mention an academic study on Facebook or post a review of a book on Goodreads. Similarly, statistics on the number of times articles have been downloaded or viewed reflect the reading and research preferences of anyone who accessed the articles, rather than just professional researchers.
Another example of how altmetrics are gathering metrics from a broader readership is the effort to measure mentions of scholarly works in popular news websites. Non-academics are more likely to visit news websites than to read scholarly publications. By tracking how many times their works are mentioned in news websites, researchers can receive feedback on the societal impact of their work. It's worth noting that many scholarly articles are still hidden behind pay-walls, and it's hard to say how many journalists--let alone laypersons--actually read the entire study. The popular press is notorious for simplifying or misinterpreting academic research. However, without news sites mentioning research studies, even fewer members of the general public would be aware of their existence.
Why is it important to measure the broader impact of scholarly work? Funding organizations would like metrics to measure the impact of the research that they fund. This is part of a larger trend of funding agencies wanting research to be more widely available. For example, some agencies are requiring that recipients publish their findings in open access (OA) sources.
Challenges Facing Altmetrics
The social web favors popular, trendy topics over obscure, complicated subjects. Works on Ebola or the refugee crisis in Europe will attract more attention on social media than a publication on hydrogen absorption by zirconium alloys at high temperatures. This bias toward popular topics is evident in Altmetric.corn's annual ranking of the 100 articles with the highest altmetrics scores. Some of the highest ranked articles from 2015 include "Accelerated Modern Human--Induced Species Losses: Entering the Sixth Mass Extinction" and "Shaping the Oral Microbiota Through Intimate Kissing." Nearly all of the articles on the list were likely to be of interest to a broad readership. Complicated articles such as "Exploration of Highly Active Bidentate Ligands for Iron (III)-Catalyzed ATRP" are only of interest to experts in a certain field. Researchers may wonder if highly cited articles will have strong altmetrics scores. Preliminary research suggests that articles with the most citations in Web of Science didn't always have high altmetrics scores.
Most researchers are more familiar with traditional bibliometrics than altmetrics. This is understandable, considering the term "altmetrics" was only coined in 2010. Scholars might find altmetrics indicators interesting, but they also want to know how well they correlate with bibliometric measures they are more familiar with. Despite the shortcomings of bibliometric measures, many researchers will be more comfortable with scholarly citations than with altmetrics. Multiple studies examining the correlation between scholarly citations and altmetrics indicators show a positive, but weak to moderate, correlation between citation counts and altmetrics.
Gaining acceptance from faculty tenure committees will be crucial for altmetrics to move from the curiosity stage to a serious tool for evaluating scholarly output. Librarians can make faculty members aware of altmetrics, but, ultimately, professors will decide their importance for promotion decisions. Even though altmetrics offer hope for evaluating works in the fine arts and humanities, those in these fields might be more resistant to quantitative measures than social science and STEM professors.
It's important to realize that altmetrics are a collection of a diverse set of tools. A professor might be contemptuous of mentions on Twitter, but he or she might be receptive to statistics on article downloads. Faculty members will have to make decisions about which altmetrics statistics they want to use for evaluating scholarly impact. How much weight should be given to readers in Mendeley versus article downloads? Should Twitter and Facebook statistics be considered?
Another obstacle to faculty accepting altmetrics is the concern about indicators being gamed by researchers to falsify the influence of their work. Fears about scholars or publishers falsifying scholarly impact metrics are not limited to altmetrics. It's also possible to manipulate bibliometric indicators; in an experiment, researchers successfully created fake papers in Google Scholar to fabricate citations. In addition, publishers have manipulated citations to increase the impact factor of their journals.
The Outlook for Altmetrics
As it stands, the following observations can be made:
* Altmetrics provide flexibility in evaluating researchers--Some scholarly works will be difficult to evaluate based on how many times they have been cited in scholarly publications. It's valuable to have additional options when h-indexes, impact factors, and scholarly citations fail to properly evaluate a researcher's works.
* Standardizing altmetrics has pros and cons--More agreement about which specific measures should be used for evaluating academic works will help improve the authority of altmetrics and make it easier to compare researchers. An important advantage of altmetrics is granting researchers the flexibility to use metrics that are a good fit for them. Attempts to centralize altmetrics should provide loose definitions that account for disciplinary differences.
* Altmetrics are connected to recent scholarly publishing trends--OA journals from publishers such as PLOS have aggressively adopted altmetric indicators. Altmetrics harvesters are using ORCID IDs, which are unique author identifiers, to collect authors' works.
* Altmetrics are better at measuring a work's attention than its quality--Critics correctly point out that article views and social media mentions are not indicators of a work's quality. However, bibliometrics are not an indicator of quality either. An article published in a journal with a high impact factor is not necessarily a quality article.
Altmetrics are a valuable addition to traditional bibliometrics as a resource for evaluating scholarly impact because they provide much faster feedback. Altmetrics have the potential to be especially helpful in evaluating the impact of scholarly works such as books, performances, and visual works that have been neglected by traditional bibliometrics. My prediction is that altmetrics will complement rather than replace the impact factor, h-indexes, and scholarly citations. It will be an uphill battle for altmetrics indicators to be given the same weight as scholarly citations in the scholarly world. If academia resists altmetrics, it will be swimming against the tide of important trends such as the social web, OA publishing, and the democratization of information.
Altmetric (2015). The Altmetrics Top 100: What Academic Research Caught the Public Imagination in 2015? Retrieved from altmetric.com/ topl00/2015.
Boon, C.Y., and Foon, J.J. (2014). Altmetrics Is an Indication of Quality Research or Just FIOT Topics. IATUL Annual Conference Proceedings, (35), 1-8.
Bornmann, L. (2014). "Do Altmetrics Point to the Broader Impact of Research? An Overview of Benefits and Disadvantages of Altmetrics." Journal of Informetrics, 8(4), 895-903. DOI:10.1016/j.joi.2014.09.005.
Carpenter, T.A. (2012, Nov. 14). Altmetrics--Replacing the Impact Factor Is Not the Only Point. Retrieved from scholarlykitchen.sspnet.org/2012/ljyi4/ altmetrics-replacing-the-impact-factor-is-not-the-only-point.
Colquhoun, D. (2014, Jan.16). Why You Should Ignore Altmetrics and Other Bibliometric Nightmares. Retrieved from dcscience.neV2014/01/16/ why-you-should-ignore-altmetrics-and-other-bibliometric-nightmares.
Costas, R., Zahedi, Z., and Wouters, R (2015). "Do 'Altmetrics' Correlate With Citations? Extensive Comparison of Altmetric Indicators With Citations From a Multidisciplinary Perspective." Journal of the Association for Information Science and Technology, 66(10), 2003-2019. DOI:10.1002/asi.23309.
Delgado Lopez-Cozar, E., Robinson-Garcfa, N., and Torres-Salinas, D. (2014). "The Google Scholar Experiment: Flow to Index False Papers and Manipulate Bibliometric Indicators." Journal of the Association for Information Science and Technology, 65(3), 446-454. DOI:10.1002/asi.23056.
Issues, Controversies, and Opportunities for Altmetrics (2015). Library Technology Reports, 51(5), 20-30.
Piwowar, H. (2013). Altmetrics: What, Why and Where? "Bulletin" of the American Society for Information Science and Technology (ASIS&T). asis.org/Bulletin/Apr-13/AprMay13_Piwowar.html.
Roemer, R.C., and Borchardt, R. (2015). Meaningful Metrics: A 21st Century Librarian's Guide to Bibliometrics, Altmetrics, and Research Impact. Chicago: ACRL.
Marc Vinyard (marc.vinyardOpepperdine.edu) is the reference and instruction librarian at Pepperdine University Libraries. His research interests include altmetrics, bibliometrics, library instruction, assessment of reference services, and business information. He has written articles for The Charleston Advisor, Searcher, and the Journal of Library Administration.
ALTMETRICS RESOURCES TO EXPLORE Information professionals have many free altmetrics resources they can work with. Librarians may have already been using some of these resources without realizing they fall under the altmetrics mantle. RESOURCE USE WEBSITE ADDRESS Altmetrics Statistics are provided altmetrics.com bookmarklet for social media. Mendeley. and CiteULike. Librarians can register for a free account with Altmetric Explorer with more search capabilities. ResearchGate Peer scholarly network researchgate.net with a heavy STEM presence: provides metrics on readers Academia.edu Peer scholarly network academia.edu for the humanities Publisher Many publishers are varies websites tracking how many times articles have been downloaded, and they have partnered with Altmetric. Institutional Readership reports on varies repositories downloads of documents Mendeley Reference manager that mendeley.com provides statistics on the number of readers CiteULike Statistics on how many citeulike.org works have been bookmarked Social Science Rapid dissemination of ssrn.com Research academic papers with Network (SSRN) rankings by document views Impactstory Collects social media impactstory.org metrics Open Syllabus Search more than 1 opensyllabusproject.org Project million course syllabi to discover books on reading lists Libraries that have a strong interest in promoting altmetrics could subscribe to the commercial PlumX database, which is one of the most comprehensive resources for altmetrics. This resource is an altmetrics harvester that pulls data from a variety of places.
|Printer friendly Cite/link Email Feedback|
|Publication:||Computers in Libraries|
|Date:||Dec 1, 2016|
|Previous Article:||Three tech trends and one skill to watch during 2017.|