https://doi.org/10.25547/FE3W-AG83

Lisez-le en français

This observation was written by Caroline Winter, with thanks to Mike Taylor for his feedback and contributions.

At a glance:

Title Altmetrics for research evaluation
Creator n/a
Publication Date n/a
Keywords research evaluation, open infrastructure, scholarly communication

In recent years, there have been numerous calls to change research evaluation policies to rely less on journal-level citation metrics such as the Journal Impact Factor (JIF), including two key international initiatives. The San Francisco Declaration on Research Assessment (DORA), developed at the Annual Meeting of the American Society for Cell Biology in 2012, calls for the use of article-level metrics, greater transparency in research evaluation policies and procedures, and consideration of all types of research outputs, not just journal articles.

The Leiden Manifesto, authored by Diana Hicks, Paul Wouters, Ludo Waltman, Sarah de Rijcke, and Ismael Rafols and published in Nature in 2015, calls attention to the problem that research evaluation—at the level of the individual, the journal, and the institution—is increasingly reliant on metrics that are often applied and interpreted inappropriately. It presents ten principles to guide research evaluation policy at all levels, emphasizing that quantitative metrics and qualitative measures should be used in combination to produce well-informed, fully contextualized research assessment.

Many in the scholarly community have proposed the use of alternative metrics or altmetrics alongside citation metrics to quantify research impact in a more nuanced way. In their study of metrics use in review, promotion, and tenure, for example, Stacy Konkiel, Cassidy Sugimoto, and Sierra Williams (2016) note that altmetrics can provide a fuller picture of how research is used by groups outside the academy, such as policy makers and the public, including research outputs other than journal articles, such as code, reports, datasets, and software. They note, echoing concerns held by the broader academic community, that quantitative measures alone cannot adequately capture a complete picture of research impact.

About Altmetrics

Unlike citation metrics that trace citations to formal scholarly publications—primarily journal articles—in other formal scholarly publications, altmetrics data traces various forms of engagement with numerous types of scholarly work, not only articles and monographs but also datasets, code, and educational resources.

As Jason Priem, Dario Taraborelli, Paul Groth, and Cameron Neylon point out in Altmetrics: A Manifesto (2010), since much of the scholarly and public discourse now takes place online, it is possible to capture data about forms of engagement with scholarly research that could not be captured or measured before. This manifesto notes that, unlike citation counts which are slow moving, relatively narrow, and often lacking in transparency, altmetrics move quickly and reflect the diversity of the scholarly communication ecosystem by looking at outputs beyond formal scholarly publications.

“Altmetrics” is an umbrella term that refers to a variety of metrics measuring many types of engagement, building on the fields of bibliometrics, scientometrics, and webometrics (Taylor 2020). These types of engagement are often categorized as follows:

  • Usage: downloads, views
  • Captures: bookmarks, saves, shares
  • Mentions: in blogs, news articles, Wikipedia
  • Social media: shares and likes on Facebook, Instagram, LinkedIn, Pinterest, Twitter
  • Citations: in scholarly resources, databases, as well as reports, patents, policy, and other forms of grey literature (King and Thuna 2013; Tananbaum 2013)

Altmetrics data is generated using application programming interfaces (APIs), which harvest data about usage, captures, mentions, etc., from a variety of sources:

Data can be collected only for research with a persistent identifier (PID), such as a DOI, and only when that PID is used (see “The UK Persistent Identifier (PID) Consortium” and “ORCID Update: Integrating ORCID iDs into Research Funding Workflows”). This means that information for books, book chapters, and other resources that do not have DOIs or other PIDs cannot be collected (see Taylor 2020).

As altmetrics have gained currency over the past decade, numerous tools and services have been developed to collect, analyze, and share the data. Although the altmetrics landscape is evolving rapidly, at the time of writing, some of the most widely used dedicated altmetrics tools and altmetrics services include the following:

Altmetrics and Policy

The Austrian Transition to Open Access 2 (AT2OA2), a national initiative to transform research publications to open access, is partnering with Altmetric to study the effect of open access on the visibility of publications using altmetrics. This appears to be the first instance of altmetrics used in a national research evaluation program (Day 2021).

In addition to their relevance to shifts in research evaluation policy, altmetrics also make visible the impact of research on policy development. Altmetric, for instance, collects data from policy documents, and its founder, Euan Adie, founded policy database Overton in 2019 that makes it possible to trace references to scholarly publications in various types of policy documents, at various levels of granularity, including by discipline and field. One analysis revealed, for instance, that the greatest number of research citations in Overton’s corpus were to social sciences and humanities research (Szomszor and Adie 2022).

Altmetrics for Research Evaluation in the Press

In an article in Science, Jeffrey Brainard (2022) explores researchers’ use of Twitter for sharing research about COVID-19 (see “Open Scholarship and COVID-19”). He notes that, although evidence is still mixed linking altmetric data to citation practices, it does demonstrate public interest and engagement in research.

An article in Forbes by Alex Zhavoronkov (2022) focuses on attention rather than on research impact, drawing in a discussion of celebrity scientists. It includes a statement from Kathy Christian from Altmetric that researchers should aim to reach the audiences who would be most interested in their work rather than trying to increase their Altmetric score for its own sake.

Altmetrics and research evaluation have also been covered in the academic press, such as in the Chronicle: a 2013 article describes one researcher’s experiments with including altmetric data in his tenure package, and an interview with Jason Priem from 2016 discusses how the altmetrics landscape has changed since they were first introduced. A 2017 article in Research Information expresses cautious optimism for the possibilities offered by altmetrics and emphasizes that, having learned how metrics can be misused, altmetrics providers, librarians, and others in the research community recognize the importance of education and training about what metrics mean and how they should be interpreted.

Responses from the INKE Community

Altmetrics have long been a topic of interest for INKE community members. In 2013, the Canadian Association of Research Libraries (CARL) released Altmetrics in Context, a primer aimed at researchers outlining the basic principles of altmetrics and some relevant tools and services (King and Thuna).

In 2019, the Public Knowledge Project (PKP) announced the launch of Paperbuzz, an open plugin for Open Journal Systems (OJS) that uses data from the not-for-profit organization Crossref, built in partnership with Impactstory. Impactstory is an open source altmetrics tool developed by OurResearch, the nonprofit organization that created Unpaywall.

A study by Paul Arthur and Lydia Hearn (2021) highlights the relationships between altmetrics and policy, noting that altmetrics enable the evaluation of the types of impacts emphasized in  policies such as the Research Evaluation Framework in the UK and the Crowdsourcing and Citizen Science Act (2016) in the US. Arthur and Hearn note that “the policy environment is shifting, and moves are underway to develop more open, congruent ways for universities to reshape their assessment of research for societal benefit, but many barriers still remain.” Some of these barriers include a lack of quality data—dynamic data aggregated from complex, heterogeneous sources—as well as dependence on commercial tools and data sources. The incompleteness of available datasets is another concern, since only research outputs that have PIDs can be tracked, and data is only collected from selected online platforms. Arthur and Hearn also point out that altmetrics only measure the flow of information into the online sphere; they do not capture the two-way flow of information that comprises true community engagement.

The Social Media Engine, led by Luis Meneses with CO.SHS, provides a use case for leveraging altmetric data for greater research impact. It generates a list of topics from Érudit’s corpus of publications in the humanities and social sciences, tweets links to OA publications related to topics discussed on social media, and monitors data from Altmetric to identify trending publications (Meneses et al. 2019).

Responses from the Broader Academic Community

In 2013, SPARC released a primer about article-level metrics that clarifies how they are distinct from altmetrics and discusses their possibilities and limitations.

Altmetrics and research evaluation is also an issue of interest for the European Commission, which established an Expert Group on Altmetrics in 2016. Its report Next Generation Metrics: Responsible Metrics and Evaluation for Open Science (2017) emphasizes that metrics will have a key role in advancing open scholarship, but that new and existing metrics must be used responsibly and transparently. It refers to recent initiatives to explore the challenges associated with the use of metrics, including DORA and the Leiden Manifesto. It also refers to the Dutch movement Science in Transition, founded in 2013, which focuses on improving the reproducibility and overall quality of scientific research and to The Metric Tide, a 2015 report about the role of metrics for research assessment in the UK.

The ScholCommLab has a research stream dedicated to Altmetrics and Societal Impact, and its numerous related publications address issues including social media metrics, altmetrics and citation patterns, and the role of metrics in review, promotion, and tenure policies.

Metrics for Research Evaluation and Open Scholarship

Altmetrics have a key role to play in advancing open scholarship as part of the digital infrastructure upon which open scholarship is built. For instance, altmetrics can enhance understanding of the advantages of OA publishing: Michael Taylor describes an Open Access Altmetrics Advantage, in which open access articles, books, and book chapters tend to receive more of the kinds of online attention that altmetrics capture (2020).

In his talk about library services and altmetrics for open scholarship, David Walters (2015) argues that there is a reciprocal relationship between altmetrics and open scholarship: links are more likely to be shared when they lead to open content rather than to paywalls, for instance, and that research shared on social media tends to be cited more often. He notes that libraries are key drivers of the cultural change required for advancing open scholarship, including through their adoption of innovations such as altmetrics.

Altmetrics offer incentives for doing open scholarship. In an institutional context, including altmetrics in review, promotion, and tenure policies and processes would provide greater recognition for the diverse forms of research outputs that researchers produce and their impact beyond the scholarly community (see “The Review, Promotion, and Tenure Project at the ScholCommLab”). This would also help drive a cultural shift toward valuing those forms of outputs and impacts (see Miedema et al. 2018).

One of the points raised in the European’s Mutual Learning Exercise (MLE) on Open Science – Altmetrics and Rewards was that the momentum behind altmetrics points to the ongoing shift in how research impact is evaluated, moving away from a more insular understanding of impact within the research community to one that foregrounds societal impact in an international policy context (Miedema et al. 2018).

In addition to the danger of altmetrics being misused as citation metrics have been, however, the academic community’s dependence on commercial publishers for altmetrics data and analysis has been cited as a barrier to open scholarship (Haustein; Konkiel et al. 2014). Open and transparent metrics are an essential component of the open scholarship ecosystem, as emphasized in the Metric Tide report (Wilsdon et al. 2015).

Also, as the MLE report by Frank Miedema et al. points out, “it is extremely difficult for researchers to adopt Open Science practices without a broad institutional shift in support and evaluation structures governing their work” (3). As the research metrics landscape continues to develop, discussions about altmetrics and their role in research evaluation and other related policies open up larger questions about how to define and measure research impact, what kinds of impact are more valued, and even the role of research in society.

Works Cited

Arthur, Paul Longley, and Lydia Hearn. 2021. “Reshaping How Universities Can Evaluate the Research Impact of Open Humanities for Societal Benefit.” The Journal of Electronic Publishing 24 (1). https://doi.org/10.3998/jep.788.

Brainard, Jeffrey. 2022. “Riding the Twitter Wave.” Science, March 24, 2022. https://www.science.org/content/article/twitter-transformed-science-communication-pandemic-will-last.

Day, Laura. 2021. “Altmetric Partners with Austria’s National Open Access Transformation Initiative.” Altmetric News (blog). November 22, 2021. https://www.altmetric.com/news/altmetric-partners-with-austrias-national-open-access-transformation-initiative/.

DORA (San Francisco Declaration on Research Assessment). n.d. “Read the Declaration.” https://sfdora.org/read/.

European Commission, Directorate-General for Research and Innovation, James Wilsdon, Judit Bar-Ilan, Robert Frodeman, Elisabeth Lex, Isabella Peters, and Pauls Wouters. 2017. Next-Generation Metrics: Responsible Metrics and Evaluation for Open Science. LU: Publications Office of the European Union. https://data.europa.eu/doi/10.2777/337729.

Haustein, Stefanie. 2016. “Grand Challenges in Altmetrics: Heterogeneity, Data Quality and Dependencies.” Scientometrics 108 (1): 413–23. https://doi.org/10.1007/s11192-016-1910-9.

Hicks, Diana, Paul Wouters, Ludo Waltman, Sarah de Rijcke, and Ismael Rafols. 2015. “Bibliometrics: The Leiden Manifesto for Research Metrics.” Nature 520 (7548): 429–31. https://doi.org/www.doi.org/10.1038/520429a.

King, Pam, and Mindy Thuna. 2013. Altmetrics in Context. Canadian Association of Resarch Libraries. https://www.carl-abrc.ca/doc/CARL2013-altmetrics-EN-FA.pdf.

Konkiel, Stacy, Cassidy Sugimoto, and Sierra Williams. 2016. “The Use of Altmetrics in Promotion and Tenure.” Educause Review. https://er.educause.edu/articles/2016/3/the-use-of-altmetrics-in-promotion-and-tenure.

Meneses, Luis, Alyssa Arbuckle, Hector Lopez, Belaid Moa, Ray Siemens, and Richard Furuta. 2019. “Social Media Engine: Extending Our Methodology into Other Objects of Scholarship.” Pop! Public. Open. Participatory, no. 1 (October). https://popjournal.ca/issue01/meneses.

Miedema, Frank, Katja Mayer, Kim Holmberg, and Sabina Leonelli. 2018. “Mutual Learning Exercise: Open Science: Altmetrics and Rewards.” European Commission. https://ec.europa.eu/research-and-innovation/sites/default/files/rio/report/MLE%2520OS_Final%2520Report_0.pdf.

Priem, Jason, Dario Taraborelli, Paul Groth, and Cameron Neylon. 2010. Altmetrics: A Manifesto. 2010. http://altmetrics.org/manifesto/.

Szomszor, Martin, and Euan Adie. 2022. “Overton — A Bibliometric Database of Policy Document Citations.” arXiv. https://doi.org/10.48550/arXiv.2201.07643.

Tananbaum, Greg. 2013. Article-Level Metrics: A SPARC Primer. SPARC. https://sparcopen.org/wp-content/uploads/2016/01/SPARC-ALM-Primer.pdf.

Taylor, M. (2020). “An Altmetric Attention Advantage for Open Access Books in the Humanities and Social Sciences.” Scientometrics, 125(3), 2523–2543. https://doi.org/10.1007/s11192-020-03735-8.

Walters, David. 2015. “Institutional Services and Altmetrics as Drivers for a Cultural Transition to Open Scholarship.” Altmetric 2:am Conference. https://www.youtube.com/watch?v=BNkGlWUCUiU.

Wilsdon, James, Liz Allen, Eleonora Belfiore, Philip Campbell, Stephen Curry, Steven Hill, Richard Jones, et al. 2015. The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. Higher Education Funding Council for England. http://www.hefce.ac.uk/pubs/rereports/Year/2015/metrictide/.

Zhavoronkov, Alex. 2022. “Measuring Attention In Science And Technology.” Forbes, March 17, 2022. https://www.forbes.com/sites/alexzhavoronkov/2022/03/17/measuring-attention-in-science-and-technology/.