Altmetrics (Chapter from Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact)

This chapter discusses altmetrics (short for 'alternative metrics'), an approach to uncovering previously-invisible traces of scholarly impact by observing activity in online tools and systems. I argu

Altmetrics (Chapter from Beyond Bibliometrics: Harnessing   Multidimensional Indicators of Scholarly Impact)

This chapter discusses altmetrics (short for “alternative metrics”), an approach to uncovering previously-invisible traces of scholarly impact by observing activity in online tools and systems. I argue that citations, while useful, miss many important kinds of impacts, and that the increasing scholarly use of online tools like Mendeley, Twitter, and blogs may allow us to measure these hidden impacts. Next, I define altmetrics and discuss research on altmetric sources–both research mapping the growth of these sources, and scientometric research measuring activity on them. Following a discussion of the potential uses of altmetrics, I consider the limitations of altmetrics and recommend areas ripe for future research.


💡 Research Summary

The chapter provides a comprehensive overview of altmetrics—alternative metrics that capture scholarly impact through online activity such as reference‑manager saves, micro‑blog mentions, and blog discussions. The author begins by critiquing traditional citation‑based bibliometrics, arguing that citations, while valuable for tracing scholarly knowledge flows, miss many forms of influence that occur outside the scholarly literature, including policy uptake, educational use, industry adoption, and public discourse.

Altmetrics are defined as quantitative indicators derived from the digital traces left when scholars and the broader public interact with research on platforms like Mendeley, Twitter, Reddit, blogs, news outlets, and patent databases. The chapter outlines two major strands of altmetrics research. The first maps the growth, structure, and disciplinary distribution of altmetric sources, often using data from commercial aggregators (Altmetric.com, PlumX) and open event‑data services (Crossref Event Data). The second investigates correlations between altmetric signals and conventional citations. Empirical studies consistently show a moderate positive correlation between Mendeley saves and later citations, suggesting that saves reflect early interest that can translate into scholarly impact. Twitter mentions, by contrast, are strongly associated with short‑term attention but have limited predictive power for long‑term citation counts. Blog posts and news articles tend to signal broader societal relevance, linking research to policy documents, patents, or public debates.

Potential applications are explored in depth. For individual researchers, altmetrics provide real‑time feedback on how their work is being discussed, shared, and applied, allowing them to showcase a more diverse impact portfolio in tenure and grant applications. Institutions can aggregate altmetric data to visualize departmental or institutional digital influence, inform strategic hiring, and identify emerging research areas. Publishers use altmetrics as marketing tools to highlight article visibility, encourage author engagement, and differentiate journals. Funding agencies and research‑assessment bodies are encouraged to incorporate altmetrics alongside citations, peer review, and societal‑impact narratives to develop multidimensional evaluation frameworks that recognize contributions beyond academia.

The chapter does not shy away from limitations. Data transparency is a major concern because many altmetric providers operate as black boxes, offering limited insight into data collection methods and coverage. The prevalence of automated bots, coordinated promotion campaigns, and self‑citation can inflate altmetric scores, raising questions about validity. Disciplinary, linguistic, and geographic biases are evident: STEM fields and English‑language outputs dominate altmetric activity, potentially marginalizing humanities, social sciences, and non‑English scholarship. Moreover, altmetrics tend to capture “visibility” rather than “substantive influence,” risking over‑interpretation of superficial engagement as meaningful impact.

To address these challenges, the author proposes several research agendas. First, the development of open standards and APIs for altmetric data, coupled with robust spam‑detection algorithms, would improve reliability and reproducibility. Second, integrative models that combine altmetrics with citations, patent citations, policy citations, and qualitative assessments (e.g., content analysis, sentiment analysis) could offer a richer picture of impact. Third, longitudinal studies are needed to test whether early altmetric signals genuinely predict downstream societal outcomes such as policy changes, technology transfer, or public health improvements. Fourth, mixed‑methods approaches that triangulate quantitative altmetric scores with qualitative case studies can help distinguish genuine influence from mere buzz. Finally, ethical and legal frameworks must be established to protect privacy, ensure informed consent for data use, and prevent misuse of altmetric indicators in high‑stakes evaluation contexts.

In sum, the chapter positions altmetrics as a promising, albeit still nascent, complement to traditional bibliometrics. By capturing the multidimensional pathways through which research diffuses into the digital public sphere, altmetrics can enrich our understanding of scholarly impact and inform more holistic evaluation practices—provided that methodological rigor, transparency, and ethical safeguards are put in place.


📜 Original Paper Content

🚀 Synchronizing high-quality layout from 1TB storage...