Citations — how often a research paper is referenced by others — have long been a quantitative proxy for academic impact, underpinning much of the research evaluation used by universities, funders, journals and rankings.
Altmetrics, as their name suggests, promise an alternative: a faster, broader view of how research travels beyond academia, albeit one that critics say blurs quality with noise.
“Altmetrics capture real-time signals of how research is picked up in the wider world: through news coverage, government reports, social media debate, etc,” says Ann Campbell, director of research impact and comparative analytics at Digital Science, a London-based research technology company that owns the online tool Altmetric.
Altmetric tracks about 250mn (and rising) mentions of research,…
Citations — how often a research paper is referenced by others — have long been a quantitative proxy for academic impact, underpinning much of the research evaluation used by universities, funders, journals and rankings.
Altmetrics, as their name suggests, promise an alternative: a faster, broader view of how research travels beyond academia, albeit one that critics say blurs quality with noise.
“Altmetrics capture real-time signals of how research is picked up in the wider world: through news coverage, government reports, social media debate, etc,” says Ann Campbell, director of research impact and comparative analytics at Digital Science, a London-based research technology company that owns the online tool Altmetric.
Altmetric tracks about 250mn (and rising) mentions of research, from fleeting social media posts to citations in patents and policy documents. Mentions are linked to DOIs, the standard digital IDs pegged to academic work. When those are missing, Altmetric relies on text searches of author names, journal titles and dates to make the match.
More from the Research Insights ranking report
Making business school research relevant, plus the top 50 schools; the most cited, downloaded and used studies and teaching cases; ‘altmetrics’, sustainability and policy influence; opinions on engagement with industry and local economies
Proponents say the appeal is breadth: altmetrics track influence that citations may overlook, from local policy debates to clinical guidelines. “Not everything that matters is a journal article,” Campbell says. “Altmetrics capture when research travels beyond the academic page.”
Yet the tools are far from flawless. Stefanie Haustein, associate professor at the University of Ottawa in Canada, is blunt: she says the promise of altmetrics has fizzled. The term was coined in 2010 by information scientist Jason Priem, though the idea of measuring research beyond citations predates him.
“Altmetrics were always reduced to the metric,” Haustein says. One visible example is Digital Science’s Altmetric Attention Score, a single number displayed inside a colourful “donut” badge, which shows where the research has been mentioned.
For Haustein, that stripped away the nuance. “Their true power, if there was any, lay in who was saying what about the research,” she says. “But the content was ignored. Any tweet was as good as the next.”
Her own studies found that much of the activity recorded on X, formerly Twitter, came from academics rather than the wider public, and at times from bots. More recent evidence suggests that papers attracting heavy online mentions are not necessarily peer-reviewed or high quality, while rigorous work can go unnoticed.
For her part, Campbell at Digital Science says the Altmetric tool has matured, adding sentiment analysis to distinguish positive from negative mentions, and an “attention over time” feature to separate fleeting noise from lasting signals.
Despite the flaws, altmetrics are finding a role in research evaluation. Some universities and funders now use them to supplement assessments, adding evidence of policy impact and media reach. Business schools, too, have begun citing them in accreditation and promotion reviews as proof of relevance beyond the academy.
Howard Thomas, a veteran business school dean and adviser to accreditor EFMD International, says this reflects a deeper problem with the dominance of citations, which have been one crude currency of academic success since the 1970s.
“It’s academics talking to academics,” Thomas says. “They’ve created an iron cage for themselves where they reinforce their own work. But to what degree does that influence management practice?”
Altmetrics are imperfect, but for him they at least test whether research escapes that cage. Thomas believes the methods will improve, giving more weight to the reputation of who engages with research as well as the volume of attention.
Below, the FT distils four academic studies, from the top 10 most viewed articles, first published between July 2024 and August 2025, according to OpenAlex and scored by Altmetric.
Hybrid work, WFH and innovation
This 2024 study — “Employee innovation during office work, work from home and hybrid work” — probes a question that has been vexing employers since the coronavirus pandemic: how do different work modes affect innovation?
Michael Gibbs, of Chicago Booth School of Business, and his co-authors analysed data from more than 48,000 staff at Indian IT services firm HCL Technologies, tracking the quantity and quality of ideas submitted via the firm’s innovation portal.
When staff worked fully from home, the volume of ideas stayed steady compared with working in the office but their quality slipped: 9 percentage points fewer were then shared with clients, and the chance of receiving a high client rating fell by 18 points.
When the company moved to a hybrid model, with staff splitting time between home and the office, the picture worsened. With hybrid teams, submissions fell by around 22 per cent, a drop the authors link to co-ordination challenges. In-office “water cooler” exchanges and virtual “coffee rooms” each offer a way for colleagues to collaborate. Hybrid muddles both: some colleagues cluster in offices; others dial in remotely.
For managers, the takeaway is that hybrid is not necessarily the “best of both worlds” unless it is carefully structured. Work-life balance may improve, but the cost can be a quieter pipeline of ideas.
Outrage fuels misinformation
Angry birds: a 2024 study looked at political differences in social media suspensions. Donald Trump was banned from Twitter in 2021 after the storming of the Capitol but reinstated in 2022 © Sipa US/Alamy
This study, published in 2024, shows how moral outrage fuels the spread of misinformation on social media. For “Misinformation exploits outrage to spread online”, the authors, who include William Brady of Kellogg School of Management in Illinois, pored over 1mn news links on Facebook and 45,000 posts on X.
They then ran two experiments with about 1,500 participants, testing whether outrage made people more likely to share headlines, and whether it affected their ability to tell the truth from lies.
Three results stand out. First, misinformation evokes more outrage than trustworthy news. On Facebook, links from low-quality sources triggered more “angry” reactions, while on X they provoked a higher share of outrage-laden replies.
Second, outrage boosts sharing. Posts that attracted anger were more widely shared, and the effect was stronger for misinformation than for trustworthy news.
Third, outrage seemed to drive people to share for reasons other than accuracy: to show emotion or signal loyalty to their group. People were more likely to share outrage-provoking false stories without reading them, and outrage did nothing to improve their grasp of truth versus lies.
The implications are stark. Anger-fuelled misinformation may be harder to counter with fact-checks or prompts that ask users to consider accuracy before sharing, since people often share it for reasons unrelated to the truth. The study suggests platforms may even amplify this content, since algorithms reward engagement.
The findings suggest a sobering reality: misinformation spreads not despite its flaws, but because it provokes anger that platforms are designed to reward.
Cutting landfill food waste
Titled “Of the first five US states with food waste bans, Massachusetts alone has reduced landfill waste”, this paper examines whether state-level bans on commercial food waste in the US have delivered the promised reductions.
Fiorentia Zoi Anglou and Ioannis Stamatopoulos, of the University of Texas McCombs School of Business, and Robert Evan Sanders, of University of California, San Diego, Rady School of Management, compiled waste data across 36 US states between 1996 and 2019 to assess the first five bans, enacted from 2014.
Policymakers had predicted 10-15 per cent reductions in waste. In practice, the researchers found no discernible effect in California, Connecticut, Vermont or Rhode Island. Massachusetts was the exception, with a 13 per cent gradual decline in waste disposal.
In the other states, landfill waste levels moved almost exactly as if no bans had been introduced, suggesting the rules had little impact. Massachusetts succeeded because it built more composting sites, kept the rules simple and enforced them hard, with more than twice as many inspections as any other state.
The findings matter for both climate and policy, since food waste generates about half of food system greenhouse gas emissions, and composting curbs methane far more than landfilling. But without infrastructure and oversight, the paper shows that even well-intentioned laws can fall flat.
Social media suspensions
The 2024 paper “Differences in misinformation sharing can lead to politically asymmetric sanctions” tackles a charged question: are social media platforms biased when they suspend more conservatives than liberals? In short, the evidence suggests not.
The authors, among them Tauhid Zaman, of Yale School of Management, say the imbalance is explained by conservatives posting more links from news outlets rated as untrustworthy.
Analysing 9,000 politically active users during the 2020 US presidential election, they found accounts using pro-Donald Trump hashtags were about four times more likely to be suspended than those using pro-Joe Biden hashtags. But these same users shared far more links from outlets rated untrustworthy, whether judged by professional fact-checkers, politically balanced groups of Americans, or even Republicans alone.
The researchers modelled neutral rules that penalised any user for sharing links from untrustworthy sites. Even under these neutral conditions, conservatives still came out more likely to be suspended, because they shared more of those links.
The implication is a dilemma: if platforms enforce their rules, they look biased; if they do not, misinformation spreads.
Top 10 papers tracked by Altmetric
Source: Digital Science, year to July 2025
Misinformation exploits outrage to spread online** **Killian L McLoughlin, William J Brady, Aden Goolsbee, Ben Kaiser, Kate Klonick, MJ Crockett 1.
Differences in misinformation sharing can lead to politically asymmetric sanctions** **Mohsen Mosleh, Qi Yang, Tauhid Zaman, Gordon Pennycook, David G Rand 1.
Of the first five US states with food waste bans, Massachusetts alone has reduced landfill waste** **Fiorentia Zoi Anglou, Robert Evan Sanders, Ioannis Stamatopoulos 1.
The social costs of keystone species collapse: evidence from the decline of vultures in India** **Eyal Frank, Anant Sudarshan 1.
AI can help humans find common ground in democratic deliberation** **Michael Henry Tessler, Michiel A Bakker, Daniel Jarrett, Hannah Sheahan, Martin J Chadwick, Raphael Koster, Georgina Evans, Lucy Campbell-Gillingham, Tantum Collins, David C Parkes, Matthew Botvinick, Christopher Summerfield 1.
Employee innovation during office work, work from home and hybrid work** **Michael Gibbs, Friederike Mengel, Christoph Siemroth 1.
Megastudy testing 25 treatments to reduce antidemocratic attitudes and partisan animosity** **Christopher J Bryan, Hanne Collins, Charles Dorison, Aaron C Kay, Nour Kteily, Maytal Saar-Tsechansky and 76 others 1.
Better late than never? Gift givers overestimate the relationship harm from giving late gifts** **Cory Haltman, Atar Herziger, Grant E Donnelly, Rebecca Walker Reczek 1.
A welfare analysis of tax audits across the income distribution** **William C Boning, Nathaniel Hendren, Ben Sprung-Keyser, Ellen Stuart 1.
Are we worse off after policy repeals? Evidence from two green policies** **Dinesh Puranam, Sungjin Kim, Jihoon Hong, Hai Che