Category: research metrics

At what point do academics forego citations for journal status?

The limitations of journal based citation metrics for assessing individual researchers are well known. However, the way in which these assessment systems differentially shape research practices within disciplines is less well understood. Presenting evi…

For China’s ambitious research reforms to be successful, they will need to be supported by new research assessment infrastructures

The Chinese government recently announced that research assessment in China should no longer be predominantly focused on metrics, Web of Science based indicators and what has become known as ‘SCI worship’. In this post Lin Zhang and Gunnar Sivertsen discuss how China’s new research policy might be implemented and the parallels it has to recent attempts to reform … Continued

The careers of carers – A numerical adjustment cannot level the playing field for researchers who take time off to care for children

Quantitative measures of the effect of caring for children on research outputs (published papers and citations) have been used by some universities as a tool to address gender bias in academic grant and job applications. In this post Adrian Barnett argues that these adjustments fail to capture the real impacts of caring for children and should be replaced with contextual […]

Book Review: Scholarly Communication and Measuring Research – What Does Everyone Need to Know?

Academics are required to not only find effective ways to communicate their research, but also to increasingly measure and quantify its quality, impact and reach. In Scholarly Communication: What Everyone Needs to Know, Rick Anderson puts us in the picture. And in Measuring Research: What Everyone Needs to Know, Cassidy Sugimoto and Vincent Lariviere critically assess over 20 tools currently available for evaluating the quality […]

Are altmetrics able to measure societal impact in a similar way to peer review?

Altmetrics have become an increasingly ubiquitous part of scholarly communication, although the value they indicate is contested. In this post, Lutz Bornmann and Robin Haunschild present evidence from their recent study examining the relationship of peer review, altmetrics, and bibliometric analyses with societal and academic impact. Drawing on evidence from REF2014 submissions, they argue altmetrics may provide evidence for wider […]

Now is the time to update our understanding of scientific impact in light of open scholarship

Sascha Friesike, Benedikt Fecher and Gert. G. Wagner outline three systemic shifts in scholarly communication that render traditional bibliometric measures of impact outdated and call for a renewed debate on how we understand and measure research impact. New digital research infrastructures and the advent of online distribution channels are changing the realities of scientific knowledge creation and dissemination. Yet, the […]

Better, fairer, more meaningful research evaluation – in seven hashtags

Considering the future of research assessment, Elizabeth Gadd outlines how she believes research evaluation could be made better, fairer, and more meaningful. The resulting seven guiding principles, neatly framed as hashtags, range from understanding our responsibilities to researchers as people, through to ensuring our evaluations are a more formative process, offering valuable, constructive feedback. Imperial College recently held an event […]

There is an absence of scientific authority over research assessment as a professional practice, leaving a gap that has been filled by database providers

Research metrics have become more established as a means to assess research performance. This is understandable given research institutions’ and funders’ demand for assessment techniques that are relatively cheap and universally applicable, even though use of such metrics remains strongly contested within scientific communities. But to what extent does the academic research field of evaluative citation analysis confer legitimacy to […]

Against metrics: how measuring performance by numbers backfires

A proliferation of companies, government agencies, and higher education institutions are in the grip of what Jerry Z. Muller has termed “metric fixation”. But by tying rewards to metrics, organisations risk incentivising gaming and encouraging behaviours that may be at odds with their larger purpose. The culture of short-termism engendered by metrics also impedes innovation and stifles the entrepreneurial element […]

2017 in review: round-up of our top posts on metrics

Mendeley reader counts offer early evidence of the scholarly impact of academic articles Although the use of citation counts as indicators of scholarly impact has well-documented limitations, it does offer insight into what articles are read and valued. However, one major disadvantage of citation counts is that they are slow to accumulate. Mike Thelwall has examined reader counts from Mendeley and found them […]