Category: metrics

2017 in review: top posts of the year

As 2017 nears its end and before our focus is drawn to whatever the new year might have in store, now is the perfect time to look back and reflect on the last twelve months on the Impact Blog. Editor Kieran Booluck reports on another year in which our readership has grown, and also shares a selection of the most […]

2017 in review: round-up of our top posts on metrics

Mendeley reader counts offer early evidence of the scholarly impact of academic articles Although the use of citation counts as indicators of scholarly impact has well-documented limitations, it does offer insight into what articles are read and valued. However, one major disadvantage of citation counts is that they are slow to accumulate. Mike Thelwall has examined reader counts from Mendeley and found them […]

Where are the rising stars of research working? Towards a momentum-based look at research excellence

Traditional university rankings and leaderboards are largely an indicator of past performance of academic staff, some of whom conducted the research for which they are most famous elsewhere. Paul X. McCarthy has analysed bibliometric data to see which research institutions are accelerating fastest in terms of output and impact. The same data also offers a glimpse into the future, helping […]

Metrics, recognition, and rewards: it’s time to incentivise the behaviours that are good for research and researchers

Researchers have repeatedly voiced their dissatisfaction with how the journals they publish in are used as a proxy for the evaluation of their work. However, those who wish to break free of this model fear negative consequences for their future funding and careers. Rebecca Lawrence emphasises the importance of addressing researchers’ recognition and reward structures, arguing it is time to […]

Better information on teaching is required to redress the balance with research

How universities allocate resources – and how academics allocate their own time – between research and teaching is a perennial problem in higher education. The labour market for research is intensely competitive and truly global; while the market for academics focused on teaching is notable by its lack of competition. An obvious result is that academics’ promotion prospects depend primarily […]

Does research evaluation in the sciences have a gender problem? What do altmetrics tell us?

Many measures used for research evaulation, such as citations or research output, are hindered by an implicit gender bias. Stacy Konkiel examines whether or not altmetrics, which track how research is discussed, shared, reviewed, and reused by other researchers and the public, might be better suited to help understand the influence of research in a more gender-balanced way. Findings suggest […]

Reshaping the tenure and promotion process so that it becomes a catalyst for innovative and invigorating scholarship

The metrics used to identify excellence, and on which current tenure and promotion decisions are based, have become a barrier to more exciting and innovative scholarship. Christopher P. Long suggests an overhaul of tenure and promotion practices, advocating a holistic approach in which structured mentoring plays a key role and values-based metrics that will empower faculty to tell more textured […]

The methodology used for the Times Higher Education World University Rankings’ citations metric can distort benchmarking

The Times Higher Education World University Rankings can influence an institution’s reputation and even its future revenues. However, Avtar Natt argues that the methodology used to calculate its citation metrics can have the effect of distorting benchmarking exercises. The fractional counting approach applied to only a select number of papers with high author numbers has led to a situation whereby […]

Research assessments based on journal rankings systematically marginalise knowledge from certain regions and subjects

Many research evaluation systems continue to take a narrow view of excellence, judging the value of work based on the journal in which it is published. Recent research by Diego Chavarro, Ismael Ràfols and colleagues shows how such systems underestimate and prove detrimental to the production of research relevant to important social, economic, and environmental issues. These systems also reflect the biases […]