Category: bibliometrics

Towards more consistent, transparent, and multi-purpose national bibliographic databases for research output

National bibliographic databases for research output collect metadata on universities’ scholarly publications, such as journal articles, monographs, and conference papers. As this sort of research information is increasingly used in assessments, funding allocation, and other academic reward structures, the value in developing comprehensive and reliable national databases becomes more and more clear. Linda Sīle, Raf Guns and Tim Engels outline […]

There is an absence of scientific authority over research assessment as a professional practice, leaving a gap that has been filled by database providers

Research metrics have become more established as a means to assess research performance. This is understandable given research institutions’ and funders’ demand for assessment techniques that are relatively cheap and universally applicable, even though use of such metrics remains strongly contested within scientific communities. But to what extent does the academic research field of evaluative citation analysis confer legitimacy to […]

Into oblivion: a closer look at the business, management and accounting research literature in Ibero-America

Faced with institutional requirements to publish in top-tier, international journals, researchers from Ibero-American countries often express concern that their work is becoming distant from their local communities. The value of participating in international debates and being able to influence the direction of research globally is sometimes provided as justification for this. But does this withstand scrutiny? Julián David Cortés-Sánchez has […]

How to compare apples with oranges: using interdisciplinary “exchange rates” to evaluate publications across disciplines

Academic research performance is typically assessed on the basis of scientific productivity. While the number of publications may provide an accurate and useful metric of research performance within one discipline, interdisciplinary comparisons of publication counts prove much more problematic. To solve this problem, Timo Korkeamäki, Jukka Sihvonen, and Sami Vähämaa introduce interdisciplinary “exchange rates”, which can be used to convert […]

Making visible the impact of researchers working in languages other than English: developing the PLOTE index

As outlined in the Leiden Manifesto, if impact is understood in terms of citations to international publications, a bias is created against research which is regionally focused and engaged with local society problems. This is particularly critical for researchers working in contexts with languages other than English. Peter Dahler-Larsen has developed the PLOTE index, a new indicator which hopes to […]

Six principles for assessing scientists for hiring, promotion, and tenure

The negative consequences of relying too heavily on metrics to assess research quality are well known, potentially fostering practices harmful to scientific research such as p-hacking, salami science, or selective reporting. The “flourish or perish” culture defined by these metrics in turn drives the system of career advancement in academia, a system that empirical evidence has shown to be problematic […]

Against metrics: how measuring performance by numbers backfires

A proliferation of companies, government agencies, and higher education institutions are in the grip of what Jerry Z. Muller has termed “metric fixation”. But by tying rewards to metrics, organisations risk incentivising gaming and encouraging behaviours that may be at odds with their larger purpose. The culture of short-termism engendered by metrics also impedes innovation and stifles the entrepreneurial element […]

How to keep up to date with the literature but avoid information overload

The sheer number of online services and social media platforms available to academics makes it possible to receive a constant stream of information about newly published research. However, much of this may serve only as a distraction from your research and staying on top of it all can even come to feel like a burden. Anne-Wil Harzing offers some simple advice […]

Random audits could shift the incentive for researchers from quantity to quality

The drive to publish papers has created a hyper-competitive research environment in which researchers who take care to produce relatively few high-quality papers are out-competed by those who cut corners so their bibliometrics look good. Adrian Barnett suggests one way to push back against the pressure to “publish or perish” is to randomly audit a small proportion of researchers and […]