Category: Evidence-based Research

Book Review: Evidence-Based Policy Making in the Social Sciences: Methods that Matter edited by Gerry Stoker and Mark Evans

In Evidence-Based Policy Making in the Social Sciences: Methods that Matter,editors Gerry Stoker and Mark Evans showcase tools through which to generate evidence-based policy insights. Released amidst discussions of a ‘post-truth’ era, this book is recommended to students looking to broaden their understanding of methods for providing meaningful evidence for policy creation, but leaves open the question of how social scientists can […]

Developing social science identities in interdisciplinary research and education

While it is no longer uncommon for social scientists to be included in research groups tackling complex problems in the natural sciences, limited understanding of the different disciplinary areas within the social sciences remains a challenge. Eric Toman describes the approach social science faculties at his university have taken to address this and also outlines how graduate training programmes have […]

Elephant paths: Wider methodological transparency is needed for legal scholarship to thrive.

Mariana Gkliati calls for a reconsideration of traditional research methods in legal studies and how these methods are communicated. Most legal scholars seek to fit their conceptual analysis into narrow and strictly legal boxes, often relying on tacit knowledge from the field. Drawing on the metaphor of elephant paths, or an overlaying system for going from place to place, and behavioural psychology, […]

Fundable, but not funded: How can research funders ensure ‘unlucky’ applications are handled more appropriately?

Having a funding application rejected does not necessarily mean the research is unsupportable by funders – maybe just unlucky. There is a significant risk to wider society in the rejection of unlucky but otherwise sound applications: good ideas may slip through the cracks, or be re-worked and dulled-down to sound more likely to provide reliable results. Oli Preston looks at how […]

Getting our hands dirty: why academics should design metrics and address the lack of transparency.

Metrics in academia are often an opaque mess, filled with biases and ill-judged assumptions that are used in overly deterministic ways. By getting involved with their design, academics can productively push metrics in a more transparent direction. Chris Elsden, Sebastian Mellor and Rob Comber introduce an example of designing metrics within their own institution. Using the metric of grant income, their tool ResViz shows […]

What impact evidence was used in REF 2014? Disciplinary differences in how researchers demonstrate and assess impact

A new report produced by the Digital Science team explores the types of evidence used to demonstrate impact in REF2014 and pulls together guidance from leading professionals on good practice. Here Tamar Loach and Martin Szomszor present a broad look at the the types of evidence in use in the REF impact case studies and reflect on the association between use of evidence […]

Political History in the Digital Age: The challenges of archiving and analysing born digital sources.

The vast bulk of source material for historical research is still paper-based. But this is bound to change. Dr Helen McCarthy considers the lessons from the Mile End Institute’s conference on Contemporary Political History in the Digital Age. The specific challenges of using a ‘born digital source’ is an area that requires considerable attention. For political historians, the advent of ‘e-government’ […]

Evaluating research assessment: Metrics-based analysis exposes implicit bias in REF2014 results.

The recent UK research assessment exercise, REF2014, attempted to be as fair and transparent as possible. However, Alan Dix, a member of the computing sub-panel, reports how a post-hoc analysis of public domain REF data reveals substantial implicit and emergent bias in terms of discipline sub-areas (theoretical vs applied), institutions (Russel Group vs post-1992), and gender. While metrics are generally […]

To fight the slow pace of gender equality in the workplace, attack the root cause: invisible, unconscious bias.

Gender diversity is correlated with better business results and enormous economic and business value. But unconscious bias continues to negatively affect women in the workplace in a number of ways, writes Caroline Turner. Those who manage teams must actively reveal and uproot these biases. This piece is part of a wider series on Women in Academia and coincides with LSE Women: making history […]

Accounting for Impact? How the Impact Factor is shaping research and what this means for knowledge production.

Why does the impact factor continue to play such a consequential role in academia? Alex Rushforth and Sarah de Rijcke look at how considerations of the metric enter in from early stages of research planning to the later stages of publication. Even with initiatives against the use of impact factors, scientists themselves will likely err on the side of caution and continue to […]