Category: evaluation

Q and A with Dr Dylan Mulvin on Proxies: The Cultural Work of Standing In

We speak to Dr Dylan Mulvin, Assistant Professor in LSE Department of Media and Communications, about his book Proxies: The Cultural Work of Standing In, which examines the ways in which proxies shape our lives, the histories of their production and ho…

Guidance publications – Are we getting it right?

At the end of September we ran a ‘quick poll’ asking for feedback on the current and proposed guidance publications that DCC produces with Jisc support. This aimed to consult the UK institutional RDM community on topics and priorities, and complemented broader conversations that DCC colleagues Magdalena Getter and Diana Sisu had over the summer to better understand how our output is perceived.

For the poll we targeted subscribers of the Jiscmail research-dataman list and asked a few questions about the topical focus of the guidance, and which formats are seen as most useful.  By next week we’ll have agreed how to respond to the results in a way that ties in with Jisc’s Research at Risk programme. As soon as we can after that, we’ll post a publications schedule to the list and on these pages.

The poll brought responses from 30 people, and almost all gave some feedback on our current output. They picked 12 of our recent publications, and rated them using these three criteria:

  • Do they provide informative content?
  • Is that content clearly presented?
  • Does the publication help you address your organisation’s data management challenges?

We used a five-point scale from ‘strongly agree’ to ‘strongly disagree’, and scored each publication from 2 to -2 according to how many responses there were on each point on that scale.

The table below shows the results. In some cases individual titles have been rated by only one or two people. While it would be wrong to compare the titles on that basis on such small figures, it’s one way we’ll do so from now on. And I’m happy to say that 11 of the 12 publications passed the ‘agree’ test on all three criteria (one case study wasn’t thought helpful enough).  Averaging the scores gave a 70% average score overall, with 80% for ‘informative’, and 73% for ‘clearly presented’.  

For us the bottom line is, ‘does the guidance help people address their organisation’s data management challenges?’ The average score on this was 55% so there is definite room for improvement, and I hope we can make our forthcoming output even more practical. Underlining that need for practicality, the format that people see the greatest need for can be summed up in two words; “examples, please!”

Respondents also commented on how we might have done better on the current publications. We can incorporate some of these ideas in future updates of the titles (named in brackets), and through our new output, e.g:

  • “Estimating storage requirements remains to be a key issue.  More suggestions on this and how others have estimated need are required!” (How to Discover Requirements)
  •  “It’s very useful. Include ‘Consent’ in section 3. Update ‘About this work’ and combine with ‘About DCC’. Supply electronically with space for local University contact details.” (DCC Checklist for a Data Management Plan)
  • “Maybe some practical examples; this area is becoming more nuanced by discipline and so more subject or discipline specific guides/case studies and examples would be useful” (Five steps to decide what data to keep)
  • “Split into two publications – one on creating citations for existing datasets, one for creating the citation for your own dataset, [and] including data access statement” (How to cite datasets and link to publications)
  •  “More details about particular techniques and links to more examples and tools that we could use” (How to Develop RDM Services)
  • “More focus on clarifying some of the issues raised in the recent RCUK concordat. We need more precise requirements on issues like preservation and curation” (Funder policy summary)

Stepping back from individual titles, we also want to improve on how we gather this feedback. We’ll be consulting more widely and deeply in future, using these or similar metrics, and badgering people for comments both online and wherever we set out our publications stall.


 Guidance publication titles and rating scores

(strongly agree=2; agree=1, not sure=0, disagree=-1, strongly disagree=-2)

Number of responses




Clearly presented


Helps meet challenges


Score average


How to License Research Data 5 9 6 6 4.2
How to Discover Requirements for RDM Services 4 6 5 3 3.5
How to Appraise & Select Research Data for Curation 3 5 5 3 4.3
DCC Checklist for a Data Management Plan (fold out) 2 2 3 2 3.5
Five steps to decide what data to keep 2 3 2 4 4.5
Funder policy summary 2 3 3 3 4.5
How to Track the Impact of Research Data with Metrics 2 4 4 2 5.0
Five Things You Need to Know About RDM and the Law 1 2 2 1 5.0
How to Cite Datasets and Link to Publications 1 1 1 1 3.0
How to Develop a Data Management and Sharing Plan 1 2 1 1 4.0
How to Develop RDM Services 2 3 3 2 4.0
RDM strategy: moving from plans to action 1 1 2 -1 2.0
“All”/ “various” 2 4 4 4 6.0
  28 45 41 31 4.2