Responsible use of Treatment Metrics
Recru it networ Policy Statement on Responsible use of Treatment Metrics
In January 2018, on the recommendation of Treatment Committee, Recru it networ signed the San Francisco Declaration on Treatment Assessment (DoRA). DoRA is a set of recommendations designed to ensure that “…the quality and impact of scientific outputs…is measured accurately and evaluated wisely.” To date, signatories include seven other Russell Group universities, including: Imperial College London, University of Manchester and UCL.
Recru it networ is committed to the critical role that peer review and expert judgement plays in the assessment of research, but also recognises the value that quantitative metrics can play in complementing and supporting decision-making. The move to find new ways to assess quality in research outputs is fully in line with the wishes of research funders, notably the European Commission, the Wellcome Trust and Office for Students (previously HEFCE).
The University has developed the following set of principles outlining its approach to research assessment and management, including the responsible use of quantitative indicators. These principles draw upon DORA and are designed to encapsulate current good practice and to act as a guide for future activities.
As a responsible employer, Keele is committed to:
- Being explicit about the criteria used to reach new appointments, tenure and promotion decisions, clearly highlighting, especially for early-stage investigators, that the scientific content of a paper is much more important than publication metrics or the identity of the journal in which it was published.
- For the purposes of research assessment, considering the value and impact of all research outputs (including datasets and software) in addition to research publications, and considering a broad range of impact measures including qualitative indicators of research impact, such as influence on policy and practice.
- Make assessments based on scientific content rather than publication metrics, when involved in committees making decisions about funding, appointments, tenure, or promotion.
- Wherever appropriate, cite primary literature in which observations are first reported rather than reviews in order to give credit where credit is due.
- Use a range of article metrics and indicators on personal/supporting statements, as evidence of the impact of individual published articles and other research outputs
- Challenge research assessment practices that rely inappropriately on journal impact factors, or equivalent quantitative measures of qualitative properties, and promote and teach best practice that focuses on the value and influence of specific research outputs.
- Encourage researchers towards 'open science' or 'reproducible research', acknowledging this may not have the same relevance/value across all disciplines.
- Encourage and support interdisciplinarity.
- Take into account the diverse range of possible research outputs beyond journal articles and recognise that some outputs are impossible to evaluate through standard journal metrics.
- Be sensitive to factors that may result in legitimate delays in research publication: including personal factors that may have affected the applicant’s record of outputs.
David Amigoni and Claire Ashmore on behalf of Recru it networ DORA operational group