excellence framework
play

Excellence Framework Follow us on Twitter at REF consultation - PowerPoint PPT Presentation

Consultation on the second Research Excellence Framework Follow us on Twitter at REF consultation events #REF2021 David Sweeney Director of Research, Knowledge Exchange and Education at HEFCE What is the REF? UK-wide framework for


  1. Consultation on the second Research Excellence Framework Follow us on Twitter at REF consultation events #REF2021 David Sweeney Director of Research, Knowledge Exchange and Education at HEFCE

  2. What is the REF? UK-wide framework for assessing research in all disciplines It is a a pee peer revie iew process. The purpose of the REF is to: • Inform the allocation of research funding by the four UK funding bodies • Provide accountability for public funding and making the case for future investment • Provide benchmarks and reputational yardsticks Additional purposes as recognised by the Stern Review • It provides a rich evidence base to inform strategic decisions about national priorities across science, social science, engineering, medicine and arts and humanities research. • It can create a strong performance incentive for universities and for individual academics. • It can be used by universities and other bodies to inform decisions on resource allocation.

  3. REF Evidence

  4. Lord Stern’s independent review • Spending Review 2015 – Review announced • December 2015 – Review to be chaired by Lord Stern • January 2016 – Call for evidence published • July 2016 – Independent review published • Government response to independent review: ‘ I have asked HEFCE to work together with the other HE funding bodies to develop detailed proposals which take forward the recommendations from the Review, including any further testing and piloting .’

  5. Impact of Lord Stern’s recommendations Key principles • Lower burden • Less game-playing • Less personalisation, more institutionally focused • Recognition for investment • Making space for long-term research • More rounded view of research activity • Interdisciplinary emphasis • Broaden impact

  6. Consultation on the second Research Excellence Framework Follow us on Twitter at REF consultation events #REF2021 Kim Hackett REF Manager

  7. Approach to the consultation REF 2021 will be based on peer review (informed by metrics, where appropriate), and assessing research outputs, impact and environment Consultation proposals are based on: • The evaluation of REF 2014 and • Implementation of the recommendations in Lord Stern’s Independent review: • 9 recommendations on the next REF • 3 recommendations on the wider context of research assessment (not directly addressed in the consultation)

  8. Features of the REF consultation We are seeking views on the following features of the next REF • Overall approach • Unit of Assessment structure • Expert panels • Staff • Collaboration • Outputs • Impact • Environment • Institutional-level assessment • Outcomes and weighting • Timetable

  9. Features of the REF consultation Overall approach • Broad continuity with REF 2014 Unit of Assessment structure • ~36 UOAs • 4 Main panels • Continuity between exercises • Consistency and streamlining across panels • Reduced visibility • High volumes of submissions Expert panels • Development of submissions guidance and panel criteria • Representativeness of expert panels

  10. Features of the REF consultation Staff and outputs • All research-active staff to be submitted • HESA definition of ‘research active’ – consistent & auditable • Staff mapping to REF UOAs – HESA cost centres • De-coupling staff and outputs, overall multiplier of two • Flexibility on number of outputs per staff member – 0/1 to 6 outputs • No individual staff circumstances but code of practice • Portability of outputs – recognition of HEI investment • Definition of ‘demonstrably generated’ • Particular impacts on certain groups

  11. Summary of key issues Challenges raised with proposed use of HESA • Defining research-active staff • Contracts (as recorded through HESA) don’t necessarily reflect activity • Including staff on T&R according to standard practice – not all research-active • R contracts in some cases reflecting KE / innovation activity • No clear-cut way to identify who are independent researchers (particularly those on R-only contracts) • Mapping to UOA by cost centre • Allocation of staff to cost centre reflects teaching purpose – not REF purpose • May result in some very small (e.g. 1 staff member) submissions • Panels would receive material for assessment not within remit • Would not allow HEI to submit cross-departmental units • Would present challenges for presenting coherent environment

  12. Features of the REF consultation Illustration of proposals 1. Number • Identify research-active staff in cost centres of outputs • Map to REF UOAs • Calculate average FTE over set period per UOA • Multiply by 2 2. Selected • Identify eligible outputs in REF period (demonstrably generated at HEI) outputs for • Select outputs up to number required • Allocate each output to at least one research-active staff member (max 6 submission outputs per person)

  13. Features of the REF consultation Staff (continued) • Staff identifier • Categories of staff eligibility • Remove Category C staff • Clarity of definition for research assistants • Fractional appointments Collaboration • Responding to the Dowling Review • Output flexibility • Research environment

  14. Features of the REF consultation Outputs • Implementation of Open Access policy • Reserve outputs for late publications • Interdisciplinary Research • Interdisciplinary ‘champions’ • Mandatory identifier • Environment template • Interdisciplinary research advisory panel (IDAP) • Metrics • Not a replacement for peer review • Quantitative metrics appropriate to the discipline • Forum for Responsible Metrics

  15. Features of the REF consultation Impact • Broadening and deepening the definition of impact and guidance on certain types of impact • Align definition of impact with Research Councils UK • Move the impact template to the assessment of environment • Number of case studies to be submitted, minimum of one • Mandatory and optional fields within case study template • Impact arising from research, research activity or a ‘body of work’ • Two-star threshold for underpinning research & standards of rigour • Impact evidence for audit and for assessment • REF2014 – REF2021 • (Institutional-level assessment)

  16. Features of the REF consultation Environment • More structured environment template • More quantitative data • Common Indicators • Equality and Diversity • Open Access and Open Data • Impact strategy • Forum for Responsible Metrics • (Institutional-level assessment)

  17. Features of the REF consultation Institutional-level assessment • Impact • Impact case studies arising from multi and interdisciplinary and collaborative work • 10-20 per cent of total impact case studies • Minimum of one • Environment • Environment statement to avoid duplication of content across UOAs • Engagement and impact strategies • Progress made against previous plans • Promoting IDR, supporting collaboration • Supporting ECRs • Promoting equality and diversity practices

  18. Features of the REF consultation Outcomes and weightings • Weighting for outputs maintained at 65 per cent • Total weighting for impact no less than 20 per cent

  19. Features of the REF consultation Timetable 1 1 Aug August 20 2013 13 Start of period for income and impacts 1 1 Jan anuary 20 2014 14 Start of period for outputs 17 17 Mar arch 20 2017 17 Consultation deadline Mid id-2017 Publish initial decisions on the next REF Mid id-2017 17 Appoint panel chairs 2018 2018 Publish guidance on submissions and panel criteria 2019 2019 Invite HEIs to make submissions 31 31 July 20 2020 20 End of assessment period (for research impacts, the research environment and related data) November 20 2020 20 Closing date for submissions 31 31 Dec December 20 2020 20 End of publication period for publication of research outputs and outputs underpinning impact case studies 2021 2021 Assessment year Dec December 20 2021 21 Publication of outcomes Spri Spring g 20 2022 22 Publication of submissions and reports

  20. Breakout discussions Follow us on Twitter at #REF2021

  21. Next steps Closing plenary Follow us on Twitter at #REF2021

  22. Next steps • 14 week consultation • Deadline: midday on Frid Friday 17 17 March 201 2017 • Analysis of responses in spring/summer 2017 • Initial decisions on next REF to be published in Summer 2017 • Advisory panels: • Equality and Diversity Advisory Panel (EDAP) • Interdisciplinary Advisory Panel (IDAP) • Forum for Responsible Metrics • Pilot activity through 2017

  23. REF2021 Timetable Pilot work Criteria phase Submissions phase Assessment phase 2010 2011-2012 2012-13 2014 Impact pilot Guidance on REF submissions Panel assessment submissions (July system launched 2011) Recruit main Panel criteria Submissions Outcomes panels (Jan 2012) deadline: 29 published: November 2013 December 2014 Pilot work Criteria phase Submissions phase Assessment phase 2017 2018-2019 2019-20 2021 Institutional level Develop and Launch REF Panel assessment pilot publish guidance submissions on submissions system and panel criteria Recruit main Submissions Outcomes panels deadline published: December 2021

Recommend


More recommend