fet open
play

FET Open: main features and evaluation process Salvatore SPINELLO - PowerPoint PPT Presentation

FET Open: main features and evaluation process Salvatore SPINELLO Research Programme Officer Research Executive Agency ECML-PKDD in Skopje, Sept. 18 th 2017 Content FET-OPEN in general Gatekeepers Evaluation process


  1. FET Open: main features and evaluation process Salvatore SPINELLO Research Programme Officer Research Executive Agency ECML-PKDD in Skopje, Sept. 18 th 2017

  2. Content • FET-OPEN in general • Gatekeepers • Evaluation process • Some Statistics • DEDALE: a very good example • FET Innovation Launchpad 2

  3. FET vs H2020 H2020 budget € 74,8 billion *OTHER: • Spreading excellence & widening participation • Science with and for society • JRC • EIT 3

  4. FET: Novel ideas for radically new technologies visionary thinking … but very concrete mission € 2.6 billion to initiate radically new lines of technologies - collaborative research - extend Europe’s capacity for advanced and paradigm-changing innovation - foster scientific collaboration across disciplines on radically new, high-risk ideas 4

  5. The power of FET complementary schemes Open, light and agile Roadmap based research FET Flagships FET-Open FET Proactive Exploration and Large-Scale Early Ideas Incubation Partnering Initiatives Critical mass Common research Independent making a case agenda research projects Exploring Developing Addressing novel ideas topics & communities grand challenges

  6. FET-Open RIA: supporting early-stages of research to establish a new technological possibility  Collaborative projects up to € 3 Mio funding (indicative)  Single step submission, '1+15' pages  Early stages of R&I on any new technological possibility  Proposals evaluated and ranked in one single Panel  Scope defined by FET gatekeepers 6

  7. Technologies Future and Emerging Foundational High-risk Novelty S&T breakthrough FET gatekeepers Interdisciplinary Long-term vision

  8. FET gatekeepers A new, original vision of technology-enabled possibilities going far beyond the state of the art S&T breakthrough Long-term vision Interdisciplinary Foundational High-risk Novelty Future and Emerging Technologies

  9. FET gatekeepers The proposed collaborations must go beyond current mainstream collaboration configurations in joint S&T research, and must aim to advance different scientific and technological disciplines S&T breakthrough Long-term vision Interdisciplinary Foundational High-risk Novelty Future and Emerging Technologies

  10. FET gatekeepers Scientifically ambitious and technologically concrete breakthroughs plausibly attainable within the life-time of the project S&T breakthrough Long-term vision Interdisciplinary Foundational High-risk Novelty Future and Emerging Technologies

  11. FET gatekeepers New Ideas and concepts, rather than the application or incremental refinement of existing ones S&T breakthrough Long-term vision Interdisciplinary Foundational High-risk Novelty Future and Emerging Technologies

  12. FET gatekeepers Balancing the high risk versus being utterly unrealistic . High-risk is not a synonym with not-doable S&T breakthrough Long-term vision Interdisciplinary Foundational High-risk Novelty Future and Emerging Technologies

  13. FET gatekeepers The breakthroughs can establish a basis for a new line of technology not currently anticipated S&T breakthrough Long-term vision Interdisciplinary Foundational High-risk Novelty Future and Emerging Technologies

  14. evaluation process Creation of a Pool Creation of a Pool of Experts CoI Eligibility Experts Remote Proposal check assignment evaluations submission submission Quality Research Executive Agency Applicant check Ethics Feedback screening/ Panel review Panel review Cross-reading Cross reading in 5 months in 5 months assessment 14

  15. Pool of (excellent) Experts A few months before the call deadline • We identify gaps in the scientific disciplines covered by the previous group of expert evaluators • We identify high-quality experts to fill those gaps (EMI, publication databases, h-factor, etc.) • We Contact these new experts to check their availability https://ec.europa.eu/research/participants/portal/ desktop/en/experts/index.html 15

  16. Experts Independence: They are evaluating in a personal capacity. They represent neither their employer, nor their country! Impartiality: They must treat all proposals equally and evaluate them impartially on their merits, irrespective of their origin or the identity of the applicants Objectivity: They evaluate each proposal as submitted; meaning on its own merit, not its potential if certain changes were to be made Accuracy: They make their judgment against the official evaluation criteria and the call or topic the proposal addresses, and nothing else Consistency: They apply the same standard of judgment to all proposals

  17. evaluation process Creation of a Pool Creation of a Pool of Experts CoI Eligibility Experts Remote Proposal check assignment evaluations submission submission 4 independent Remote Evaluators (per proposal) are selected from the Pool Quality Applicant check Ethics Feedback screening/ Panel review Panel review Cross-reading Cross reading in 5 months in 5 months assessment 17

  18. Allocation of proposals to experts quantify and categorize semantic similarities between linguistic items Build a Semantic model based on their distributional properties in large samples of documents Select characteristic keywords from submitted Proposals and experts' fingerprint documents and experts' publications Compute the distance between Similarities between proposals needs experts and proposals and experts skills Global allocation between all experts proposals coverage optimization using and all proposals constrained integer programming problem Officials, helped by Vice Chairs, will validate/adjust the pre-assignments given by the system

  19. evaluation process Creation of a Pool Creation of a Pool of Experts CoI Eligibility Experts Remote Proposal check assignment evaluations submission submission Each expert prepares his/her Individual Quality Applicant Evaluation Report (IER) check Ethics Feedback screening/ Panel review Panel review Cross-reading Cross reading in 5 months in 5 months assessment 19

  20. Evaluation criteria RIA Excellence Impact Implementation Clarity and novelty of long-term Importance of the new Soundness of the workplan and vision, and ambition and concreteness technological outcome with clarity of intermediate targets. of the targeted breakthrough towards regards to its transformational Relevance of expertise in the that vision. impact on technology and/or consortium, Novelty, non-incrementality and society. Appropriate allocation and plausibility of the proposed research Impact on future European justification of resources (person- for achieving the targeted scientific and industrial months, equipment). breakthrough and its foundational leadership, notably from character. involvement of new and high Appropriateness of the research potential actors. methodology and its suitability to Quality of methods and address high scientific and measures for achieving impact technological risks. beyond the research world and Range and added value from for establishing European though interdisciplinarity, including measures leadership, as perceived by for exchange, cross-fertilisation and industry and society. synergy. Threshold: 4/5 Threshold: 3,5/5 Threshold: 3/5 20 Weight: 60% Weight: 20% Weight: 20%

  21. evaluation process Creation of a Pool Creation of a Pool of Experts CoI Eligibility Experts Remote Proposal check assignment evaluations submission submission Quality check of IERs , possibly with Quality several iterations (if necessary), to ensure Applicant check full compliance with the evaluation criteria/sub-criteria Ethics Feedback screening/ Panel review Panel review Cross reading Cross-reading in 5 months in 5 months assessment 21

  22. evaluation process Creation of a Pool Creation of a Pool of Experts CoI Eligibility Experts Remote Proposal check assignment evaluations submission submission Quality Applicant check • Collation of 4 IERs (no consensus) , 3 median scores calculated on 4+4 single scores per criterion • Underline and analyse 'diverging' opinions Ethics Feedback screening/ Panel review Panel review Cross reading Cross-reading in 5 months in 5 months assessment 22

  23. evaluation process Creation of a Pool Creation of a Pool of Experts CoI Eligibility Experts Remote Proposal check assignment evaluations submission submission • Detailed discussion in clusters of all 'highly scored' proposals with special attention to 'diverging' opinions Applicant Quality check Quality check • Final score decision by consensus or vote , if necessary • Final objective: ranking list Ethics Panel review Panel review Feedback screening/ Cross-reading Cross reading central meeting of central meeting of in 5 months in 5 months assessment cross-readers 23

Recommend


More recommend