efficiency in health research time for a haircut from the
play

Efficiency in Health Research Time for a Haircut from the - PowerPoint PPT Presentation

Efficiency in Health Research Time for a Haircut from the Barber-Surgeon Joel Ray 1 CSIM Annual Meeting 2019 Conflict Disclosures None 3 4 What DS said: The Canadian Institutes of Health Research (which replaces the


  1. Efficiency in Health Research – Time for a Haircut from the Barber-Surgeon Joel Ray 1

  2. CSIM Annual Meeting 2019 Conflict Disclosures “None”

  3. 3

  4. 4

  5. What DS said: • “The Canadian Institutes of Health Research (which replaces the Medical Research Council and certain other agencies) is charged with the task of reorganizing health research to better serve the health of Canadians.” • “I suggest that it will fail to do so unless it dramatically increases support for randomised clinical trials (RCTs) and those who design and conduct them.” 5

  6. … sort of, again …. but, we all must go on trial, and some are going to get wet 6

  7. 7

  8. • CIHR was created in June 2000 by the Canadian Institutes of Health Research Act • Mandate: “to excel, according to internationally accepted standards of scientific excellence, in the creation of new knowledge and its translation into improved health for Canadians, more effective health services and products and a strengthened Canadian health care system.” 8

  9. CIHR planned spending $1.14 billion $1.15 billion $1.05 billion Funding Health Research and Training 9

  10. CIHR's core values • Excellence • Scientific Integrity and Ethics • Collaboration • Innovation • Public Interest – “The public interest is of paramount importance in the creation and use of health knowledge through all research and related activities supported by CIHR.” 10

  11. CMAJ 1995;152 11

  12. N Engl J Med 2002; 346:285-287 12

  13. 13

  14. https://blogs.scientificamerican.com/observations/are-we-measuring-research-success-wrong/ 14

  15. Joshua M. Pearce says … • Academic researchers are, for the most part, competitive. • These intellectual gladiators like to succeed— but more than that, they like to win. • Historically this “winning” was determined by solving problems no one else has ever solved before, thereby driving a particular scientific discipline forward. 15

  16. • Recently, however, many universities have been overrun by administrators without sufficient academic qualifications to obtain tenure in their own disciplines. • These administrators needed some relatively simple way to determine which academic researchers were winning. The metric that has gained traction among such administrators is “research expenditures.” 16

  17. The The research researcher institute 17

  18. • As a metric, “research expenditures” enables administrators to compare individual faculty members on what appears to be a level playing field. • It also boils down the research efforts of an entire university to a single number to be used for simpleminded ranking. 18

  19. • The more grants you win, the more time you have to spend administering the grant: managing budgets, writing reports and meeting with grant administrators. • This reduces the time and effort you can put into research. • What if the collective effect of focusing on research expenditures actually is slowing science down? 19

  20. There exists a real and undocumented conflict of interest between a research institute and the researcher who resides therein 20

  21. $ 21

  22. $ indirect costs • Each paid research institution receives from CIHR a ~ 17% “Indirect costs” (ICP) re- imbursement (even though the real costs are actually closer to 40%). • Provides funding to universities, colleges and Includes scientists’ salaries research hospitals to help cover a portion of the indirect costs associated with the research funded by federal granting councils. 22 Tenth-year Evaluation of the Indirect Costs Program. NSERC-SSHRC Evaluation Division. June 23, 2014

  23. University of Toronto 23

  24. $ indirect costs • Hence, the institution has a potential conflict of interest. • The more operating funds their researchers get  the more ICP the institution gets. • Thus, the incentive is for their researchers to bring in more tri-council grant money (in addition to the many positive aspects of having successful researchers, including institutional prestige). 24

  25. • But CIHR may not see this as a necessary good • As it may promote excesses • And betray the public trust • CIHR is a FUNDING agency, so its obligation is to ensure that funds are spent in a productive manner, for health promotion. 25

  26. So other metrics have been proposed 26

  27. The “Hirsch score” • In 2009, Dr. Greg Hirsch • A "Deliverable scoring metric“ • Keep track and give value to research being done in Dept. of Surgery at Dalhousie. • Annually: All deliverables are accounted and compiled. 27

  28. 28

  29. 29

  30. 30

  31. 31

  32. Research output score (ROP) • Sum of grant points (g), publication points (p) and PhD supervision points (s): ROP = g + p + s 32

  33. For grant points 33

  34. g = My addition My addition (Number of PI or Co-PI Number of co-applicant grants over “Z” years) grants over “Z” years x x (some sort of + (some sort of weighting factor) weighting factor) x x (total dollar value) (total dollar value) 34

  35. For supervision points • Consider each PhD student supervision as 1 point • Consider each Master’s student supervision as 0.5 points • Consider each PhD or MSc thesis committee non- supervisor role as 0.2 points • Consider supervision of a clinical trainees (med student, resident or fellow as 0.2 points) • All of the above must have a formal protocol of research written up. • Sum up over “X” years 35

  36. For publication points 36

  37. x (journal impact factor) P = 37

  38. • There are critics of journal impact factor, etc… 38

  39. https://arxiv.org/pdf/1507.02099.pdf 39

  40. PLOS ONE | DOI:10.1371/journal.pone.0173152 March 9, 2017 40

  41. After 1 year, 5 years and 10 years of a scientist’s academic appointment … • At the level of one’s own research institute, ROPs can be compared. • Annually, each scientist submits a standardized ROP = g + p + s • ROP percentile is then created for each person, standardized to the number of years since their first academic appointment (accounting for leave of absence). 41

  42. But what about how well they spend their grant money? • CIHR must respect the public trust • I.e., taxpayers fund researchers’ research • That these monies must be well spent • As the pool of money is clearly limited • So, productive people who get things done should be well funded • Unproductive people should not be funded (or funded less) 42

  43. Efficiency in labor 43

  44. Flip this around: How much ROP per grant $ 44

  45. I propose a new additional (dimensional) metric for research output “Productivity of Research Output ( P ROP)” = Output Input = ROP x 1000 R&D expenditures = g + p + s x 1000 Research funding 45

  46. PROP example 1 Researcher A Researcher B • ROP = 20 • ROP = 10 • Grants = $100,000 • Grants = $100,000 PROP = (20/100,000) * 1000 PROP = (10/100,000) * 1000 = 0.2 = 0.1 While grants ( g ) are a part of ROP, grants cancel out in this scenario 46

  47. PROP example 2 Researcher A Researcher B • ROP = 20 • ROP = 20 • Grants = $1,000,000 • Grants = $100,000 PROP = (20/1,000,000) * 1000 PROP = (20/100,000) * 1000 = 0.02 = 0.2 Grants (g) are a part of ROP, and grants do not cancel out in this scenario 47

  48. PROP example 3 Researcher A Researcher B • ROP = 100 • ROP = 10 • Grants = $1,000,000 • Grants = $100,000 PROP = (100/1,000,000) * 1000 PROP = (10/100,000) * 1000 = 0.1 = 0.1 Grants (g) are a part of ROP, and grants do not cancel out in this scenario 48

  49. PROP • Not only applied to an individual researcher • Can be applied to a research institute, where the Institutional mean PROP, X = where i is each of the appointed scientists 49

  50. Exceptions to the rule are necessary “Someone has painted an amusing picture of Newton ordered by the director of a modern organization to make a progress report on his theory of gravitation.” LA ROGERS. “WHAT CONSTITUTES EFFICIENCY IN RESEARCH?” Address of the President at the Twenty-fourth annual meeting of the Society of American Bacteriologists, Detroit, Michigan, December, 1922. 50

  51. • Could allow individuals whose work does not fall under these metrics, to apply for an exemption • E.g., those solely involved in policy work 51

  52. Research output score (ROP) • Sum of grant points (g), policy points (p) and PhD supervision points (s): ROP = g + p + s 52

  53. CIHR's core values • Excellence • Scientific Integrity and Ethics • Collaboration • Innovation • Public Interest - The public interest is of paramount importance in the creation and use of health knowledge through all research and related activities supported by CIHR. 53

  54. What should CIHR do with researchers who have a low PROP? • Those < 10 th percentile would be offered strategies (e.g., a course to improve their efficiency). • If remain < 10 th percentile after some amount of time  exclude from applying as a PI or co-PI for tri-council funding, on the likely assumption that they are unable to meet productivity requirements. • This is objective • This might save CIHR both time (for reviews and other administration) and money. 54

Recommend


More recommend