Rethinking Evidence Synthesis Prof Enrico Coiera Director, Centre for Health Informatics Australian Institute of Health Innovation Macquarie University Sydney, Australia
Variation in care is high • Caretrack study found 57% of Australians receive care in line with level 1 evidence or consensus guidelines ( Med J Aust 2012; 197 (2): 100-105.) • Causes for practice variation include: • Patient specific needs e.g. co-morbidity • Patient preferences • Clinician preferences • Working with out of date evidence
Evidence synthesis is slow In Australia we don’t always deliver the care that guidelines and experts agree 22794056 on as appropriate. 17638714 Systematic reviews could be updated as soon as a new study results are available (this means we need to do the right trials). Systematic reviews can take years to complete and are extremely resource-intensive, so many are out of date – some as soon as they are published. 20644625 3
Clinical evidence is often biased 24411643 Due to biases in the design, undertaking, reporting, and synthesis in clinical research, about 85% of it is wasted. 20679560 Trials that are funded by industry are less likely to be published within 2 years, and when they are, they are more likely to have favourable results. 23861749 When trials are published, some outcomes are incompletely reported or not reported at all. Safety outcomes are affected more than efficacy outcomes. When reviewers and systematic reviewers synthesise the 25285542 results from many clinical studies, those with financial conflicts of interest are more likely to report favourably. 4
RCTs and guidelines have limitations • They do not represent real-world populations: • Co-morbidities are excluded • May be highly geographically localized introducing biases • Often are too small to detect small effect sizes and too short to detect long-term effects. • Patients have their own preferences once benefits and harms are explained.
Panel Dr. Guy Tsafnat “ The automation of evidence summarisation” Leads the Computable Evidence Lab which is dedicated to automation and optimisation of evidence based medicine Dr. Julian Elliott “ Combining human effort and machines ” Head of Clinical Research at Alfred Hospital and Monash University and Sr Researcher at Australasian Cochrane Centre Dr. Adam Dunn “ When biases in evidence synthesis lead to harm or waste ” Leads the Computational Epidemiology Lab, monitoring biases in the design, reporting, & synthesis of clinical trials Dr. Blanca Gallego-Luxan “ Learning from ‘ patients like mine’ ” Leads the Health Analytics Lab, designing, analysing and developing models derived from complex empirical data MQ | AIHI I CHI 6
Panel Dr. Guy Tsafnat “ The automation of evidence summarisation” Leads the Computable Evidence Lab which is dedicated to automation and optimisation of evidence based medicine Dr. Julian Elliott “ Combining human effort and machines ” Head of Clinical Research at Alfred Hospital and Monash University and Sr Researcher at Australasian Cochrane Centre Dr. Adam Dunn “ When biases in evidence synthesis lead to harm or waste ” Leads the Computational Epidemiology Lab, monitoring biases in the design, reporting, & synthesis of clinical trials Dr. Blanca Gallego-Luxan “ Learning from ‘ patients like mine’ ” Leads the Health Analytics Lab, designing, analysing and developing models derived from complex empirical data MQ | AIHI I CHI 7
Systematic Reviews A robust model for evidence based medicine * Tsafnat, Glasziou, Choong, et al. Sys Rev 3:74 2014 AIHI I CHI I CEL 8
The Manual Process * Tsafnat, Glasziou, Dunn, Coiera The BMJ , 346:f139, 2013 Preparation Retrieval Appraisal Synthesis Meta-Analysis Write-up
Automation * Tsafnat, Glasziou, Dunn, Coiera The BMJ , 346:f139, 2013 Preparation Retrieval Appraisal Synthesis Meta-Analysis Write-up
Automation * Tsafnat, Glasziou, Dunn, Coiera The BMJ , 346:f139, 2013
Clinical Queries AIHI I CHI I CEL 12 Guidelines
Search Automation Saved strategies AIHI I CHI I CEL 13
Citation Networks *Robinson, Dunn, Tsafnat, Glasziou, Journal of Clinical Epidemiology 67(7) 2014 AIHI I CHI I CEL 14
Information Extraction from Trials * Kiritchenko et al., BMC Med Inform Decis Mak , 10, 2010 (text from Kawamura et al. Dev med child neuro 49, 2007) This study compare the effects of low and high doses of botulinum toxin A (BTX-A) to improve upper extremity function. Thirty-nine children (22 males, 17 females) with a mean age of 6 years 2 months (SD 2y 9mo) diagnosed with spastic hemiplegia or triplegia were enrolled into this double-blind, randomized controlled trial. The high-dose group received BTX-A in the following doses: biceps 2U/kg, brachioradialis 1.5U/kg, common flexor origin 3U/kg, pronator teres 1.5 U/kg, and adductor/opponens pollicis 0.6U/kg to a maximum of 20U. (from Kawamura et al. Dev med child neuro 49, 2007) The low-dose group received 50% of this dosage. Outcomes were measured at baseline and at 1 and 3 months after injection, and results were analyzed with a repeated-measures analysis of variance. AIHI I CHI I CEL 15
Panel Dr. Guy Tsafnat “ The automation of evidence summarisation” Leads the Computable Evidence Lab which is dedicated to automation and optimisation of evidence based medicine Dr. Julian Elliott “ Combining human effort and machines ” Head of Clinical Research at Alfred Hospital and Monash University and Sr Researcher at Australasian Cochrane Centre Dr. Adam Dunn “ When biases in evidence synthesis lead to harm or waste ” Leads the Computational Epidemiology Lab, monitoring biases in the design, reporting, & synthesis of clinical trials Dr. Blanca Gallego-Luxan “ Learning from ‘ patients like mine’ ” Leads the Health Analytics Lab, designing, analysing and developing models derived from complex empirical data MQ | AIHI I CHI 18
Julian’s video goes here 19
Panel Dr. Guy Tsafnat “ The automation of evidence summarisation” Leads the Computable Evidence Lab which is dedicated to automation and optimisation of evidence based medicine Dr. Julian Elliott “ Combining human effort and machines ” Head of Clinical Research at Alfred Hospital and Monash University and Sr Researcher at Australasian Cochrane Centre Dr. Adam Dunn “ When biases in evidence synthesis lead to harm or waste ” Leads the Computational Epidemiology Lab, monitoring biases in the design, reporting, & synthesis of clinical trials Dr. Blanca Gallego-Luxan “ Learning from ‘ patients like mine’ ” Leads the Health Analytics Lab, designing, analysing and developing models derived from complex empirical data MQ | AIHI I CHI 20
CENTRE FOR HEALTH INFORMATICS | AUSTRALIAN INSTITUTE OF HEALTH INNOVATION Systematic reviews are fundamentally limited by the quality and transparency of the primary The evidence-practice disconnect evidence on which they are based… Registering clinical trials (2003) 10.1001/jama.290.4.516 Solutions: (a) improve the quality and transparency of the studies that can be included in reviews, or (b) create new forms of evidence synthesis that do not rely on the current ways that clinical studies are reported. 21
CENTRE FOR HEALTH INFORMATICS | AUSTRALIAN INSTITUTE OF HEALTH INNOVATION Synthesis biases: when reviews include evidence selectively or when results and conclusions don’t match. Systematic reviews with COIs produced more favourable conclusions 10.7326/m14-0933 Publication bias: when clinical studies are never published, or published after a long delay. 66% of trials had published results 10.7326/0003-4819-153-3-201008030-00006 Reporting bias: when reports of clinical studies miss or misrepresent parts of what was measured. 40 – 62% of studies had ≥1 primary outcome changed, introduced, omitted. 10.1371/journal.pone.0066844 Design bias: when clinical studies are not designed to answer the right questions at the right times. Industry statin trials used more surrogate outcomes, fewer safety outcomes, were faster. 10.1038/clpt.2011.279 22
CENTRE FOR HEALTH INFORMATICS | AUSTRALIAN INSTITUTE OF HEALTH INNOVATION Sharing of patient-level data: The third movement in the push for completeness and transparency, with pressure on funders/companies – and the technologies they need. A new future for clinical research through data sharing (YODA Project). 10.1001/jama.2013.1299 Linking trial design to practice: Making (post-approval) clinical trials match practice to properly address safety and effectiveness – and the technologies they need. Right answers, wrong questions in clinical research. 10.1126/scitranslmed.3007649 Bigger, better studies using EHRs: Connecting research and practice to fix enrolment and make trials much more efficient – and the technologies they need. A new architecture for connecting clinical research to patients through EHRs. 10.1136/amiajnl-2014-002727 23
CENTRE FOR HEALTH INFORMATICS | AUSTRALIAN INSTITUTE OF HEALTH INNOVATION • Focuses on research into peer review and research integrity • High visibility – permanent, unrestricted, free online access • Highly-respected editorial board • Rapid and thorough peer review Editors-in-Chief: Stephanie Harriman (UK), Maria Kowalczuk (UK) Iveta Simera ( UK ), Elizabeth Wager (UK) www.biomedcentral.com www.researchintegrityjournal.com 24
Recommend
More recommend