utd hltri at tac 2019 ddi track
play

UTD HLTRI at TAC 2019: DDI Track Ramon Maldonado , Maxwell - PowerPoint PPT Presentation

UTD HLTRI at TAC 2019: DDI Track Ramon Maldonado , Maxwell Weinzierl, & Sanda M. Harabagiu The University of Texas at Dallas Human Language Technology Research Institute http://www.hlt.utdallas.edu/~{ramon, max, sanda} Outline 1.


  1. UTD HLTRI at TAC 2019: DDI Track Ramon Maldonado , Maxwell Weinzierl, & Sanda M. Harabagiu The University of Texas at Dallas Human Language Technology Research Institute http://www.hlt.utdallas.edu/~{ramon, max, sanda}

  2. Outline 1. Introduction 2. The Approach 1. Pipeline Overview 2. Preprocessing 3. Multi-Task Transformer 4. Postprocessing 3. Results 4. Conclusion

  3. Introduction Multi-task neural model for: • Task 1: entity identification • Task 2: relation identification • Task 3*: concept normalization • Task 4: normalized relation identification

  4. Introduction Problem • Sentence-level • Binary Relation identification Our Approach • Multi-task learning – Sentence classification – Mention boundary detection – Relation extraction – PK effect classification • Pre-trained Transformer for shared representation

  5. Outline 1. Introduction 2. The Approach 1. Pipeline Overview 2. Preprocessing 3. Multi-Task Transformer 4. Postprocessing 3. Results 4. Conclusion

  6. The Approach FDA Label Drug-Drug Interaction Pipeline Postprocessing Structured Product Labels Task 1: Mentions • Mention Filtering • Continuation Linking Task 2: Relations SPLs SPLs SPLs • Unused mention/relation filtering Normalization PK Effects UMLS SNOMED-CT MED-RT Mentions Relations Sentence Mention Relation PKE Classifier Boundary Extractor Classifier Preprocessing Annotation Propagation Shared Representation • Mentions Task 3: • Relations Normalized Mentions • Pseudo-triggers BERT Tokenization • Spacy Multi-task Transformer Net for Identifying Drug-Drug Task 4: Label Interactions • Word-piece Interactions

  7. Outline 1. Introduction 2. The Approach 1. Pipeline Overview 2. Preprocessing 3. Multi-Task Transformer 4. Postprocessing 3. Results 4. Conclusion

  8. Preprocessing • Binary Relations – (Trigger, Precipitant, Effect) -> • (Trigger, Precipitant) • (Trigger, Effect) – Pseudo-triggers for SIs in some PDIs – PK effects as attributes • Mention annotation propagation – Ease the learning problem

  9. Preprocessing • Tokenization – spaCy – WordPiece using BERT vocab • C-IOBES tagging – Continuation necessary for disjoint spans

  10. Outline 1. Introduction 2. The Approach 1. Pipeline Overview 2. Preprocessing 3. Multi-Task Transformer 4. Postprocessing 3. Results 4. Conclusion

  11. Multi-Task Transformer Multi-Task Transformer network for Identifying Drug-Drug Interactions (MTTDDI) Relation Labels for all Mention Pairs Mention Type & Boundary Softmax Layer Labels for all words in an EEG report PKI effect codes r Mention Boundary Labeler Sentences containing PKE Classifier interactions b n b 1 b 2 Sentence Classifier Trigger Argument Context Embedding Embedding Embedding CRF Softmax Layer Softmax Layer r c 1 c 2 c 3 c 4 c 5 c 6 c n c n c 1 c 2 If r is a PKI s c n s c 1 c 2 c 3 BERT Sentence Encoder t 1 t 2 t 3 t n [CLS] [SEP]

  12. BERT Sentence Encoder Multi-Task Transformer network for Identifying Drug-Drug Interactions (MTTDDI) Relation Labels for all Mention Pairs Mention Type & Boundary Softmax Layer Labels for all words in an EEG report PKI effect codes r Mention Boundary Labeler Sentences containing PKE Classifier interactions b n b 1 b 2 Sentence Classifier Trigger Argument Context Embedding Embedding Embedding CRF Softmax Layer Softmax Layer r c 1 c 2 c 3 c 4 c 5 c 6 c n c n c 1 c 2 If r is a PKI s c n s c 1 c 2 c 3 BERT Sentence Encoder t 1 t 2 t 3 t n [CLS] [SEP]

  13. Sentence Classifier Multi-Task Transformer network for Identifying Drug-Drug Interactions (MTTDDI) Relation Labels for all Mention Pairs Mention Type & Boundary Softmax Layer Labels for all words in an EEG report PKI effect codes r Mention Boundary Labeler Sentences containing PKE Classifier interactions b n b 1 b 2 Sentence Classifier Trigger Argument Context Embedding Embedding Embedding CRF Softmax Layer Softmax Layer r c 1 c 2 c 3 c 4 c 5 c 6 c n c n c 1 c 2 If r is a PKI s c n s c 1 c 2 c 3 BERT Sentence Encoder t 1 t 2 t 3 t n [CLS] [SEP]

  14. Mention Boundary Labeler Multi-Task Transformer network for Identifying Drug-Drug Interactions (MTTDDI) Relation Labels for all Mention Pairs Mention Type & Boundary Softmax Layer Labels for all words in an EEG report PKI effect codes r Mention Boundary Labeler Sentences containing PKE Classifier interactions b n b 1 b 2 Sentence Classifier Trigger Argument Context Embedding Embedding Embedding CRF Softmax Layer Softmax Layer r c 1 c 2 c 3 c 4 c 5 c 6 c n c n c 1 c 2 If r is a PKI s c n s c 1 c 2 c 3 BERT Sentence Encoder t 1 t 2 t 3 t n [CLS] [SEP]

  15. Relation Extractor Multi-Task Transformer network for Identifying Drug-Drug Interactions (MTTDDI) Relation Labels for all Mention Pairs Mention Type & Boundary Softmax Layer Labels for all words in an EEG report PKI effect codes r Mention Boundary Labeler Sentences containing PKE Classifier interactions b n b 1 b 2 Sentence Classifier Trigger Argument Context Embedding Embedding Embedding CRF Softmax Layer Softmax Layer r c 1 c 2 c 3 c 4 c 5 c 6 c n c n c 1 c 2 If r is a PKI s c n s c 1 c 2 c 3 BERT Sentence Encoder t 1 t 2 t 3 t n [CLS] [SEP]

  16. Pharmacokinetic Effect Classifier Multi-Task Transformer network for Identifying Drug-Drug Interactions (MTTDDI) Relation Labels for all Mention Pairs Mention Type & Boundary Softmax Layer Labels for all words in an EEG report PKI effect codes r Mention Boundary Labeler Sentences containing PKE Classifier interactions b n b 1 b 2 Sentence Classifier Trigger Argument Context Embedding Embedding Embedding CRF Softmax Layer Softmax Layer r c 1 c 2 c 3 c 4 c 5 c 6 c n c n c 1 c 2 If r is a PKI s c n s c 1 c 2 c 3 BERT Sentence Encoder t 1 t 2 t 3 t n [CLS] [SEP]

  17. Outline 1. Introduction 2. The Approach 1. Pipeline Overview 2. Preprocessing 3. Multi-Task Transformer 4. Postprocessing 3. Results 4. Conclusion

  18. Postprocessing • Filtering – Invalid boundary tag sequences – Repeated mentions – Mentions not involved in an interaction • C-spans linked to closest mention • Reconstruct ternary interactions from binary through shared trigger

  19. Postprocessing • Normalization – String matching – SNOMED-CT • Specific interactions – MED-RT • Drug classes – UNII • precipitants – Augmented with atoms from UMLS • Map precipitants first to MED-RT, then to UNII of no match was found

  20. Postprocessing Task 4 • inferred from unique interactions between normalized mentions • PK effect codes from MTTDDI

  21. Outline 1. Introduction 2. The Approach 1. Pipeline Overview 2. Preprocessing 3. Multi-Task Transformer 4. Postprocessing 3. Results 4. Conclusion

  22. Results Evaluated MTTDDI against two alternate configurations: • UTDHLTRI Run3: No sentence filtering/targeted training • Run3 + Filtering: Dedicated Learners System Task1 Task2 Task3 Task4 Best Submission 65.38 49.03 62.39 17.56 Median 48.97 37.13 45.53 17.56 UTDHLTRI Run3 35.04 27.48 28.66 17.56 Run3 + Filtering 56.03 42.29 45.73 24.07 MTTDDI 54.39 41.34 44.08 25.20 * Bold indicated best score. Italics indicates best score among LDIIP systems.

  23. Outline 1. Introduction 2. The Approach 1. Pipeline Overview 2. Preprocessing 3. Multi-Task Transformer 4. Postprocessing 3. Results 4. Conclusion

  24. Questions

Recommend


More recommend