large scale live active learning training object
play

Large-Scale Live Active Learning: Training Object Detectors with - PowerPoint PPT Presentation

Large-Scale Live Active Learning: Training Object Detectors with Crawled Data and Crowds Sudheendra Vijayanarasimhan Kristen Grauman Department of Computer Science University of Texas at Austin Austin, Texas Introduction Problem Our


  1. Large-Scale Live Active Learning: Training Object Detectors with Crawled Data and Crowds Sudheendra Vijayanarasimhan Kristen Grauman Department of Computer Science University of Texas at Austin Austin, Texas

  2. Introduction Problem Our Approach Results Conclusions Object Detection Challenge: Best results require large amount of cleanly labeled training examples. 2

  3. Introduction Problem Our Approach Results Conclusions Ways to Reduce Effort Active learning minimize effort by focusing label requests on the most informative examples [Kapoor et al. ICCV 2007, Qi et al. CVPR 2008, Vijayanarasimhan et al. CVPR 2009, Joshi et al. CVPR 2009, Siddique et al. CVPR 2010] 3

  4. Introduction Problem Our Approach Results Conclusions Ways to Reduce Effort Active learning Labeled data Unlabeled data minimize effort by focusing label requests on the most informative examples [Kapoor et al. ICCV 2007, Qi et al. CVPR 2008, Vijayanarasimhan et al. CVPR 2009, Joshi et al. CVPR 2009, Siddique et al. CVPR 2010] 3

  5. Introduction Problem Our Approach Results Conclusions Ways to Reduce Effort Active learning Labeled data Current category model Unlabeled data minimize effort by focusing label requests on the most informative examples [Kapoor et al. ICCV 2007, Qi et al. CVPR 2008, Vijayanarasimhan et al. CVPR 2009, Joshi et al. CVPR 2009, Siddique et al. CVPR 2010] 3

  6. Introduction Problem Our Approach Results Conclusions Ways to Reduce Effort Active learning Labeled data Current category model Active Selection Function Unlabeled data minimize effort by focusing label requests on the most informative examples [Kapoor et al. ICCV 2007, Qi et al. CVPR 2008, Vijayanarasimhan et al. CVPR 2009, Joshi et al. CVPR 2009, Siddique et al. CVPR 2010] 3

  7. Introduction Problem Our Approach Results Conclusions Ways to Reduce Effort Active learning Labeled data Current category model Actively Active selected Selection example(s) Function Unlabeled data minimize effort by focusing label requests on the most informative examples [Kapoor et al. ICCV 2007, Qi et al. CVPR 2008, Vijayanarasimhan et al. CVPR 2009, Joshi et al. CVPR 2009, Siddique et al. CVPR 2010] 3

  8. Introduction Problem Our Approach Results Conclusions Ways to Reduce Effort Active learning Labeled data Annotator(s) “Annotation request” Current category model Actively Active selected Selection example(s) Function Unlabeled data minimize effort by focusing label requests on the most informative examples [Kapoor et al. ICCV 2007, Qi et al. CVPR 2008, Vijayanarasimhan et al. CVPR 2009, Joshi et al. CVPR 2009, Siddique et al. CVPR 2010] 3

  9. Introduction Problem Our Approach Results Conclusions Ways to Reduce Effort Active learning Labeled data Annotator(s) “Annotation request” Current category model Actively Active selected Selection example(s) Function Unlabeled data minimize effort by focusing label requests on the most informative examples [Kapoor et al. ICCV 2007, Qi et al. CVPR 2008, Vijayanarasimhan et al. CVPR 2009, Joshi et al. CVPR 2009, Siddique et al. CVPR 2010] 3

  10. Introduction Problem Our Approach Results Conclusions Ways to Reduce Effort Active learning Crowd-sourced annotations Labeled data Annotator(s) “Annotation request” Current category model Actively Active selected Selection example(s) Function Unlabeled data package annotation tasks to obtain from online human minimize effort by focusing workers label requests on the most [von Ahn et al. CHI 2004, Russell et al. IJCV 2007, Sorokin et al. 2008, Welinder et al. informative examples ACVHL 2010, Deng et al, CVPR 2009] [Kapoor et al. ICCV 2007, Qi et al. CVPR 2008, Vijayanarasimhan et al. CVPR 2009, Joshi et al. CVPR 2009, Siddique et al. CVPR 2010] 3

  11. Introduction Problem Our Approach Results Conclusions Problem Thus far techniques are only tested in artificially controlled settings: 4

  12. Introduction Problem Our Approach Results Conclusions Problem Thus far techniques are only tested in artificially controlled settings: use “sandbox” datasets - dataset’s source and scope is fixed 4

  13. Introduction Problem Our Approach Results Conclusions Problem Thus far techniques are only tested in artificially controlled settings: use “sandbox” datasets - dataset’s source and scope is fixed computational cost of active selection and retraining the model generally ignored - linear/quadratic time 4

  14. Introduction Problem Our Approach Results Conclusions Problem Thus far techniques are only tested in artificially controlled settings: use “sandbox” datasets - dataset’s source and scope is fixed computational cost of active selection and retraining the model generally ignored - linear/quadratic time crowd-sourced collection requires iterative fine-tuning 4

  15. Introduction Problem Our Approach Results Conclusions Goal Goal Take active learning and crowd-sourced annotation collection out of the “sandbox”. 5

  16. Introduction Problem Our Approach Results Conclusions Goal Goal Take active learning and crowd-sourced annotation collection out of the “sandbox”. break free from dataset-based learning 5

  17. Introduction Problem Our Approach Results Conclusions Goal Goal Take active learning and crowd-sourced annotation collection out of the “sandbox”. break free from dataset-based learning collect information on the fly (no manual intervention) 5

  18. Introduction Problem Our Approach Results Conclusions Goal Goal Take active learning and crowd-sourced annotation collection out of the “sandbox”. break free from dataset-based learning collect information on the fly (no manual intervention) large-scale data 5

  19. Introduction Problem Our Approach Results Conclusions Goal Goal Take active learning and crowd-sourced annotation collection out of the “sandbox”. break free from dataset-based learning collect information on the fly (no manual intervention) large-scale data Our Approach: Live Learning Live active learning system that autonomously builds models for object detection 5

  20. Introduction Problem Our Approach Results Conclusions Goal Goal Take active learning and crowd-sourced annotation collection out of the “sandbox”. break free from dataset-based learning collect information on the fly (no manual intervention) large-scale data Our Approach: Live Learning Live active learning system that autonomously builds models for object “bicycle” Category model detection 5

  21. Introduction Problem Our Approach Results Conclusions Our Approach: Live Learning “bicycle” 6

  22. Introduction Problem Our Approach Results Conclusions Our Approach: Live Learning “bicycle” Unlabeled images 6

  23. Introduction Problem Our Approach Results Conclusions Our Approach: Live Learning “bicycle” Generate object windows Unlabeled images Unlabeled windows 6

  24. Introduction Problem Our Approach Results Conclusions Our Approach: Live Learning “bicycle” Select images to annotate Actively selected examples Generate object windows Unlabeled images Unlabeled windows 6

  25. Introduction Problem Our Approach Results Conclusions Our Approach: Live Learning Labeled data Online annotation collection “bicycle” Select images to annotate Actively selected examples Generate object windows Unlabeled images Unlabeled windows 6

  26. Introduction Problem Our Approach Results Conclusions Our Approach: Live Learning Labeled data Online annotation collection Object representation and classifier “bicycle” Category model Select images to annotate Actively selected examples Generate object windows Unlabeled images Unlabeled windows 6

  27. Introduction Problem Our Approach Results Conclusions Main Contributions Linear classification part-based linear detector based on non-linear feature coding 7

  28. Introduction Problem Our Approach Results Conclusions Main Contributions Linear classification part-based linear detector based on non-linear feature coding Large-scale active selection sub-linear time hashing scheme for efficiently selecting uncertain examples [ Jain, Vijayanarasimhan & Grauman, NIPS 2010] 7

  29. Introduction Problem Our Approach Results Conclusions Main Contributions Linear classification part-based linear detector based on non-linear feature coding Large-scale active selection sub-linear time hashing scheme for efficiently selecting uncertain examples [ Jain, Vijayanarasimhan & Grauman, NIPS 2010] Live learning results for active detection of unprecedented scale and autonomy for the first time 7

  30. Introduction Problem Our Approach Results Conclusions Outline Labeled data Online annotation collection Object ? representation and classifier “bicycle” Category model Select images to annotate Actively selected examples Generate object windows Unlabeled images Unlabeled windows 8

  31. Introduction Problem Our Approach Results Conclusions Outline Labeled data Online annotation collection Linear classification fast/incremental Object ? representation training using linear and classifier SVM “bicycle” Category model Select images to annotate Actively selected examples Generate object windows Unlabeled images Unlabeled windows 8

Recommend


More recommend