cos429 final project
play

COS429 FINAL PROJECT Object Detection on PASCAL VOC 2012 Yinda - PowerPoint PPT Presentation

COS429 FINAL PROJECT Object Detection on PASCAL VOC 2012 Yinda Zhang @ CS 105, Dec 18, 2015 WHAT TO DO Classification: Cat Detection: Cat + Bounding box CHALLENGING Appearance Viewpoint Occlusion Multiple objects MOST


  1. COS429 FINAL PROJECT Object Detection on PASCAL VOC 2012 Yinda Zhang @ CS 105, Dec 18, 2015

  2. WHAT TO DO Classification: Cat Detection: Cat + Bounding box

  3. CHALLENGING • Appearance • Viewpoint • Occlusion • Multiple objects

  4. MOST EXTREME CASE

  5. EVALUATE

  6. EVALUATE

  7. EVALUATE Intersection > 0.5? Union

  8. AVERAGE PRECISION Precision: % of detection that are correct; Recall: % of ground truth detected mAP: average of AP over multiple classes Precision AP Recall

  9. DATASET • 20 classes, 11530 images with 27450 objects labelled • Development toolkit

  10. BASELINE •Output center box, always classify as “cat” •Run image classification, and randomly generate a box •Sliding window: a window slides in image and perform classification for each location (DPM) •Region proposal: generate some regions from image, and perform classification on each.

  11. BASELINE • YOLO, http://arxiv.org/abs/1506.02640 • ResidualNet, http://arxiv.org/abs/1512.03385 • Faster RCNN, http://arxiv.org/abs/1506.01497 • Fast RCNN, http://arxiv.org/abs/1504.08083 • Inside-Outside Net, http://www.seanbell.ca/tmp/ion-bell2015.pdf • Exemplar SVM, http://www.cs.cmu.edu/~tmalisie/projects/iccv11/ • DPM, http://www.cs.berkeley.edu/~rbg/latent/ • RCNN, http://arxiv.org/abs/1311.2524

  12. TODO ✓ Implement a detection system •From scratch: your own idea or previous work •Improve upon released code of previous work •“script_train.m” •load data, perform training, and save model •“script_test.m” •load data and model, perform testing, and visualize result

  13. TODO ✓ Report •CVPR format: http://www.pamitc.org/cvpr16/ author_guidelines.php •Group members, name + ID •Methods •Evaluation: APs, mAP , PR curve, succ/fail detection result •Discussion •Job Assignment: Who did what

  14. TODO ✓ Evaluate on eval set. Only on test set for extra bonus ✓ Proposal deadline: Dec 18 ✓ Project deadline: Jan 12 ✓ Name your submission •xj_yindaz_mingru_cos429fp.pdf •xj_yindaz_mingru_cos429fp.zip

  15. GRADING ✓ Implementation (40%) •The amount of working codes •The data/result visualization/analysis •Any existing codes does not count ✓ Correctness (20%) •“script_train.m” and “script_test.m” are runnable •mAP > 20%

  16. GRADING ✓ Report Writing (10%) •Right format •All required contents ✓ Code Clearness (10%) •Codes is clean, well-organized, easy to read ✓ Algorithm Novelty (10%) •Create your own idea •Improve your baseline by something

  17. GRADING ✓ Performance (10%) •Rank mAP on eval set from all groups ✓ Extra Bonus (a looooot of marks) •If your mAP is above 70% •Evaluate on testing set as a proof

Recommend


More recommend