learning loss for active learning
play

Learning Loss for Active Learning Rymarczyk D., Zieliski B., Tabor - PowerPoint PPT Presentation

Learning Loss for Active Learning Rymarczyk D., Zieliski B., Tabor J., Sadowski M., Titov M. Agenda 1. Active Learning introduction 2. Base methods in AL for Deep Learning 3. Learning Loss for Active Learning 4. Our ideas for Active


  1. Learning Loss for Active Learning Rymarczyk D., Zieliński B., Tabor J., Sadowski M., Titov M.

  2. Agenda 1. Active Learning introduction 2. Base methods in AL for Deep Learning 3. Learning Loss for Active Learning 4. Our ideas for Active Learning 5. Future plans 6. Bibliography

  3. Active Learning Labeled Unlabeled dataset

  4. Active Learning Labeled Unlabeled Label dataset

  5. Active Learning Labeled Unlabeled dataset Predicted Label

  6. Active Learning Labeled Unlabeled dataset Predicted Label True Label

  7. Active Learning Labeled Unlabeled Label dataset

  8. Challenges in Active Learning 1. Criteria on which the sample will be chosen to the labelling process. 2. How many samples should be included in the labelling process? 3. Is the oracle infallible? 4. Multi oracle scenarios. 5. Online learning. 6. Can we use unlabeled data? and how? 7. How does the oracle reckon the AL system? 8. ...

  9. Challenges in Active Learning 1. Criteria on which the sample will be chosen to the labelling process. 2. How many samples should be included in the labelling process? 3. Is the oracle infallible? 4. Multi oracle scenarios. 5. Online learning. 6. Can we use unlabeled data? and how? 7. How does the oracle reckon the AL system? 8. ...

  10. Base methods in AL for Deep Learning Random sampling to label Labeled Unlabeled dataset

  11. Base methods in AL for Deep Learning Core-Set approach - K-Greedy algorithm / KMeans++ Labeled Unlabeled dataset

  12. Base methods in AL for Deep Learning Core-Set approach - K-Greedy algorithm / KMeans++ Labeled Unlabeled dataset

  13. Base methods in AL for Deep Learning Uncertainty based approach - entropy 0.1 Labeled 0.4 Unlabeled dataset 0.3 0.9 0.1 0.2 0.4 0.5

  14. Base methods in AL for Deep Learning Uncertainty based approach - entropy 0.1 Labeled 0.4 Unlabeled dataset 0.3 0.9 0.1 0.2 0.4 0.5

  15. Base methods in AL for Deep Learning Experiment - learning episode: Labeled 1000 Unlabeled dataset (CIFAR10) Predicted Label To be Labeled True Label 1000

  16. Base methods in AL for Deep Learning Experiment - 10 x learning episodes: Labeled 10000 Unlabeled dataset (CIFAR10) Predicted Label

  17. Base methods in AL for Deep Learning Experiment - Results

  18. Learning Loss for Active Learning Motivation: 1. None of the basic methods use information from inner layers of NN. 2. Best measure of NN error is value of loss function . 3. More advanced methods requires: a. modifications of the architecture, b. training another neural network , c. training generative model, d. finding adversarial examples, e. bayesian deep learning, f. model ensembles .

  19. Learning Loss for Active Learning Architecture modifications - learning loss module:

  20. Learning Loss for Active Learning Architecture modifications - learning loss module:

  21. Learning Loss for Active Learning Loss function for learning the loss

  22. Learning Loss for Active Learning Results from the paper: Experiment details on CIFAR10: network trained for 200 epochs, lr=0.1 ● after 160 epochs lr=0.01 ● at 120 epoch loss prediction module ● does not influence the con weights

  23. Our ideas for Active Learning 1. Remove the loss prediction module and use decoder or VAE. Take to the labelling process samples with the highest reconstruction loss

  24. Our ideas for Active Learning 1. Remove the loss prediction module and use decoder or VAE.

  25. Our ideas for Active Learning 2. We should try to make an adversarial example of the image and choose those which requires the smallest modification to do so. DONE: https://arxiv.org/pdf/1802.09841.pdf

  26. Our ideas for Active Learning 3. We should be like GANs. Train a discriminator to distinguish between labeled and unlabeled dataset. DONE: https://arxiv.org/pdf/1907.06347.pdf

  27. Our ideas for Active Learning 4. The neural network is learning the easy example first. Can be the history of learning a differentiation between labeled and unlabeled datasets?

  28. Our ideas for Active Learning 4. History of learning every 20 epochs History record w a Labeled r Unlabeled Label dataset

  29. Our ideas for Active Learning 4. History of learning Labeled Labeled Unlabeled History record RandomForest dataset Unlabeled 1000 Only unlabeled 100 sampled with highest possibility of being unlabeled

  30. Our ideas for Active Learning 4. History of learning

  31. Our ideas for Active Learning 5. Different moments of History of learning

  32. Our ideas for Active Learning 6. Maybe the NN is after critical point - do not fine-tune

  33. Our ideas for Active Learning 7. Take the history of inner layers

  34. Our ideas for Active Learning 8. Why entropy is so good? Can we be better?

  35. Our ideas for Active Learning 9. Is history even worth something?

  36. Our ideas for Active Learning 9. Is history even worth something?

  37. Our ideas for Active Learning 9. Is history even worth something?

  38. Our ideas for Active Learning 9. Is history even worth something?

  39. Our ideas for Active Learning 9. Is history even worth something?

  40. Our ideas for Active Learning 9. Is history even worth something?

  41. Future plans 1. Investigate ways of finding the dataset outlier. 2. Do more research about history of learning.

  42. Future plans 1. IDEA: use augmentation for checking how the image prediction is sustained through different transformation. Take samples with highest m . y’ y’’ y’’’

  43. Bibliography 1. Yoo, Donggeun, and In So Kweon. "Learning Loss for Active Learning." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2019. https://arxiv.org/pdf/1905.03677.pdf 2. Ducoffe, Melanie, and Frederic Precioso. "Adversarial active learning for deep networks: a margin based approach." arXiv preprint arXiv:1802.09841 (2018). https://arxiv.org/pdf/1802.09841.pdf 3. Gissin, Daniel, and Shai Shalev-Shwartz. "Discriminative active learning." arXiv preprint arXiv:1907.06347 (2019). https://arxiv.org/pdf/1907.06347.pdf 4. Sener, Ozan, and Silvio Savarese. "Active learning for convolutional neural networks: A core-set approach." arXiv preprint arXiv:1708.00489 (2017). https://arxiv.org/pdf/1708.00489.pdf

Recommend


More recommend