Learning with Bad Training Data via Iterative Trimmed Loss Minimization Yanyao Shen , Sujay Sanghavi University of Texas at Austin
Motivations 1: Bad Training Labels in Classification Supervised: noise in training labels makes classifiers inaccurate 2. bird Systematic label noise: a fraction of “horse” is 7. horse mis-labeled “bird” Dataset size will not 2. bird rescue …
Motivations 1: Bad Training Labels in Classification 2: Mixed Training Data Supervised: noise in training labels Unsupervised: spurious samples give makes classifiers inaccurate bad generative models 2. bird Systematic label noise: a fraction of “horse” is 7. horse mis-labeled “bird” Dataset size will not 2. bird rescue … + GAN =
Motivations 1: Bad Training Labels in Classification 2: Mixed Training Data Supervised: noise in training labels Unsupervised: spurious samples give makes classifiers inaccurate bad generative models 2. bird Systematic label noise: a fraction of “horse” is 7. horse mis-labeled “bird” Dataset size will not 2. bird rescue … + GAN = 3: Backdoor Attacks Images classified as `ship’ Images classified as `horse’
Observation: Initial Epochs Can Differentiate
Recommend
More recommend