estimating and mitigating gender bias in deep image
play

Estimating and Mitigating Gender Bias in Deep Image Representations - PowerPoint PPT Presentation

Balance ced Datasets Are Not Enough: Estimating and Mitigating Gender Bias in Deep Image Representations Tianlu Wang University of Virginia Tianlu Wang-Gender Bias in Deep Image Representation Gender Bias in Visual Recognition Systems Deep


  1. Balance ced Datasets Are Not Enough: Estimating and Mitigating Gender Bias in Deep Image Representations Tianlu Wang University of Virginia Tianlu Wang-Gender Bias in Deep Image Representation

  2. Gender Bias in Visual Recognition Systems Deep Neural Network Tianlu Wang-Gender Bias in Deep Image Representation

  3. Gender Bias in Visual Recognition Systems Trained Deep Neural Network tie: Tianlu Wang-Gender Bias in Deep Image Representation

  4. Quantifying Bias Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification Tianlu Wang-Gender Bias in Deep Image Representation

  5. Quantifying Bias Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification Tianlu Wang-Gender Bias in Deep Image Representation

  6. Quantifying Bias Men Also Like Shopping: Reducing Gender Bias Am Amplifi fication on using Corpus-level Constraints Tianlu Wang-Gender Bias in Deep Image Representation

  7. Quantifying Bias Men Also Like Shopping: Reducing Gender Bias Am Amplifi fication on using Corpus-level Constraints Tianlu Wang-Gender Bias in Deep Image Representation

  8. Quantifying Bias Women Also Snowboard: Overcoming Bias in Captioning Models Tianlu Wang-Gender Bias in Deep Image Representation

  9. Quantifying Bias Is this prediction Trained Deep bi bias ased ? Neural Network Tianlu Wang-Gender Bias in Deep Image Representation

  10. Quantifying Bias: Model Leakage man Gender woman Classifier Tianlu Wang-Gender Bias in Deep Image Representation

  11. Quantifying Bias: Model Leakage training Gender Classifier testing Gender Classifier acc > 50% acc == 50% Tianlu Wang-Gender Bias in Deep Image Representation

  12. Quantifying Bias: Model Leakage man Gender woman Classifier Mo Model Leakage : gender prediction accuracy of a classifier trained on predictions. Tianlu Wang-Gender Bias in Deep Image Representation

  13. Object & Action Recognition Models COCO Object ct Recognition imSit im itu Act ction Recognition • 22k images (16k man & 6k woman) • 24k images (14k man & 10k woman) • 80 objects (kite, ski, handbag, tie…) • 211 activities (cooking, shooting, lifting…) • Recognition performance (F1): 53.75% • Recognition performance (F1): 40.11% • mo model leakage : 70.46% • mo model leakage : 76.93% Tianlu Wang-Gender Bias in Deep Image Representation

  14. Quantifying Bias: Dataset Leakage Ground Truth Predictions Labels Gender man Classifier woman Da Dataset Leakage : gender prediction accuracy of a classifier trained on annotations. Does the model inherit 100% dataset leakage? Tianlu Wang-Gender Bias in Deep Image Representation

  15. Performance Matters! Ground Truth Predictions Predictions Random Guess Labels NO LEAKGE! F1 score = 100% F1 score = 53.75% F1 score ≈ 0 Dataset Leakage = 67.72% Model Leakage = 70.46% Tianlu Wang-Gender Bias in Deep Image Representation

  16. Quantifying Bias: Adjusted Dataset Leakage Ground Truth Perturbed Labels Labels match ch the Gender man Random performance Classifier woman Perturbation Ad Adju justed Da Dataset Leakage : gender prediction accuracy of a classifier trained on per pertur urbed bed annotations. Tianlu Wang-Gender Bias in Deep Image Representation

  17. Quantifying Bias: Adjusted Dataset Leakage Tianlu Wang-Gender Bias in Deep Image Representation

  18. Quantifying Bias: Bias Amplification COCO Object Recognition imSitu Action Recognition 80 72 75 68 70 9.93 64 20.47 65 60 60 56 55 52 50 Model Leakage Adjusted Dataset Model Leakage Adjusted Dataset Leakage Leakage Δ = Model Leakage – Adjusted Dataset Leakage > 0 Bias Amplification! Tianlu Wang-Gender Bias in Deep Image Representation

  19. Eliminating Bias 55 50 F1 Score (%) 45 40 35 2 4 6 8 10 Bias Amplification in COCO Tianlu Wang-Gender Bias in Deep Image Representation

  20. Eliminating Bias: Adding Noise Handbag Fork Fully- Vase connected Convolutional Spoon Layer Neural + … Network Logistic (Resnet-50) Knife Regressors Car Oven Tianlu Wang-Gender Bias in Deep Image Representation

  21. Eliminating Bias: Adding Noise 55 50 F1 Score (%) original 45 randomization 40 35 2 4 6 8 10 Bias Amplification in COCO Tianlu Wang-Gender Bias in Deep Image Representation

  22. Eliminating Bias: Balanced Datasets woman woman 29% man woman woman man 38% man 57% 43% 50% 50% man 68% 71% Balance ced 1 Original Or Balance ced 3 Balance ced 2 F1 score (%): 42.89 F1 score (%): 53.75 F1 score (%): 52.60 F1 score (%): 51.95 model leakage (%): 63.22 model leakage (%): 70.46 model leakage (%): 67.78 model leakage (%): 64.45 less images, lower performance, lower model leakage Tianlu Wang-Gender Bias in Deep Image Representation

  23. Eliminating Bias: Balanced Datasets 55 50 F1 Score (%) original randomization 45 balanced 3 balanced 2 Balancing the co-occurance of gender and target labels 40 balanced 1 does not reduce bias amplification. 35 2 4 6 8 10 Bias Amplification in COCO Tianlu Wang-Gender Bias in Deep Image Representation

  24. Eliminating Bias: Balanced Datasets Gender Bias in Coreference Resolution: Evaluation and Debiasing Methods Tianlu Wang-Gender Bias in Deep Image Representation

  25. Eliminating Bias: Balanced Datasets Balancing the co-occurance of gender and target labels does not reduce bias amplification. Tianlu Wang-Gender Bias in Deep Image Representation

  26. Eliminating Bias: Using Extra Annotations original blackout-face blur-segm blackout-segm blackout-box Tianlu Wang-Gender Bias in Deep Image Representation

  27. Eliminating Bias: Using Extra Annotations 55 original randomization 50 F1 Score (%) balanced 3 balanced 2 45 balanced 1 blackout-face 40 blur-segm blackout-segm blackout-box 35 2 4 6 8 10 Bias Amplification in COCO Tianlu Wang-Gender Bias in Deep Image Representation

  28. Eliminating Bias: Adversarial Training Handbag Fork Fully- Vase connected Convolutional Spoon Layer Neural + … Network Logistic (Resnet-50) Knife Regressors Car Oven Gradient Reversal man Gender Classifier woman Tianlu Wang-Gender Bias in Deep Image Representation

  29. Eliminating Bias: Adversarial Training Handbag Fork maintain Fully- Vase connected object features Convolutional Spoon Layer Neural + … Network Logistic (Resnet-50) Knife Regressors Car Oven destroy Gradient Reversal gender features man Gender optimize the Classifier classifier A good or a bad gender classifier? woman Tianlu Wang-Gender Bias in Deep Image Representation

  30. Eliminating Bias: Adversarial Training original 55 randomization balanced 3 50 balanced 2 F1 Score (%) balanced 1 blackout-face 45 blur-segm blackout-segm 40 blackout-box adv @ image 35 adv@conv4 2 4 6 8 10 adv @ conv5 Bias Amplification in COCO Tianlu Wang-Gender Bias in Deep Image Representation

  31. Visualization of Adversarial Training Handbag Fork Fully- X Vase connected Convolutional Spoon Layer Neural Mask Prediction + … Network Logistic (Resnet-50) Knife Regressors Car Oven Gradient Reversal man Gender Classifier woman Tianlu Wang-Gender Bias in Deep Image Representation

  32. Adversarial Training: Removing Face Area Tianlu Wang-Gender Bias in Deep Image Representation

  33. Adversarial Training: Removing Face and Skin Tianlu Wang-Gender Bias in Deep Image Representation

  34. Adversarial Training: Removing Entire Person Tianlu Wang-Gender Bias in Deep Image Representation

  35. Adversarial Training: Removing Contextual Cues Tianlu Wang-Gender Bias in Deep Image Representation

  36. Tianlu Wang-Gender Bias in Deep Image Representation

Recommend


More recommend