men also like shopping
play

MEN ALSO LIKE SHOPPING REDUCING GENDER BIAS AMPLIFICATION USING - PowerPoint PPT Presentation

MEN ALSO LIKE SHOPPING REDUCING GENDER BIAS AMPLIFICATION USING CORPUS-LEVEL CONSTRAINTS Jieyu Zhao 1,3 , Tianlu Wang 1 , Mark Yatskar 2,4 , Vicente Ordonez 1 , Kai-Wei Chang 1,3 1 University of Virginia 2 University of Washington 3 UCLA 4 Allen


  1. MEN ALSO LIKE SHOPPING REDUCING GENDER BIAS AMPLIFICATION USING CORPUS-LEVEL CONSTRAINTS Jieyu Zhao 1,3 , Tianlu Wang 1 , Mark Yatskar 2,4 , Vicente Ordonez 1 , Kai-Wei Chang 1,3 1 University of Virginia 2 University of Washington 3 UCLA 4 Allen Institute for AI ( me ) 1

  2. Dataset Gender Bias 33% 66% Male Female imsitu.org 2

  3. Model Bias After Training 16% 84% Male Female imsitu.org 3

  4. Why does this happen? Good for accuracy 4

  5. Algorithmic Bias in Grounded Setting } faucet } cooking dusting fork Dataset World Model

  6. Algorithmic Bias in Grounded Setting woman cooking } faucet } cooking dusting fork Dataset World Model

  7. Algorithmic Bias in Grounded Setting woman cooking man fixing faucet } faucet } cooking dusting fork Dataset World Model

  8. Algorithmic Bias in Grounded Setting } faucet } cooking dusting fork Dataset World Model RBA

  9. Algorithmic Bias in Grounded Setting Reduce amplification ~50% Negligible loss in performance } faucet } cooking dusting fork Dataset World Model RBA

  10. Contributions imSitu vSRL COCO MLC (objects) (events) High dataset gender bias data 38% (objects) 47% (events) exhibit strong bias Models amplify existing gender bias model ~70% objects and events have bias amplification Reducing bias amplification RBA ~50% reduction in amplification Insignificant loss in performance

  11. Outline imSitu vSRL 1. Background (events) COCO MLC (objects) 2. Dataset Bias data 3. Bias Amplification model 4. Reducing Bias Amplification RBA

  12. imSitu Visual Semantic Role Labeling (vSRL) (events) Internet FrameNet WordNet COOKING ROLES NOUNS AGENT woman FOOD vegetable CONTAINER pot TOOL spatula 12 Yatskar et al. CVPR ’16, Yang et al. NAACL ’16, Gupta and Malik arXiv ’16

  13. imSitu Visual Semantic Role Labeling (vSRL) (events) Internet FrameNet WordNet COOKING ROLES NOUNS AGENT woman FOOD vegetable CONTAINER pot TOOL spatula 13 Yatskar et al. CVPR ’16, Yang et al. NAACL ’16, Gupta and Malik arXiv ’16

  14. imSitu Visual Semantic Role Labeling (vSRL) (events) Convolutional Neural Network COOKING Regression ROLES NOUNS AGENT woman FOOD vegetable CONTAINER pot TOOL spatula Conditional Random Field 14 Yatskar et al. CVPR ’16, Yang et al. NAACL ’16, Gupta and Malik arXiv ’16

  15. imSitu Visual Semantic Role Labeling (vSRL) (events) Convolutional Neural Network COOKING Regression ROLES NOUNS AGENT woman FOOD vegetable CONTAINER pot TOOL spatula Conditional Random Field 15 Yatskar et al. CVPR ’16, Yang et al. NAACL ’16, Gupta and Malik arXiv ’16

  16. imSitu Visual Semantic Role Labeling (vSRL) (events) Convolutional Neural Network Need to model correlation between variables Model can use that machinery to amplify gender bias COOKING Regression ROLES NOUNS AGENT woman FOOD vegetable CONTAINER pot TOOL spatula Conditional Random Field 16 Yatskar et al. CVPR ’16, Yang et al. NAACL ’16, Gupta and Malik arXiv ’16

  17. COCO Multi-Label Classification (MLC) (objects) Internet a woman is smiling in a kitchen near a pizza on a stove Caption Inferred WOMAN Label PIZZA yes COCO ZEBRA no Objects FRIDGE yes CAR no … … 17

  18. COCO Multi-Label Classification (MLC) (objects) Convolutional Neural Network WOMAN PIZZA yes Regression ZEBRA no FRIDGE yes CAR no … … Conditional Random Field 18

  19. Related Work • Implicit Bias image search (Kay et al., 2015) search advertising (Sweeny, 2013) online news (Ross and Carter, 2011) credit score (Hardt et al., 2016) word vector (Bolukbasi et al., 2016) •Classifier class imbalance Barocas and Selbst, 2014; Dwork et al., 2012; Feldman et al., 2015; Zliobaite, 2015 19

  20. Outline imSitu vSRL 1. Background (events) COCO MLC (objects) 2. Dataset Bias data 3. Model Bias Amplification model 4. Reducing Bias Amplification RBA

  21. Defining Dataset Bias (events) Training Gender Ratio ( verb) Training Set cooking woman man COOKING COOKING ROLES NOUNS ROLES NOUNS AGENT woman AGENT man FOOD stir-fry FOOD noodle #( cooking , man) = 1/3 #( cooking , man) + #( cooking , woman)

  22. Defining Dataset Bias (objects) Training Gender Ratio ( noun) Training Set snowboard woman man MAN WOMAN snowboard yes snowboard yes refrigerator no refrigerator no bowl no bowl no #( snowboard, man) = 2/3 #( snowboard , man) + #( snowboard , woman)

  23. Gender Dataset Bias imSitu Verb COCO Noun 0.25 0.2 % of items 0.15 0.1 0.05 0 0 0.25 0.5 0.75 1 Female Unbiased Male bias bias Gender Ratio

  24. Gender Dataset Bias imSitu Verb COCO Noun 0.25 0.2 % of items 0.15 lecturing 0.1 cooking coaching washing shopping 0.05 repairing braiding 0 0 0.25 0.5 0.75 1 Female Unbiased Male bias bias Gender Ratio

  25. Gender Dataset Bias imSitu Verb COCO Noun 0.25 0.2 % of items bed 0.15 refrigerator 0.1 skateboard ski 0.05 fork surfboard 0 0 0.25 0.5 0.75 1 Female Unbiased Male bias bias Gender Ratio

  26. Gender Dataset Bias 64.6% bias imSitu Verb 86.6% bias COCO Noun 0.25 0.2 % of items 0.15 0.1 0.05 0 0 0.25 0.5 0.75 1 Female Unbiased Male bias bias Gender Ratio

  27. Gender Dataset Bias 64.6% bias 46.9% strong bias (>2:1) imSitu Verb 37.9% strong bias (>2:1) 86.6% bias COCO Noun 0.25 0.2 % of items 0.15 0.1 0.05 0 0 0.25 0.5 0.75 1 Female Unbiased Male bias bias Gender Ratio

  28. Outline imSitu vSRL 1. Background (events) COCO MLC (objects) 2. Dataset Bias data 3. Bias Amplification model 4. Reducing Bias Amplification RBA

  29. Defining Bias Amplification (events) Predicted Gender Ratio ( verb) Development Set COOKING COOKING ROLES NOUNS ROLES NOUNS AGENT woman AGENT man FOOD stir-fry FOOD noodle What does the model predict on unseen data?

  30. Defining Bias Amplification (events) Predicted Gender Ratio ( verb) Development Set cooking woman man COOKING COOKING ROLES NOUNS ROLES NOUNS AGENT woman AGENT man FOOD stir-fry FOOD noodle #( cooking , man) = 1/6 #( cooking , man) + #( cooking , woman)

  31. Model Bias Amplification imSitu Verb COCO Noun 1.00 Predicted Gender Ratio 0.75 0.50 0.25 Matched gender ratio 0.00 0 0.25 0.5 0.75 1 Female Unbiased Male Gender Ratio bias bias 31

  32. Model Bias Amplification imSitu Verb COCO Noun 1.00 Predicted Gender Ratio 0.75 0.50 0.25 Amplification Zone Matched gender ratio Matched gender ratio 0.00 0 0.25 0.5 0.75 1 Female Unbiased Male Gender Ratio bias bias 32

  33. Model Bias Amplification imSitu Verb COCO Noun assembling autographing 1.00 Predicted Gender Ratio 0.75 0.50 0.25 Amplification Zone cooking Matched gender ratio Matched gender ratio 0.00 washing 0 0.25 0.5 0.75 1 Female Unbiased Male Gender Ratio bias bias 33

  34. Model Bias Amplification 69% bias .05 bias imSitu Verb COCO Noun 73% bias .04 bias 1.00 Predicted Gender Ratio 0.75 0.50 0.25 Amplification Zone Matched gender ratio Matched gender ratio 0.00 0 0.25 0.5 0.75 1 Female Unbiased Male Gender Ratio bias bias 34

  35. Model Bias Amplification 69% bias .05 bias > 2:1 initial bias : .07 bias imSitu Verb > 2:1 initial bias : .08 bias COCO Noun 73% bias .04 bias 1.00 Predicted Gender Ratio 0.75 0.50 0.25 Amplification Zone Matched gender ratio Matched gender ratio 0.00 0 0.25 0.5 0.75 1 Female Unbiased Male Gender Ratio bias bias 35

  36. Summary Can we remove gender bias amplification and still maintain performance? 1.00 Predicted Gender Ratio 0.75 0.50 0.25 Matched gender ratio Matched gender ratio 0.00 0 0.25 0.5 0.75 1 Female Unbiased Male Gender Ratio bias bias 36

  37. Summary Can we remove gender bias amplification and still maintain performance? 1.00 Predicted Gender Ratio 0.75 Performance Goal: as good as the original Fairness Goal: not more biased than the data it was trained on 0.50 0.25 Matched gender ratio Matched gender ratio 0.00 0 0.25 0.5 0.75 1 Female Unbiased Male Gender Ratio bias bias 37

  38. Outline imSitu vSRL 1. Background (events) COCO MLC (objects) 2. Dataset Bias data 3. Bias Amplification model 4. Reducing Bias Amplification RBA

  39. Reducing Bias Amplification (RBA) Dataset Model RBA • Corpus level constraints on model output (ILP) ★ Doesn’t require model retraining • Reuse model inference through Lagrangian relaxation ★ Can be applied to any structured model 39

  40. Reducing Bias Amplification (RBA) Integer Linear Program s(y i , image) max X base model y i i CRF Inference 40

  41. Reducing Bias Amplification (RBA) Integer Linear Program s(y i , image) max X y i i ∀ points <= margin Training Ratio - Predicted Ratio f(y 1 … y n ) Predicted Gender Ratio 1.00 0.75 0.50 Violating margin Within margin Margin 0.25 Matched gender ratio 0.00 0 0.25 0.5 0.75 1 Gender Ratio 41

Recommend


More recommend