<latexit sha1_base64="GeA/Os4/BK6Zz954iZvfPtPrQE=">ACXHicbVFdSxwxFM1MtepY61qhL325uLQo6DIjhdoHQVpafFzBrcLOdshk7q7BZDIkdwrLMH+yb/Wlf6XZdR786IHAyTn3kOQkr5R0FMd/gvDFyurLtfWNaPV1uvt3s6bH87UVuBIGXsdc4dKlniCQpvK4scp0rvMpvy78q19onTlJc0rnGg+K+VUCk5eynqUolL73yDVsoDvB/ABTuEoVWYGwdqmkbwGMs5SF2ts4ZOk/Znc9lCl8OMuQhYJYcer0w5Babho6S9iDr9eNBvAQ8J0lH+qzDMOv9Tgsjao0lCcWdGydxRZOGW5JCYRultcOKi1s+w7GnJdfoJs2ynRbe6WAqbF+lQRL9WGi4dq5uc79pOZ0456C/F/3rim6cmkWVE5bi/qBprYAMLKqGQloUpOaecGlvyuIG265IP8hkS8hefrk52R0Pg8SC4+9s+dG2s3dsj+2zhH1iZ+ycDdmICXYXsGAjiIK/4Wq4GW7dj4ZBl9ljxC+/Qcv6aoL</latexit> <latexit sha1_base64="GeA/Os4/BK6Zz954iZvfPtPrQE=">ACXHicbVFdSxwxFM1MtepY61qhL325uLQo6DIjhdoHQVpafFzBrcLOdshk7q7BZDIkdwrLMH+yb/Wlf6XZdR786IHAyTn3kOQkr5R0FMd/gvDFyurLtfWNaPV1uvt3s6bH87UVuBIGXsdc4dKlniCQpvK4scp0rvMpvy78q19onTlJc0rnGg+K+VUCk5eynqUolL73yDVsoDvB/ABTuEoVWYGwdqmkbwGMs5SF2ts4ZOk/Znc9lCl8OMuQhYJYcer0w5Babho6S9iDr9eNBvAQ8J0lH+qzDMOv9Tgsjao0lCcWdGydxRZOGW5JCYRultcOKi1s+w7GnJdfoJs2ynRbe6WAqbF+lQRL9WGi4dq5uc79pOZ0456C/F/3rim6cmkWVE5bi/qBprYAMLKqGQloUpOaecGlvyuIG265IP8hkS8hefrk52R0Pg8SC4+9s+dG2s3dsj+2zhH1iZ+ycDdmICXYXsGAjiIK/4Wq4GW7dj4ZBl9ljxC+/Qcv6aoL</latexit> <latexit sha1_base64="GeA/Os4/BK6Zz954iZvfPtPrQE=">ACXHicbVFdSxwxFM1MtepY61qhL325uLQo6DIjhdoHQVpafFzBrcLOdshk7q7BZDIkdwrLMH+yb/Wlf6XZdR786IHAyTn3kOQkr5R0FMd/gvDFyurLtfWNaPV1uvt3s6bH87UVuBIGXsdc4dKlniCQpvK4scp0rvMpvy78q19onTlJc0rnGg+K+VUCk5eynqUolL73yDVsoDvB/ABTuEoVWYGwdqmkbwGMs5SF2ts4ZOk/Znc9lCl8OMuQhYJYcer0w5Babho6S9iDr9eNBvAQ8J0lH+qzDMOv9Tgsjao0lCcWdGydxRZOGW5JCYRultcOKi1s+w7GnJdfoJs2ynRbe6WAqbF+lQRL9WGi4dq5uc79pOZ0456C/F/3rim6cmkWVE5bi/qBprYAMLKqGQloUpOaecGlvyuIG265IP8hkS8hefrk52R0Pg8SC4+9s+dG2s3dsj+2zhH1iZ+ycDdmICXYXsGAjiIK/4Wq4GW7dj4ZBl9ljxC+/Qcv6aoL</latexit> Maximum Likelihood Training • Maximum the likelihood of predicting the next word in the reference given the previous words ` ( E | F ) = − log P ( E | F ) T X = − log P ( e t | F, e 1 , . . . , e t − 1 ) t =1
<latexit sha1_base64="GeA/Os4/BK6Zz954iZvfPtPrQE=">ACXHicbVFdSxwxFM1MtepY61qhL325uLQo6DIjhdoHQVpafFzBrcLOdshk7q7BZDIkdwrLMH+yb/Wlf6XZdR786IHAyTn3kOQkr5R0FMd/gvDFyurLtfWNaPV1uvt3s6bH87UVuBIGXsdc4dKlniCQpvK4scp0rvMpvy78q19onTlJc0rnGg+K+VUCk5eynqUolL73yDVsoDvB/ABTuEoVWYGwdqmkbwGMs5SF2ts4ZOk/Znc9lCl8OMuQhYJYcer0w5Babho6S9iDr9eNBvAQ8J0lH+qzDMOv9Tgsjao0lCcWdGydxRZOGW5JCYRultcOKi1s+w7GnJdfoJs2ynRbe6WAqbF+lQRL9WGi4dq5uc79pOZ0456C/F/3rim6cmkWVE5bi/qBprYAMLKqGQloUpOaecGlvyuIG265IP8hkS8hefrk52R0Pg8SC4+9s+dG2s3dsj+2zhH1iZ+ycDdmICXYXsGAjiIK/4Wq4GW7dj4ZBl9ljxC+/Qcv6aoL</latexit> <latexit sha1_base64="GeA/Os4/BK6Zz954iZvfPtPrQE=">ACXHicbVFdSxwxFM1MtepY61qhL325uLQo6DIjhdoHQVpafFzBrcLOdshk7q7BZDIkdwrLMH+yb/Wlf6XZdR786IHAyTn3kOQkr5R0FMd/gvDFyurLtfWNaPV1uvt3s6bH87UVuBIGXsdc4dKlniCQpvK4scp0rvMpvy78q19onTlJc0rnGg+K+VUCk5eynqUolL73yDVsoDvB/ABTuEoVWYGwdqmkbwGMs5SF2ts4ZOk/Znc9lCl8OMuQhYJYcer0w5Babho6S9iDr9eNBvAQ8J0lH+qzDMOv9Tgsjao0lCcWdGydxRZOGW5JCYRultcOKi1s+w7GnJdfoJs2ynRbe6WAqbF+lQRL9WGi4dq5uc79pOZ0456C/F/3rim6cmkWVE5bi/qBprYAMLKqGQloUpOaecGlvyuIG265IP8hkS8hefrk52R0Pg8SC4+9s+dG2s3dsj+2zhH1iZ+ycDdmICXYXsGAjiIK/4Wq4GW7dj4ZBl9ljxC+/Qcv6aoL</latexit> <latexit sha1_base64="GeA/Os4/BK6Zz954iZvfPtPrQE=">ACXHicbVFdSxwxFM1MtepY61qhL325uLQo6DIjhdoHQVpafFzBrcLOdshk7q7BZDIkdwrLMH+yb/Wlf6XZdR786IHAyTn3kOQkr5R0FMd/gvDFyurLtfWNaPV1uvt3s6bH87UVuBIGXsdc4dKlniCQpvK4scp0rvMpvy78q19onTlJc0rnGg+K+VUCk5eynqUolL73yDVsoDvB/ABTuEoVWYGwdqmkbwGMs5SF2ts4ZOk/Znc9lCl8OMuQhYJYcer0w5Babho6S9iDr9eNBvAQ8J0lH+qzDMOv9Tgsjao0lCcWdGydxRZOGW5JCYRultcOKi1s+w7GnJdfoJs2ynRbe6WAqbF+lQRL9WGi4dq5uc79pOZ0456C/F/3rim6cmkWVE5bi/qBprYAMLKqGQloUpOaecGlvyuIG265IP8hkS8hefrk52R0Pg8SC4+9s+dG2s3dsj+2zhH1iZ+ycDdmICXYXsGAjiIK/4Wq4GW7dj4ZBl9ljxC+/Qcv6aoL</latexit> Maximum Likelihood Training • Maximum the likelihood of predicting the next word in the reference given the previous words ` ( E | F ) = − log P ( E | F ) T X = − log P ( e t | F, e 1 , . . . , e t − 1 ) t =1 • Also called "teacher forcing"
Problem 1: Exposure Bias • Teacher forcing assumes feeding correct previous input, but at test time we may make mistakes that propagate encoder I I I I classify classify classify classify classify I I I I I
Problem 1: Exposure Bias • Teacher forcing assumes feeding correct previous input, but at test time we may make mistakes that propagate encoder I I I I classify classify classify classify classify I I I I I • Exposure bias: The model is not exposed to mistakes during training, and cannot deal with them at test
Problem 1: Exposure Bias • Teacher forcing assumes feeding correct previous input, but at test time we may make mistakes that propagate encoder I I I I classify classify classify classify classify I I I I I • Exposure bias: The model is not exposed to mistakes during training, and cannot deal with them at test • Really important! One main source of commonly witnessed phenomena such as repeating.
Problem 2: Disregard to Evaluation Metrics
Problem 2: Disregard to Evaluation Metrics • In the end, we want good translations
Problem 2: Disregard to Evaluation Metrics • In the end, we want good translations • Good translations can be measured with metrics, e.g. BLEU or METEOR
Problem 2: Disregard to Evaluation Metrics • In the end, we want good translations • Good translations can be measured with metrics, e.g. BLEU or METEOR • Really important! Causes systematic problems:
Problem 2: Disregard to Evaluation Metrics • In the end, we want good translations • Good translations can be measured with metrics, e.g. BLEU or METEOR • Really important! Causes systematic problems: • Hypothesis-reference length mismatch
Problem 2: Disregard to Evaluation Metrics • In the end, we want good translations • Good translations can be measured with metrics, e.g. BLEU or METEOR • Really important! Causes systematic problems: • Hypothesis-reference length mismatch • Dropped/repeated content
A Clear Example • My (winning) submission to Workshop on Asian Translation 2016 [Neubig 16] Lexicons and Minimum Risk Training for Neural Machine Translation: NAIST-CMU at WAT2016 (Neubig 16)
A Clear Example • My (winning) submission to Workshop on Asian Translation 2016 [Neubig 16] Length Ratio BLEU 100 27 95 26 90 25 85 24 80 23 MLE MLE+Length MinRisk MLE MLE+Length MinRisk Lexicons and Minimum Risk Training for Neural Machine Translation: NAIST-CMU at WAT2016 (Neubig 16)
A Clear Example • My (winning) submission to Workshop on Asian Translation 2016 [Neubig 16] Length Ratio BLEU 100 27 95 26 90 25 85 24 80 23 MLE MLE+Length MinRisk MLE MLE+Length MinRisk Lexicons and Minimum Risk Training for Neural Machine Translation: NAIST-CMU at WAT2016 (Neubig 16)
A Clear Example • My (winning) submission to Workshop on Asian Translation 2016 [Neubig 16] Length Ratio BLEU 100 27 95 26 90 25 85 24 80 23 MLE MLE+Length MinRisk MLE MLE+Length MinRisk Lexicons and Minimum Risk Training for Neural Machine Translation: NAIST-CMU at WAT2016 (Neubig 16)
A Clear Example • My (winning) submission to Workshop on Asian Translation 2016 [Neubig 16] Length Ratio BLEU 100 27 95 26 90 25 85 24 80 23 MLE MLE+Length MinRisk MLE MLE+Length MinRisk Lexicons and Minimum Risk Training for Neural Machine Translation: NAIST-CMU at WAT2016 (Neubig 16)
A Clear Example • My (winning) submission to Workshop on Asian Translation 2016 [Neubig 16] Length Ratio BLEU 100 27 95 26 90 25 85 24 80 23 MLE MLE+Length MinRisk MLE MLE+Length MinRisk • Just training for (sentence-level) BLEU largely fixes length problems, and does much better than heuristics Lexicons and Minimum Risk Training for Neural Machine Translation: NAIST-CMU at WAT2016 (Neubig 16)
Error and Risk
Error
<latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> <latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> <latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> Error • Generate a translation ˆ E P ( ˜ E = argmax ˜ E | F )
<latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> <latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> <latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> <latexit sha1_base64="KRxJjxRAFBSumCLgm+mSm7rf7k=">ACHicbVDLSgNBEJyNrxhfUY9eBoMQcOuBNSDECIBDx4iuEZIQpidJIhsw9mesWw5Ee8+CtePKh48SD4N04eB40WNBRV3XR3eZEUGm37y0rNzS8sLqWXMyura+sb2c2tGx3GioPLQxmqW49pkCIAFwVKuI0UMN+TUP65yO/dgdKizC4xkETZ91A9ERnKGRWtliA+EeE1AqVMN85YA2egyTynCfnlHncGKWLyvuL6+VzdkFewz6lzhTkiNTVFvZj0Y75LEPAXLJtK47doTNhCkUXMIw04g1RIz3WRfqhgbMB91Mxt8N6Z5R2rQTKlMB0rH6cyJhvtYD3zOdPsOenvVG4n9ePcbOSTMRQRQjBHyqBNLiEdRUXbQgFHOTCEcSXMrZT3mGIcTaAZE4Iz+/Jf4h4VTgvOVTFXKk/TSJMdskvyxCHpEQuSJW4hJMH8kReyKv1aD1b9b7pDVlTWe2yS9Yn9F6BW</latexit> <latexit sha1_base64="KRxJjxRAFBSumCLgm+mSm7rf7k=">ACHicbVDLSgNBEJyNrxhfUY9eBoMQcOuBNSDECIBDx4iuEZIQpidJIhsw9mesWw5Ee8+CtePKh48SD4N04eB40WNBRV3XR3eZEUGm37y0rNzS8sLqWXMyura+sb2c2tGx3GioPLQxmqW49pkCIAFwVKuI0UMN+TUP65yO/dgdKizC4xkETZ91A9ERnKGRWtliA+EeE1AqVMN85YA2egyTynCfnlHncGKWLyvuL6+VzdkFewz6lzhTkiNTVFvZj0Y75LEPAXLJtK47doTNhCkUXMIw04g1RIz3WRfqhgbMB91Mxt8N6Z5R2rQTKlMB0rH6cyJhvtYD3zOdPsOenvVG4n9ePcbOSTMRQRQjBHyqBNLiEdRUXbQgFHOTCEcSXMrZT3mGIcTaAZE4Iz+/Jf4h4VTgvOVTFXKk/TSJMdskvyxCHpEQuSJW4hJMH8kReyKv1aD1b9b7pDVlTWe2yS9Yn9F6BW</latexit> <latexit sha1_base64="KRxJjxRAFBSumCLgm+mSm7rf7k=">ACHicbVDLSgNBEJyNrxhfUY9eBoMQcOuBNSDECIBDx4iuEZIQpidJIhsw9mesWw5Ee8+CtePKh48SD4N04eB40WNBRV3XR3eZEUGm37y0rNzS8sLqWXMyura+sb2c2tGx3GioPLQxmqW49pkCIAFwVKuI0UMN+TUP65yO/dgdKizC4xkETZ91A9ERnKGRWtliA+EeE1AqVMN85YA2egyTynCfnlHncGKWLyvuL6+VzdkFewz6lzhTkiNTVFvZj0Y75LEPAXLJtK47doTNhCkUXMIw04g1RIz3WRfqhgbMB91Mxt8N6Z5R2rQTKlMB0rH6cyJhvtYD3zOdPsOenvVG4n9ePcbOSTMRQRQjBHyqBNLiEdRUXbQgFHOTCEcSXMrZT3mGIcTaAZE4Iz+/Jf4h4VTgvOVTFXKk/TSJMdskvyxCHpEQuSJW4hJMH8kReyKv1aD1b9b7pDVlTWe2yS9Yn9F6BW</latexit> Error • Generate a translation ˆ E P ( ˜ E = argmax ˜ E | F ) • Calculate its "badness" (e.g. 1-BLEU, 1-METEOR) error( E, ˆ E ) = 1 − BLEU( E, ˆ E )
<latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> <latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> <latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> <latexit sha1_base64="KRxJjxRAFBSumCLgm+mSm7rf7k=">ACHicbVDLSgNBEJyNrxhfUY9eBoMQcOuBNSDECIBDx4iuEZIQpidJIhsw9mesWw5Ee8+CtePKh48SD4N04eB40WNBRV3XR3eZEUGm37y0rNzS8sLqWXMyura+sb2c2tGx3GioPLQxmqW49pkCIAFwVKuI0UMN+TUP65yO/dgdKizC4xkETZ91A9ERnKGRWtliA+EeE1AqVMN85YA2egyTynCfnlHncGKWLyvuL6+VzdkFewz6lzhTkiNTVFvZj0Y75LEPAXLJtK47doTNhCkUXMIw04g1RIz3WRfqhgbMB91Mxt8N6Z5R2rQTKlMB0rH6cyJhvtYD3zOdPsOenvVG4n9ePcbOSTMRQRQjBHyqBNLiEdRUXbQgFHOTCEcSXMrZT3mGIcTaAZE4Iz+/Jf4h4VTgvOVTFXKk/TSJMdskvyxCHpEQuSJW4hJMH8kReyKv1aD1b9b7pDVlTWe2yS9Yn9F6BW</latexit> <latexit sha1_base64="KRxJjxRAFBSumCLgm+mSm7rf7k=">ACHicbVDLSgNBEJyNrxhfUY9eBoMQcOuBNSDECIBDx4iuEZIQpidJIhsw9mesWw5Ee8+CtePKh48SD4N04eB40WNBRV3XR3eZEUGm37y0rNzS8sLqWXMyura+sb2c2tGx3GioPLQxmqW49pkCIAFwVKuI0UMN+TUP65yO/dgdKizC4xkETZ91A9ERnKGRWtliA+EeE1AqVMN85YA2egyTynCfnlHncGKWLyvuL6+VzdkFewz6lzhTkiNTVFvZj0Y75LEPAXLJtK47doTNhCkUXMIw04g1RIz3WRfqhgbMB91Mxt8N6Z5R2rQTKlMB0rH6cyJhvtYD3zOdPsOenvVG4n9ePcbOSTMRQRQjBHyqBNLiEdRUXbQgFHOTCEcSXMrZT3mGIcTaAZE4Iz+/Jf4h4VTgvOVTFXKk/TSJMdskvyxCHpEQuSJW4hJMH8kReyKv1aD1b9b7pDVlTWe2yS9Yn9F6BW</latexit> <latexit sha1_base64="KRxJjxRAFBSumCLgm+mSm7rf7k=">ACHicbVDLSgNBEJyNrxhfUY9eBoMQcOuBNSDECIBDx4iuEZIQpidJIhsw9mesWw5Ee8+CtePKh48SD4N04eB40WNBRV3XR3eZEUGm37y0rNzS8sLqWXMyura+sb2c2tGx3GioPLQxmqW49pkCIAFwVKuI0UMN+TUP65yO/dgdKizC4xkETZ91A9ERnKGRWtliA+EeE1AqVMN85YA2egyTynCfnlHncGKWLyvuL6+VzdkFewz6lzhTkiNTVFvZj0Y75LEPAXLJtK47doTNhCkUXMIw04g1RIz3WRfqhgbMB91Mxt8N6Z5R2rQTKlMB0rH6cyJhvtYD3zOdPsOenvVG4n9ePcbOSTMRQRQjBHyqBNLiEdRUXbQgFHOTCEcSXMrZT3mGIcTaAZE4Iz+/Jf4h4VTgvOVTFXKk/TSJMdskvyxCHpEQuSJW4hJMH8kReyKv1aD1b9b7pDVlTWe2yS9Yn9F6BW</latexit> Error • Generate a translation ˆ E P ( ˜ E = argmax ˜ E | F ) • Calculate its "badness" (e.g. 1-BLEU, 1-METEOR) error( E, ˆ E ) = 1 − BLEU( E, ˆ E ) • We would like to minimize error
<latexit sha1_base64="KRxJjxRAFBSumCLgm+mSm7rf7k=">ACHicbVDLSgNBEJyNrxhfUY9eBoMQcOuBNSDECIBDx4iuEZIQpidJIhsw9mesWw5Ee8+CtePKh48SD4N04eB40WNBRV3XR3eZEUGm37y0rNzS8sLqWXMyura+sb2c2tGx3GioPLQxmqW49pkCIAFwVKuI0UMN+TUP65yO/dgdKizC4xkETZ91A9ERnKGRWtliA+EeE1AqVMN85YA2egyTynCfnlHncGKWLyvuL6+VzdkFewz6lzhTkiNTVFvZj0Y75LEPAXLJtK47doTNhCkUXMIw04g1RIz3WRfqhgbMB91Mxt8N6Z5R2rQTKlMB0rH6cyJhvtYD3zOdPsOenvVG4n9ePcbOSTMRQRQjBHyqBNLiEdRUXbQgFHOTCEcSXMrZT3mGIcTaAZE4Iz+/Jf4h4VTgvOVTFXKk/TSJMdskvyxCHpEQuSJW4hJMH8kReyKv1aD1b9b7pDVlTWe2yS9Yn9F6BW</latexit> <latexit sha1_base64="KRxJjxRAFBSumCLgm+mSm7rf7k=">ACHicbVDLSgNBEJyNrxhfUY9eBoMQcOuBNSDECIBDx4iuEZIQpidJIhsw9mesWw5Ee8+CtePKh48SD4N04eB40WNBRV3XR3eZEUGm37y0rNzS8sLqWXMyura+sb2c2tGx3GioPLQxmqW49pkCIAFwVKuI0UMN+TUP65yO/dgdKizC4xkETZ91A9ERnKGRWtliA+EeE1AqVMN85YA2egyTynCfnlHncGKWLyvuL6+VzdkFewz6lzhTkiNTVFvZj0Y75LEPAXLJtK47doTNhCkUXMIw04g1RIz3WRfqhgbMB91Mxt8N6Z5R2rQTKlMB0rH6cyJhvtYD3zOdPsOenvVG4n9ePcbOSTMRQRQjBHyqBNLiEdRUXbQgFHOTCEcSXMrZT3mGIcTaAZE4Iz+/Jf4h4VTgvOVTFXKk/TSJMdskvyxCHpEQuSJW4hJMH8kReyKv1aD1b9b7pDVlTWe2yS9Yn9F6BW</latexit> <latexit sha1_base64="KRxJjxRAFBSumCLgm+mSm7rf7k=">ACHicbVDLSgNBEJyNrxhfUY9eBoMQcOuBNSDECIBDx4iuEZIQpidJIhsw9mesWw5Ee8+CtePKh48SD4N04eB40WNBRV3XR3eZEUGm37y0rNzS8sLqWXMyura+sb2c2tGx3GioPLQxmqW49pkCIAFwVKuI0UMN+TUP65yO/dgdKizC4xkETZ91A9ERnKGRWtliA+EeE1AqVMN85YA2egyTynCfnlHncGKWLyvuL6+VzdkFewz6lzhTkiNTVFvZj0Y75LEPAXLJtK47doTNhCkUXMIw04g1RIz3WRfqhgbMB91Mxt8N6Z5R2rQTKlMB0rH6cyJhvtYD3zOdPsOenvVG4n9ePcbOSTMRQRQjBHyqBNLiEdRUXbQgFHOTCEcSXMrZT3mGIcTaAZE4Iz+/Jf4h4VTgvOVTFXKk/TSJMdskvyxCHpEQuSJW4hJMH8kReyKv1aD1b9b7pDVlTWe2yS9Yn9F6BW</latexit> <latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> <latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> <latexit sha1_base64="6ek90mJoNPTvCtomTW+aydQsu2s=">ACH3icbVBNSwMxEM3W7/pV9eglWAS9lF0RqgehKIrHClaFbinZ7LQNTXaXZFZalv0pXvwrXjyoiDf/jWkt4teDwJv3ZpjMCxIpDLru1OYmp6ZnZtfKC4uLa+sltbWr0ycag4NHstY3wTMgBQRNFCghJtEA1OBhOugfzLyr29BGxFHlzhMoKVYNxIdwRlaqV2q+j2G2WlOj6iPMCM6a5ig7yd+ShkCNbKaX3nq6C+EiE92Xym7FHYP+Jd6ElMkE9XbpzQ9jniqIkEtmTNzE2zZdSi4hLzopwYSxvusC01LI6bAtLxgTndtkpIO7G2L0I6Vr9PZEwZM1SB7VQMe+a3NxL/85opdg5amYiSFCHin4s6qaQY01FaNBQaOMqhJYxrYf9KeY9pxtFmWrQheL9P/ksae5XDinexX64dT9KYJ5tki+wQj1RJjZyTOmkQTu7IA3kiz8698+i8OK+frQVnMrNBfsB5/wAY9KMb</latexit> Error • Generate a translation ˆ E P ( ˜ E = argmax ˜ E | F ) • Calculate its "badness" (e.g. 1-BLEU, 1-METEOR) error( E, ˆ E ) = 1 − BLEU( E, ˆ E ) • We would like to minimize error • Problem: argmax is not differentiable, and thus not conducive to gradient-based optimization
In Phrase-based MT: Minimum Error Rate Training
In Phrase-based MT: Minimum Error Rate Training • A clever trick for gradient-free optimization of linear models
In Phrase-based MT: Minimum Error Rate Training • A clever trick for gradient-free optimization of linear models • Pick a single direction in feature space
In Phrase-based MT: Minimum Error Rate Training • A clever trick for gradient-free optimization of linear models • Pick a single direction in feature space • Exactly calculate the loss surface in this direction only (over an n-best list for every hypothesis)
In Phrase-based MT: Minimum Error Rate Training • A clever trick for gradient-free optimization of linear models • Pick a single direction in feature space • Exactly calculate the loss surface in this direction only (over an n-best list for every hypothesis) F 1 candidates F 1 error (a) (b) (c) F 1 φ 1 φ 2 φ 3 err 4 1 E 1,3 3 E 1,1 E 1,1 1 0 -1 0.6 2 E 1,2 1 E 1,2 0 1 0 0 0 0 total error (d) -4 -2 0 2 4 -4 -2 0 2 4 -1 E 1,3 1 0 1 1 2 -2 -3 F 2 φ 1 φ 2 φ 3 err 1 -4 F 2 error F 2 candidates E 2,1 1 0 -2 0.8 1 0 4 -4 -2 0 2 4 3 E 2,2 3 0 1 0.3 E 2,3 α ←1.25 2 E 2,1 0 1 E 2,3 3 1 2 0 -4 -2 0 2 4 0 -4 -2 0 2 4 -1 λ 1 =-1, λ 2 =1, λ 3 =1.25 E 2,2 λ 1 =-1, λ 2 =1, λ 3 =0 -2 -3 d 1 =0, d 2 =0, d 3 =1 -4
A Smooth Approximation: Risk [Smith+ 2006, Shen+ 2015] Minimum Risk Annealing for Training Log-Linear Models (Smith and Eisner 2006) Minimum risk training for neural machine translation (Shen et al. 2015)
A Smooth Approximation: Risk [Smith+ 2006, Shen+ 2015] • Risk is defined as the expected error Minimum Risk Annealing for Training Log-Linear Models (Smith and Eisner 2006) Minimum risk training for neural machine translation (Shen et al. 2015)
<latexit sha1_base64="iwD7OmBG4KhDZEWl5K36ziE3oIk=">ACTHicbVFdSyMxFM1U14/uh1UfQmWhRakzIigIoIoLfvYhe1W6JSydza0GRmSO6IZg/6Iuwb/svfPFBRdi0Hcpu3QuBk3POvUlOgkQKg672ymtrH5YW9/YLH/89PnLVmV756eJU82hw2MZ6+uAGZAig4KlHCdaGAqkNANxldTvXsL2og4+oGTBPqK3URiKDhDSw0qoY9wh5kWZpzXWge0eUB9HAGyOj2nvknVIPNRyBCyZp7Tdm2xob4SIW2dLezQaB1rPabEzhrDcGlarbcGdF3wOvAFVSVHtQ+eWHMU8VRMglM6bnuQn2M6ZRcAl52U8NJIyP2Q30LIyYAtPZmnk9KtlQjqMtV0R0hn7d0fGlDETFVinYjgy9qU/J/WS3F40s9ElKQIEZ8fNEwlxZhOo6Wh0MBRTixgXAt7V8pHTDO9gPKNgRv+cnvQewcdrwvh9VLy6LNDbIHtknNeKRY3JBvpE26RBO7skjeSYvzoPz5Lw6b3NrySl6dsk/Vr7A9xQsM=</latexit> <latexit sha1_base64="iwD7OmBG4KhDZEWl5K36ziE3oIk=">ACTHicbVFdSyMxFM1U14/uh1UfQmWhRakzIigIoIoLfvYhe1W6JSydza0GRmSO6IZg/6Iuwb/svfPFBRdi0Hcpu3QuBk3POvUlOgkQKg672ymtrH5YW9/YLH/89PnLVmV756eJU82hw2MZ6+uAGZAig4KlHCdaGAqkNANxldTvXsL2og4+oGTBPqK3URiKDhDSw0qoY9wh5kWZpzXWge0eUB9HAGyOj2nvknVIPNRyBCyZp7Tdm2xob4SIW2dLezQaB1rPabEzhrDcGlarbcGdF3wOvAFVSVHtQ+eWHMU8VRMglM6bnuQn2M6ZRcAl52U8NJIyP2Q30LIyYAtPZmnk9KtlQjqMtV0R0hn7d0fGlDETFVinYjgy9qU/J/WS3F40s9ElKQIEZ8fNEwlxZhOo6Wh0MBRTixgXAt7V8pHTDO9gPKNgRv+cnvQewcdrwvh9VLy6LNDbIHtknNeKRY3JBvpE26RBO7skjeSYvzoPz5Lw6b3NrySl6dsk/Vr7A9xQsM=</latexit> <latexit sha1_base64="iwD7OmBG4KhDZEWl5K36ziE3oIk=">ACTHicbVFdSyMxFM1U14/uh1UfQmWhRakzIigIoIoLfvYhe1W6JSydza0GRmSO6IZg/6Iuwb/svfPFBRdi0Hcpu3QuBk3POvUlOgkQKg672ymtrH5YW9/YLH/89PnLVmV756eJU82hw2MZ6+uAGZAig4KlHCdaGAqkNANxldTvXsL2og4+oGTBPqK3URiKDhDSw0qoY9wh5kWZpzXWge0eUB9HAGyOj2nvknVIPNRyBCyZp7Tdm2xob4SIW2dLezQaB1rPabEzhrDcGlarbcGdF3wOvAFVSVHtQ+eWHMU8VRMglM6bnuQn2M6ZRcAl52U8NJIyP2Q30LIyYAtPZmnk9KtlQjqMtV0R0hn7d0fGlDETFVinYjgy9qU/J/WS3F40s9ElKQIEZ8fNEwlxZhOo6Wh0MBRTixgXAt7V8pHTDO9gPKNgRv+cnvQewcdrwvh9VLy6LNDbIHtknNeKRY3JBvpE26RBO7skjeSYvzoPz5Lw6b3NrySl6dsk/Vr7A9xQsM=</latexit> A Smooth Approximation: Risk [Smith+ 2006, Shen+ 2015] • Risk is defined as the expected error P ( ˜ E | F ; θ )error( E, ˜ X risk( F, E, θ ) = E ) . ˜ E Minimum Risk Annealing for Training Log-Linear Models (Smith and Eisner 2006) Minimum risk training for neural machine translation (Shen et al. 2015)
<latexit sha1_base64="iwD7OmBG4KhDZEWl5K36ziE3oIk=">ACTHicbVFdSyMxFM1U14/uh1UfQmWhRakzIigIoIoLfvYhe1W6JSydza0GRmSO6IZg/6Iuwb/svfPFBRdi0Hcpu3QuBk3POvUlOgkQKg672ymtrH5YW9/YLH/89PnLVmV756eJU82hw2MZ6+uAGZAig4KlHCdaGAqkNANxldTvXsL2og4+oGTBPqK3URiKDhDSw0qoY9wh5kWZpzXWge0eUB9HAGyOj2nvknVIPNRyBCyZp7Tdm2xob4SIW2dLezQaB1rPabEzhrDcGlarbcGdF3wOvAFVSVHtQ+eWHMU8VRMglM6bnuQn2M6ZRcAl52U8NJIyP2Q30LIyYAtPZmnk9KtlQjqMtV0R0hn7d0fGlDETFVinYjgy9qU/J/WS3F40s9ElKQIEZ8fNEwlxZhOo6Wh0MBRTixgXAt7V8pHTDO9gPKNgRv+cnvQewcdrwvh9VLy6LNDbIHtknNeKRY3JBvpE26RBO7skjeSYvzoPz5Lw6b3NrySl6dsk/Vr7A9xQsM=</latexit> <latexit sha1_base64="iwD7OmBG4KhDZEWl5K36ziE3oIk=">ACTHicbVFdSyMxFM1U14/uh1UfQmWhRakzIigIoIoLfvYhe1W6JSydza0GRmSO6IZg/6Iuwb/svfPFBRdi0Hcpu3QuBk3POvUlOgkQKg672ymtrH5YW9/YLH/89PnLVmV756eJU82hw2MZ6+uAGZAig4KlHCdaGAqkNANxldTvXsL2og4+oGTBPqK3URiKDhDSw0qoY9wh5kWZpzXWge0eUB9HAGyOj2nvknVIPNRyBCyZp7Tdm2xob4SIW2dLezQaB1rPabEzhrDcGlarbcGdF3wOvAFVSVHtQ+eWHMU8VRMglM6bnuQn2M6ZRcAl52U8NJIyP2Q30LIyYAtPZmnk9KtlQjqMtV0R0hn7d0fGlDETFVinYjgy9qU/J/WS3F40s9ElKQIEZ8fNEwlxZhOo6Wh0MBRTixgXAt7V8pHTDO9gPKNgRv+cnvQewcdrwvh9VLy6LNDbIHtknNeKRY3JBvpE26RBO7skjeSYvzoPz5Lw6b3NrySl6dsk/Vr7A9xQsM=</latexit> <latexit sha1_base64="iwD7OmBG4KhDZEWl5K36ziE3oIk=">ACTHicbVFdSyMxFM1U14/uh1UfQmWhRakzIigIoIoLfvYhe1W6JSydza0GRmSO6IZg/6Iuwb/svfPFBRdi0Hcpu3QuBk3POvUlOgkQKg672ymtrH5YW9/YLH/89PnLVmV756eJU82hw2MZ6+uAGZAig4KlHCdaGAqkNANxldTvXsL2og4+oGTBPqK3URiKDhDSw0qoY9wh5kWZpzXWge0eUB9HAGyOj2nvknVIPNRyBCyZp7Tdm2xob4SIW2dLezQaB1rPabEzhrDcGlarbcGdF3wOvAFVSVHtQ+eWHMU8VRMglM6bnuQn2M6ZRcAl52U8NJIyP2Q30LIyYAtPZmnk9KtlQjqMtV0R0hn7d0fGlDETFVinYjgy9qU/J/WS3F40s9ElKQIEZ8fNEwlxZhOo6Wh0MBRTixgXAt7V8pHTDO9gPKNgRv+cnvQewcdrwvh9VLy6LNDbIHtknNeKRY3JBvpE26RBO7skjeSYvzoPz5Lw6b3NrySl6dsk/Vr7A9xQsM=</latexit> A Smooth Approximation: Risk [Smith+ 2006, Shen+ 2015] • Risk is defined as the expected error P ( ˜ E | F ; θ )error( E, ˜ X risk( F, E, θ ) = E ) . ˜ E • This is includes the probability in the objective function -> differentiable ! Minimum Risk Annealing for Training Log-Linear Models (Smith and Eisner 2006) Minimum risk training for neural machine translation (Shen et al. 2015)
Sub-sampling
Sub-sampling • Create a small sample of sentences (5-50), and calculate risk over that
<latexit sha1_base64="s7VNmewP+sEAU60nHL1SnfP+azM=">ACTHicbVFNaxsxFNS6aZo4/XDbYy4ipmBDMLul0OYQC02OTokTkK8xmi1b2NhSbtIb0uN2D/YSyC3/otekhCILjQz76QGiYmfckjZJCoth+CeovVh5ufpqb2+8frN23eN9x+ObV4aDgOey9ycJsyCFBoGKFDCaWGAqUTCSTL9MdPfoKxItdHOCtgpNi5FpngD01bqQxwi90Rthp1ept0+42PWzTXRrbUo1djEKm4LoVjYWmh37LDOu3ogKJHSXrtyZx4vZoExualaflI8Yeg97XGjGXbCRdHnIFqCJlWf9y4jNOclwo0csmsHUZhgSPHDAouoarHpYWC8Sk7h6GHmimwI7dIo6KfPJPSLDd+aQL9mGHY8ramUq8UzGc2KfanPyfNiwx+zZyQhclgub3B2WlpJjTebQ0FQY4ypkHjBvh70r5hPnA0H9A3YcQPX3yczD43NnpRAdfmnvfl2mskU2yRVokIl/JHtknfTIgnPwmf8kVuQ4ugn/BTXB7b60Fy56P5FHVu8A7qWy3A=</latexit> <latexit sha1_base64="s7VNmewP+sEAU60nHL1SnfP+azM=">ACTHicbVFNaxsxFNS6aZo4/XDbYy4ipmBDMLul0OYQC02OTokTkK8xmi1b2NhSbtIb0uN2D/YSyC3/otekhCILjQz76QGiYmfckjZJCoth+CeovVh5ufpqb2+8frN23eN9x+ObV4aDgOey9ycJsyCFBoGKFDCaWGAqUTCSTL9MdPfoKxItdHOCtgpNi5FpngD01bqQxwi90Rthp1ept0+42PWzTXRrbUo1djEKm4LoVjYWmh37LDOu3ogKJHSXrtyZx4vZoExualaflI8Yeg97XGjGXbCRdHnIFqCJlWf9y4jNOclwo0csmsHUZhgSPHDAouoarHpYWC8Sk7h6GHmimwI7dIo6KfPJPSLDd+aQL9mGHY8ramUq8UzGc2KfanPyfNiwx+zZyQhclgub3B2WlpJjTebQ0FQY4ypkHjBvh70r5hPnA0H9A3YcQPX3yczD43NnpRAdfmnvfl2mskU2yRVokIl/JHtknfTIgnPwmf8kVuQ4ugn/BTXB7b60Fy56P5FHVu8A7qWy3A=</latexit> <latexit sha1_base64="s7VNmewP+sEAU60nHL1SnfP+azM=">ACTHicbVFNaxsxFNS6aZo4/XDbYy4ipmBDMLul0OYQC02OTokTkK8xmi1b2NhSbtIb0uN2D/YSyC3/otekhCILjQz76QGiYmfckjZJCoth+CeovVh5ufpqb2+8frN23eN9x+ObV4aDgOey9ycJsyCFBoGKFDCaWGAqUTCSTL9MdPfoKxItdHOCtgpNi5FpngD01bqQxwi90Rthp1ept0+42PWzTXRrbUo1djEKm4LoVjYWmh37LDOu3ogKJHSXrtyZx4vZoExualaflI8Yeg97XGjGXbCRdHnIFqCJlWf9y4jNOclwo0csmsHUZhgSPHDAouoarHpYWC8Sk7h6GHmimwI7dIo6KfPJPSLDd+aQL9mGHY8ramUq8UzGc2KfanPyfNiwx+zZyQhclgub3B2WlpJjTebQ0FQY4ypkHjBvh70r5hPnA0H9A3YcQPX3yczD43NnpRAdfmnvfl2mskU2yRVokIl/JHtknfTIgnPwmf8kVuQ4ugn/BTXB7b60Fy56P5FHVu8A7qWy3A=</latexit> Sub-sampling • Create a small sample of sentences (5-50), and calculate risk over that P ( ˜ E | F ) error( E, ˆ X risk( F, E, S ) = E ) Z ˜ E ∈ S
<latexit sha1_base64="s7VNmewP+sEAU60nHL1SnfP+azM=">ACTHicbVFNaxsxFNS6aZo4/XDbYy4ipmBDMLul0OYQC02OTokTkK8xmi1b2NhSbtIb0uN2D/YSyC3/otekhCILjQz76QGiYmfckjZJCoth+CeovVh5ufpqb2+8frN23eN9x+ObV4aDgOey9ycJsyCFBoGKFDCaWGAqUTCSTL9MdPfoKxItdHOCtgpNi5FpngD01bqQxwi90Rthp1ept0+42PWzTXRrbUo1djEKm4LoVjYWmh37LDOu3ogKJHSXrtyZx4vZoExualaflI8Yeg97XGjGXbCRdHnIFqCJlWf9y4jNOclwo0csmsHUZhgSPHDAouoarHpYWC8Sk7h6GHmimwI7dIo6KfPJPSLDd+aQL9mGHY8ramUq8UzGc2KfanPyfNiwx+zZyQhclgub3B2WlpJjTebQ0FQY4ypkHjBvh70r5hPnA0H9A3YcQPX3yczD43NnpRAdfmnvfl2mskU2yRVokIl/JHtknfTIgnPwmf8kVuQ4ugn/BTXB7b60Fy56P5FHVu8A7qWy3A=</latexit> <latexit sha1_base64="s7VNmewP+sEAU60nHL1SnfP+azM=">ACTHicbVFNaxsxFNS6aZo4/XDbYy4ipmBDMLul0OYQC02OTokTkK8xmi1b2NhSbtIb0uN2D/YSyC3/otekhCILjQz76QGiYmfckjZJCoth+CeovVh5ufpqb2+8frN23eN9x+ObV4aDgOey9ycJsyCFBoGKFDCaWGAqUTCSTL9MdPfoKxItdHOCtgpNi5FpngD01bqQxwi90Rthp1ept0+42PWzTXRrbUo1djEKm4LoVjYWmh37LDOu3ogKJHSXrtyZx4vZoExualaflI8Yeg97XGjGXbCRdHnIFqCJlWf9y4jNOclwo0csmsHUZhgSPHDAouoarHpYWC8Sk7h6GHmimwI7dIo6KfPJPSLDd+aQL9mGHY8ramUq8UzGc2KfanPyfNiwx+zZyQhclgub3B2WlpJjTebQ0FQY4ypkHjBvh70r5hPnA0H9A3YcQPX3yczD43NnpRAdfmnvfl2mskU2yRVokIl/JHtknfTIgnPwmf8kVuQ4ugn/BTXB7b60Fy56P5FHVu8A7qWy3A=</latexit> <latexit sha1_base64="s7VNmewP+sEAU60nHL1SnfP+azM=">ACTHicbVFNaxsxFNS6aZo4/XDbYy4ipmBDMLul0OYQC02OTokTkK8xmi1b2NhSbtIb0uN2D/YSyC3/otekhCILjQz76QGiYmfckjZJCoth+CeovVh5ufpqb2+8frN23eN9x+ObV4aDgOey9ycJsyCFBoGKFDCaWGAqUTCSTL9MdPfoKxItdHOCtgpNi5FpngD01bqQxwi90Rthp1ept0+42PWzTXRrbUo1djEKm4LoVjYWmh37LDOu3ogKJHSXrtyZx4vZoExualaflI8Yeg97XGjGXbCRdHnIFqCJlWf9y4jNOclwo0csmsHUZhgSPHDAouoarHpYWC8Sk7h6GHmimwI7dIo6KfPJPSLDd+aQL9mGHY8ramUq8UzGc2KfanPyfNiwx+zZyQhclgub3B2WlpJjTebQ0FQY4ypkHjBvh70r5hPnA0H9A3YcQPX3yczD43NnpRAdfmnvfl2mskU2yRVokIl/JHtknfTIgnPwmf8kVuQ4ugn/BTXB7b60Fy56P5FHVu8A7qWy3A=</latexit> Sub-sampling • Create a small sample of sentences (5-50), and calculate risk over that P ( ˜ E | F ) error( E, ˆ X risk( F, E, S ) = E ) Z ˜ E ∈ S • Samples can be created using random sampling or n-best search
<latexit sha1_base64="s7VNmewP+sEAU60nHL1SnfP+azM=">ACTHicbVFNaxsxFNS6aZo4/XDbYy4ipmBDMLul0OYQC02OTokTkK8xmi1b2NhSbtIb0uN2D/YSyC3/otekhCILjQz76QGiYmfckjZJCoth+CeovVh5ufpqb2+8frN23eN9x+ObV4aDgOey9ycJsyCFBoGKFDCaWGAqUTCSTL9MdPfoKxItdHOCtgpNi5FpngD01bqQxwi90Rthp1ept0+42PWzTXRrbUo1djEKm4LoVjYWmh37LDOu3ogKJHSXrtyZx4vZoExualaflI8Yeg97XGjGXbCRdHnIFqCJlWf9y4jNOclwo0csmsHUZhgSPHDAouoarHpYWC8Sk7h6GHmimwI7dIo6KfPJPSLDd+aQL9mGHY8ramUq8UzGc2KfanPyfNiwx+zZyQhclgub3B2WlpJjTebQ0FQY4ypkHjBvh70r5hPnA0H9A3YcQPX3yczD43NnpRAdfmnvfl2mskU2yRVokIl/JHtknfTIgnPwmf8kVuQ4ugn/BTXB7b60Fy56P5FHVu8A7qWy3A=</latexit> <latexit sha1_base64="s7VNmewP+sEAU60nHL1SnfP+azM=">ACTHicbVFNaxsxFNS6aZo4/XDbYy4ipmBDMLul0OYQC02OTokTkK8xmi1b2NhSbtIb0uN2D/YSyC3/otekhCILjQz76QGiYmfckjZJCoth+CeovVh5ufpqb2+8frN23eN9x+ObV4aDgOey9ycJsyCFBoGKFDCaWGAqUTCSTL9MdPfoKxItdHOCtgpNi5FpngD01bqQxwi90Rthp1ept0+42PWzTXRrbUo1djEKm4LoVjYWmh37LDOu3ogKJHSXrtyZx4vZoExualaflI8Yeg97XGjGXbCRdHnIFqCJlWf9y4jNOclwo0csmsHUZhgSPHDAouoarHpYWC8Sk7h6GHmimwI7dIo6KfPJPSLDd+aQL9mGHY8ramUq8UzGc2KfanPyfNiwx+zZyQhclgub3B2WlpJjTebQ0FQY4ypkHjBvh70r5hPnA0H9A3YcQPX3yczD43NnpRAdfmnvfl2mskU2yRVokIl/JHtknfTIgnPwmf8kVuQ4ugn/BTXB7b60Fy56P5FHVu8A7qWy3A=</latexit> <latexit sha1_base64="s7VNmewP+sEAU60nHL1SnfP+azM=">ACTHicbVFNaxsxFNS6aZo4/XDbYy4ipmBDMLul0OYQC02OTokTkK8xmi1b2NhSbtIb0uN2D/YSyC3/otekhCILjQz76QGiYmfckjZJCoth+CeovVh5ufpqb2+8frN23eN9x+ObV4aDgOey9ycJsyCFBoGKFDCaWGAqUTCSTL9MdPfoKxItdHOCtgpNi5FpngD01bqQxwi90Rthp1ept0+42PWzTXRrbUo1djEKm4LoVjYWmh37LDOu3ogKJHSXrtyZx4vZoExualaflI8Yeg97XGjGXbCRdHnIFqCJlWf9y4jNOclwo0csmsHUZhgSPHDAouoarHpYWC8Sk7h6GHmimwI7dIo6KfPJPSLDd+aQL9mGHY8ramUq8UzGc2KfanPyfNiwx+zZyQhclgub3B2WlpJjTebQ0FQY4ypkHjBvh70r5hPnA0H9A3YcQPX3yczD43NnpRAdfmnvfl2mskU2yRVokIl/JHtknfTIgnPwmf8kVuQ4ugn/BTXB7b60Fy56P5FHVu8A7qWy3A=</latexit> Sub-sampling • Create a small sample of sentences (5-50), and calculate risk over that P ( ˜ E | F ) error( E, ˆ X risk( F, E, S ) = E ) Z ˜ E ∈ S • Samples can be created using random sampling or n-best search • If random sampling, make sure to deduplicate
Policy Gradient/REINFORCE
Policy Gradient/REINFORCE • Alternative way of maximizing expected reward, minimizing risk
<latexit sha1_base64="QJ/ljc72z58oUdsvi8ZHPU5Q/Xw=">ACK3icbVBNSwMxEM36bf2qevQSLEILWnZFUA+C6MVjFast3VKy6bQNzWaXZFYsy/4gL/4VQTyoePV/mNYKfj0IPN6bmcy8IJbCoOu+OBOTU9Mzs3PzuYXFpeWV/OralYkSzaHKIxnpWsAMSKGgigIl1GINLAwkXAf906F/fQPaiEhd4iCGZsi6SnQEZ2ilVv7UBylbqY9wi6kGoTqRHZxlxdp2vUSP6M5F0e8xTOvZNq2XfBl1aeVLoX4o2rRWauULbtkdgf4l3pgUyBiVv7Rb0c8CUEhl8yYhufG2EyZRsElZDk/MRAz3mdaFiqWAimY6OzeiWVdrUrmfQjpSv3ekLDRmEAa2MmTYM7+9ofif10iwc9BMhYoTBMU/P+okmJEh8nRtDAUQ4sYVwLuyvlPaYZR5tvzobg/T75L6nulg/L3vle4fhknMYc2SCbpEg8sk+OyRmpkCrh5I48kGfy4tw7T86r8/ZOuGMe9bJDzjvHwvDpmQ=</latexit> <latexit sha1_base64="QJ/ljc72z58oUdsvi8ZHPU5Q/Xw=">ACK3icbVBNSwMxEM36bf2qevQSLEILWnZFUA+C6MVjFast3VKy6bQNzWaXZFYsy/4gL/4VQTyoePV/mNYKfj0IPN6bmcy8IJbCoOu+OBOTU9Mzs3PzuYXFpeWV/OralYkSzaHKIxnpWsAMSKGgigIl1GINLAwkXAf906F/fQPaiEhd4iCGZsi6SnQEZ2ilVv7UBylbqY9wi6kGoTqRHZxlxdp2vUSP6M5F0e8xTOvZNq2XfBl1aeVLoX4o2rRWauULbtkdgf4l3pgUyBiVv7Rb0c8CUEhl8yYhufG2EyZRsElZDk/MRAz3mdaFiqWAimY6OzeiWVdrUrmfQjpSv3ekLDRmEAa2MmTYM7+9ofif10iwc9BMhYoTBMU/P+okmJEh8nRtDAUQ4sYVwLuyvlPaYZR5tvzobg/T75L6nulg/L3vle4fhknMYc2SCbpEg8sk+OyRmpkCrh5I48kGfy4tw7T86r8/ZOuGMe9bJDzjvHwvDpmQ=</latexit> <latexit sha1_base64="QJ/ljc72z58oUdsvi8ZHPU5Q/Xw=">ACK3icbVBNSwMxEM36bf2qevQSLEILWnZFUA+C6MVjFast3VKy6bQNzWaXZFYsy/4gL/4VQTyoePV/mNYKfj0IPN6bmcy8IJbCoOu+OBOTU9Mzs3PzuYXFpeWV/OralYkSzaHKIxnpWsAMSKGgigIl1GINLAwkXAf906F/fQPaiEhd4iCGZsi6SnQEZ2ilVv7UBylbqY9wi6kGoTqRHZxlxdp2vUSP6M5F0e8xTOvZNq2XfBl1aeVLoX4o2rRWauULbtkdgf4l3pgUyBiVv7Rb0c8CUEhl8yYhufG2EyZRsElZDk/MRAz3mdaFiqWAimY6OzeiWVdrUrmfQjpSv3ekLDRmEAa2MmTYM7+9ofif10iwc9BMhYoTBMU/P+okmJEh8nRtDAUQ4sYVwLuyvlPaYZR5tvzobg/T75L6nulg/L3vle4fhknMYc2SCbpEg8sk+OyRmpkCrh5I48kGfy4tw7T86r8/ZOuGMe9bJDzjvHwvDpmQ=</latexit> Policy Gradient/REINFORCE • Alternative way of maximizing expected reward, minimizing risk ` reinforce ( X, Y ) = − R ( ˆ Y , Y ) log P ( ˆ Y | X )
<latexit sha1_base64="QJ/ljc72z58oUdsvi8ZHPU5Q/Xw=">ACK3icbVBNSwMxEM36bf2qevQSLEILWnZFUA+C6MVjFast3VKy6bQNzWaXZFYsy/4gL/4VQTyoePV/mNYKfj0IPN6bmcy8IJbCoOu+OBOTU9Mzs3PzuYXFpeWV/OralYkSzaHKIxnpWsAMSKGgigIl1GINLAwkXAf906F/fQPaiEhd4iCGZsi6SnQEZ2ilVv7UBylbqY9wi6kGoTqRHZxlxdp2vUSP6M5F0e8xTOvZNq2XfBl1aeVLoX4o2rRWauULbtkdgf4l3pgUyBiVv7Rb0c8CUEhl8yYhufG2EyZRsElZDk/MRAz3mdaFiqWAimY6OzeiWVdrUrmfQjpSv3ekLDRmEAa2MmTYM7+9ofif10iwc9BMhYoTBMU/P+okmJEh8nRtDAUQ4sYVwLuyvlPaYZR5tvzobg/T75L6nulg/L3vle4fhknMYc2SCbpEg8sk+OyRmpkCrh5I48kGfy4tw7T86r8/ZOuGMe9bJDzjvHwvDpmQ=</latexit> <latexit sha1_base64="QJ/ljc72z58oUdsvi8ZHPU5Q/Xw=">ACK3icbVBNSwMxEM36bf2qevQSLEILWnZFUA+C6MVjFast3VKy6bQNzWaXZFYsy/4gL/4VQTyoePV/mNYKfj0IPN6bmcy8IJbCoOu+OBOTU9Mzs3PzuYXFpeWV/OralYkSzaHKIxnpWsAMSKGgigIl1GINLAwkXAf906F/fQPaiEhd4iCGZsi6SnQEZ2ilVv7UBylbqY9wi6kGoTqRHZxlxdp2vUSP6M5F0e8xTOvZNq2XfBl1aeVLoX4o2rRWauULbtkdgf4l3pgUyBiVv7Rb0c8CUEhl8yYhufG2EyZRsElZDk/MRAz3mdaFiqWAimY6OzeiWVdrUrmfQjpSv3ekLDRmEAa2MmTYM7+9ofif10iwc9BMhYoTBMU/P+okmJEh8nRtDAUQ4sYVwLuyvlPaYZR5tvzobg/T75L6nulg/L3vle4fhknMYc2SCbpEg8sk+OyRmpkCrh5I48kGfy4tw7T86r8/ZOuGMe9bJDzjvHwvDpmQ=</latexit> <latexit sha1_base64="QJ/ljc72z58oUdsvi8ZHPU5Q/Xw=">ACK3icbVBNSwMxEM36bf2qevQSLEILWnZFUA+C6MVjFast3VKy6bQNzWaXZFYsy/4gL/4VQTyoePV/mNYKfj0IPN6bmcy8IJbCoOu+OBOTU9Mzs3PzuYXFpeWV/OralYkSzaHKIxnpWsAMSKGgigIl1GINLAwkXAf906F/fQPaiEhd4iCGZsi6SnQEZ2ilVv7UBylbqY9wi6kGoTqRHZxlxdp2vUSP6M5F0e8xTOvZNq2XfBl1aeVLoX4o2rRWauULbtkdgf4l3pgUyBiVv7Rb0c8CUEhl8yYhufG2EyZRsElZDk/MRAz3mdaFiqWAimY6OzeiWVdrUrmfQjpSv3ekLDRmEAa2MmTYM7+9ofif10iwc9BMhYoTBMU/P+okmJEh8nRtDAUQ4sYVwLuyvlPaYZR5tvzobg/T75L6nulg/L3vle4fhknMYc2SCbpEg8sk+OyRmpkCrh5I48kGfy4tw7T86r8/ZOuGMe9bJDzjvHwvDpmQ=</latexit> Policy Gradient/REINFORCE • Alternative way of maximizing expected reward, minimizing risk ` reinforce ( X, Y ) = − R ( ˆ Y , Y ) log P ( ˆ Y | X ) • Outputs that get a bigger reward will get a higher weight
<latexit sha1_base64="QJ/ljc72z58oUdsvi8ZHPU5Q/Xw=">ACK3icbVBNSwMxEM36bf2qevQSLEILWnZFUA+C6MVjFast3VKy6bQNzWaXZFYsy/4gL/4VQTyoePV/mNYKfj0IPN6bmcy8IJbCoOu+OBOTU9Mzs3PzuYXFpeWV/OralYkSzaHKIxnpWsAMSKGgigIl1GINLAwkXAf906F/fQPaiEhd4iCGZsi6SnQEZ2ilVv7UBylbqY9wi6kGoTqRHZxlxdp2vUSP6M5F0e8xTOvZNq2XfBl1aeVLoX4o2rRWauULbtkdgf4l3pgUyBiVv7Rb0c8CUEhl8yYhufG2EyZRsElZDk/MRAz3mdaFiqWAimY6OzeiWVdrUrmfQjpSv3ekLDRmEAa2MmTYM7+9ofif10iwc9BMhYoTBMU/P+okmJEh8nRtDAUQ4sYVwLuyvlPaYZR5tvzobg/T75L6nulg/L3vle4fhknMYc2SCbpEg8sk+OyRmpkCrh5I48kGfy4tw7T86r8/ZOuGMe9bJDzjvHwvDpmQ=</latexit> <latexit sha1_base64="QJ/ljc72z58oUdsvi8ZHPU5Q/Xw=">ACK3icbVBNSwMxEM36bf2qevQSLEILWnZFUA+C6MVjFast3VKy6bQNzWaXZFYsy/4gL/4VQTyoePV/mNYKfj0IPN6bmcy8IJbCoOu+OBOTU9Mzs3PzuYXFpeWV/OralYkSzaHKIxnpWsAMSKGgigIl1GINLAwkXAf906F/fQPaiEhd4iCGZsi6SnQEZ2ilVv7UBylbqY9wi6kGoTqRHZxlxdp2vUSP6M5F0e8xTOvZNq2XfBl1aeVLoX4o2rRWauULbtkdgf4l3pgUyBiVv7Rb0c8CUEhl8yYhufG2EyZRsElZDk/MRAz3mdaFiqWAimY6OzeiWVdrUrmfQjpSv3ekLDRmEAa2MmTYM7+9ofif10iwc9BMhYoTBMU/P+okmJEh8nRtDAUQ4sYVwLuyvlPaYZR5tvzobg/T75L6nulg/L3vle4fhknMYc2SCbpEg8sk+OyRmpkCrh5I48kGfy4tw7T86r8/ZOuGMe9bJDzjvHwvDpmQ=</latexit> <latexit sha1_base64="QJ/ljc72z58oUdsvi8ZHPU5Q/Xw=">ACK3icbVBNSwMxEM36bf2qevQSLEILWnZFUA+C6MVjFast3VKy6bQNzWaXZFYsy/4gL/4VQTyoePV/mNYKfj0IPN6bmcy8IJbCoOu+OBOTU9Mzs3PzuYXFpeWV/OralYkSzaHKIxnpWsAMSKGgigIl1GINLAwkXAf906F/fQPaiEhd4iCGZsi6SnQEZ2ilVv7UBylbqY9wi6kGoTqRHZxlxdp2vUSP6M5F0e8xTOvZNq2XfBl1aeVLoX4o2rRWauULbtkdgf4l3pgUyBiVv7Rb0c8CUEhl8yYhufG2EyZRsElZDk/MRAz3mdaFiqWAimY6OzeiWVdrUrmfQjpSv3ekLDRmEAa2MmTYM7+9ofif10iwc9BMhYoTBMU/P+okmJEh8nRtDAUQ4sYVwLuyvlPaYZR5tvzobg/T75L6nulg/L3vle4fhknMYc2SCbpEg8sk+OyRmpkCrh5I48kGfy4tw7T86r8/ZOuGMe9bJDzjvHwvDpmQ=</latexit> Policy Gradient/REINFORCE • Alternative way of maximizing expected reward, minimizing risk ` reinforce ( X, Y ) = − R ( ˆ Y , Y ) log P ( ˆ Y | X ) • Outputs that get a bigger reward will get a higher weight • Can show this converges to minimum-risk solution
But Wait, why is Everyone Using MLE for NMT?
When Training goes Bad... Minimum risk training for neural machine translation (Shen et al. 2015)
When Training goes Bad... Chances are, this is you 😕 Minimum risk training for neural machine translation (Shen et al. 2015)
It Happens to the Best of Us
It Happens to the Best of Us • Email from a famous MT researcher: "we also re-implemented MRT, but so far, training has been very unstable, and after a improving for a bit, our models develop a bias towards producing ever-shorter translations..."
My Current Recipe for Stabilizing MRT/Reinforcement Learning
Warm-start
Warm-start • Start training with maximum likelihood, then switch over to REINFORCE
Warm-start • Start training with maximum likelihood, then switch over to REINFORCE • Works only in the scenarios where we can run MLE (not latent variables or standard RL settings)
Warm-start • Start training with maximum likelihood, then switch over to REINFORCE • Works only in the scenarios where we can run MLE (not latent variables or standard RL settings) • MIXER (Ranzato et al. 2016) gradually transitions from MLE to the full objective
Adding a Baseline
Adding a Baseline • Basic idea: we have expectations about our reward for a particular sentence
Adding a Baseline • Basic idea: we have expectations about our reward for a particular sentence “This is an easy sentence” “Buffalo Buffalo Buffalo”
Adding a Baseline • Basic idea: we have expectations about our reward for a particular sentence Reward “This is an easy sentence” 0.8 “Buffalo Buffalo Buffalo” 0.3
Adding a Baseline • Basic idea: we have expectations about our reward for a particular sentence Reward Baseline “This is an easy sentence” 0.8 0.95 “Buffalo Buffalo Buffalo” 0.3 0.1
Adding a Baseline • Basic idea: we have expectations about our reward for a particular sentence Reward Baseline B-R “This is an easy sentence” 0.8 0.95 -0.15 “Buffalo Buffalo Buffalo” 0.3 0.1 0.2
Adding a Baseline • Basic idea: we have expectations about our reward for a particular sentence Reward Baseline B-R “This is an easy sentence” 0.8 0.95 -0.15 “Buffalo Buffalo Buffalo” 0.3 0.1 0.2 • We can instead weight our likelihood by B-R to reflect when we did better or worse than expected
Adding a Baseline • Basic idea: we have expectations about our reward for a particular sentence Reward Baseline B-R “This is an easy sentence” 0.8 0.95 -0.15 “Buffalo Buffalo Buffalo” 0.3 0.1 0.2 • We can instead weight our likelihood by B-R to reflect when we did better or worse than expected ` baseline ( X ) = − ( R ( ˆ Y , Y ) − B ( ˆ Y )) log P ( ˆ Y | X )
Adding a Baseline • Basic idea: we have expectations about our reward for a particular sentence Reward Baseline B-R “This is an easy sentence” 0.8 0.95 -0.15 “Buffalo Buffalo Buffalo” 0.3 0.1 0.2 • We can instead weight our likelihood by B-R to reflect when we did better or worse than expected ` baseline ( X ) = − ( R ( ˆ Y , Y ) − B ( ˆ Y )) log P ( ˆ Y | X ) • (Be careful to not backprop through the baseline)
Increasing Batch Size
Increasing Batch Size • Because each sample will be high variance, we can sample many different examples before performing update
Increasing Batch Size • Because each sample will be high variance, we can sample many different examples before performing update • We can increase the number of examples (roll-outs) done before an update to stabilize
Increasing Batch Size • Because each sample will be high variance, we can sample many different examples before performing update • We can increase the number of examples (roll-outs) done before an update to stabilize • We can also save previous roll-outs and re-use them when we update parameters (experience replay, Lin 1993)
Adding Temperature
<latexit sha1_base64="e4M3TNipvdjfyQh+cH52R4IE4w0=">ACa3icbVHLbhMxFPUMrxIeTQsLXguLqFIiRWGmqgRVhVSBWrEMKqEVmRB5PHcaK7ZnZN9BRNas+EN2fAIbvgFPOovSciXrHp1zH/ZxWkphMYp+BeGNm7du39m427l3/8HDze7W9mdbVIbDhBeyMGcpsyCFhgkKlHBWGmAqlXCaLt83+uk3MFYU+hOuSpgpdq5FLjhDT827PxKE7+iMsMu6fzykR0Oa4AKQNZlVQ3oyoG9pYis1dwkKmYE7qmkiND3xKTeMu3H/kqBERo8P2hmDry5+1Yypa/fFi+tVYExh6n6zaMHQNw3m3V40itZBr4O4BT3Sxnje/ZlkBa8UaOSWTuNoxJnjhkUXELdSoLJeNLdg5TDzVTYGdubVZNdzyT0bw/mika/Zyh2PK2pVKfaViuLBXtYb8nzatMH8zc0KXFYLmF4vySlIsaOM8zYQBjnLlAeNG+LtSvmDeQfT/0/EmxFefB1Mdkf7o/jXu/wXevGBnlOXpI+iclrckg+kDGZE5+B5vBk+Bp8Cd8HD4LX1yUhkHb84j8E+HOX85buPE=</latexit> <latexit sha1_base64="e4M3TNipvdjfyQh+cH52R4IE4w0=">ACa3icbVHLbhMxFPUMrxIeTQsLXguLqFIiRWGmqgRVhVSBWrEMKqEVmRB5PHcaK7ZnZN9BRNas+EN2fAIbvgFPOovSciXrHp1zH/ZxWkphMYp+BeGNm7du39m427l3/8HDze7W9mdbVIbDhBeyMGcpsyCFhgkKlHBWGmAqlXCaLt83+uk3MFYU+hOuSpgpdq5FLjhDT827PxKE7+iMsMu6fzykR0Oa4AKQNZlVQ3oyoG9pYis1dwkKmYE7qmkiND3xKTeMu3H/kqBERo8P2hmDry5+1Yypa/fFi+tVYExh6n6zaMHQNw3m3V40itZBr4O4BT3Sxnje/ZlkBa8UaOSWTuNoxJnjhkUXELdSoLJeNLdg5TDzVTYGdubVZNdzyT0bw/mika/Zyh2PK2pVKfaViuLBXtYb8nzatMH8zc0KXFYLmF4vySlIsaOM8zYQBjnLlAeNG+LtSvmDeQfT/0/EmxFefB1Mdkf7o/jXu/wXevGBnlOXpI+iclrckg+kDGZE5+B5vBk+Bp8Cd8HD4LX1yUhkHb84j8E+HOX85buPE=</latexit> <latexit sha1_base64="e4M3TNipvdjfyQh+cH52R4IE4w0=">ACa3icbVHLbhMxFPUMrxIeTQsLXguLqFIiRWGmqgRVhVSBWrEMKqEVmRB5PHcaK7ZnZN9BRNas+EN2fAIbvgFPOovSciXrHp1zH/ZxWkphMYp+BeGNm7du39m427l3/8HDze7W9mdbVIbDhBeyMGcpsyCFhgkKlHBWGmAqlXCaLt83+uk3MFYU+hOuSpgpdq5FLjhDT827PxKE7+iMsMu6fzykR0Oa4AKQNZlVQ3oyoG9pYis1dwkKmYE7qmkiND3xKTeMu3H/kqBERo8P2hmDry5+1Yypa/fFi+tVYExh6n6zaMHQNw3m3V40itZBr4O4BT3Sxnje/ZlkBa8UaOSWTuNoxJnjhkUXELdSoLJeNLdg5TDzVTYGdubVZNdzyT0bw/mika/Zyh2PK2pVKfaViuLBXtYb8nzatMH8zc0KXFYLmF4vySlIsaOM8zYQBjnLlAeNG+LtSvmDeQfT/0/EmxFefB1Mdkf7o/jXu/wXevGBnlOXpI+iclrckg+kDGZE5+B5vBk+Bp8Cd8HD4LX1yUhkHb84j8E+HOX85buPE=</latexit> Adding Temperature P ( ˜ E | F ; θ ) 1 / τ error( E, ˆ X risk( F, E, θ , τ , S ) = E ) Z ˜ E ∈ S
<latexit sha1_base64="e4M3TNipvdjfyQh+cH52R4IE4w0=">ACa3icbVHLbhMxFPUMrxIeTQsLXguLqFIiRWGmqgRVhVSBWrEMKqEVmRB5PHcaK7ZnZN9BRNas+EN2fAIbvgFPOovSciXrHp1zH/ZxWkphMYp+BeGNm7du39m427l3/8HDze7W9mdbVIbDhBeyMGcpsyCFhgkKlHBWGmAqlXCaLt83+uk3MFYU+hOuSpgpdq5FLjhDT827PxKE7+iMsMu6fzykR0Oa4AKQNZlVQ3oyoG9pYis1dwkKmYE7qmkiND3xKTeMu3H/kqBERo8P2hmDry5+1Yypa/fFi+tVYExh6n6zaMHQNw3m3V40itZBr4O4BT3Sxnje/ZlkBa8UaOSWTuNoxJnjhkUXELdSoLJeNLdg5TDzVTYGdubVZNdzyT0bw/mika/Zyh2PK2pVKfaViuLBXtYb8nzatMH8zc0KXFYLmF4vySlIsaOM8zYQBjnLlAeNG+LtSvmDeQfT/0/EmxFefB1Mdkf7o/jXu/wXevGBnlOXpI+iclrckg+kDGZE5+B5vBk+Bp8Cd8HD4LX1yUhkHb84j8E+HOX85buPE=</latexit> <latexit sha1_base64="e4M3TNipvdjfyQh+cH52R4IE4w0=">ACa3icbVHLbhMxFPUMrxIeTQsLXguLqFIiRWGmqgRVhVSBWrEMKqEVmRB5PHcaK7ZnZN9BRNas+EN2fAIbvgFPOovSciXrHp1zH/ZxWkphMYp+BeGNm7du39m427l3/8HDze7W9mdbVIbDhBeyMGcpsyCFhgkKlHBWGmAqlXCaLt83+uk3MFYU+hOuSpgpdq5FLjhDT827PxKE7+iMsMu6fzykR0Oa4AKQNZlVQ3oyoG9pYis1dwkKmYE7qmkiND3xKTeMu3H/kqBERo8P2hmDry5+1Yypa/fFi+tVYExh6n6zaMHQNw3m3V40itZBr4O4BT3Sxnje/ZlkBa8UaOSWTuNoxJnjhkUXELdSoLJeNLdg5TDzVTYGdubVZNdzyT0bw/mika/Zyh2PK2pVKfaViuLBXtYb8nzatMH8zc0KXFYLmF4vySlIsaOM8zYQBjnLlAeNG+LtSvmDeQfT/0/EmxFefB1Mdkf7o/jXu/wXevGBnlOXpI+iclrckg+kDGZE5+B5vBk+Bp8Cd8HD4LX1yUhkHb84j8E+HOX85buPE=</latexit> <latexit sha1_base64="e4M3TNipvdjfyQh+cH52R4IE4w0=">ACa3icbVHLbhMxFPUMrxIeTQsLXguLqFIiRWGmqgRVhVSBWrEMKqEVmRB5PHcaK7ZnZN9BRNas+EN2fAIbvgFPOovSciXrHp1zH/ZxWkphMYp+BeGNm7du39m427l3/8HDze7W9mdbVIbDhBeyMGcpsyCFhgkKlHBWGmAqlXCaLt83+uk3MFYU+hOuSpgpdq5FLjhDT827PxKE7+iMsMu6fzykR0Oa4AKQNZlVQ3oyoG9pYis1dwkKmYE7qmkiND3xKTeMu3H/kqBERo8P2hmDry5+1Yypa/fFi+tVYExh6n6zaMHQNw3m3V40itZBr4O4BT3Sxnje/ZlkBa8UaOSWTuNoxJnjhkUXELdSoLJeNLdg5TDzVTYGdubVZNdzyT0bw/mika/Zyh2PK2pVKfaViuLBXtYb8nzatMH8zc0KXFYLmF4vySlIsaOM8zYQBjnLlAeNG+LtSvmDeQfT/0/EmxFefB1Mdkf7o/jXu/wXevGBnlOXpI+iclrckg+kDGZE5+B5vBk+Bp8Cd8HD4LX1yUhkHb84j8E+HOX85buPE=</latexit> Adding Temperature P ( ˜ E | F ; θ ) 1 / τ error( E, ˆ X risk( F, E, θ , τ , S ) = E ) Z ˜ E ∈ S • Temperature adjusts the peakiness of the distribution τ = 1 τ = 0.5 τ = 0.25 τ = 0.05 2 2 2 2 1.5 1.5 1.5 1.5 1 1 1 1 0.5 0.5 0.5 0.5 0 0 0 0 -4 -3 -2 -1 0 1 2 3 4 -4 -3 -2 -1 0 1 2 3 4 -4 -3 -2 -1 0 1 2 3 4 -4 -3 -2 -1 0 1 2 3 4
<latexit sha1_base64="e4M3TNipvdjfyQh+cH52R4IE4w0=">ACa3icbVHLbhMxFPUMrxIeTQsLXguLqFIiRWGmqgRVhVSBWrEMKqEVmRB5PHcaK7ZnZN9BRNas+EN2fAIbvgFPOovSciXrHp1zH/ZxWkphMYp+BeGNm7du39m427l3/8HDze7W9mdbVIbDhBeyMGcpsyCFhgkKlHBWGmAqlXCaLt83+uk3MFYU+hOuSpgpdq5FLjhDT827PxKE7+iMsMu6fzykR0Oa4AKQNZlVQ3oyoG9pYis1dwkKmYE7qmkiND3xKTeMu3H/kqBERo8P2hmDry5+1Yypa/fFi+tVYExh6n6zaMHQNw3m3V40itZBr4O4BT3Sxnje/ZlkBa8UaOSWTuNoxJnjhkUXELdSoLJeNLdg5TDzVTYGdubVZNdzyT0bw/mika/Zyh2PK2pVKfaViuLBXtYb8nzatMH8zc0KXFYLmF4vySlIsaOM8zYQBjnLlAeNG+LtSvmDeQfT/0/EmxFefB1Mdkf7o/jXu/wXevGBnlOXpI+iclrckg+kDGZE5+B5vBk+Bp8Cd8HD4LX1yUhkHb84j8E+HOX85buPE=</latexit> <latexit sha1_base64="e4M3TNipvdjfyQh+cH52R4IE4w0=">ACa3icbVHLbhMxFPUMrxIeTQsLXguLqFIiRWGmqgRVhVSBWrEMKqEVmRB5PHcaK7ZnZN9BRNas+EN2fAIbvgFPOovSciXrHp1zH/ZxWkphMYp+BeGNm7du39m427l3/8HDze7W9mdbVIbDhBeyMGcpsyCFhgkKlHBWGmAqlXCaLt83+uk3MFYU+hOuSpgpdq5FLjhDT827PxKE7+iMsMu6fzykR0Oa4AKQNZlVQ3oyoG9pYis1dwkKmYE7qmkiND3xKTeMu3H/kqBERo8P2hmDry5+1Yypa/fFi+tVYExh6n6zaMHQNw3m3V40itZBr4O4BT3Sxnje/ZlkBa8UaOSWTuNoxJnjhkUXELdSoLJeNLdg5TDzVTYGdubVZNdzyT0bw/mika/Zyh2PK2pVKfaViuLBXtYb8nzatMH8zc0KXFYLmF4vySlIsaOM8zYQBjnLlAeNG+LtSvmDeQfT/0/EmxFefB1Mdkf7o/jXu/wXevGBnlOXpI+iclrckg+kDGZE5+B5vBk+Bp8Cd8HD4LX1yUhkHb84j8E+HOX85buPE=</latexit> <latexit sha1_base64="e4M3TNipvdjfyQh+cH52R4IE4w0=">ACa3icbVHLbhMxFPUMrxIeTQsLXguLqFIiRWGmqgRVhVSBWrEMKqEVmRB5PHcaK7ZnZN9BRNas+EN2fAIbvgFPOovSciXrHp1zH/ZxWkphMYp+BeGNm7du39m427l3/8HDze7W9mdbVIbDhBeyMGcpsyCFhgkKlHBWGmAqlXCaLt83+uk3MFYU+hOuSpgpdq5FLjhDT827PxKE7+iMsMu6fzykR0Oa4AKQNZlVQ3oyoG9pYis1dwkKmYE7qmkiND3xKTeMu3H/kqBERo8P2hmDry5+1Yypa/fFi+tVYExh6n6zaMHQNw3m3V40itZBr4O4BT3Sxnje/ZlkBa8UaOSWTuNoxJnjhkUXELdSoLJeNLdg5TDzVTYGdubVZNdzyT0bw/mika/Zyh2PK2pVKfaViuLBXtYb8nzatMH8zc0KXFYLmF4vySlIsaOM8zYQBjnLlAeNG+LtSvmDeQfT/0/EmxFefB1Mdkf7o/jXu/wXevGBnlOXpI+iclrckg+kDGZE5+B5vBk+Bp8Cd8HD4LX1yUhkHb84j8E+HOX85buPE=</latexit> Adding Temperature P ( ˜ E | F ; θ ) 1 / τ error( E, ˆ X risk( F, E, θ , τ , S ) = E ) Z ˜ E ∈ S • Temperature adjusts the peakiness of the distribution τ = 1 τ = 0.5 τ = 0.25 τ = 0.05 2 2 2 2 1.5 1.5 1.5 1.5 1 1 1 1 0.5 0.5 0.5 0.5 0 0 0 0 -4 -3 -2 -1 0 1 2 3 4 -4 -3 -2 -1 0 1 2 3 4 -4 -3 -2 -1 0 1 2 3 4 -4 -3 -2 -1 0 1 2 3 4 • With a small sample, setting temperature > 1 accounts for unsampled hypotheses that should be in the denominator
Contrasting Phrase-based SMT and NMT
Phrase-based SMT MERT and NMT MinRisk/REINFORCE
Phrase-based SMT MERT and NMT MinRisk/REINFORCE NMT+ PBMT+MERT MinRisk
Phrase-based SMT MERT and NMT MinRisk/REINFORCE NMT+ PBMT+MERT MinRisk Model NMT PBMT
Phrase-based SMT MERT and NMT MinRisk/REINFORCE NMT+ PBMT+MERT MinRisk Model NMT PBMT Optimized 5-30 Log-linear Millions Parameters Weights (others MLE)
Phrase-based SMT MERT and NMT MinRisk/REINFORCE NMT+ PBMT+MERT MinRisk Model NMT PBMT Optimized 5-30 Log-linear Millions Parameters Weights (others MLE) Objective Risk Error
Recommend
More recommend