linear classifiers prediction eq u ations
play

Linear classifiers : prediction eq u ations L IN E AR C L ASSIFIE R - PowerPoint PPT Presentation

Linear classifiers : prediction eq u ations L IN E AR C L ASSIFIE R S IN P YTH ON Michael ( Mike ) Gelbart Instr u ctor , The Uni v ersit y of British Col u mbia Dot Prod u cts x = np.arange(3) np.sum(x*y) x 14 array([0, 1, 2]) x@y y =


  1. Linear classifiers : prediction eq u ations L IN E AR C L ASSIFIE R S IN P YTH ON Michael ( Mike ) Gelbart Instr u ctor , The Uni v ersit y of British Col u mbia

  2. Dot Prod u cts x = np.arange(3) np.sum(x*y) x 14 array([0, 1, 2]) x@y y = np.arange(3,6) y 14 array([3, 4, 5]) x@y is called the dot prod u ct of x and y , and is x*y w ri � en x ⋅ y . array([0, 4, 10]) LINEAR CLASSIFIERS IN PYTHON

  3. Linear classifier prediction raw model output = coefficients ⋅ features + intercept Linear classi � er prediction : comp u te ra w model o u tp u t , check the sign if positi v e , predict one class if negati v e , predict the other class This is the same for logistic regression and linear SVM fit is di � erent b u t predict is the same LINEAR CLASSIFIERS IN PYTHON

  4. Ho w LogisticRegression makes predictions raw model output = coefficients ⋅ features + intercept lr = LogisticRegression() lr.fit(X,y) lr.predict(X)[10] 0 lr.predict(X)[20] 1 LINEAR CLASSIFIERS IN PYTHON

  5. Ho w LogisticRegression makes predictions ( cont .) lr.coef_ @ X[10] + lr.intercept_ # raw model output array([-33.78572166]) lr.coef_ @ X[20] + lr.intercept_ # raw model output array([ 0.08050621]) LINEAR CLASSIFIERS IN PYTHON

  6. The ra w model o u tp u t LINEAR CLASSIFIERS IN PYTHON

  7. The ra w model o u tp u t LINEAR CLASSIFIERS IN PYTHON

  8. The ra w model o u tp u t LINEAR CLASSIFIERS IN PYTHON

  9. Let ' s practice ! L IN E AR C L ASSIFIE R S IN P YTH ON

  10. What is a loss f u nction ? L IN E AR C L ASSIFIE R S IN P YTH ON Michael Gelbart Instr u ctor , The Uni v ersit y of British Col u mbia

  11. Least sq u ares : the sq u ared loss scikit - learn ' s LinearRegression minimi z es a loss : n ∑ 2 (true i th target value − predicted i th target value) i =1 Minimi z ation is w ith respect to coe � cients or parameters of the model . Note that in scikit - learn model.score() isn ' t necessaril y the loss f u nction . LINEAR CLASSIFIERS IN PYTHON

  12. Classification errors : the 0-1 loss Sq u ared loss not appropriate for classi � cation problems ( more on this later ). A nat u ral loss for classi � cation problem is the n u mber of errors . This is the 0-1 loss : it ' s 0 for a correct prediction and 1 for an incorrect prediction . B u t this loss is hard to minimi z e ! LINEAR CLASSIFIERS IN PYTHON

  13. Minimi z ing a loss from scipy.optimize import minimize minimize(np.square, 0).x array([0.]) minimize(np.square, 2).x array([-1.88846401e-08]) LINEAR CLASSIFIERS IN PYTHON

  14. Let ' s practice ! L IN E AR C L ASSIFIE R S IN P YTH ON

  15. Loss f u nction diagrams L IN E AR C L ASSIFIE R S IN P YTH ON Michael ( Mike ) Gelbart Instr u ctor , The Uni v ersit y of British Col u mbia

  16. The ra w model o u tp u t LINEAR CLASSIFIERS IN PYTHON

  17. 0-1 loss diagram LINEAR CLASSIFIERS IN PYTHON

  18. Linear regression loss diagram LINEAR CLASSIFIERS IN PYTHON

  19. Logistic loss diagram LINEAR CLASSIFIERS IN PYTHON

  20. Hinge loss diagram LINEAR CLASSIFIERS IN PYTHON

  21. Hinge loss diagram LINEAR CLASSIFIERS IN PYTHON

  22. Let ' s practice ! L IN E AR C L ASSIFIE R S IN P YTH ON

Recommend


More recommend