Optimizing the Algorithm
Hard-Margin Objective ● Current objective: ● want to relax hard constraint
Soft-Margin Objective ● New objective:
Loss Function
Loss Function ● ● takes into account window overlaps up to 50%
Soft-Margin Objective ● New objective: ● Will use 1-slack cutting plane ○ Want faster, more highly scalable and parallelizable solver
1-Slack Formulation ● new objective: ● Extremely sparse solutions ○ Number of non-zero dual variables independent of number of training examples ● Size of cutting plane models and number of iterations bounded by ○ Regularization constant ○ Desired precision of solution
1-Slack Formulation ● new objective: ● Extremely sparse solutions ○ Number of non-zero dual variables independent of number of training examples ● Size of cutting plane models and number of iterations bounded by ○ Regularization constant ○ Desired precision of solution
1-Slack Formulation ● new objective: ● where ○ most violated constraint ● greedily find all entries
1-Slack Formulation ● new objective: ● same as:
Cutting Plane Approximation ● find lower bound approximation by piecewise linear functions
Cutting Plane formulation ● current objective: ● bounded by: ● R(w) is the empirical risk ○ how violated the constraints are
Approximating R(w)
Approximating R(w)
Approximating R(w)
Cutting Plane Algorithm ● Reduced objective: Iterate j = 1,...,t until convergence: 1. Find by solving the reduced objective ■ t will typically be small (~10-100), so we can use off-the-shelf solvers 2. Save hyperplane ■ needed to update
Recommend
More recommend