introduction
play

Introduction Lijun Zhang zlj@nju.edu.cn http://cs.nju.edu.cn/zlj - PowerPoint PPT Presentation

Introduction Lijun Zhang zlj@nju.edu.cn http://cs.nju.edu.cn/zlj Outline Mathematical Optimization Least-squares Linear Programming Convex Optimization Nonlinear Optimization Summary Outline Mathematical Optimization


  1. Introduction Lijun Zhang zlj@nju.edu.cn http://cs.nju.edu.cn/zlj

  2. Outline  Mathematical Optimization  Least-squares  Linear Programming  Convex Optimization  Nonlinear Optimization  Summary

  3. Outline  Mathematical Optimization  Least-squares  Linear Programming  Convex Optimization  Nonlinear Optimization  Summary

  4. Mathematical Optimization (1)  Optimization Problem � � �  Optimization Variable: � � �  Objective Function: � �  Constraint Functions: � ⋆ is called optimal or a solution  ⋆  � , �  For any with � � , we have � ∗ �

  5. Mathematical Optimization (2)  Linear Problem � � � � and all  for all  Nonlinear Program  If the optimization problem is not linear  Convex Optimization Problem � � � � and all  for all with , ,

  6. Applications � � �  Abstraction  represents the choice made  � represent firm requirements � that limit the possible choices  represents the cost of choosing �  A solution corresponds to a choice that has minimum cost, among all choices that meet the requirements

  7. Portfolio Optimization (1)  Variables  𝑦 � represents the investment in the 𝑗 -th asset  𝑦 ∈ 𝐒 � describes the overall portfolio allocation across the set of asset  Constraints  A limit on the budget the requirement  Investments are nonnegative  A minimum acceptable value of expected return for the whole portfolio  Objective  Minimize the variance of the portfolio return

  8. Portfolio Optimization (2)  We want to spread our money over 𝑂 different assets; the fraction of our money we invest in asset 𝑜 is denoted 𝑦 � . � � 𝑦 � � 1, and 0 � 𝑦 � � 1, for 𝑜 � 1, . . . , 𝑂 ���  Denote the return of these investments as 𝑏 � , . . . , 𝑏 � . The expected return which are usually calculated using some kind of historical average, is 𝜈 � , . . . , 𝜈 � . We specify some target expected return 𝜍 , which means: � � � 𝑦 � � 𝜈 � 𝑦 � 𝜍 E � 𝑏 � 𝑦 � � � E�𝑏 � � 𝑦 � � � 𝜈 � ��� ��� ���

  9. Portfolio Optimization (3)  We want to solve for the 𝑦 that achieves this level of return while minimizing the variance of our return � � � 𝑦 � � 𝑦 � Cov 𝑏 𝑦 � 𝑦 � 𝑆𝑦 � � � 𝑆 �,� Var � 𝑏 � 𝑦 � 𝑦 � ��� ��� ���  Our Optimization Program � � � � ��� �  Quadratic program with linear constraints, convex

  10. Device Sizing  Variables  𝑦 ∈ 𝐒 � describes the widths and lengths of the devices  Constraints  Limits on the device sizes  Timing requirements  A limit on the total area of the circuit  Objective  Minimize the total power consumed by the circuit

  11. Data Fitting  Variables  𝑦 ∈ 𝐒 � describes parameters in the model  Constraints  Prior information  Required limits on the parameters (such as nonnegativity)  Objective  Minimize the prediction error between the observed data and the values predicted by the model

  12. Solving Optimization Problems  General Optimization Problem  Very difficult to solve  Constraints can be very complicated, the number of variables can be very lage  Methods involve some compromise, e.g., computation time, or suboptimal solution  Exceptions  Least-squares problems  Linear programming problems  Convex optimization problems

  13. Outline  Mathematical Optimization  Least-squares  Linear Programming  Convex Optimization  Nonlinear Optimization  Summary

  14. Least-squares Problems (1)  The Problem � � � � � � � ��� � is the 𝑗 -th row of 𝐵 , 𝑐 ∈ 𝐒 �  𝐵 ∈ 𝐒 ��� , 𝑏 �  𝑦 ∈ 𝐒 � is the optimization variable How to solve it?

  15. Least-squares Problems (1)  The Problem � � � � � � � ��� � is the 𝑗 -th row of 𝐵 , 𝑐 ∈ 𝐒 �  𝐵 ∈ 𝐒 ��� , 𝑏 �  𝑦 ∈ 𝐒 � is the optimization variable  Setting the gradient to be 0 � � � � �� �

  16. Least-squares Problems (2)  A Set of Linear Equations � �  Solving least-squares problems  Reliable and efficient algorithms and software  Computation time proportional to ��� ; less if structured �  A mature technology  Challenging for extremely large problems

  17. Using Least-squares  Easy to Recognize  Weighted least-squares � � � � � � � � � � � � � ��� ���  Different importance

  18. Using Least-squares  Easy to Recognize  Weighted least-squares � � � � � � � � � � � � � ��� ���  Different importance  Regularization � � � � � � � � ��� ���  More stable

  19. Outline  Mathematical Optimization  Least-squares  Linear Programming  Convex Optimization  Nonlinear Optimization  Summary

  20. Linear Programming  The Problem � � � � � , �  � � �  Solving Linear Programs  No analytical formula for solution  Reliable and efficient algorithms and software  Computation time proportional to 𝑜 � 𝑛 if 𝑛 � 𝑜 ; less with structure  A mature technology  Challenging for extremely large problems

  21. Using Linear Programming  Not as easy to recognize  Chebyshev Approximation Problem � � � ���,…,� � � � ���,…,� � � � � � �

  22. Outline  Mathematical Optimization  Least-squares  Linear Programming  Convex Optimization  Nonlinear Optimization  Summary

  23. Convex Optimization  Why Convexity? “ The great watershed in optimization isn’t between linearity and nonlinearity, but convexity and nonconvexity.” — R. Rockafellar, SIAM Review 1993

  24. Convex Optimization  Why Convexity? Local minimizers “ The great watershed in optimization isn’t between linearity and nonlinearity, but convexity and nonconvexity.” are also global — R. Rockafellar, SIAM Review 1993 minimizers.

  25. Convex Optimization Problems (1)  The Problem � � � �  Functions � � � � � � and all for all with , ,  Least-squares and linear programs as special cases

  26. Convex Optimization Problems (2)  Solving Convex Optimization Problems  No analytical solution  Reliable and efficient algorithms (e.g., interior-point methods)  Computation time (roughly) proportional � � to � s and their first and  𝐺 is cost of evaluating 𝑔 � second derivatives  Almost a technology

  27. Using Convex Optimization  Often difficult to recognize  Many tricks for transforming problems into convex form  Surprisingly many problems can be solved via convex optimization

  28. An Example (1)  lamps illuminating patches  Intensity � at patch depends linearly on lamp powers � � �� max cos𝜄 �� , 0 𝐽 � � � 𝑏 �� 𝑞 � , 𝑏 �� � 𝑠 �� ���

  29. An Example (2)  Achieve desired illumination ��� with bounded lamp powers ���,...,� � ��� � ��� How to solve it?

  30. An Example (3) 1. Use uniform power: , vary � 2. Use least-squares � � � � 𝐽 � � 𝐽 ��� � min � � � � 𝑏 �� 𝑞 � � 𝐽 ��� ��� ��� ���  Round � if ��� or � � 3. Use weighted least-squares � � � 𝑞 � � 𝑞 ��� 𝐽 � � 𝐽 ��� � min � � � 𝑥 � 2 ��� ���  Adjust weights � until � ���

  31. An Example (4) 4. Use linear programming ���,...,� � ��� � ��� 5. Use convex optimization ���,...,� � ��� � ��� � ��� ���,...,� ��� � � ���

  32. An Example (5) � ��� ���,...,� ��� � � ��� � � ���,...,� �� � ��� ��� ��� � ��� �  �

  33. Outline  Mathematical Optimization  Least-squares  Linear Programming  Convex Optimization  Nonlinear Optimization  Summary

  34. Nonlinear Optimization  An optimization problem when the objective or constraint functions are not linear, but not known to be convex  Sadly, there are no effective methods for solving the general nonlinear programming problem  Could be NP-hard  We need compromise

  35. Local Optim ization Methods  Find a point that minimizes � among feasible points near it  The compromise is to give up seeking the optimal  Fast, can handle large problems  Differentiability  Require initial guess  Provide no information about distance to (global) optimum  Local optimization methods are more art than technology

  36. Comparisons Problem Solving the Formulation Problem Local Optimization Methods for Straightforward Art Nonlinear Programming Convex Optimization Art Standard

Recommend


More recommend