nonlinear equations how can we solve these equations
play

Nonlinear Equations How can we solve these equations? = 40 / - PowerPoint PPT Presentation

Nonlinear Equations How can we solve these equations? = 40 / Spring force: = What is the displacement when = 2N? How can we solve these equations? Drag force: ! = 0.5 / = 0.5


  1. Nonlinear Equations

  2. How can we solve these equations? 𝑙 = 40 𝑂/𝑛 β€’ Spring force: 𝐺 = 𝑙 𝑦 What is the displacement when 𝐺 = 2N?

  3. How can we solve these equations? β€’ Drag force: 𝜈 ! = 0.5 𝑂𝑑/𝑛 𝐺 = 0.5 𝐷 ( 𝜍 𝐡 𝑀 ) = 𝜈 ( 𝑀 ) What is the velocity when 𝐺 = 20N ?

  4. 𝜈 ! = 0.5 𝑂𝑑/𝑛 𝑔 𝑀 = 𝜈 ( 𝑀 ) βˆ’πΊ = 0 Find the root (zero) of the nonlinear equation 𝑔 𝑀 Nonlinear Equations in 1D Goal: Solve 𝑔 𝑦 = 0 for 𝑔: β„› β†’ β„› Often called Root Finding

  5. Bisection method

  6. Bisection method

  7. Convergence An iterative method converges with rate 𝑠 if: ||2 !"# || lim ||2 ! || $ = 𝐷, 0 < 𝐷 < ∞ 𝑠 = 1: linear convergence .β†’0 Linear convergence gains a constant number of accurate digits each step (and 𝐷 < 1 matters!) For example: Power Iteration

  8. Convergence An iterative method converges with rate 𝑠 if: ||𝑓 .34 || lim ||𝑓 . || 5 = 𝐷, 0 < 𝐷 < ∞ .β†’0 𝑠 = 1: linear convergence 𝑠 > 1: superlinear convergence 𝑠 = 2: quadratic convergence Linear convergence gains a constant number of accurate digits each step (and 𝐷 < 1 matters!) Quadratic convergence doubles the number of accurate digits in each step (however it only starts making sense once ||𝑓 . || is small (and 𝐷 does not matter much)

  9. Convergence β€’ The bisection method does not estimate 𝑦 . , the approximation of the desired root 𝑦. It instead finds an interval smaller than a given tolerance that contains the root.

  10. Example: Consider the nonlinear equation 𝑔 𝑦 = 0.5𝑦 ) βˆ’ 2 and solving f x = 0 using the Bisection Method. For each of the initial intervals below, how many iterations are required to ensure the root is accurate within 2 67 ? A) [βˆ’10, βˆ’1.8] B) [βˆ’3, βˆ’2.1] C) [βˆ’4, 1.9]

  11. Bisection method Algorithm: 1.Take two points, 𝑏 and 𝑐 , on each side of the root such that 𝑔(𝑏) and 𝑔(𝑐) have opposite signs. 2.Calculate the midpoint 𝑛 = !"# $ 3. Evaluate 𝑔(𝑛) and use 𝑛 to replace either 𝑏 or 𝑐 , keeping the signs of the endpoints opposite.

  12. Bisection Method - summary q The function must be continuous with a root in the interval 𝑏, 𝑐 q Requires only one function evaluations for each iteration! o The first iteration requires two function evaluations. q Given the initial internal [𝑏, 𝑐] , the length of the interval after 𝑙 iterations is 869 ) ! q Has linear convergence

  13. Newton’s method β€’ Recall we want to solve 𝑔 𝑦 = 0 for 𝑔: β„› β†’ β„› β€’ The Taylor expansion: 𝑔 𝑦 ! + β„Ž β‰ˆ 𝑔 𝑦 ! + 𝑔′ 𝑦 ! β„Ž gives a linear approximation for the nonlinear function 𝑔 near 𝑦 ! .

  14. Newton’s method 𝑦 "

  15. Example Consider solving the nonlinear equation 5 = 2.0 𝑓 " + 𝑦 # What is the result of applying one iteration of Newton’s method for solving nonlinear equations with initial starting guess 𝑦 $ = 0, i.e. what is 𝑦 % ? A) βˆ’2 B) 0.75 C) βˆ’1.5 D) 1.5 E) 3.0

  16. Newton’s Method - summary q Must be started with initial guess close enough to root (convergence is only local). Otherwise it may not converge at all. q Requires function and first derivative evaluation at each iteration (think about two function evaluations) q Typically has quadratic convergence ||𝑓 .34 || lim ||𝑓 . || ) = 𝐷, 0 < 𝐷 < ∞ .β†’0 q What can we do when the derivative evaluation is too costly (or difficult to evaluate)?

  17. Secant method Also derived from Taylor expansion, but instead of using 𝑔′ 𝑦 . , it approximates the tangent with the secant line: 𝑦 .34 = 𝑦 . βˆ’ 𝑔 𝑦 . /𝑔′ 𝑦 .

  18. Secant Method - summary q Still local convergence q Requires only one function evaluation per iteration (only the first iteration requires two function evaluations) q Needs two starting guesses q Has slower convergence than Newton’s Method – superlinear convergence ||𝑓 .34 || lim ||𝑓 . || 5 = 𝐷, 1 < 𝑠 < 2 .β†’0

  19. 1D methods for root finding: Method Update Convergence Cost Check signs of 𝑔 𝑏 and Linear (𝑠 = 1 and c = 0.5) Bisection One function evaluation per 𝑔 𝑐 iteration, no need to compute derivatives 𝑒 ! = |𝑐 βˆ’ 𝑏| 2 ! Superlinear 𝑠 = 1.618 , Secant 𝑦 !"# = 𝑦 ! + β„Ž One function evaluation per local convergence properties, iteration (two evaluations for β„Ž = βˆ’π‘” 𝑦 ! /𝑒𝑔𝑏 convergence depends on the the initial guesses only), no initial guess need to compute derivatives 𝑒𝑔𝑏 = 𝑔 𝑦 ! βˆ’ 𝑔 𝑦 !$# 𝑦 ! βˆ’ 𝑦 !$# Quadratic 𝑠 = 2 , local 𝑦 !"# = 𝑦 ! + β„Ž Newton Two function evaluations per convergence properties, iteration, requires first order β„Ž = βˆ’π‘” 𝑦 ! /𝑔′ 𝑦 ! convergence depends on the derivatives initial guess

Recommend


More recommend