Nonlinear Equations: Root snivelyj/ep501/EP501_9.pdf¢  2017. 10. 3.¢  Nonlinear...

download Nonlinear Equations: Root snivelyj/ep501/EP501_9.pdf¢  2017. 10. 3.¢  Nonlinear Equations: Root Finding

If you can't read please download the document

  • date post

    18-Sep-2020
  • Category

    Documents

  • view

    0
  • download

    0

Embed Size (px)

Transcript of Nonlinear Equations: Root snivelyj/ep501/EP501_9.pdf¢  2017. 10. 3.¢  Nonlinear...

  • Nonlinear Equations: Root Finding

    J. B. Snively (EP501)

  • Root Finding

    For the continuous nonlinear function f(x), find the value x=α such that f(α)=0.

    Or, transform to solve x=g(x), and find x=α, such that α=g(α) and f(α)=0.

  • Root Finding Two Step Process:

    1) Bound the solution 2) Refine the solution

    Closed Domain Methods

    Open Domain Methods

    • Interval Halving • False Position

    • Fixed-point iteration • Newton’s Method • Secant Method

    Trial and Error Methods

  • Root Finding Two Step Process:

    1) Bound the solution 2) Refine the solution

    Trial and Error Methods

  • Trial and Error a = minimum guess value

    δa= guess increment

    while |f(a)|>tol do (not converged)

    a=a+δa

    end.

    We advance a until f(a) is sufficiently close to 0.

  • Trial and Error a = minimum guess value

    δa= guess increment

    while |f(a)|>tol do (not converged)

    a=a+δa

    end.

    Advance a until f(a) is satisfactorily close to 0. Try for textbook problem (in degrees):

    Let a = 30º to start, tol = 0.0001.

  • Root Finding Two Step Process:

    1) Bound the solution 2) Refine the solution

    Closed Domain Methods • Interval Halving • False Position

  • Bisection Method a = lower bound guess value

    b = upper bound guess value

    while |f(c)|>tol do (while not converged)

    c=(a+b)/2

    if f(a)*f(c)tol, i.e., we can define tolerance in terms of how f(a) is to the root or how close a and b are to a single value.

    Let a = 30º, b = 40º to start, tol = 0.0001.

  • False Position Method a = lower bound guess value

    b = upper bound guess value

    while |f(c)|>tol do (while not converged)

    c=b-f(b)(b-a)/(f(b)-f(a))

    if f(a)*f(c)tol, i.e., we can define tolerance in terms of how f(a) is to the root or how close a and b are to a single value. 
 Here, c is a midpoint calculation defined by a linear interpolation!

  • Root Finding Two Step Process:

    1) Bound the solution 2) Refine the solution

    Open Domain Methods • Fixed-point iteration • Newton’s Method • Secant Method

  • Root Finding

    For the continuous nonlinear function f(x), find the value x=α such that f(α)=0.

    Or, transform to solve x=g(x), and find x=α, such that α=g(α) and f(α)=0.

  • Fixed-Point Iteration

    [Courtesy of A. Z. Liu]

    Fall 2015 EP501 Numerical Methods Lecture 5

    y

    xx 1

    x 2

    x 3

    g(x 1

    )

    g(x 2

    ) g(x

    3

    )

    y = x

    y = g(x)

    y

    xx 1

    x 2

    x 3

    g(x 1

    )

    g(x 2

    )

    g(x 3

    )

    y = x

    y = g(x)

    Figure 2: Illustration of fixed-point iteration. True solution is at the cross point between y = x and y = g(x). On the left arrows indicate how the iteration converges to the true solution while on the right, the iteration does not converge.

    The convergence criterion is ���� e i+1

    e i

    ���� = |g 0(⇠)| < 1. (1.16)

    Convergence is linear since e i+1

    is linearly dependent on e i

    . Rapid convergence occurs when this ratio is close to zero. Re-arranging f(x) to get a di↵erent g(x) may reduce |g0(x)| for faster convergence (but still linear).

    Example: Solve f(x) = x2 � sin x = 0 in the range x = 0.5 to 1. We can define

    x = g(x) = x2 � sin x + x (1.17) and get

    g0(x) = 2x � cos x + 1 > 1 when x > 0.5. (1.18) This g does not give a convergent iteration. Alternatively, we can define x = g(x) =

    p sin x, and get

    g0(x) = cos x

    2 p

    sin x < 1. (1.19)

    Using this g the iteration convergences.

    The fixed point iteration is not a desired method because of its unreliability in convergence. It is introduced here because the Newton’s method described next is also an iteration process and has some similar properties.

    b. Newton’s Method

    This is the most well-known and powerful method. The new estimate of the root x n+1

    is calculated as

    x i+1

    = x i

    � f(xi) f 0(x

    i

    ) . (1.20)

    This process continues until |x

    i+1

    � x i

    |  " 1

    and/or |f(x i+1

    )|  " 2

    . (1.21)

    Graphical illustration Newton’s method can also be obtained from the Taylor series

    f(x i+1

    ) = f(x i

    ) + f 0(x i

    )(x i+1

    � x i

    ) + · · · . (1.22)

    10

    Converging Diverging

  • Fixed-Point Iteration x1 = guess value

    g(xi)= where g(xi)=xi when f(xi)=0

    while |f(xi)|>tol do

    xi+1=g(xi)

    end.

    Advance xi until f(xi) is satisfactorily close to 0.

    Let phi = 30º to start, tol = 0.0001.

  • Need to Define Problem: Try for textbook problem (in degrees):

  • Found Root

    ? (deg.) 25 30 35 40

    f (? )

    25

    30

    35

    40

    a=32.0156 f(a)=9.6426e-06 k=4

  • Need to Define Problem: Try for textbook problem (in degrees):

    Another fixed point solution, but leads to a different result, finding another root that may not be relevant to the problem.

  • Found (Wrong) Root

    ? (deg.) -20 -10 0 10 20 30 40

    f (? )

    -20

    -10

    0

    10

    20

    30

    40

    a=-9.7467 f(a)=-8.5049e-06 k=17

  • Newton’s Method Iteration

    x1 = guess value

    while |f(xi)|>tol do

    xi+1=xi-f(xi)/f’(xi)

    end.

    Advance xi until f (xi) is satisfactorily close to 0.

    Have option to evaluate f ’(xi) numerically.

  • Secant Method Iteration x0 = guess value 0

    x1 = guess value 1

    while |f(xi)|>tol do

    slope=(f(xi)-f(xi-1))/(xi-xi-1)

    xi+1=xi-f(xi)/slope

    end.

    Advance xi until f(xi) is satisfactorily close to 0.