Divide and Average Algorithm

This is a discussion on Divide and Average Algorithm within the C Programming forums, part of the General Programming Boards category; I am attempting to write a program using a divide and average algorithm to approximate the square root of any ...

  1. #1
    Unregistered
    Guest

    Angry Divide and Average Algorithm

    I am attempting to write a program using a divide and average algorithm to approximate the square root of any positive number A, which will be inputted into the program. I then need to take an initial approximation X that is positive and find a new approximation by calculating the average, (X+A/X)/2 I am really confused as to where I should start. I am trying to find out how a square root actually works. any help would be great.

    sincerely,

    confused

  2. #2
    ....
    Join Date
    Aug 2001
    Location
    Groningen (NL)
    Posts
    2,386
    I don't know anything by the name "Newton's Square Root Theory." I
    wonder if you can be thinking about the fact that the ancient "divide-
    and-average" algorithm for approximating sqrt(a) is in fact just
    Newton's method, applied to the function f(x) = x^2 - a?

    First, here's the divide-and-average algorithm.

    Suppose that you wish to calculate the square root of a number A. The
    divide-and-average algorithm is:

    1. Choose a rough approximation G of sqrt(A).
    2. Divide A by G and then average the quotient with G, that is,
    calculate:

    G* = ((A/G)+G)/2

    3. If G* is sufficiently accurate, stop. Otherwise, let G = G* and
    return to step 2.

    Here's an example: To calculate the sqrt(2), choose G = 1.5.

    G* = (2/1.5 + 1.5)/2 = 1.41666666666
    G* = (2/1.41666666666+1.41666666666)=1.41421568628
    G* = (2/1.41421568628+1.41421568628)=1.41421356238
    G* = (2/1.41421356238+1.41421356238)=1.41421356238

    The number of correct decimal places more or less doubles with each
    repetition of step 2.

    Secondly, Newton's method is a method in calculus for determining a
    zero of a function. Suppose f has a zero near a; then if we set
    x_1 = a and define:

    x_{n+1} = x_n - f (x_n)/f'(x_n), n = 1, 2, 3, ...

    in many cases the sequence x_1, x_2, ... will converge to the zero
    near a.

    It turns out that if f(x) = x^2 - a and we take x_1 = a/2, then
    Newton's method is the divide-and-average algorithm.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Need explanation of variable arg lists
    By Countfog in forum C Programming
    Replies: 2
    Last Post: 05-04-2008, 04:34 PM
  2. calculating average
    By coolca in forum C Programming
    Replies: 1
    Last Post: 10-11-2006, 05:55 PM
  3. daily average
    By pamd in forum C Programming
    Replies: 18
    Last Post: 12-03-2005, 02:08 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21