# Thread: Divide and Average Algorithm

1. ## Divide and Average Algorithm

I am attempting to write a program using a divide and average algorithm to approximate the square root of any positive number A, which will be inputted into the program. I then need to take an initial approximation X that is positive and find a new approximation by calculating the average, (X+A/X)/2 I am really confused as to where I should start. I am trying to find out how a square root actually works. any help would be great.

sincerely,

confused

2. I don't know anything by the name "Newton's Square Root Theory." I
wonder if you can be thinking about the fact that the ancient "divide-
and-average" algorithm for approximating sqrt(a) is in fact just
Newton's method, applied to the function f(x) = x^2 - a?

First, here's the divide-and-average algorithm.

Suppose that you wish to calculate the square root of a number A. The
divide-and-average algorithm is:

1. Choose a rough approximation G of sqrt(A).
2. Divide A by G and then average the quotient with G, that is,
calculate:

G* = ((A/G)+G)/2

3. If G* is sufficiently accurate, stop. Otherwise, let G = G* and

Here's an example: To calculate the sqrt(2), choose G = 1.5.

G* = (2/1.5 + 1.5)/2 = 1.41666666666
G* = (2/1.41666666666+1.41666666666)=1.41421568628
G* = (2/1.41421568628+1.41421568628)=1.41421356238
G* = (2/1.41421356238+1.41421356238)=1.41421356238

The number of correct decimal places more or less doubles with each
repetition of step 2.

Secondly, Newton's method is a method in calculus for determining a
zero of a function. Suppose f has a zero near a; then if we set
x_1 = a and define:

x_{n+1} = x_n - f (x_n)/f'(x_n), n = 1, 2, 3, ...

in many cases the sequence x_1, x_2, ... will converge to the zero
near a.

It turns out that if f(x) = x^2 - a and we take x_1 = a/2, then
Newton's method is the divide-and-average algorithm.