I am attempting to write a program using a divide and average algorithm to approximate the square root of any positive number A, which will be inputted into the program. I then need to take an initial approximation X that is positive and find a new approximation by calculating the average, (X+A/X)/2 I am really confused as to where I should start. I am trying to find out how a square root actually works. any help would be great.

sincerely,

confused