i created the following program to calculate the digits of pi; however, it begins to approach pi, then starts to fall apart and eventually just returns values of 0. i believe that his is because it is rounding all the variable to too few digits. how can i maximize the number of digits it uses? right now i have it set at 18 after the decimal place, because that is all it was doing anyway.

Code:#include <stdio.h> #include <math.h> #include <stdlib.h> int main() { long double r,n,a,s,p; r=0; a=.5; while (r==r) { n=3*pow(2,(r)); s=2*sqrt(1-(a*a)); p=(s*n)/2; printf("pi=%1.18Lf",p); getchar(); system ("cls"); a=sqrt((1+a)/2); r++; } return (0); }