I need Immediate help with a problem due in a few hours....
Distance on the Earth's surface. Write a program that takes the latitudes and longitudes of two points on the earth's surface and computes the distance between them in nautical miles, miles, and kilometers. The angular distance (delta) between two points is given by:
cos(delta) = (sin_lat1)(sinlat_2) + (coslat_1)(coslat_2)[cos(long_1 - long_2)]
This formula assumes that both longitudes are measured either east of Greenwich or west of Greenwich. Multiplying the Angular distance by the radius of the earth gives the surface distance.
This is A problem I have no clue where to start does anyone have a helpful code solution?