This very small program is an example from a book on C programming I'm currently working through. It runs perfectly. I understand most of it. That being said, I cannot understand why the initialization of num does not cause an error.
Code:
#include <stdio.h>
long factorial( long num );
int main (int argc, const char * argv[]) {
long num = 5L;
printf( "%ld factorial is %ld.", num,
factorial( num ) );
return 0;
}
long factorial( long num ) {
if ( num > 1 )
num *= factorial( num - 1 );
return( num );
}