I'm needing an experienced programmers opinion about gcc and writing c programs. I've written a fairly large program 11000 + lines of code in a single program - a single file.
It compiles and runs fine on one Linux server yet on another Unix server, while it compiles fine, when executed it seems to stuble over the simplest areas of code that it should recognise - there is no error in the code - but for some reason the executable won't run - very peculair behavior.
Does gcc have trouble dealing with very large files or something? Or is it the different system the problem here? I just don't get it ! For example in the main function if I omit the following line, which initialises a global variable (struct variable)
the script will work - now what the hell is going on here? Its the same for certain other spots in the code - it just doesn't seem to like it for some reason when its perfectly correct code.
Would breaking the code down into smaller files and linking during compiling likely help? Does this improve the chances of long areas of code working more reliably?
Has anyone experienced this sort of problem before and how did you get around it,
Cheers, your help and words of wisdom will be much appreciated,