Just a follow up post on a strange and very difficult problem I finally cracked, yet in actual fact it was very simple.

I'd written a 11000+ line program that was on a single file. It compiled and ran successfully on a Linux system and when I tried installing it on another Unix system while it compiled fine it would not run. However if I removed certain lines in the script it would run, yet the code I was removing was perfectly valid. It was extremely perplexing!!!
Well the problem was this secondary Unix system was smaller in memory capacity and that's what caused the weird behavior. As soon as I reduced the size of the variables I'd set up it worked fine. I calculated my program was demanding about 50 megs of memory - thats heaps! I reduced it to just a few megs and it runs fine now. Here I was thinking there's something unreliable about gcc when it compiles large single C files - Not at all. GCC is HIGHLY reliable. My faith is fully restored, its us programmers that stuff things up.
Actually this is a great tip to anyone experiencing problems with programs being installed on different systems. Take into mind your program's demand on memory and keep everything to a minimum if you want it to work universally.