I'm really sorry if this is a dumb question but ever since I started using C, I never understood the reason why functions are defined with much larger values then required.
I'm planning to write an commercial API so I wish to get my code written as best as possible, so I won't have to go back and fix things later.
My questions is that they're are millions of functions out there but many are definied with much higher variables then required.
An example of this is the int SDL_Flip(SDL_Surface *screen); function in the Simple Direct Layer library. The function will only return two possible values, 0 for success & -1 on an error. So why is this defined as an interger? Wouldn't be better to define this as an char?
int main(int argc, char *argv) is another one. I'll be very surpised if their was a single program that could return 4228250625 different return values with good reason. I'm sure most people just use two. One for a success and another for an error. I'm guessing that main() is an expection however, because it's not like any other function.
So why are some many programs and functions like this? I'm sure that changing these won't ever gain the slightest speedup, but this wouldn't this make your executables a few bytes smaller?
Thanks to any replies. Once again sorry if this was a stupid question.