Because the heap is often the (only) area of a process that can change size at runtime. I don't see how that's related to C? As C gives no definition of the "stack" and "heap" -- instead automatic storage and dynamic storage. In most implementations the stack size is small, and fixed at compile time (or process initialisation).
So I'd say, "chances are, given most implementations the stack will be small and fixed size, thus the memory allocation might not fit on the stack". And perhaps mention that heap allocations may be optimised for large allocations. And also releasing the memory might be difficult on the stack, since it would have to "surface to the stop" (i.e. fall out of scope) to be freed. Consider:
Code:
int huge[100000] = {0};
int alsoHuge[100000] = {0};
/* ... use huge and alsoHuge */
Now I'm done with huge, but I (being the compiler or implementation) would have to 'pop' alsoHuge to release huge since it was probably pushed before alsoHuge.
Of course that could be solved by an optimising compiler. Except when the dependency isn't linear...