> However the danger with allocating on the stack is that of stack smashing, a very big danger indeed.
No less dangerous than heap smashing, but certainly more immediately obvious.
If you write bad code, then it really doesn't matter where you allocate the memory, you're screwed either way.
> What techniques do you use to determine beforehand (if possible) maximum memory usage bounds
> (naturally this would require keeping track of the possible call graphs especially wrt recursion)?
On a desktop environment, I generally don't care at all.
The default stack in such an environment is typically in the megabytes - that's a lot of stack frames (like 1000's).
As a rough guess, the typical deepest stack (number of functions) is somewhere between the square root of the number of functions, and log2 of the number of functions in the program, excluding recursion.
Efficient recursive functions like say qsort are log2 of their input size. So 1 million records would be around 20 frames, 1 billion records would be around 30 frames.
Bad recursive functions, like say a naive Fibonacci, typically start taking too long to be of any practical use before their recursion gets near to being deep enough to be a problem.
Mind you, doesn't stop the newbies from trying int array[1000][1000], which we usually see every couple of months on the forums.
But hey, if you're curious, use -Wstack-usage on your GCC command line.
Code:
$ g++ -std=c++11 -Wall -Wstack-usage=512 proj.cpp
proj.cpp: In function ‘std::__cxx11::string test0()’:
proj.cpp:8:8: warning: stack usage is 576 bytes [-Wstack-usage=]
string test0(void)
^
proj.cpp: In function ‘std::__cxx11::string test1()’:
proj.cpp:20:8: warning: stack usage is 640 bytes [-Wstack-usage=]
string test1(void)
^
proj.cpp: In function ‘std::__cxx11::string test2()’:
proj.cpp:34:8: warning: stack usage is 640 bytes [-Wstack-usage=]
string test2(void)
^