Why the program below generates a segmentation fault? 100.000 x 26 should occupy 3MB at most... is it too much?
Thanks.
Code:#include <stdio.h>
int main() {
char m[100000][26];
return 0;
}
Printable View
Why the program below generates a segmentation fault? 100.000 x 26 should occupy 3MB at most... is it too much?
Thanks.
Code:#include <stdio.h>
int main() {
char m[100000][26];
return 0;
}
No idea why. It does run normally for me. Use malloc though if you don't think of anything else.
Code:char **m = malloc(100000 * sizeof(char *));
for (i=0; i < 100000; i++)
m[i] = malloc(26 * sizeof(char));
Seems to me default stack size if usually around 1MB (although a Google search indicates 8MB for OS X), so best bet is you're exceeding your stack limit. This is also usually a configurable value, so you may be able to change it to be more to your liking. Or you can go with the heap, as C_ntua suggests, and which would be more portable.
In the wonderful world of protected mode programming it's not too much. However something tells me there are better data structures out there that you can use to gain the same functionality.Quote:
Why the program below generates a segmentation fault? 100.000 x 26 should occupy 3MB at most... is it too much?
Yes it is too much. The default stack size is usually 1MB and if you ever find that this is not enough then basically you're doing something wrong. You need to use dynamic memory allocation.
Note that the suggestion of 100000 allocations of 26 bytes each is not particularly efficient. You should only alocate the entries in such an array that you are actually using. Also you shouldn't attempts to declare something huge as a catch-all size and cross your fingers that it is big enough. Always only allocate approixmately how much you need and if you need more later then allocate one twice as big and copy the old items across.
Better yet, use C++ and use a std::vector that will do this for you.