1. ## Limited memory allocation

So i found out my next program will need a lot of memory, but want to figure an estimate of how much.

I first wanted to make an end_list[10000][100] array for the structure End, but since i only could get it working with [100][100] then i decided to make more structures and arrays but even that doesn't work.

So why and how will I be able to solve this?

Code:
```#include <stdio.h>
#include <string.h>
#include <time.h>
#include <iostream>
#include <stdlib.h>
using namespace std;

int main()
{
struct Start {char A[5]; char B[100];};
struct End1 {char A[5]; char B[100];};
//    struct End2 {char A[5]; char B[100];};
//    struct End3 {char A[5]; char B[100];};
//    struct End4 {char A[5]; char B[100];};

Start start_list[100];
End1 end_list1[100][100];
//    End2 end_list2[100][100];
//    End3 end_list3[100][100];
//    End4 end_list4[100][100];

system("pause");
}```
Edit: Put system("pause") so I can check how much memory is taken, hopefully you have windows :P

2. Both of your struct types have a size of at least 105 characters (5 + 100 + any padding inserted by the compiler). A 10000x100 array of these will consume 105000000 (105x10000x100) bytes, minimum. That is just over 100 megabytes.

I won't bore you with a discussion of operating system quotas, and the like. Suffice to say that most modern operating systems (not just windows) impose quotas on processes, which limit the amount of memory that a running process can use. That quota provides an upper limit on the total size of variables that, in C, can be declared local to functions, as statics, etc.

A common solution is to use dynamic memory allocation (for example, malloc()) to allocate memory for large arrays. That does not eliminate the problem .... it simply uses memory in a way for which the operating system allows a larger quota. All it means is that the upper limit on array sizes is increased, not infinite. If the amount of memory used by such array is a large percentage of available RAM, then the operating system and the offending program slow to a crawl (as the virtual memory system swaps processes between RAM - which is fast - and swap file on disk - which is slow).

Quotas can be changed, but that tends to affect stability of the operating system.

The better approach would be to design your program in a smarter way so it does not need such large arrays. That involves some trade-off.

3. Since you're creating these arrays as local variables they will be stored on the stack, and you're running out of stack space. Try dynamically allocating them, or make them "static" (or global).

4. Originally Posted by grumpy
A 10000x100 array of these
I'm not seeing where it's a 10000x100 array. at most I see a 100x100 array, in which case, it's 1050000 bytes (minimum). it's likely that this size will also overflow most stacks, since I believe the default stack size on windows and linux is 1MB (1048576 bytes), although I could be wrong about this.

5. In Johnathon'S description he mentions a 10000x100 array, but in his code he had trouble with even a 100x100 array. Anyway, grumpy's comment about attempting to redesign the thing to use less memory is certainly the first thing to try. If that can't be done, then 100MB is not that big on a modern machine, but of course it won't fit on the stack (unless you increase the stack space).

6. Originally Posted by oogabooga
(unless you increase the stack space).
and how would I do that?

7. It's system specific. Did you try simply putting the word static in front of your array definition?

8. Don't blindly reach for the static keyword: how exactly are you going to use this array? Have you tried using say, std::vector instead for dynamic memory allocation?