Thread: array limits in C++

  1. #1
    Registered User
    Join Date
    Dec 2004
    Posts
    1

    array limits in C++

    Hi all,
    this is a newbie question. I have started with C++ only a week ago .
    I want to define an array of doubles C[T][T]. The compilation goes fine (I use Borland C++ 5.5.1 for win 32 in case it matters).
    For T=1000 the program does not run. If T=200, the program runs. Are there any limits on a size of arrays? If so, is there a way to circumvent them? An alternative way to define matrices.
    any hint what's going on would be valued!
    Thanks

  2. #2
    Registered User
    Join Date
    Dec 2004
    Location
    UK
    Posts
    109
    As far as I know the only limit on the size of an array is how much memory you have.

    A double is 64 bit. 8 bytes for a million cells (1000*1000) comes roughly to 8 megabytes. That shouldn't be a problem (if you are running a 32 bit application) unless there's some compiler issues.

    Sorry I can't be of greater help.

  3. #3
    Guest Sebastiani's Avatar
    Join Date
    Aug 2001
    Location
    Waterloo, Texas
    Posts
    5,708
    if you declared the array as a local variable then you probably exceeded the amount of stack space that the OS gave your program to run under. try dynamically allocating the data with malloc/new.
    Code:
    #include <cmath>
    #include <complex>
    bool euler_flip(bool value)
    {
        return std::pow
        (
            std::complex<float>(std::exp(1.0)), 
            std::complex<float>(0, 1) 
            * std::complex<float>(std::atan(1.0)
            *(1 << (value + 2)))
        ).real() < 0;
    }

  4. #4
    Code Goddess Prelude's Avatar
    Join Date
    Sep 2001
    Posts
    9,897
    >If so, is there a way to circumvent them?
    The limit on array sizes is the edge of the comfort zone with your common sense. The way to circumvent problems with static arrays is to allocate dynamic memory using new, or figure out how to avoid using such a large array in the first place. I have never needed a 1000x1000 matrix of doubles. Hell, I haven't even needed a 200x200 matrix of doubles in real world programming (and I would probably be fired if I tried to use one). 99.9% of huge arrays can be cut in size drastically by rethinking your problem.

    And 70% of all statistics are a lie.
    My best code is written with the delete key.

  5. #5
    Registered User
    Join Date
    Dec 2004
    Location
    UK
    Posts
    109
    doh

    Totally forgot about stack limits. Kind of assumed he'd be working with a dynamic array if he was using something that big.

  6. #6
    Banned master5001's Avatar
    Join Date
    Aug 2001
    Location
    Visalia, CA, USA
    Posts
    3,685
    > And 70% of all statistics are a lie.
    That they are

    Realistically you shouldn't be using huge 2d arrays ever since they are more time consuming to calculate than one dimensional arrays. And there are no limits to array sizes, only limits to memory allocations. And since you are trying to create a big fat static 2d array you should also know that the stack for any program is not going to be big enough to accomodate an array that size.

    (you beat me this time sigfriedmcwild)

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Replies: 16
    Last Post: 05-29-2009, 07:25 PM
  2. from 2D array to 1D array
    By cfdprogrammer in forum C Programming
    Replies: 17
    Last Post: 03-24-2009, 10:33 AM
  3. Class Template Trouble
    By pliang in forum C++ Programming
    Replies: 4
    Last Post: 04-21-2005, 04:15 AM
  4. Unknown Memory Leak in Init() Function
    By CodeHacker in forum Windows Programming
    Replies: 3
    Last Post: 07-09-2004, 09:54 AM
  5. Quick question about SIGSEGV
    By Cikotic in forum C Programming
    Replies: 30
    Last Post: 07-01-2004, 07:48 PM