Hi all,

again another question hoping to find it's answer on this forum. The question is related to a program I recently made, a program for integer factorization.

I won't be posting the whole code here, since it's six pages long, and half of that is only "ifs" and "switches" so I never get output in the form of, say, "5^0", or "2 ^1" and such. The basic way the program works:

1. Asks user for the number they want to factorize

2. Generates primes up to that number into an 2D array (second dimension explained later), and at the same time checks if the entered number is prime.

3. Divides by two while possible, incrementing a variable every time it does so.

4. Does the same thing except with division by three.

5. If, after division by two and three, the result is one, it prints out something like "2^2 x 3"

6. If after division by two and three, the result isn't one, division continues with the primes in the array generated at the beginning. Whenever a prime is a divisor, the second dimension, which is by default 0, is incremented. This is done while possible.

7. The primes that have a value bigger than 0 in the second dimension of the array are transfered to a new array, which is then printed out. I have the same switches/ifs etc. to ensure proper form

The complete source code is included in the attachment. The code works fine, except for one thing, that being that it can't factorize numbers above thirty two thousand something. I can't remember the exact number. I have a feeling that is has something to do with the number of elements an array can have, but I'm not sure. I've tried changing the data type, but even when I use an unsigned long long int it won't change anything.

Help is appreciated