hi forum

i've been trying to solve this problem:

The following iterative sequence is defined for the set of positive integers:

n → n/2 (n is even)

n → 3n + 1 (n is odd)

Using the rule above and starting with 13, we generate the following sequence:

13 → 40 → 20 → 10 → 5 → 16 → 8 → 4 → 2 → 1

It can be seen that this sequence (starting at 13 and finishing at 1) contains 10 terms. Although it has not been proved yet (Collatz Problem), it is thought that all starting numbers finish at 1.

Which starting number, under one million, produces the longest chain?

the code i've written looks like this:

the problem is every time i try to run it i get a message that says:Code:#include <stdio.h> #include <stdlib.h> int main() { int *a, seq, maxseq=1, i, j, num=1; a=malloc(1000000 * sizeof(int)); if(a==NULL){ printf("........!"); return 1; } a[1]=1; for(i=2; i<1000000; i++){ seq=0; j=i; while(j>=i){ if(j%2) j=3*j+1; else j/=2; seq++; } seq+=a[j]; a[i]=seq; if(seq>maxseq){ maxseq=seq; num=i; } } free(a); printf("%d",num); return 0; }

"code.c has stopped working" and it shuts down... it works when i run it up to 100,000

or so but it wont work for 1,000,000- can anyone help??

btw im using code::blocks with GNU GCC compiler, dont know if thats relevant..