convert 32 bit number to array of digits

Hi,

I am attempting to convert a 32-bit number into an array of its digits, the indices of which can then be used to fill a structure. I obtained this code, but I am having a little difficulty in understanding some parts of it:

Code:

`int * numArrayFill(u32 number) `

{

unsigned int length = (int)(log10((float)number)) + 1; // calculates length of number

int *numArray = (int *) (length * sizeof(int)), *curr = numArray;

do {

*curr++ = number % 10;

number /= 10;

} while (number != 0);

return numArray;

}

The work of the function seems to be done by this line:

Code:

`int *numArray = (int *) (length * sizeof(int)), *curr = numArray;`

but its this I don't understand.

Also, as I assume that numArray is an array of ints, then do I have to then declare this as an array in my calling function?

Thanks

Dave