# Thread: 10 digit computation problem

1. ## 10 digit computation problem

My program is supposed to display a ten digit number starting with the last number on top. I keep getting an error when divide the number entered by 10000000000.

the error says "integer constant too large for 'long' type"

I attempted using long long int for the variables and there was no change in the error message. I also tried to divde by 10 and then divide by 100000000, the program ran but it's output was all garbage when ten digits were entered.

Code:
```#include <stdio.h>

int main(void)
{
int n, d1,d2,d3,d4,d5,d6,d7,d8,d9,d10;
int a, b, c, d, e, f, g, h, i;
printf("Enter a number:");
scanf("%d", &n);

d1 = n%10;

a = n%100;
d2 = a/10;

b = n%1000;
d3 = b/100;

c = n%10000;
d4 = c/1000;

d = n%100000;
d5 = d/10000;

e = n%1000000;
d6 = e/100000;

f = n%10000000;
d7 = f/1000000;

g = n%100000000;
d8 = g/10000000;

h = n%1000000000;
d9 = h/100000000;

d10 = i/1000000000;

printf("%d\n%d\n%d\n%d\n%d\n%d\n%d\n%d\n%d\n%d\n",d1,d2,d3,d4,d5,d6,d7,d8,d9,d10);
}```
Any solution ideas would be appreciated.

2. The error message is self-explanatory. There is no guarantee a long integer type can hold a tend digit value. In fact, a long unsigned type is only guaranteed to be able to represent a maximum value of 4294967296 (which means it can represent some, but not all, 9 digit values).

A long long type is guaranteed to be able to hold ten-digit values - although the limits are finite. The thing to remember is that literal values (like 1000000) have type int by default - the compiler doesn't magically decide to make a literal value have type long long unless you tell it to, even if the value is large. If you want a literal value to be of type long long, use 10000000LL. Also, when printing a long long type, a different format than %d is needed.

You would also be better off using an array and a loop, rather than trying to hardcode multiple variables named d1,d2,d3, .... d10. That way, all you need to do is divide by ten (repeatedly) in the loop, not divide by hard coded powers of 10.

3. you also need a different format specifier on the scanf to get a long long

4. You aren't showing any code that reproduces the problem. The code there should work fine for 32-bit ints. It only tries to divide by one billion, not ten billion. As it is using signed ints, this will produce a maximum of 2 for the highest digit.

You should be using unsigned types, and yes as mentioned, just divide by 10 after extracting each digit. You don't need anywhere near that many variabled.

5. I'm trying to use %u for a unsigned int, is that the correct format specifier? Also what would be the correct one for long long unsigned int? I rewrote the code into a loop. 9 digits works perfect but as soon as I enter 10 i get complete garbage output.

6. Originally Posted by Josh Genser
9 digits works perfect but as soon as I enter 10 i get complete garbage output.
Your unsigned ints are probably 32 bits, thus the biggest possible value is 4,294,967,295. Bigger values will overflow and will look like "garbage".