My program is supposed to display a ten digit number starting with the last number on top. I keep getting an error when divide the number entered by 10000000000.

the error says "integer constant too large for 'long' type"

I attempted using long long int for the variables and there was no change in the error message. I also tried to divde by 10 and then divide by 100000000, the program ran but it's output was all garbage when ten digits were entered.

Any solution ideas would be appreciated.Code:#include <stdio.h> int main(void) { int n, d1,d2,d3,d4,d5,d6,d7,d8,d9,d10; int a, b, c, d, e, f, g, h, i; printf("Enter a number:"); scanf("%d", &n); d1 = n%10; a = n%100; d2 = a/10; b = n%1000; d3 = b/100; c = n%10000; d4 = c/1000; d = n%100000; d5 = d/10000; e = n%1000000; d6 = e/100000; f = n%10000000; d7 = f/1000000; g = n%100000000; d8 = g/10000000; h = n%1000000000; d9 = h/100000000; d10 = i/1000000000; printf("%d\n%d\n%d\n%d\n%d\n%d\n%d\n%d\n%d\n%d\n",d1,d2,d3,d4,d5,d6,d7,d8,d9,d10); }