Thread: difference between %i and %d

1. difference between %i and %d

Hi all,

I've been debuging a code for almost two weeks now and I found that the problem was the in a fscanf/sscanf statement I used %i instead of %d...

Can someone please explain to me what is the difference between %d and %i?
Also, what is a "decimal integer"?

Thanks
Spiros

2. For the scanf() family of functions, &#37;d is always assumed to be an integer in base 10. %i is assumed to be an int, but depending on what format it is supplied in, it will be interpreted as being in base 8, 10, or even 16.

3. "decimal" means "base 10". %d will always read a number in as though it were base 10; %i follows the same rules as constants in a program -- a number starting with 0 is octal, a number starting with 0x is hex.

4. According to my Schildt book on 'C', it says "You may use either &#37;i or %d to indicate a signed decimal number. These format specifiers are equivalent; both are supported for historical reasons.

It may be that %i is no longer supported by your compiler. Other than that, either should work if both are supported.

5. understood, cheers guys!

6. Originally Posted by kcpilot
According to my Schildt book on 'C', it says "You may use either %i or %d to indicate a signed decimal number. These format specifiers are equivalent; both are supported for historical reasons.

It may be that %i is no longer supported by your compiler. Other than that, either should work if both are supported.
And here is reason 4177 for not reading your Schildt book on 'C'.

7. From what I hear, Schildt is a moron that should never have written a book on C.

8. I think Schildt was confusing scanf() with printf(). printf()'s &#37;i and %d are identical for historical reasons (i.e., symmetry with scanf()) -- but as far as I know, neither format specifier is deprecated.

On the topic of Schildt's books . . . they always seem to use while(!feof(fp)).