difference between %i and %d

This is a discussion on difference between %i and %d within the C Programming forums, part of the General Programming Boards category; Hi all, I've been debuging a code for almost two weeks now and I found that the problem was the ...

  1. #1
    Registered User
    Join Date
    Nov 2004
    Posts
    55

    difference between %i and %d

    Hi all,

    I've been debuging a code for almost two weeks now and I found that the problem was the in a fscanf/sscanf statement I used %i instead of %d...

    Can someone please explain to me what is the difference between %d and %i?
    Also, what is a "decimal integer"?

    Thanks
    Spiros

  2. #2
    Deathray Engineer MacGyver's Avatar
    Join Date
    Mar 2007
    Posts
    3,211
    For the scanf() family of functions, %d is always assumed to be an integer in base 10. %i is assumed to be an int, but depending on what format it is supplied in, it will be interpreted as being in base 8, 10, or even 16.

  3. #3
    and the Hat of Guessing tabstop's Avatar
    Join Date
    Nov 2007
    Posts
    14,185
    "decimal" means "base 10". %d will always read a number in as though it were base 10; %i follows the same rules as constants in a program -- a number starting with 0 is octal, a number starting with 0x is hex.

  4. #4
    Registered User
    Join Date
    Jan 2007
    Location
    Euless, TX
    Posts
    144
    According to my Schildt book on 'C', it says "You may use either %i or %d to indicate a signed decimal number. These format specifiers are equivalent; both are supported for historical reasons.

    It may be that %i is no longer supported by your compiler. Other than that, either should work if both are supported.

  5. #5
    Registered User
    Join Date
    Nov 2004
    Posts
    55
    understood, cheers guys!

  6. #6
    and the Hat of Guessing tabstop's Avatar
    Join Date
    Nov 2007
    Posts
    14,185
    Quote Originally Posted by kcpilot View Post
    According to my Schildt book on 'C', it says "You may use either %i or %d to indicate a signed decimal number. These format specifiers are equivalent; both are supported for historical reasons.

    It may be that %i is no longer supported by your compiler. Other than that, either should work if both are supported.
    And here is reason 4177 for not reading your Schildt book on 'C'.

  7. #7
    Deathray Engineer MacGyver's Avatar
    Join Date
    Mar 2007
    Posts
    3,211
    From what I hear, Schildt is a moron that should never have written a book on C.

  8. #8
    Frequently Quite Prolix dwks's Avatar
    Join Date
    Apr 2005
    Location
    Canada
    Posts
    8,045
    I think Schildt was confusing scanf() with printf(). printf()'s %i and %d are identical for historical reasons (i.e., symmetry with scanf()) -- but as far as I know, neither format specifier is deprecated.

    On the topic of Schildt's books . . . they always seem to use while(!feof(fp)).
    dwk

    Seek and ye shall find. quaere et invenies.

    "Simplicity does not precede complexity, but follows it." -- Alan Perlis
    "Testing can only prove the presence of bugs, not their absence." -- Edsger Dijkstra
    "The only real mistake is the one from which we learn nothing." -- John Powell


    Other boards: DaniWeb, TPS
    Unofficial Wiki FAQ: cpwiki.sf.net

    My website: http://dwks.theprogrammingsite.com/
    Projects: codeform, xuni, atlantis, nort, etc.

Popular pages Recent additions subscribe to a feed

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21