Noob question about #define

This is a discussion on Noob question about #define within the C Programming forums, part of the General Programming Boards category; I'm learning C finally I heard somewhere that #define a 20 is better than doing a=20; But never got it ...

  1. #1
    Registered User
    Join Date
    May 2008
    Posts
    52

    Noob question about #define

    I'm learning C finally

    I heard somewhere that #define a 20 is better than doing a=20;

    But never got it why.

    Can you explain me the advantages of using #define instead of a=20?

    (You can explain me through lower lever language if its easier, i know assembly).

  2. #2
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Depends on the compiler and the actual definition of "better".

    Macros (#define) will replace the text when compiling (before actually parsing and translating the code itself). This means that every time your macro namer occurs, it will be replaced with it's definition (so each 'a' will be replaced with '20' in your example).

    In reasonably new compilers, using const variables should be a better solution, e.g
    Code:
    const int a = 20;
    . Using a variable which is non-const will be a bad solution for two reasons:
    1. if you accidentally write something like if (a = b), then a is no longer 20, but whatever b is.
    2. The compiler doesn't know that a is a constant, it will produce extra code to read out a from it's memory location and instead of using the constant value directly.

    A third option is to use
    Code:
    enum
    {
       a = 20,
       ...
    }
    , which again allows the compiler to both understand that the value is a constant, and it's possible for the compiler to prevent bad code (e.g. a = b).

    There are very few cases in C where a macro is the only (reasonable) solution. That is when you should use those solutions. Any other time, alternatives should be used.

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

  3. #3
    Jack of many languages Dino's Avatar
    Join Date
    Nov 2007
    Location
    Katy, Texas
    Posts
    2,309
    When doing a #define, in assembly terms, you are creating a symbol. If you know assembly, then you know how to appreciate the use of symbols versus hardcoded constants.
    Mac and Windows cross platform programmer. Ruby lover.

    Quote of the Day
    12/20: Mario F.:I never was, am not, and never will be, one to shut up in the face of something I think is fundamentally wrong.

    Amen brother!

  4. #4
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Quote Originally Posted by Dino View Post
    When doing a #define, in assembly terms, you are creating a symbol. If you know assembly, then you know how to appreciate the use of symbols versus hardcoded constants.
    I'm not sure I completely agree with your wording there, but I see what you are trying to say. So put it another way:
    Code:
    ...
       func(20);
    ...
    is not quite as clear as
    Code:
    const int height = 20;  // We could use any other form of making "height" the value 20 here. 
    ...
       func(height);
    ...
    Obviously, calling it 'a' is probably meaningless - unless you are dealing with a triangle type problem with the sides/angles called a, b and c.

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

  5. #5
    Jack of many languages Dino's Avatar
    Join Date
    Nov 2007
    Location
    Katy, Texas
    Posts
    2,309
    Yes. The point being, instead of using what is called today a "magic number" (whereas I call it a "hard coded value"), the #define can make your life easier later on when that 20 needs to become a 25. Change it in one place and you are done.

    Todd
    Mac and Windows cross platform programmer. Ruby lover.

    Quote of the Day
    12/20: Mario F.:I never was, am not, and never will be, one to shut up in the face of something I think is fundamentally wrong.

    Amen brother!

  6. #6
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Quote Originally Posted by Dino View Post
    Yes. The point being, instead of using what is called today a "magic number" (whereas I call it a "hard coded value"), the #define can make your life easier later on when that 20 needs to become a 25. Change it in one place and you are done.

    Todd
    I completely agree. Although I also think there are several ways of doing this BETTER than #define, as I expressed in the first post - but a define is better than typing in 20 in several places - in case you have to change it later (and particularly as the same constant value is often used for different purpose, e.g. we may have a 100 long array and loops that traipse through that, but also a 100 x 200 pixel box that we want to draw on the screen. If we then need the array to grow to 200, we don't wan the box to be a square).

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

  7. #7
    Frequently Quite Prolix dwks's Avatar
    Join Date
    Apr 2005
    Location
    Canada
    Posts
    8,046
    Note that a #define has one other advantage that hasn't been mentioned yet: you can change it quite easily on the command-line, with most compilers. This makes them good for values that might be changed on different platforms or builds or whatever.

    I usually use enums, personally.

    There have been lots of threads debating this. The one that comes to mind is: Marching coordinates
    dwk

    Seek and ye shall find. quaere et invenies.

    "Simplicity does not precede complexity, but follows it." -- Alan Perlis
    "Testing can only prove the presence of bugs, not their absence." -- Edsger Dijkstra
    "The only real mistake is the one from which we learn nothing." -- John Powell


    Other boards: DaniWeb, TPS
    Unofficial Wiki FAQ: cpwiki.sf.net

    My website: http://dwks.theprogrammingsite.com/
    Projects: codeform, xuni, atlantis, nort, etc.

  8. #8
    Registered User
    Join Date
    May 2008
    Posts
    87
    No one has explicitly mentioned the processor. All of your #-statements are preprocessor commands. The preprocessor runs over your code before the compiler gets its go. Thats why you may have noticed an obvious difference between say a #define statement and another, regular C statement -- the #define one doesn't end in a semicolon!

    When you build your code, the preprocessor runs over your code and does whatever it is told to do by the #-statements. Here is a simple example of a #define in use:

    Code:
    #define MAX_ARR 5
    
    int main() {
      int arr[MAX_ARR];
      int i;
    
      for (i = 0; i < MAX_ARR; i++)
        arr[i] = i;
    
      return 0;
    }
    When I "compile" this code, first the preprocessor runs over it.
    Code:
    #define MAX_ARR 5
    is a preprocessor command. It says anywhere* you see the text "MAX_ARR" in the source code, replace it with the text "5". It is useful here because if I ever wanted to change the size of my array, I would need to update all places where I reference the size in my code. However, since I did it this way, I only need to change the one line at the top, rather than - in this case - the two lines in my code. You can probably imagine how to extend this example into something more useful. After all preprocessor commands have been run, then the compiler compiles my modified source text.

    * Note that the preprocessor won't replace text that occurs inside a string literal (something between double quotes.)

    Another common preprocessor command is #include. That tells the preprocessor to plop all the source text from the "included" file into your source file before compiling.

    Hope that helps.

  9. #9
    C++まいる!Cをこわせ! Elysia's Avatar
    Join Date
    Oct 2007
    Posts
    22,411
    This is all assuming C because it doesn't really support constant expressions (some have mentioned constant expressions, but they're C++!).
    You can't just do "const int size = 5" and "int array[size]" as in C++, you'd have to use a define.
    The other thing about defines are that they can be macros.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  10. #10
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Quote Originally Posted by Elysia View Post
    This is all assuming C because it doesn't really support constant expressions (some have mentioned constant expressions, but they're C++!).
    You can't just do "const int size = 5" and "int array[size]" as in C++, you'd have to use a define.
    The other thing about defines are that they can be macros.
    I believe constant expressions are allowed in C99, but not C89. I could be wrong.

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

  11. #11
    C++まいる!Cをこわせ! Elysia's Avatar
    Join Date
    Oct 2007
    Posts
    22,411
    Hmm. Well, I'm no expert on differences between C89/C99.
    But obviously, if this is the case, then constant expressions are encouraged and defines are left for macros (or for a type of typedef which typedef cannot do, like shorten some text to write).
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Would someone solve my problem?
    By Lonners in forum C Programming
    Replies: 9
    Last Post: 01-19-2008, 05:58 PM
  2. edit controls in dialogs
    By Homunculus in forum Windows Programming
    Replies: 10
    Last Post: 02-23-2006, 02:38 PM
  3. NAQ: Everything you never wanted to know about CPP
    By evildave in forum C Programming
    Replies: 21
    Last Post: 12-12-2005, 09:56 AM
  4. whats wrong here
    By sreetvert83 in forum C++ Programming
    Replies: 15
    Last Post: 09-21-2005, 10:05 AM
  5. Help getting multiple keypresses in a DOS compiler
    By Nongan in forum Game Programming
    Replies: 2
    Last Post: 04-01-2005, 09:07 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21