Maths with #define

This is a discussion on Maths with #define within the C++ Programming forums, part of the General Programming Boards category; When something is declared using #define is it automatically an integer or what? And is it wise to do the ...

  1. #1
    Registered User rogster001's Avatar
    Join Date
    Aug 2006
    Location
    Liverpool UK
    Posts
    1,425

    Maths with #define

    When something is declared using #define is it automatically an integer or what? And is it wise to do the following? i.e should mathematical operations be done on numbers declared using #define?

    Code:
    #define dwnMax 60
    #define acrMax 60
    const int Mapsize = dwnMax * acrMax;

  2. #2
    C++ Witch laserlight's Avatar
    Join Date
    Oct 2003
    Location
    Singapore
    Posts
    21,650
    Quote Originally Posted by rogster001
    When something is declared using #define is it automatically an integer or what?
    It is whatever it is. Since dwnMax will be replaced by 60, which is an int, you could loosely say that dwnMax's type is int.

    Quote Originally Posted by rogster001
    And is it wise to do the following? i.e should mathematical operations be done on numbers declared using #define?
    It can be done. Since you should be avoiding macros in the first place, I would say no, it should not be done.
    C + C++ Compiler: MinGW port of GCC
    Version Control System: Bazaar

    Look up a C++ Reference and learn How To Ask Questions The Smart Way

  3. #3
    Dae
    Dae is offline
    Deprecated Dae's Avatar
    Join Date
    Oct 2004
    Location
    Canada
    Posts
    1,034
    I don't think it's automatically an integer. I think it's just substitution. Whatever you #define will be placed where you place the name of the #define. Preprocessed. In this case it works, it's just turned into 60 * 60. It's like text replacement. Could always just do 60 * 60, since it's obvious, or

    Code:
    const int dwnMax = 60;
    const int acrMax = 60;
    const int Mapsize = dwnMax * acrMax;
    Warning: Have doubt in anything I post.

    GCC 4.5, Boost 1.40, Code::Blocks 8.02, Ubuntu 9.10 010001000110000101100101

  4. #4
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,893
    Preprocessor macros are text substitutions. Parametrized cut and paste. Nothing more, nothing less.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  5. #5
    Registered User
    Join Date
    Jul 2009
    Posts
    36
    You have a little control over how mathematical "constants" are defined (otherwise the compiler will interpret them with context). For example:

    #define XYZ1 2.76f //float
    #define XYZ2 1234567890UL; //unsigned long
    #define XYZ3 1234567890123456789ULL; //unsigned long long

  6. #6
    Malum in se abachler's Avatar
    Join Date
    Apr 2007
    Posts
    3,189
    Quote Originally Posted by rogster001 View Post
    When something is declared using #define is it automatically an integer or what? And is it wise to do the following? i.e should mathematical operations be done on numbers declared using #define?

    Code:
    #define dwnMax 60
    #define acrMax 60
    const int Mapsize = dwnMax * acrMax;
    Specifically the preprocessor replaces the defined variable with the text of the definition before it begins compilation. This is what is known as a multi-pass compiler. Single pass compilers have to resolve the definition immediately, which limits the definition ostensibly to native types, while a multi pass compiler can resolve the macro in situ, which means it can resolve to the type that is appropriate for the context.

    All modern C/C++ compilers that I know of are multi-pass. The argument over multi pass versus single pass used to be a heated debate back in the day, when million line source code could take literally days to compile on a multi pass compiler. The inefficient code generated by a single pass compilation is now regarded as not worth the increase in speed.

    You can see the legacy of single pass compilation in C/C++'s requirement for forward declaration. This is more or less unnecessary with multi-pass compilers, but its part of the standard and so can't be changed, which is one reason I am so vocal against adding anything (e.g. the STL) that isn't absolutely critical to the survival of the language to the standard.
    Last edited by abachler; 09-24-2009 at 07:41 PM.
    Until you can build a working general purpose reprogrammable computer out of basic components from radio shack, you are not fit to call yourself a programmer in my presence. This is cwhizard, signing off.

  7. #7
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,893
    Quote Originally Posted by abachler View Post
    Specifically the preprocessor replaces the defined variable with the text of the definition before it begins compilation. This is what is known as a multi-pass compiler.
    Huh? No, even with the preprocessor, most C/C++ compilers are single-pass. Since both preprocessing and actual compilation require everything to be defined first, they can be single-pass, and the can (and usually are) integrated so that preprocessing is done on the fly. In other words, the preprocessor feeds the compiler tokens, and for macro expansions it feeds it tokens from its token cache.

    Single pass compilers have to resolve the definition immediately,
    I have no idea what this means.

    which limits the definition ostensibly to native types, while a multi pass compiler can resolve the macro in situ, which means it can resolve to the type that is appropriate for the context.
    Also wrong. Macros are text (or rather, token) replacements, and the preprocessor doesn't know anything about types.

    All modern C/C++ compilers that I know of are multi-pass.
    Strange, I know not a single one, unless you count MSC's template instantiation.

    The inefficient code generated by a single pass compilation is now regarded as not worth the increase in speed.
    Inefficient machine code? Nah, optimizers work on an intermediate representation and don't care in the least whether the front-end is single- or multi-pass.

    You can see the legacy of single pass compilation in C/C++'s requirement for forward declaration. This is more or less unnecessary with multi-pass compilers,
    True, see Java.

    but its part of the standard and so can't be changed, which is one reason I am so vocal against adding anything (e.g. the STL) that isn't absolutely critical to the survival of the language to the standard.
    Making standard C single-pass-capable was critical to the survival of the language, I'll bet.
    I'll also argue that a language without a standard library, i.e. something you can rely on being available, is brutally handicapped.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Pointer within a Struct
    By Bladactania in forum C Programming
    Replies: 11
    Last Post: 04-03-2009, 10:20 PM
  2. Why?!?
    By p3rry in forum C Programming
    Replies: 3
    Last Post: 01-08-2009, 11:52 AM
  3. size of an integer pointer
    By onebrother in forum C Programming
    Replies: 5
    Last Post: 07-09-2008, 11:49 AM
  4. edit controls in dialogs
    By Homunculus in forum Windows Programming
    Replies: 10
    Last Post: 02-23-2006, 02:38 PM
  5. Please STICKY this- vital to MSVC 6 dev - BASETSD.h
    By VirtualAce in forum Game Programming
    Replies: 11
    Last Post: 03-15-2005, 08:22 AM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21