1. ## Maths with #define

When something is declared using #define is it automatically an integer or what? And is it wise to do the following? i.e should mathematical operations be done on numbers declared using #define?

Code:
```#define dwnMax 60
#define acrMax 60
const int Mapsize = dwnMax * acrMax;```

2. Originally Posted by rogster001
When something is declared using #define is it automatically an integer or what?
It is whatever it is. Since dwnMax will be replaced by 60, which is an int, you could loosely say that dwnMax's type is int.

Originally Posted by rogster001
And is it wise to do the following? i.e should mathematical operations be done on numbers declared using #define?
It can be done. Since you should be avoiding macros in the first place, I would say no, it should not be done.

3. I don't think it's automatically an integer. I think it's just substitution. Whatever you #define will be placed where you place the name of the #define. Preprocessed. In this case it works, it's just turned into 60 * 60. It's like text replacement. Could always just do 60 * 60, since it's obvious, or

Code:
```const int dwnMax = 60;
const int acrMax = 60;
const int Mapsize = dwnMax * acrMax;```

4. Preprocessor macros are text substitutions. Parametrized cut and paste. Nothing more, nothing less.

5. You have a little control over how mathematical "constants" are defined (otherwise the compiler will interpret them with context). For example:

#define XYZ1 2.76f //float
#define XYZ2 1234567890UL; //unsigned long
#define XYZ3 1234567890123456789ULL; //unsigned long long

6. Originally Posted by rogster001
When something is declared using #define is it automatically an integer or what? And is it wise to do the following? i.e should mathematical operations be done on numbers declared using #define?

Code:
```#define dwnMax 60
#define acrMax 60
const int Mapsize = dwnMax * acrMax;```
Specifically the preprocessor replaces the defined variable with the text of the definition before it begins compilation. This is what is known as a multi-pass compiler. Single pass compilers have to resolve the definition immediately, which limits the definition ostensibly to native types, while a multi pass compiler can resolve the macro in situ, which means it can resolve to the type that is appropriate for the context.

All modern C/C++ compilers that I know of are multi-pass. The argument over multi pass versus single pass used to be a heated debate back in the day, when million line source code could take literally days to compile on a multi pass compiler. The inefficient code generated by a single pass compilation is now regarded as not worth the increase in speed.

You can see the legacy of single pass compilation in C/C++'s requirement for forward declaration. This is more or less unnecessary with multi-pass compilers, but its part of the standard and so can't be changed, which is one reason I am so vocal against adding anything (e.g. the STL) that isn't absolutely critical to the survival of the language to the standard.

7. Originally Posted by abachler
Specifically the preprocessor replaces the defined variable with the text of the definition before it begins compilation. This is what is known as a multi-pass compiler.
Huh? No, even with the preprocessor, most C/C++ compilers are single-pass. Since both preprocessing and actual compilation require everything to be defined first, they can be single-pass, and the can (and usually are) integrated so that preprocessing is done on the fly. In other words, the preprocessor feeds the compiler tokens, and for macro expansions it feeds it tokens from its token cache.

Single pass compilers have to resolve the definition immediately,
I have no idea what this means.

which limits the definition ostensibly to native types, while a multi pass compiler can resolve the macro in situ, which means it can resolve to the type that is appropriate for the context.
Also wrong. Macros are text (or rather, token) replacements, and the preprocessor doesn't know anything about types.

All modern C/C++ compilers that I know of are multi-pass.
Strange, I know not a single one, unless you count MSC's template instantiation.

The inefficient code generated by a single pass compilation is now regarded as not worth the increase in speed.
Inefficient machine code? Nah, optimizers work on an intermediate representation and don't care in the least whether the front-end is single- or multi-pass.

You can see the legacy of single pass compilation in C/C++'s requirement for forward declaration. This is more or less unnecessary with multi-pass compilers,
True, see Java.

but its part of the standard and so can't be changed, which is one reason I am so vocal against adding anything (e.g. the STL) that isn't absolutely critical to the survival of the language to the standard.
Making standard C single-pass-capable was critical to the survival of the language, I'll bet.
I'll also argue that a language without a standard library, i.e. something you can rely on being available, is brutally handicapped.