-
K&R book question
I'm reading this magnificent book and I came across a paragraph that I do not understand. It's the last paragraph on page 39.
"Enumerations provide a convenient way to associate constant values with names, an alternative to #define with the advantage that the values can be generated for you. Although variables of enum types may be declared, compilers need not check that what you store in such a variable is a valid value for the enumeration. Nevertheless, enumeration variables offer the chance of checking and so are often better than #defines. In addition, a debugger may be able to print values of enumeration variables in their symbolic form. "
Can someone rephrase this for me? Thanks a lot.
-
Code:
// good
enum { ORANGE, APPLE, PEAR };
typedef enum { RED, AMBER, GREEN } light;
light lamp = RED;
// bad
#define RED 0
#define AMBER 1
#define GREEN 2
int lamp = RED;
> an alternative to #define with the advantage that the values can be generated for you.
You don't have to invent all the numbers, imagine having to number several dozen of them.
Then imagine say having to renumber them when you want to add something else.
> Nevertheless, enumeration variables offer the chance of checking and so are often better than #defines
In the former case, then compiler might be able to tell you about dumb things like
lamp = 42;
lamp = ORANGE;
A #define cannot do this.
> In addition, a debugger may be able to print values of enumeration variables in their symbolic form. "
Given
int lamp = RED;
You'd always see 0 when printed inside the debugger - then you'd have to go look it up yourself
light lamp = RED;
Good debuggers will print RED when you examine the variable.