Thread: Using raw number VS reading from a variable , what's faster?

  1. #1
    Registered User
    Join Date
    Aug 2013
    Posts
    73

    Using raw number VS reading from a variable , what's faster?

    I know it is maybe a little noobie question, But I want to know,
    Is using raw number faster than reading it from a variable ? (not talking about the time it requires to initialize the variables, only the read time)
    I am talking about situations that I define a global variable and use it so it would be easier to read and understand the code, what does it cost extra time?

  2. #2
    Registered User
    Join Date
    Nov 2012
    Posts
    1,393
    Could you write some code snippets to demonstrate the two different ways you have in mind? By "raw numbers" I'm guessing you probably mean numeric constants. If you write some code samples, then you can also check the generated assembly for those code samples to see if there is a difference. Normally the optimizer should be able to automatically make an intelligent choice about two possibilities if there is a difference in speed to be had.

  3. #3
    and the Hat of Guessing tabstop's Avatar
    Join Date
    Nov 2007
    Posts
    14,336
    If you were using a global variable, it would make it harder to read and understand the code, so that's surely not what you're doing. If you mean #define, then that name is replaced by the actual number before the compiler actually sees it.

    And for that matter, for a number to be used it pretty much has to end up in a register anyway, whether it be directly loaded or copied from memory. Directly loading may be marginally faster, but I doubt the difference would be measurable.

  4. #4
    Registered User
    Join Date
    Aug 2013
    Posts
    73
    Yeah, I meant using a global variable (#define)

    Code:
    #define ZERO 0
    #define TEN 10
    
    int i;
    //this is using the constants
    
    for(i = ZERO; i < TEN; i++){
         .....
    }
    
    //this is using "raw" numbers
    
    for(i = 0; i < 10; i++){
         .....
    }

  5. #5
    C++ Witch laserlight's Avatar
    Join Date
    Oct 2003
    Location
    Singapore
    Posts
    28,413
    Quote Originally Posted by patishi
    Yeah, I meant using a global variable (#define)
    A macro constant is not a "global variable" in the sense that we commonly speak of global variables. Your macro constant will be replaced at compile time with its value, hence there is no accessing of a variable: the resulting code could well be identical.

    Furthermore, a compiler may (or may not) apply optimisations with respect to variables with constant values. As such, you should strive to use what expresses your intention better and which makes your code easier to read and understand. In this case, the use of the named constant would be better if the name is descriptive: ZERO and TEN are not really better than 0 and 10 since all you did was use the English names of those magic numbers.
    Quote Originally Posted by Bjarne Stroustrup (2000-10-14)
    I get maybe two dozen requests for help with some sort of programming or design problem every day. Most have more sense than to send me hundreds of lines of code. If they do, I ask them to find the smallest example that exhibits the problem and send me that. Mostly, they then find the error themselves. "Finding the smallest program that demonstrates the error" is a powerful debugging tool.
    Look up a C++ Reference and learn How To Ask Questions The Smart Way

  6. #6
    Registered User
    Join Date
    Aug 2013
    Posts
    73
    Yeah, ofcourse I just gave those as an example. My real intention is using
    Code:
    enum{WHITE,BLACK}
    instead of 0 and 1 in a game for example.. or
    Code:
    enum{FALSE,TRUE}
    .

    So you are saying that #define constant is faster than using a normal constans variable?

  7. #7
    C++ Witch laserlight's Avatar
    Join Date
    Oct 2003
    Location
    Singapore
    Posts
    28,413
    Quote Originally Posted by patishi
    My real intention is using << an enum >>
    That could be a good idea.

    Quote Originally Posted by patishi
    So you are saying that #define constant is faster than using a normal constans variable?
    No, I am saying that: "you should strive to use what expresses your intention better and which makes your code easier to read and understand". If you require your constant to have a limited scope, then a variable declared const could well be appropriate since variables obey the normal rules of scope whereas macros do not. On the other hand, if you want to use the constant to declare an array, and you don't want the array to be variable length, then you would likely declare it as a macro constant.
    Quote Originally Posted by Bjarne Stroustrup (2000-10-14)
    I get maybe two dozen requests for help with some sort of programming or design problem every day. Most have more sense than to send me hundreds of lines of code. If they do, I ask them to find the smallest example that exhibits the problem and send me that. Mostly, they then find the error themselves. "Finding the smallest program that demonstrates the error" is a powerful debugging tool.
    Look up a C++ Reference and learn How To Ask Questions The Smart Way

  8. #8
    Registered User
    Join Date
    Jun 2005
    Posts
    6,815
    Quote Originally Posted by patishi View Post
    So you are saying that #define constant is faster than using a normal constans variable?
    Nobody said that. The only way you could find out if that is true is to test it - write two versions of the code that differ only in this specific, develop a test harness so you can gather proper timing measurements by doing the same operations (say) a few billion times, and measure. Repeat for all possible compilers, compiler settings (particularly optimisation settings), and operating systems.

    On balance, after a couple of years of your life have been wasted on working this out, you will probably find there is little difference, but that the specifics will vary between compilers. Sometimes a #define will work faster, sometimes an enum will, most of the times the difference will not be significant enough to worry about.

    Personally, I'd stick with the enums. The advantages in terms of having understandable code that is easier to get right vastly exceed the occasional saving of CPU cycles you might achieve if you find that some other approach is a picosecond faster on average.

    And, by using enums, I would avoid the label "premature optimiser", which you have just earned.
    Right 98% of the time, and don't care about the other 3%.

    If I seem grumpy or unhelpful in reply to you, or tell you you need to demonstrate more effort before you can expect help, it is likely you deserve it. Suck it up, Buttercup, and read this, this, and this before posting again.

  9. #9
    Registered User
    Join Date
    Apr 2013
    Posts
    1,658
    In general compile time constants will probably end up as immediate values as part of instructions, while variables will be loaded from memory, so the compile time constants will be slightly faster. It doesn't matter if the compile time constants are defines, numbers, or enums.

  10. #10
    Registered User MutantJohn's Avatar
    Join Date
    Feb 2013
    Posts
    2,665
    I think the OP has a worthwhile question but more often than not, algorithms >>>>>> such small changes to code.

  11. #11
    Registered User
    Join Date
    Jun 2005
    Posts
    6,815
    Quote Originally Posted by MutantJohn View Post
    I think the OP has a worthwhile question but more often than not, algorithms >>>>>> such small changes to code.
    The question is about trying to find a general answer, where there is no general answer, and where any real-world difference probably doesn't matter

    There is no single answer of the form "#define's are more efficient than an enum". If the compiler is smart enough, it will pick the most efficient of the two, so performance does not need to be a factor in deciding which to use. If the compiler is not that smart, it is pretty rare for the difference to even matter for the (user of) the software - and the right answer depends on host machines, compilers, and even compiler settings (i.e. it depends).

    It takes considerable effort to get non-trivial code working and to keep it working when maintaining it. If there is not a demonstrated need (e.g. identified through testing and profiling the working software, once some overall performance deficiency is experienced by end users or by excess power usage in a server farm) there is no point worrying about such low level details. Pick the one that is easiest to understand and maintain, that a compiler can detect problems with, etc etc. Usually that means it is better to use enums over #defines.
    Right 98% of the time, and don't care about the other 3%.

    If I seem grumpy or unhelpful in reply to you, or tell you you need to demonstrate more effort before you can expect help, it is likely you deserve it. Suck it up, Buttercup, and read this, this, and this before posting again.

  12. #12
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    Pretty sure enums are compile-time constructs only. They should produce code exactly the same as if you used #define (or a literal value).

  13. #13
    Algorithm Dissector iMalc's Avatar
    Join Date
    Dec 2005
    Location
    New Zealand
    Posts
    6,318
    Quote Originally Posted by cyberfish View Post
    Pretty sure enums are compile-time constructs only. They should produce code exactly the same as if you used #define (or a literal value).
    Indeed. In theory an enum is just an integer literal with compile-time type-safety around it.
    Other than possible checking that the value of an enum typed variable holds a value that corresponds to one from the enumeration, which I doubt is common, I would expect identical assembly code.

    I would recommend as a first step here to make sure you know the correct terminology for things, i.e. learn what these are:
    Global variable
    Local variable
    Enumeration Definition
    Enumeration variable
    Integer literal
    Pre-processor Macro
    My homepage
    Advice: Take only as directed - If symptoms persist, please see your debugger

    Linus Torvalds: "But it clearly is the only right way. The fact that everybody else does it some other way only means that they are wrong"

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. variable number of arguments
    By newC72 in forum C Programming
    Replies: 6
    Last Post: 06-25-2012, 12:54 PM
  2. Is ram faster at reading than writing?
    By Verdagon in forum General Discussions
    Replies: 1
    Last Post: 03-01-2011, 02:48 AM
  3. how many number of character many be in a variable name?
    By surrounded in forum C Programming
    Replies: 3
    Last Post: 02-26-2009, 05:41 PM
  4. Perfect Number - faster Algo Pls..
    By loko in forum C Programming
    Replies: 18
    Last Post: 08-21-2005, 08:26 AM
  5. Reading files number by number, also getw()?
    By rmullen3 in forum C Programming
    Replies: 4
    Last Post: 01-03-2003, 01:22 PM