Thread: NULL Define - Why?

  1. #16
    Programming Sex-God Polymorphic OOP's Avatar
    Join Date
    Nov 2002
    Posts
    1,078
    Originally posted by Magos
    I prefer to use NULL since it's more obvious what you're doing, making a pointer into a NULL pointer. By setting it to 0 manually, you can misread and think that you set what it's pointing at to 0.
    It's NOT as obvious what you are doing when you use NULL. C++ doesn't define a NULL pointer as a pointer with the value NULL, NULL means nothing directly in C++ -- a "NULL pointer" is defined as a pointer with the value 0. Using NULL doesn't make anything more clear, it just opens up the question "what is NULL defined as here." If someone doesn't understand that when you set a pointer's value to 0 that it's a NULL pointer, then they don't understand the language.

    Originally posted by Magos
    Get rid of the magic numbers and use definitions instead, it makes the code easier to read and understand.
    You don't use magic numbers when there is a possibility that you would at some point in time want to change the value. That is not the case with 0. Setting a pointers value to 0 is a direct part of the language -- defining NULL to 0 doesn't make your code any better because you'd never ever want to change NULL to another value when using it for a NULL pointer, and if you did, then you'd only open yourself up to errors. The reason you'd use a constant (macros, particularly in places like this, are generally poor practice) are when you have a number that you want to be able to easily change when you compile. You would never want to change NULL to a different value, and if you did, then your code wouldn't compile because the only valid integer you can assign as a value to a pointer directly is 0. NULL just adds the possible question "what is NULL defined as here." That is silly. A NULL pointer has a value 0, period. You'd never want to change that. Unless NULL becomes a direct part of the language, then it's not as safe as using 0, and despite what you claim, using 0 is more clear to someone who knows the language because that is how the language defines a NULL pointer.
    Last edited by Polymorphic OOP; 12-14-2002 at 06:12 PM.

  2. #17
    Carnivore ('-'v) Hunter2's Avatar
    Join Date
    May 2002
    Posts
    2,879
    Maybe polymorphic is an expert at C++ and can understand the magic 0's, but I'm not, and I personally find NULL to be more convenient (though maybe less safe), especially when passing parameters... if there is this function:
    Code:
    void foob(int x, int y, Something *something, DWORD flags, char *text);
    Then to call it, you could do one of these:
    Code:
    foob(0, 0, NULL, 0, NULL);
    foob(0, 0, 0, 0, 0);
    To me, I find it easier to read the first one, because it is immediately apparent at a glance exactly which parameters are pointers (true, NULL can be used for other things, but only someone stupid would do so) and which aren't.

    And besides, the NULL's can be used as landmarks, so you know the 2 parameters in front are the position, and the flags are between the 2 nulls
    Just Google It. √

    (\ /)
    ( . .)
    c(")(") This is bunny. Copy and paste bunny into your signature to help him gain world domination.

  3. #18
    Banned master5001's Avatar
    Join Date
    Aug 2001
    Location
    Visalia, CA, USA
    Posts
    3,685
    I think Polymorphic is correct. There is no mandate saying I can't go around and change NULL to something other than zero. In C NULL is a pointer to 0 whereas in C++ it is simply 0. Therefore it isn't as obvious per se. I'd like to think that people don't redefine NULL as something other than zero but we don't live in a perfect world. I think a common example would be something like this:

    Code:
    #ifdef NULL
    #undef NULL
    #define NULL ((MyClass *)0)
    #endif
    As you can see NULL is still a pointer to nothing however, it is specified as a MyClass pointer to nothing. This is great and all but now saying char *MyString = NULL is totally invalid without typecasting the NULL. Using NULL is my preference but sometimes NULL can cause problems without you doing anything to your code.

  4. #19
    Code Monkey Davros's Avatar
    Join Date
    Jun 2002
    Posts
    812
    Thanks for all your replies. I'm glad I asked that question (I think), especially as it is with regards to producing APIs to be called by code in compilers other than my own.

    If it were a rigid requirement that NULL is zero (rather than a pointer to zero - or something else), I would use it. Certainly when I have been using NULL, I've been assuming it is zero.

    I think, however, I'll follow Stroustrup's recommendations in the future. This issue about 'readability' of NULL pointers is a trivial one to me.

  5. #20
    Confused Magos's Avatar
    Join Date
    Sep 2001
    Location
    Sweden
    Posts
    3,145
    I agree with Hunter, it makes things a lot clearer.

    Another thing.
    Nobody in their right senses would redefine NULL. If you assume that people can redefine anything then you can't have this in the headers anymore:
    Code:
    #ifndef MYHEADER
    #define MYHEADER
    
    ...
    
    #endif
    What if I defined _WINDOWS_ then included windows.h?
    Crash bang boom, it wouldn't work. I'd get lots of 'undefined function/stuff' errors.
    It's unsafe since people 'have the potential' to define/redefine stuff.

    A NULL pointer has a value 0, period
    So why would anyone redefine it then? If someone still does it, it's his fault and his problem.

    and despite what you claim, using 0 is more clear to someone who knows the language because that is how the language defines a NULL pointer.
    A NULL pointer is defined as a pointer that does not point at anything, basically it has nothing to do with the number 0. The reason 0 was chosen is that it was most fitting (0 = nothing, interferes as little as possible with 'real' adresses), but it could as well be 3, 12, 2^32 - 1, (all 1:s) or anything.
    Who knows, future systems might operate in a different manner and another number than 0 may be more logical.
    Last edited by Magos; 12-15-2002 at 10:18 AM.
    MagosX.com

    Give a man a fish and you feed him for a day.
    Teach a man to fish and you feed him for a lifetime.

  6. #21
    Code Monkey Davros's Avatar
    Join Date
    Jun 2002
    Posts
    812
    Yes. But you seem to be insisting on two different things. First you quote:

    >A NULL pointer has a value 0, period

    Then you say:

    >A NULL pointer is defined as a pointer that does not point at anything, basically it has nothing to do with the number 0.

    If I provide routine in a DLL & say this pointer parameter may be NULL, I really mean zero. If someone is working in a different environment (may be I wrote my API with C++ & someone is accessing it from a pure C environment) NULL to them vould mean something else (i.e. a pointer to zero). Rather I need to say this parameter should be zero if it's not pointing to anything. If you look at the MS Win API, they tend to specify zero rather than NULL.

    As I've no intention of doing ropey things like:

    #ifdef NULL
    #undef NULL
    #define NULL 0
    #endef

    ...I'm safer sticking with zero, especially when everyone needs to have the same understanding.
    Last edited by Davros; 12-15-2002 at 10:45 AM.

  7. #22
    Confused Magos's Avatar
    Join Date
    Sep 2001
    Location
    Sweden
    Posts
    3,145
    Or course you have to define NULL as the same value everywhere, but my point was that it doesn't neccessarily had to be 0 (though it simplifies things, don't get me wrong on that). By using NULL rather than 0 you don't mind what it's defined as, just that it is a pointer not pointning at anything.
    MagosX.com

    Give a man a fish and you feed him for a day.
    Teach a man to fish and you feed him for a lifetime.

  8. #23
    Code Monkey Davros's Avatar
    Join Date
    Jun 2002
    Posts
    812
    Thanks for the reply.

    >but my point was that it doesn't neccessarily had to be 0 (though it simplifies things, don't get me wrong on that)

    I undertand what you are saying- NULL can be defined as anything as long as it is universally recognised that this 'anything' means NULL. However, this becomes difficult in a world of constant change & multi-platform development environments.

    It's been a while, but I once did a bit of work on Unix. As I recall there where all sorts #defines, #ifdefs, #undefs everywhere in a battle to maintain source compatability between HP-UX & Solaris. It got an order of magnitude worse where Windows became involved. I don't like #define anymore, but sometimes it's necessary.
    OS: Windows XP
    Compilers: MinGW (Code::Blocks), BCB 5

    BigAngryDog.com

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Replies: 10
    Last Post: 07-10-2008, 03:45 PM
  2. Menu like file, edit, help, etc...
    By Livijn in forum Windows Programming
    Replies: 44
    Last Post: 01-23-2007, 05:49 PM
  3. Menu in C, like File, Edit, Window, etc...
    By Livijn in forum Windows Programming
    Replies: 5
    Last Post: 01-18-2007, 05:49 PM
  4. Wierd Segmentation Faults on Global Variable
    By cbranje in forum C Programming
    Replies: 6
    Last Post: 02-19-2005, 12:25 PM
  5. simulate Grep command in Unix using C
    By laxmi in forum C Programming
    Replies: 6
    Last Post: 05-10-2002, 04:10 PM