Thread: difference between int array and char array

  1. #1
    Registered User
    Join Date
    Aug 2012
    Location
    Lagos, Nigeria
    Posts
    17

    difference between int array and char array

    What is the difference between an integer array and a char array in C? Is an integer array terminated with a sentinel character too?

  2. #2
    Registered User
    Join Date
    Sep 2006
    Posts
    8,868
    Quote Originally Posted by bluechip View Post
    What is the difference between an integer array and a char array in C? Is an integer array terminated with a sentinel character too?
    Char's are by definition, one byte data types. Integers are much larger, depending on the type, and your system. No, integer arrays are not terminated with any sentinel char.

  3. #3
    Registered User
    Join Date
    Sep 2012
    Posts
    357
    An integer array is an object capable of holding values of type int.
    A char array is an object capable of holding values of type char.

    There are no other differences between them.

    Also, a char array is not terminated with a sentinel character. Strings are terminated with a sentinel character. Strings (including the sentinel character) are saved in char arrays.

    All strings are char arrays; not all char arrays are strings.

  4. #4
    Registered User ssharish2005's Avatar
    Join Date
    Sep 2005
    Location
    Cambridge, UK
    Posts
    1,732
    Int array take more memory to allocate where as char less.

    Integer, on a machine takes upto 4 bytes (its also depends on what compiler your using. Any new compiler should be 4 bytes). If you say

    Code:
    int array[10];   <== should allocate consequitive 40 bytes with no sentinel at the end. 
    
    char array[10];  <== should allocate consequitive 10 bytes.
    A char takes upto a byte and there n * ( sizeof char ). So the amount of memory allocated is very different. And also strings are terminated with a NULL character, where as in int array they aren't.
    It is also important to note, the boundary checks will need to make explicitly in both cases.

    ssharish
    Life is like riding a bicycle. To keep your balance you must keep moving - Einstein

  5. #5
    Registered User piyush.sharma's Avatar
    Join Date
    Aug 2012
    Location
    Noida, India
    Posts
    9
    I think both are same, the difference should only be the number of bytes allocated.

  6. #6
    C++ Witch laserlight's Avatar
    Join Date
    Oct 2003
    Location
    Singapore
    Posts
    28,413
    Quote Originally Posted by Adak
    Integers are much larger, depending on the type, and your system.
    Quote Originally Posted by ssharish2005
    Int array take more memory to allocate where as char less.
    Quote Originally Posted by piyush.sharma
    I think both are same, the difference should only be the number of bytes allocated.
    My experience has been too limited to have encountered such a system, but it is possible that sizeof(int) == sizeof(char).
    Quote Originally Posted by Bjarne Stroustrup (2000-10-14)
    I get maybe two dozen requests for help with some sort of programming or design problem every day. Most have more sense than to send me hundreds of lines of code. If they do, I ask them to find the smallest example that exhibits the problem and send me that. Mostly, they then find the error themselves. "Finding the smallest program that demonstrates the error" is a powerful debugging tool.
    Look up a C++ Reference and learn How To Ask Questions The Smart Way

  7. #7
    Registered User
    Join Date
    Sep 2006
    Posts
    8,868
    Quote Originally Posted by piyush.sharma View Post
    I think both are same, the difference should only be the number of bytes allocated.
    I think what piyush.sharma means, is that chars are like ints, but smaller in size (bytes and thus, range). On Turbo C/C++ ver. 1.01 for DOS, the integers were twice as large as the char's, but short integers were the same size as char's.

  8. #8
    Registered User
    Join Date
    May 2012
    Location
    Arizona, USA
    Posts
    948
    Quote Originally Posted by Adak View Post
    I think what piyush.sharma means, is that chars are like ints, but smaller in size (bytes and thus, range). On Turbo C/C++ ver. 1.01 for DOS, the integers were twice as large as the char's, but short integers were the same size as char's.
    Were char's 16 bits long in that compiler? The short int type is required to be at least 16 bits wide, so if what you say is true, then char must also be at least 16 bits wide (or else the compiler is completely non-standard).

    I can't imagine char being anything other than 8 bits in any DOS compiler. Are you sure that sizeof(char) == sizeof(short) in that compiler? I'm almost willing to bet that sizeof(short) == sizeof(int) == 2 (and sizeof(char) == 1 by definition).

  9. #9
    Been here, done that.
    Join Date
    May 2003
    Posts
    1,164
    Quote Originally Posted by Adak View Post
    On Turbo C/C++ ver. 1.01 for DOS, the integers were twice as large as the char's, but short integers were the same size as char's.
    Not true... short and int were the same size:

    Code:
    C:\TC>tcc x
    Turbo C++  Version 1.00 Copyright (c) 1990 Borland International
    x.c:
    Turbo Link  Version 3.0 Copyright (c) 1987, 1990 Borland International
    
            Available memory 409952
    
    C:\TC>x
    2  2
    C:\TC>
    Code:
    #include "include\stdio.h"
    
    int main()
    {
        printf("%d  ", sizeof(int));
        printf("%d  ", sizeof(short));
        
        
        return 0;
    }
    Definition: Politics -- Latin, from
    poly meaning many and
    tics meaning blood sucking parasites
    -- Tom Smothers

  10. #10
    Registered User
    Join Date
    Sep 2006
    Posts
    8,868
    Quote Originally Posted by christop View Post
    Were char's 16 bits long in that compiler? The short int type is required to be at least 16 bits wide, so if what you say is true, then char must also be at least 16 bits wide (or else the compiler is completely non-standard).

    I can't imagine char being anything other than 8 bits in any DOS compiler. Are you sure that sizeof(char) == sizeof(short) in that compiler? I'm almost willing to bet that sizeof(short) == sizeof(int) == 2 (and sizeof(char) == 1 by definition).
    This was back when you dodged the T-Rex, on your way to the store!


    There was no C standard at that time - there was an AT&T C standard, and that's what was followed by Borland's Turbo C.

    Chars were 8 bits, shorts were 16 bits, and ints were 16 bits. Long int's were 32 bits. After reading WaltP's post, I wanted to check it, but the only computer that has it still on it, is refusing to boot up. He appears to be using the same compiler however, so I bow to his fact check.

    Couldn't get the old hardware to go, but did find a header file on another computer - WaltP is correct. Shorts were 16 bits, not 8.

    Good catch, WaltP!
    Last edited by Adak; 09-11-2012 at 02:49 PM.

  11. #11
    Registered User
    Join Date
    Dec 2006
    Location
    Canada
    Posts
    3,229
    I have used a few 8-bit architectures, and even then, int is usually 16-bit, which violates the C definition that int is the natural word size of the processor (they are definitely not on 8-bit archs).

  12. #12
    Registered User
    Join Date
    May 2012
    Location
    Arizona, USA
    Posts
    948
    Quote Originally Posted by cyberfish View Post
    I have used a few 8-bit architectures, and even then, int is usually 16-bit, which violates the C definition that int is the natural word size of the processor (they are definitely not on 8-bit archs).
    The standards require int (and short int) to be at least 16 bits wide, which is not necessarily the processor's natural word size. On most processors int is chosen to be the easiest word size for the processor to handle for practical reasons (perhaps because int is the most-used type in C).

  13. #13
    Registered User piyush.sharma's Avatar
    Join Date
    Aug 2012
    Location
    Noida, India
    Posts
    9
    Quote Originally Posted by christop View Post
    Were char's 16 bits long in that compiler? The short int type is required to be at least 16 bits wide, so if what you say is true, then char must also be at least 16 bits wide (or else the compiler is completely non-standard).

    I can't imagine char being anything other than 8 bits in any DOS compiler. Are you sure that sizeof(char) == sizeof(short) in that compiler? I'm almost willing to bet that sizeof(short) == sizeof(int) == 2 (and sizeof(char) == 1 by definition).
    I have work with 3 or 4 compiler. and both 32 bit and 64 bit OS. So we should consider this fact that result on a OS & compiler will differ from another OS/Compiler. This is C.

  14. #14
    Registered User
    Join Date
    May 2012
    Location
    Arizona, USA
    Posts
    948
    Quote Originally Posted by piyush.sharma View Post
    I have work with 3 or 4 compiler. and both 32 bit and 64 bit OS. So we should consider this fact that result on a OS & compiler will differ from another OS/Compiler. This is C.
    Of course. I was referring to a single compiler on one OS.

    It really irks me that many people assume that int is capable of handling values larger than 32767. While this is true on some platforms/compilers, it's not true on others, and the standards require only that int be capable of representing the integer range [-32767,32767]. If you want to handle large values (up to +/-2 billion), use long int!

    I work with a system/compiler (m68000/gcc) where int is 16 bits wide and all pointers are 32 bits wide, so things go south very quickly when functions that take or return pointers are not properly prototyped (there's also the fact the pointers are returned in a different register (a0) than integer types (d0)). For example, if I use memory allocated with malloc but forget to include stdlib.h, the system will probably fall over very quickly (there is virtually no memory protection on this particular system ).

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Replies: 2
    Last Post: 03-20-2012, 08:41 AM
  2. uincode char array to array<unsigned char,1>^
    By ripspinner in forum Windows Programming
    Replies: 5
    Last Post: 12-14-2009, 05:41 PM
  3. Replies: 3
    Last Post: 11-17-2008, 12:36 PM
  4. Difference between char* and array[]
    By kotoko in forum C Programming
    Replies: 9
    Last Post: 04-23-2008, 06:03 AM
  5. signed char array to unsign char array.
    By beon in forum C Programming
    Replies: 5
    Last Post: 12-14-2006, 07:19 PM