What is the difference between an integer array and a char array in C? Is an integer array terminated with a sentinel character too?
Printable View
What is the difference between an integer array and a char array in C? Is an integer array terminated with a sentinel character too?
An integer array is an object capable of holding values of type int.
A char array is an object capable of holding values of type char.
There are no other differences between them.
Also, a char array is not terminated with a sentinel character. Strings are terminated with a sentinel character. Strings (including the sentinel character) are saved in char arrays.
All strings are char arrays; not all char arrays are strings.
Int array take more memory to allocate where as char less.
Integer, on a machine takes upto 4 bytes (its also depends on what compiler your using. Any new compiler should be 4 bytes). If you say
A char takes upto a byte and there n * ( sizeof char ). So the amount of memory allocated is very different. And also strings are terminated with a NULL character, where as in int array they aren't.Code:int array[10]; <== should allocate consequitive 40 bytes with no sentinel at the end.
char array[10]; <== should allocate consequitive 10 bytes.
It is also important to note, the boundary checks will need to make explicitly in both cases.
ssharish
I think both are same, the difference should only be the number of bytes allocated.
Quote:
Originally Posted by Adak
Quote:
Originally Posted by ssharish2005
My experience has been too limited to have encountered such a system, but it is possible that sizeof(int) == sizeof(char).Quote:
Originally Posted by piyush.sharma
Were char's 16 bits long in that compiler? The short int type is required to be at least 16 bits wide, so if what you say is true, then char must also be at least 16 bits wide (or else the compiler is completely non-standard).
I can't imagine char being anything other than 8 bits in any DOS compiler. Are you sure that sizeof(char) == sizeof(short) in that compiler? I'm almost willing to bet that sizeof(short) == sizeof(int) == 2 (and sizeof(char) == 1 by definition).
Not true... short and int were the same size:
Code:C:\TC>tcc x
Turbo C++ Version 1.00 Copyright (c) 1990 Borland International
x.c:
Turbo Link Version 3.0 Copyright (c) 1987, 1990 Borland International
Available memory 409952
C:\TC>x
2 2
C:\TC>
Code:#include "include\stdio.h"
int main()
{
printf("%d ", sizeof(int));
printf("%d ", sizeof(short));
return 0;
}
This was back when you dodged the T-Rex, on your way to the store!
:biggrin:
There was no C standard at that time - there was an AT&T C standard, and that's what was followed by Borland's Turbo C.
Chars were 8 bits, shorts were 16 bits, and ints were 16 bits. Long int's were 32 bits. After reading WaltP's post, I wanted to check it, but the only computer that has it still on it, is refusing to boot up. He appears to be using the same compiler however, so I bow to his fact check.
Couldn't get the old hardware to go, but did find a header file on another computer - WaltP is correct. Shorts were 16 bits, not 8.
Good catch, WaltP!
I have used a few 8-bit architectures, and even then, int is usually 16-bit, which violates the C definition that int is the natural word size of the processor (they are definitely not on 8-bit archs).
The standards require int (and short int) to be at least 16 bits wide, which is not necessarily the processor's natural word size. On most processors int is chosen to be the easiest word size for the processor to handle for practical reasons (perhaps because int is the most-used type in C).
Of course. I was referring to a single compiler on one OS.
It really irks me that many people assume that int is capable of handling values larger than 32767. While this is true on some platforms/compilers, it's not true on others, and the standards require only that int be capable of representing the integer range [-32767,32767]. If you want to handle large values (up to +/-2 billion), use long int!
I work with a system/compiler (m68000/gcc) where int is 16 bits wide and all pointers are 32 bits wide, so things go south very quickly when functions that take or return pointers are not properly prototyped (there's also the fact the pointers are returned in a different register (a0) than integer types (d0)). For example, if I use memory allocated with malloc but forget to include stdlib.h, the system will probably fall over very quickly (there is virtually no memory protection on this particular system :().