Thread: Something strength !!!!!!!

  1. #1
    Registered User
    Join Date
    Jun 2003
    Posts
    70

    Something strength !!!!!!!

    Please look at following 'C' code :

    main()
    {
    char c1;
    int i1;
    char c2;

    printf("%u %u %u", &c1, &i1, &c2);
    }


    I got output of the above mentioned code is :

    65497 65498 65501


    Here the difference between first two numbers is 1 and it is ok as the size of char type is 1. But I want to know why the difference between 2nd and 3rd number is 3, while the size of int data type is only 2. So acording to me the output shoud be as follows:

    65497 65498 65500

    Can anybody please tell me why this happens ?

    Thanking You,
    Chintan R Naik

  2. #2
    Banned master5001's Avatar
    Join Date
    Aug 2001
    Location
    Visalia, CA, USA
    Posts
    3,685
    Sometimes your compiler is making an attempt at being smart and tries to optimize data along 32-bit boundaries. Though I don't believe that to be the case here. To be perfectly frank, if I wanted to know exactly where c1, i1, and c2 were in relation to each other in the frame I'd write the function in assembler. Trust your compiler's judgement in this area. I haven't seen too many instances where the compiler's output assembler is off in a way that will cost you performance.

  3. #3
    Been here, done that.
    Join Date
    May 2003
    Posts
    1,164
    16 bit compiler? DOS? Probably one or both.
    Definition: Politics -- Latin, from
    poly meaning many and
    tics meaning blood sucking parasites
    -- Tom Smothers

  4. #4
    Registered User
    Join Date
    Jun 2003
    Posts
    70
    >16 bit compiler? DOS? Probably one or both.

    No.....I have ran this program fragment on 32-bit computer running on Windows'98.
    Chintan R Naik

  5. #5
    Registered User
    Join Date
    May 2003
    Posts
    1,619
    Yes, but was it COMPILED for a 32 bit computer? 16-bit code will still run.

    And it seems certain you didn't compile it for 32 bits -- an int should typically be 4 bytes under most popular 32 bit implementations. We know the final character variable can't overlap the integer, so the int must have a size of 2 bytes (or 3, but I am not aware of any compilers out there which would use a 3-byte integer type; certainly no x86 compilers).

    It seems obvious it was compiled as a 16-bit program. And it really doesn't matter where the compiler chooses to put things into memory -- the compiler gets to put them anywhere it feels like.

  6. #6
    Registered User
    Join Date
    Jun 2003
    Posts
    70
    >And it really doesn't matter where the compiler chooses to put >things into memory -- the compiler gets to put them anywhere it >feels like.

    Do you mean that the efficiency factor particulary for memory allocation is completley depend on compiler.

    Pl. look at following two code fragments: Suppose both are running on 16 bit computer and also compiled for 16 bit.

    1.
    main()
    {
    int i;
    char c;

    printf("%c %d",c,i);
    }

    2.
    main()
    {
    char c;
    int i;

    printf("%c %d",c,i);
    }

    What I am thinking is first one will execute faster than second one. Faster by 1 instuction cycle. because according to me both are assiged at access boundary.

    Now if allocation is depend completly on compiler then I think we don't have to look at this point..... is it ok ???
    Chintan R Naik

  7. #7
    ATH0 quzah's Avatar
    Join Date
    Oct 2001
    Posts
    14,826
    There is no point in optimizing most code. If you really need to, use the compiler flags which do so. Other than that, use a profiler on it to see where your bottlenecks are and rethink your algos. It's pointless to optimize "Hello World!".

    Quzah.
    Hope is the first step on the road to disappointment.

  8. #8
    zsaniK Kinasz's Avatar
    Join Date
    Jan 2003
    Posts
    222
    Every cycle counts, if you loop 8,000,000,000 times at some point then it will make a minute difference if you have three or four extra cycles in each loop

    I'm newer to c than I am to assembly but i dont think just because your programming higher level for faster processors that you can say there is no point optimizing code
    "Assumptions are the mother of all **** ups!"

  9. #9
    the hat of redundancy hat nvoigt's Avatar
    Join Date
    Aug 2001
    Location
    Hannover, Germany
    Posts
    3,130
    Quzah's point is that you should first find out where and what to optimize before starting to throw in wild optimization guesses.
    hth
    -nv

    She was so Blonde, she spent 20 minutes looking at the orange juice can because it said "Concentrate."

    When in doubt, read the FAQ.
    Then ask a smart question.

  10. #10
    Registered User
    Join Date
    Jun 2003
    Posts
    70
    Kinasz, you are right.....but I was thinking in this way because I think it is required at core level. Suppose in future I will get some microcontroller based work or some embedded system based work which is of my interest (though I haven't any experience yet towards this side), then I think knowlege about this type of technics will be very helpfull
    Chintan R Naik

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Whoa... tough one here.
    By nickodonnell in forum Game Programming
    Replies: 10
    Last Post: 10-08-2005, 12:40 PM
  2. what is wrong with this code please
    By korbitz in forum Windows Programming
    Replies: 3
    Last Post: 03-05-2004, 10:11 AM
  3. Results for the Encryption Contest -- June 23, 2002
    By ygfperson in forum A Brief History of Cprogramming.com
    Replies: 18
    Last Post: 07-07-2002, 08:04 AM
  4. C++ Vs Java : Job market stability & probability.
    By zahid in forum A Brief History of Cprogramming.com
    Replies: 37
    Last Post: 01-07-2002, 12:35 PM
  5. relative strength of encryption algorithms (blowfish, des, rinjdael...)
    By duck-billed platypus in forum A Brief History of Cprogramming.com
    Replies: 3
    Last Post: 12-30-2001, 04:20 PM