Thread: So, why does MS make the deviation between text and binary File I/O?

  1. #1
    Has a Masters in B.S.
    Join Date
    Aug 2001
    Posts
    2,263

    So, why does MS make the deviation between text and binary File I/O?

    Why does MS make the deviation between text and binary File I/O?

    is there any logical reasoning for this?




    BTW: this is not a bash MS thread, this is a serious question please reserve irrelevant comments.
    ADVISORY: This users posts are rated CP-MA, for Mature Audiences only.

  2. #2
    &TH of undefined behavior Fordy's Avatar
    Join Date
    Aug 2001
    Posts
    5,793

    Re: So, why does MS make the deviation between text and binary File I/O?

    Originally posted by no-one
    Why does MS make the deviation between text and binary File I/O?

    is there any logical reasoning for this?




    BTW: this is not a bash MS thread, this is a serious question please reserve irrelevant comments.
    Maybe for unicode implimentations? That might be part of it I guess.....hmm...good question

  3. #3
    Has a Masters in B.S.
    Join Date
    Aug 2001
    Posts
    2,263
    but doesn't *nix support unicode, without this difference?
    ADVISORY: This users posts are rated CP-MA, for Mature Audiences only.

  4. #4
    &TH of undefined behavior Fordy's Avatar
    Join Date
    Aug 2001
    Posts
    5,793
    Originally posted by no-one
    but doesn't *nix support unicode, without this difference?
    Yeah true......

  5. #5
    Registered User
    Join Date
    Jun 2002
    Posts
    151
    Interpretation of control characters.

  6. #6
    Registered User Aran's Avatar
    Join Date
    Aug 2001
    Posts
    1,301
    also, Java seperates bit (byte?) streams and char streams.. which i find to be a pain in the rear.

  7. #7
    no-one
    Guest
    >Interpretation of control characters.

    this doesn't make sense to me, shouldn't this be left to the application to interpret, depending on the files data?

    this is a retorical question, not aimed at you.

  8. #8
    Registered User
    Join Date
    Jun 2002
    Posts
    151
    Not if you want to use a standard set of control characters to achieve something and they don't correspond to the binary values expected by an o/s. It'll need to know that you're using these characters to represent a control character rather than their actual binary value.

    Perhaps I've misunderstood your point, but at a lower level (CreateFile, ReadFile, WriteFile, etc) Windows doesn't differentiate between text and binary file I/O anyway. It's just a runtime library feature.

  9. #9
    Just because ygfperson's Avatar
    Join Date
    Jan 2002
    Posts
    2,490
    i thought it was because of the carriage return/linefeed thing in dos. i don't know anything for sure but i know that i had to make some changes to my data compression program to fix some encoding problems.

  10. #10
    Has a Masters in B.S.
    Join Date
    Aug 2001
    Posts
    2,263
    >It'll need to know that you're using these characters to represent a control character rather than their actual binary value.

    yes but shouldn't this be done by context?( not as in device contexts or anything like)

    excuse me if im being a bit slow here, it just doesnt seem like enough to justify the feature. am i mistaken or is it just a mere convenience?
    ADVISORY: This users posts are rated CP-MA, for Mature Audiences only.

  11. #11
    Registered User
    Join Date
    Jun 2002
    Posts
    151
    >yes but shouldn't this be done by context?

    It could be but you'd have to be able to dynamically change the control characters that applications/os are looking out for. If an application has been hard coded to look for 0xd 0xa to indicate a new line and you write your text file in binary just using the C control char '\n' (0xa) then it can't tell if you wanted a new line or not (0xa by itself could have a completely different native meaning).

    You could probably add some runtime code into your own apps to check for either 0xd 0xa or 0xa by itself, but why bother when you can just set a compile time flag? Alternatively you could write all your Windows text I/O in binary using the native control chars, but this would break source code portability as compilers for other o/s's will still accept the flags for other i/o modes even if they have no affect.

  12. #12
    no-one
    Guest
    i get it, but i still don't agree with it, looks to me to be pure convenience, though it might help with certain issues, i just don't think its a necessity, thats just my opinion...

    thanks, for the answer(s) though.

  13. #13
    Visionary Philosopher Sayeh's Avatar
    Join Date
    Aug 2002
    Posts
    212
    The difference is irrelevant. Just treat everything as binary (it is anyway). I mean, most storage devices are not character based devices, they are block devices. Work in big chunks and use RAM for character-based work.
    It is not the spoon that bends, it is you who bends around the spoon.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. binary text file
    By spanker in forum C Programming
    Replies: 7
    Last Post: 12-28-2007, 02:10 AM
  2. Text file versus Binary file
    By sherwi in forum C Programming
    Replies: 6
    Last Post: 04-15-2006, 02:41 PM
  3. How to use FTP?
    By maxorator in forum C++ Programming
    Replies: 8
    Last Post: 11-04-2005, 03:17 PM
  4. Request for comments
    By Prelude in forum A Brief History of Cprogramming.com
    Replies: 15
    Last Post: 01-02-2004, 10:33 AM
  5. simulate Grep command in Unix using C
    By laxmi in forum C Programming
    Replies: 6
    Last Post: 05-10-2002, 04:10 PM