Ha ha. But no, I deem OSes that provide C libraries that by default insist on mangling the contents of the files non-sane.
Old Mac OS (prior to X) used to have really funky files, with resources attached that you couldn't reach just by reading the data. Some of those resources were used by the UI for icons, some where internally used by the application. It works fine if you never transfer files to or from a different OS. If you did.. well, the results were much funkier than you'd imagine.
Newline mangling is peanuts compared to those issues, although all the various little problems associated with the insane "binary"/"text" file handling differences are like a million papercuts. I'd understand them in a high-level language, but in the base C libraries, it's sheer idiocy.
(Just before the end of the millenium, I debugged an issue in a mixed PC-Mac lab with printing. Certain jobs printed from Macs via a Linux AppleTalk fileserver would get corrupted, sometimes. Typically if they contained bitmap graphics; text and vector stuff tended to be okay. It turned out to be newline convention related issue -- there was an extra 10 <-> 13 conversion for binary data on the Mac end. Fortunately, there was a workaround.)
If I need to read text input that is possibly originated in non-POSIXy systems, I always resort to universal newline support (that is, \r\n \n\r \r \n are all considered newlines, and are checked in that order). Anything else is just stabbing yourself in the brain with deformed cacti.