The problem is, they are correct. 100 billion bytes is 100 gigabytes (GB), by conventional SI unit reckoning. Now, if they claimed a capacity of 100 GiB for an actual capacity of 100 billion bytes, that would be a different matter altogether.Its false advertising IMO, same as HDD's that advertise 500GB when in fact they only have 465, but they choose to use fuzzy math and say they have 100 billion bytes, thats a 100 gigabyte, which it isnt.
Look up a C++ Reference and learn How To Ask Questions The Smart WayOriginally Posted by Bjarne Stroustrup (2000-10-14)
The term GiB was added after the fact to 'resolve' this issue. GB was always 1048576 KB (KiB by the new 'standard') when dealing with computers, just as the term calorie in nutrition is known to mean kilocalorie. The adertising retards were just takign advantage of the jargon used in 2 disciplines having a different meanign in laymans terms. Naturalyl they chose the one that made their product look better. This was not always teh case, fro many years they correctly used the proper values and only switched when the encountered a client base that was not saavy enough to know the difference.
If you say kilobyte, anyone who is a programmer automatically thinks 1024 bytes, not 1000 bytes. The term kilo when applied to computer math was borrowed from the metric system, but it is not in fact the same. Or at least ti wasn't until the retards at SI cow towed to industry and made the distinction. IEEE never made such a distinction afaik, wqhich is why industry prefers SI, and engineers prefer IEEE.
Last edited by abachler; 04-10-2008 at 08:57 AM.
That's true, but the market consists of more than just programmers and other engineers ("a client base that was not saavy enough to know the difference"). Consequently, they can defend their choice of units, even though it is intentionally misleading.If you say kilobyte, anyone who is a programmer automatically thinks 1024 bytes, not 1000 bytes.
Look up a C++ Reference and learn How To Ask Questions The Smart WayOriginally Posted by Bjarne Stroustrup (2000-10-14)
To the end user yes, I don't think we as programmers and engineers should encourage the use amongst ourselves. I don't use the term GiB, and I generally look with disdain upon those that do, unless it is specifically refering to the difference betwene the two. In my life at least, the people I socialize with are mostly either engineers themselves, or educated enough to know the difference. Granted the drooling masses neither know nor care as long as they can surf the web for 'movies', so then why even bother having seperate terms?
Last edited by abachler; 04-10-2008 at 09:44 AM.
Nah, you don't need to be a programmer to recognize the difference between 1024 bytes and 1000 bytes... any average computer user can recognize the difference when they plug in their new 500GB HDD and see only that they can only partition about 465GB according to their operating system. If this problem was limited only to programmers and engineers, the uproar wouldn't be nearly as large as it has been.
Sent from my iPadŽ
Well, I haven't seen anyone advertise MB as megabits, and if they are, that's worth complaining about. However, I remember when I was very young and knew very little about computers complaining why my 56K connection only got a max of 7KB/sec on downloads. After all, we didn't even call it a 56Kb connection, it was always simply "56K".
Last edited by SlyMaelstrom; 04-10-2008 at 10:39 AM.
Sent from my iPadŽ
Actually, I believe the convention for bits is to use lowercase, e.g., Kbps and Mbps.
Look up a C++ Reference and learn How To Ask Questions The Smart WayOriginally Posted by Bjarne Stroustrup (2000-10-14)
*cough* Thanks, captain obvious. *sneeze* *yawn* *burp*
There was a reason I *hiccup* said "when I was very young and knew very little about computers"... *weeze* Clearly, I understand that now... and yes, *gag* *cough* as I said in my original post, the first reply on this topic:*pant* I hope you know how to take things liightly. *pant*Originally Posted by SlyMaelstrom
*/pant**/pant**/cough**/gag**/weeze**/hiccup**/burp**/yawn**/sneeze**/cough*
Last edited by SlyMaelstrom; 04-10-2008 at 11:10 AM.
Sent from my iPadŽ
The problem is, as programmers we're not trying to be "liars" by stating that a K is 1024. It simply is. But by introducing the "kibi" unit, it only solidified in the minds of laypersons that we're trying to pull one over on them. "Look, they were forced to invent a new unit to make it obvious how they're screwing us! Buncha liars!"
b is bits, B is bytes by common convention. I believe the IEEE even made it a standard. Bytes are more the norm for storage, where bits are used for communications because it typically takes more than 8 bits to transfer a byte but exactly how many is dependant on the protocol, not the hardware.
Bits per second are more meaningful when discussing communications technology, they also happen to be bigger numbers, and companies like to put big numbers of their product because people like big numbers.
50% of the human race has an IQ less than 100. Pigs have an IQ of 65. 65 * 1.5 = 97.5, which means a lot of people are barely half again as smart as a pig.
I see dumb people, they walk around like everybody else, most of them don't even know they are dumb.