Yea. Data always helps. I do not want programming to become a field where emotions and platitudes rule the day. From the point of view of the person you are trying to persuade, (more on that later) you are not providing sufficient evidence to back up your claims.
Originally Posted by Elysia
My contention with this statement is that the more C++ is used, the more these errors will come from C++ code as well. I can't appreciate the difference.
As for security issues... yeah, how many buffer overrun issues have you seen in software (especially big ones such as Internet Explorer and Windows)? That should be proof enough.
I mean, I will grant you that people do not have to do things the C way, when you have helpful classes but if you expect such code not to produce run time errors, then I think you'll end up hoist by your own petard. How many C++ people know about STL debug mode?
Funny story. I'm studying Java right now. In the book I am reading they recommend to use BigDecimal for money. Of course, I had a hard time believing that a serious financial application does this because scaling integers is easy and correct. So that led to some googling, and I found a page with a most interesting fact: "A standard long value can store the current value of the Unites States national debt (as cents, not dollars) 6477 times without any overflow."
Also, do you know why banks do not use C in their applications? Especially ones that handle money?
And how is that a negative mark on C? My memory is hazy on this book but in Deep C Secrets, there was a statement that separation of the C compiler from lint, a static analysis tool, was the worst thing to happen in C's history.
Oh, and btw, there was once a time when I saw an innocent looking piece of code. At first glance, I really couldn't find many problems with it.
But you know what? Applying static analysis on that piece of code turned up a lot of problems that I couldn't find.
This just goes to show that we make mistakes, and we are not perfect.
And there is no free lint for C++. C++ sucks then, right?
I cannot appreciate this argument since formal education has warped you into a C++ version of this.
Then there are people who just learn C in college and that this language is the way to go. Then there is the fact that most OS API (such as Linux) are C APIs.
Then there are people who thinks C is cool, awesome and can do everything, therefore it must be best. Some colleges teach C out of ignorance just to throw in some programming course.
C is popular so it is bad? That is the kind of argument I expect for mainstream music, Mr. hipster.
And because C is so popular, there are tons of tutorials, courses and tools for C. That causes more people to jump on the bandwagon.
One wonders why you have so much effort to spend slaying the C dragon.
It's an evil cycle. The reasons why C is still in use on desktop is endless.
Did they now? I'm having trouble confirming your statement.
But you know, Microsoft purposefully did not include C as a language for their new Metro API. I wonder why...
On one hand, Metro is nothing like what you say it is. Wikipedia leads me to believe that it is a design language that makes programs look like windows phone apps. That certainly doesn't seem very relevant.
But further digging helped me find a slide from a build conference from over a year ago. C is on it:
Microsoft to developers: Metro is your future | ZDNet It appears you mean the WinRT API.
Which is C++ with proprietary extentions:
Windows Runtime - Wikipedia, the free encyclopedia
So I will concede that you are at least factually correct. But if I may, do you have an intelligent answer to your own question? Why does WinRT exclude C? It would seem that C can use WinRT.
And I promised a comment on something earlier. To me, it sounds like you are asking a fisherman to throw away his rod, and that this whole debate is mean-spirited in origin.