Insulted for using C++?

This is a discussion on Insulted for using C++? within the General Discussions forums, part of the Community Boards category; Now, just write a program in Java that produces the same output! See my point? I don't see your point ...

  1. #61
    Registered User
    Join Date
    Sep 2004
    Location
    California
    Posts
    3,246
    Now, just write a program in Java that produces the same output! See my point?
    I don't see your point at all. All you did was use a language feature java doesn't have, so it would take a few extra lines of code to implement the same thing in Java. That would be like me writing some java code which uses anonymous classes, then ask someone to write the same thing in c++. What does that prove? nothing.

    Now pretend C++ didn't have destructors. You'd basically be forced to resort to something like:
    Well it's true that is how you would do it in c++ without destructors. In java, you would use the finally-without-catch idiom though.
    Code:
    void foo( void )
    {
        A a;
        B b;
        C c;
        D d;
        try {
            a = new A();
            b = new B();
            c = new C();
            d = new D();
        }
        finally {
            if(a != null)
                a.cleanup();
            if(b != null)
                b.cleanup();
            if(c != null)
                c.cleanup();
            if(d != null)
                d.cleanup();
        }
    }
    Last edited by bithub; 09-17-2009 at 02:29 PM.
    bit∙hub [bit-huhb] n. A source and destination for information.

  2. #62
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Portugal
    Posts
    7,487
    Indeed. Essentially idiomatic approaches being applied to the reality of each language.

    There's no clear cut advantage of one over another unless we dig into more specific details. But on that case, they will not "wrong" or "right" a certain language implementation like deterministic destructors. They will simply expose that for a certain problem, one language may be more appropriate than another.

    What I find particular interesting about the nature of destructors is that very rarely is this a problem. That is, during the course of a complete programming career it is possible that one can count with one hand the number of times they had to reject one method because it wouldn't simply work. And that is perhaps already saying a lot.

    EDIT: However being dispassionate about anything is something I have trouble with. So, despite anything I do prefer C++ deterministic approach. I just don't see it as particularly advantageous. The greatest chunk of my programming career, I used non deterministic destructors without even knowing about it (Visual Basic). That didn't stop me from filling my dinner plate and get a comfortable life and producing software that is still in operation and I'm especially proud of.
    Last edited by Mario F.; 09-17-2009 at 02:45 PM.
    The programmer’s wife tells him: “Run to the store and pick up a loaf of bread. If they have eggs, get a dozen.”
    The programmer comes home with 12 loaves of bread.


    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  3. #63
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,893
    Quote Originally Posted by Sebastiani View Post
    Sorry, circular logic doesn't count.
    There's no circular reasoning here. Garbage collection was a design goal of Java, and many other things, such as the lack of stack objects or deterministic destructors, are a consequence of this.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  4. #64
    Guest Sebastiani's Avatar
    Join Date
    Aug 2001
    Location
    Waterloo, Texas
    Posts
    5,708
    Quote Originally Posted by CornedBee
    There's no circular reasoning here.
    Well, in the sense that my question implied "Is there a compelling reason not to provide deterministic destructors, as opposed to GC?", your logic does seem somewhat circular.

    Quote Originally Posted by CornedBee
    Garbage collection was a design goal of Java, and many other things, such as the lack of stack objects or deterministic destructors, are a consequence of this.
    More likely, the thought of handling cyclic references prevented it's inclusion - I'm honestly considering sending Bill Joy a personal email to clear the matter up.

    Anyway, suit yourselves. The matter seems quite clear to me, but apparently this is not a universal sentiment. The debate rages on, I suppose.

  5. #65
    Disrupting the universe Mad_guy's Avatar
    Join Date
    Jun 2005
    Posts
    258
    Besides that (and this may be an over-generalized observation) in my experience, it seems that most Java/C# programmers have a tendency to be much less proficient at the algorithmic aspect of writing code, which indicates to me that the fact that the very design of these languages discourages "fine-grain" control actually inhibits a deeper understanding of computer science. C/C++, on the other hand, really force you to consider the effects of *everything* you do, which I think results in much more robust programmers (in general, at least), as a result.
    I think your reasoning is completely misguided to be honest - the point of being abstract with a programming language is so that we can truly be precise in specifying the problem we want to solve, and thus, let our brain work on the truly bigger problems in our domain. It is the same as mathematics, and I think Alfred Whitehead said it best: "The symbols for integers illustrate the enormous importance of a good notation. By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and in effect increases the mental power of the race."

    Truly this is why we have technologies like garbage collection, virtual machines and in fact different programming languages all together. The goal is to abstract away the details which are really unimportant to the problem, and let our brain focus on more important problems.

    Computer Science is not about pointers and it's not about the implications of C++ virtual methods or const-references. Edsger Dijkstra said it best: “Computer science is no more about computers than astronomy is about telescopes.” It is much larger than that in other words.

    I do not know what you speak of when you reference "algorithmic aspect of writing code." Are you referring to big O notation or something? This is not at all relevant to the programming language used (mostly, google 'Purely Functional Data Structures' if you're into that kind of thing.) How does C#/Java inhibit programmers from seeing/applying these 'algorithmic aspects' to their code which apparently become so glaringly obvious in a language like C++? Your argument is not very well defined.

    I also happen to disagree with a lot of the assertions you've made in your more technically-oriented posts, particularly concerning the using block for controlling the usage of a resource (seriously, the abstraction of 'opening a resource, doing a computation with said resource, closing resource' is incredibly useful for a large amount of things. There is no reason to repeat yourself and you should instead just abstract it generally, like the using block does) but I'm going to leave it at that for the most part.

    In short, I think the idea "low level code is just more better because of control and stuff, no exception" is a very harmful ideology, and characterizing an entire demographic of developers because you happen to hold it isn't a great way to make your argument. I employ you to look very deep into what you've said and think seriously about why we people continue to invest in areas like programming language technology and development. It is to make our lives easier, not more difficult, and leave our brain to bigger issues, and, as Whitehead said, effectively increase the mental power and ability of the race.
    operating systems: mac os 10.6, debian 5.0, windows 7
    editor: back to emacs because it's more awesomer!!
    version control: git

    website: http://0xff.ath.cx/~as/

  6. #66
    Malum in se abachler's Avatar
    Join Date
    Apr 2007
    Posts
    3,189
    Quote Originally Posted by Mad_guy View Post
    I think your reasoning is completely misguided to be honest - the point of being abstract with a programming language is so that we can truly be precise in specifying the problem we want to solve, and thus, let our brain work on the truly bigger problems in our domain. It is the same as mathematics, and I think Alfred Whitehead said it best: "The symbols for integers illustrate the enormous importance of a good notation. By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and in effect increases the mental power of the race."
    1st, use proper quote tags so we can read the quoted text in context.

    2nd, your assertion is completely out of touch with real world programming, where performance can make or break an application. By abstracting away that 'unnecessary' work you lose sight of the fact that how you do it is often almost as important as what you do. This is specifically what the poster you quoted was stating. Iif you can come up with an elegant solution that is completely abstracted away from the hardware and 100% portable, if it takes 100 years or in some cases 100ms to run it is fail.
    Last edited by abachler; 09-18-2009 at 01:44 AM.
    Until you can build a working general purpose reprogrammable computer out of basic components from radio shack, you are not fit to call yourself a programmer in my presence. This is cwhizard, signing off.

  7. #67
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,893
    Quote Originally Posted by Sebastiani View Post
    Well, in the sense that my question implied "Is there a compelling reason not to provide deterministic destructors, as opposed to GC?", your logic does seem somewhat circular.
    I saw no such implication. I interpreted the question as a simple, "Why would any language choose not to have deterministic destructors?"
    And the answer is that they are simply incompatible with automatic, cycle-capable, efficient garbage collection. If you don't have cycle-capable garbage collection, you can't make it automatic; the risk of bugs is too high, and the programmer can't really do anything about it. Examples: the infamous cycle bug in IE's JavaScript garbage collector, where you would get a massive memory leak if you had a cycle between a native JS object and a DOM node. Also, the extremely naive garbage collector in Warcraft 3's map scripting language.
    Similarly, if you have inefficient garbage collection, you're in trouble too. Java 1 was a complete failure on the desktop because of its poor GC implementation; every few seconds or minutes (depending on the load), the application would actually hang for a moment while the GC ran. This (and the not very efficient interpreter) gave Java a reputation for slowness that it still hasn't fully shaken to this day.
    The problem is, nobody has yet come up with an efficient, cycle-capable, automatic garbage collector that allows deterministic destruction.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  8. #68
    Guest Sebastiani's Avatar
    Join Date
    Aug 2001
    Location
    Waterloo, Texas
    Posts
    5,708
    Quote Originally Posted by Mad_guy
    I think your reasoning is completely misguided to be honest - the point of being abstract with a programming language is so that we can truly be precise in specifying the problem we want to solve, and thus, let our brain work on the truly bigger problems in our domain. It is the same as mathematics, and I think Alfred Whitehead said it best: "The symbols for integers illustrate the enormous importance of a good notation. By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and in effect increases the mental power of the race."
    Except that those are things not inherent with languages such as C# and Java, IMO. Rather, they have a tendency to weigh down the programmer with clumsy and unnecessary constructs.

    Quote Originally Posted by Mad_guy
    Truly this is why we have technologies like garbage collection, virtual machines and in fact different programming languages all together. The goal is to abstract away the details which are really unimportant to the problem, and let our brain focus on more important problems.
    No. We have GC because history has shown that the average shmoe tends to forget to release resources properly. We have virtual machines because the average operating system either isn't smart enough or simply doesn't care enough to protect it's users properly. And, well, a proliferation of programming languages quite simply because everyone has their own druthers.

    Quote Originally Posted by Mad_guy
    Computer Science is not about pointers and it's not about the implications of C++ virtual methods or const-references. Edsger Dijkstra said it best: “Computer science is no more about computers than astronomy is about telescopes.” It is much larger than that in other words.
    I totally agree with you there.

    Quote Originally Posted by Mad_guy
    In short, I think the idea "low level code is just more better because of control and stuff, no exception" is a very harmful ideology, and characterizing an entire demographic of developers because you happen to hold it isn't a great way to make your argument. I employ you to look very deep into what you've said and think seriously about why we people continue to invest in areas like programming language technology and development. It is to make our lives easier, not more difficult, and leave our brain to bigger issues, and, as Whitehead said, effectively increase the mental power and ability of the race.
    As I said, it's just an observation. Years ago, when I was at a family Christmas get-together, I remember someone had gotten one of those 'Electronics Kits' that you always see at electronics stores - all sorts of impressive experiments and what not. But on closer inspection, I realized that it was really just a bunch of pre-assembled components with "insert pin 'X' here" type labels everywhere. He didn't seem to mind, of course - in his mind he had mastered electronics! But it made me think back to the stories that my grandfather used to tell me about how, at only 10 or 11 years old, he built working radios from scrap and clever mechanical contraptions using only spare parts from old machines. Back in those days, you see, you could barely get a hold of the crudest materials, much less some sort of integrated component. Ironically, it was probably this very *lack* of facilities that helped inspire an entire generation to be so inventive, and in some sense, propel us into the modern age. So my point being that when you know that it all depends on you to get things working, you're much more likely to gain some real insight (and expertise). That certainly isn't always the case, of course, but often enough it is.

    So if you love C#/Java - fine. As long as you understand the internals of the machine and don't isolate yourself from trying new things - more power to you. But often the sentiment I hear from C#/Java programmers is that unmanaged code is "evil" and "dangerous", and as such won't touch a C compiler with a ten-foot pole. This, in my mind, is 1000 more destructive than a C++ programmer (who also happens to use those languages every day) complaining about their crappy design. I might be biased by my opinions, but at least I'm not blinded by ignorance.

    That said, I do think you made some excellent points.

  9. #69
    Guest Sebastiani's Avatar
    Join Date
    Aug 2001
    Location
    Waterloo, Texas
    Posts
    5,708
    Quote Originally Posted by CornedBee View Post
    I saw no such implication. I interpreted the question as a simple, "Why would any language choose not to have deterministic destructors?"
    And the answer is that they are simply incompatible with automatic, cycle-capable, efficient garbage collection. If you don't have cycle-capable garbage collection, you can't make it automatic; the risk of bugs is too high, and the programmer can't really do anything about it. Examples: the infamous cycle bug in IE's JavaScript garbage collector, where you would get a massive memory leak if you had a cycle between a native JS object and a DOM node. Also, the extremely naive garbage collector in Warcraft 3's map scripting language.
    Similarly, if you have inefficient garbage collection, you're in trouble too. Java 1 was a complete failure on the desktop because of its poor GC implementation; every few seconds or minutes (depending on the load), the application would actually hang for a moment while the GC ran. This (and the not very efficient interpreter) gave Java a reputation for slowness that it still hasn't fully shaken to this day.
    The problem is, nobody has yet come up with an efficient, cycle-capable, automatic garbage collector that allows deterministic destruction.

    Well, sure, now that you actually qualify your statement, it makes much more sense.

    I understand your point, but I would argue that if this is the case (that the feature wasn't included because of the difficulty of implementation) then perhaps we just aren't ready for auto-magical memory management.

  10. #70
    spurious conceit MK27's Avatar
    Join Date
    Jul 2008
    Location
    segmentation fault
    Posts
    8,300

    Cool

    Quote Originally Posted by Mad_guy View Post
    Edsger Dijkstra said it best: “Computer science is no more about computers than astronomy is about telescopes.”
    Pretty interesting (things that make you go "hmmm"); HOWEVER, with astronomy you can at least say that it is still really about something concrete to which a science may apply itself -- and it is not called "telescope science", so this is in fact a bad analogy. If computer science is not about computers in the same way, what is it about? Pure speculation ? "Logic"? Language? Philosophy? Method? Life, the universe and everything? Whatever I want? It seems to me Mr. Dijkstra may be a victim of his own pseudo eloquence here. Some of these people operate against the background of anglo-analytic philosophy, which sometimes is presented as the rightful inheritor of the mantle of all possible sanity, like a religion, instead of just a useful endeavour which gave us things like computer science, but no more (or less) than that. I appreciate mysticism, tho -- which that is what that is.

    The Whitehead quote is a great one, just I do not think it supports your argument except in a superficial way, because there is still the question of the value of your chosen notation, which is context dependent*. So it is still exactly the same argument, anyone could turn around and say, "Clearly Whitehead's statement applies best to Fortran". But since I don't disagree with the basic jist of your argument, I won't argue

    * I don't think there can be a best programming language in absolute terms. I think there can and is a "root"/fundamental one -- assembly -- which is easy to define and (hopefully) beyond contention. And then there can be better and worse, once you define your goals.

    Quote Originally Posted by Sebastiani
    No. We have GC because history has shown that the average shmoe tends to forget to release resources properly.
    Strongly disagree, altho it may be good for that too. The reason is that with a lot of high level tasks, doing your own garbage on a modern system is just a waste of time, it might as well be automated, as the expense is minimal. I think you are suspicious of it because you almost never rely on it. This does not mean I think *all* languages need it; for one thing, that will reduce diversity, which I'd go for a "computer science as ecology" metaphor

    Of course, your observation about the electronics kit is interesting here: who do think ended up with the most impressive app? vs. who do you think learned the most? I imagine once you have collected enough garbage, you may want to move on...there is no more to learn, nothing interesting and experimental left there.
    Last edited by MK27; 09-18-2009 at 06:37 AM.
    C programming resources:
    GNU C Function and Macro Index -- glibc reference manual
    The C Book -- nice online learner guide
    Current ISO draft standard
    CCAN -- new CPAN like open source library repository
    3 (different) GNU debugger tutorials: #1 -- #2 -- #3
    cpwiki -- our wiki on sourceforge

  11. #71
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Portugal
    Posts
    7,487
    Quote Originally Posted by MK27 View Post
    If computer science is not about computers in the same way, what is it about? Pure speculation ? "Logic"? Language? Philosophy? Method? Life, the universe and everything? Whatever I want?
    Dijkstra quote is:
    “Computer science is no more about computers than astronomy is about telescopes.”
    The interpretation I make of this is not that of negating the relevance of Computer Science on computers. He merely tries to remind us that the Computer Science field is much bigger than its tools. We also know the context where he put this phrase and can safely say that the quote is a direct defense to his belief that the act of programming is merely a tool of Computer Science, or even the application of the Computer Science field in everyday life. Programming is simply executing previously gained knowledge.

    Ironically enough this quote is a double-edged sword. It didn't stop him -- neither it could -- from being so anal about several programming languages of his time. So a tool is, according to him, still a possible subject of analysis.

    This puts the argument "all programming languages (within reason) are created equal" that I and others defend to the test. Personally I find Dijkstra went a little overboard on this letter. Probably not his proudest moment. The human mind is thankfully a lot more adaptable than that and one can evolve their skills from bad programming languages into good programming languages over the course of a single programming career. The thought that once you learn BASIC as your first language you are forever doomed into mediocrity is a bad argument. It is in fact negating the value of an entire generation of programmers. It is also not really very compatible with the tens of thousands of years of human history in which new and better knowledge was gained on the foundation of old and bad knowledge. How could this be, if we were doomed to not adapt to new knowledge?

    But I don't entirely deny the value of his arguments (that programming languages are subject of criticism). I personally feel languages such as Java inferior in a few respects. Some more important than others. And I still have yet to understand why the insistence in creating interpreted programming languages. However I cannot deny their ability to produce software. And this is where the debate begins.
    Last edited by Mario F.; 09-18-2009 at 07:23 AM.
    The programmer’s wife tells him: “Run to the store and pick up a loaf of bread. If they have eggs, get a dozen.”
    The programmer comes home with 12 loaves of bread.


    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  12. #72
    spurious conceit MK27's Avatar
    Join Date
    Jul 2008
    Location
    segmentation fault
    Posts
    8,300
    Quote Originally Posted by Mario F. View Post
    The interpretation I make of this is not that of negating the relevance of Computer Science on computers. He merely tries to remind us that the Computer Science field is much bigger than its tools.
    Point taken, perhaps the phrase "not about" should have been "not just about" or (most accurately) "not primarily about", but this still just looks like an inflated piece of vacuous rhetoric to me, since it is very hard to read as not implying that computers are a tool for studying _______, and thus _______ is the real focus of computer science. Literally, that is the metaphor with astronomy. My point is the ______ cannot be filled in, so this is a "pie in the sky sentiment". Computer science is primarily and fundamentally about computers, astronomy is not primarily and fundamentally about telescopes -- this is still just a very silly analogy. And computer science is in fact no bigger than it's tools -- the concerns of computer science historically always revolve around and are mostly derived from problems with the tools.

    Consider the way mad_guy uses it: the _________ (higher purpose or whatever) would be a justification for the idea that language X best gets away from trivial concerns (such as, the computer ) because it most clearly serves this _____. This is a clever (and common enough) rhetorical method*, because both Dijkstra and mad_guy manage to avoid having to say what ____ might be, but still make it seem very convincing (hence religuous or mystical). If _____ does not exist, however, it is hoodwinking, intentional or not (in the unintentional version, you hoodwink yourself).

    I would almost say the _______ is Mr. Dijkstra himself, since it is only because a prestigious and respected computer scientist said this that anyone would take it seriously. Hence, computer science is even bigger and more important than we might have thought, because Mr. Dijkstra (computer scientist) says so! What is this "bigger" place -- whatever Mr. Dijksra concerns himself with, above and beyond computers! But just because the pope says something, does not make it true, or even sensical. The cart does not go before the horse.

    * this is what the deconstructionists would call a suspicious ellipse.

    Quote Originally Posted by Mario F. View Post
    And I still have yet to understand why the insistence in creating interpreted programming languages.
    Mostly, I believe, because they work most efficiently that way. As a rule, the syntax of interpreted languages is much "higher level" than that of compiled languages, which the interpreter itself is actually done with one of the later. It is kind of like "why have an image format" when you could just write compilable code to do the same thing, and then you wouldn't need a seperate image viewer to interpret the format specific code, etc. Of course, this option would be 1) very unportable, and 2) ridiculously awkward.
    Last edited by MK27; 09-18-2009 at 08:14 AM.
    C programming resources:
    GNU C Function and Macro Index -- glibc reference manual
    The C Book -- nice online learner guide
    Current ISO draft standard
    CCAN -- new CPAN like open source library repository
    3 (different) GNU debugger tutorials: #1 -- #2 -- #3
    cpwiki -- our wiki on sourceforge

  13. #73
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,893
    And I still have yet to understand why the insistence in creating interpreted programming languages.
    The ease of the edit-try-repeat cycle. The fact that when the interpreter is built into something else, a text editor is all you need for development. Can you imagine what the web would be like if JavaScript had to be compiled? (Hint: there would be a lot less JavaScript.)
    And of course, the ease with which you can show these languages to others in an interactive interpreter. "Type this, hit enter, and something happens." No messing around with compiling, linking and executing. I know a lot of Java beginners who were very confused by its class loading behavior.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  14. #74
    spurious conceit MK27's Avatar
    Join Date
    Jul 2008
    Location
    segmentation fault
    Posts
    8,300
    Quote Originally Posted by CornedBee View Post
    Can you imagine what the web would be like if JavaScript had to be compiled? (Hint: there would be a lot less JavaScript.)
    Actually there would be none; since js is often embedded in HTML, it would be impossible to use it "compiled".

    Even if you relagated it exclusively to .js files (the "unobtrusive" approach), using code that was compiled into asm for the browser would be an assbackward approach.

    Also, javascript would be much easier to debug if it were compiled first, so this idea that the interpreter makes "the edit-try-repeat cycle" easier is also assbackward, at least in the case of js. Altho to be fair, I think error handling is better in most interpreted languages (as opposed to compiled ones), so your point is valid.

    "Flash" code is compiled into bytecode, but witness flash lacks javascript's potential for page interaction (except to the extent that it uses external js).

    Also, the only argument against "interpreted languages" that I have ever heard is that it's expensive, but this need not be true and becomes less so all the time, as the interpreters improve. Eg, it's often said that nothing is faster than perl at string manipulation because of the underlying C code.
    Last edited by MK27; 09-18-2009 at 08:50 AM.
    C programming resources:
    GNU C Function and Macro Index -- glibc reference manual
    The C Book -- nice online learner guide
    Current ISO draft standard
    CCAN -- new CPAN like open source library repository
    3 (different) GNU debugger tutorials: #1 -- #2 -- #3
    cpwiki -- our wiki on sourceforge

  15. #75
    * Death to Visual Basic * Devil Panther's Avatar
    Join Date
    Aug 2001
    Posts
    768
    Quote Originally Posted by sarah22 View Post
    I'm a currently 3rd year computer science student. I use C++ on most of my project and might also use it on my thesis that's related on 3D Graphics which probably Math extensive.

    Here in my school, they always tell me that I suck/lowtech/crazy because I don't code in java or C#. The reason I don't use it because I prefer to learn in C++ because of the OpenGL and DirectX, not because I don't know or something. Another one is I heard that most of the good games released today use C++.

    My question is, why do most people(students,some teachers) insult/annoy/provoke me for using C++ as my main programming language? Here in my school, I heard that most of the graduating student use java/c# on their thesis. Am I on the wrong track for using C++? Should I stop using it and move to java or C#?
    I call those people morons, especially the .NET users (Notice I don't use the term "developers").

    They prefer speed over control and quality. It's just as it was/is with MFC and Win32API developers. When they got stuck with MFC's limitations they had no idea how to solve their problem because they didn't bother to understand what the MFC library is made of: Win32API.

    As for Java and C++, there isn't much of a difference between them. Some people will claim (From complete lack of knowledge and experience) that Java is portable, unlike C++. The only people who say that and actually mean it, are the people that haven't tried to port their Java code to a different OS, and I'm not talking about your 5 line school homework Hello world programs!

    But I guess we are all idiots, for example, a C++ developer will say that script development (Perl, TCL, Ruby, Python, etc) and Web development (Javascript, PHP, etc.) are not real software development.

    Me... I will always low the .NET and regular VB "users".
    "I don't suffer from insanity but enjoy every minute of it" - Edgar Allen Poe

    http://www.Bloodware.net - Developing free software for the community.

Page 5 of 8 FirstFirst 12345678 LastLast
Popular pages Recent additions subscribe to a feed

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21