IBM Cognitive Computing

This is a discussion on IBM Cognitive Computing within the General Discussions forums, part of the Community Boards category; I have been following this for a few years and wondered what some of you think of this. Personally I ...

  1. #1
    Super Moderator VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,590

    IBM Cognitive Computing

    I have been following this for a few years and wondered what some of you think of this. Personally I feel it could be a huge leap forward but could also make the computer more imperfect than perfect which may not be what we need.

    IBM Research: Cognitive computing

    Specifically, where would software engineers fit into this kind of hardware technology?

  2. #2
    Registered User rogster001's Avatar
    Join Date
    Aug 2006
    Location
    Liverpool UK
    Posts
    1,409
    Well i think it is a natural evolution of all that technology has become, i personally love anything like that, the learning machine, whether or not its entirely desirable is another thing, I am going to read up more on that link you posted though, try and get a better understanding of it all. I did chuckle though when it was that company though.. as a typo of mine in a config file '=' instead of '=-' gave a section of their business running a certain software, an hour of downtime last week, maybe that wouldnt happen in the cognitive world ...
    Thought for the day:
    "Are you sure your sanity chip is fully screwed in sir?" (Kryten)
    FLTK: "The most fun you can have with your clothes on."

    Stroustrup:
    "If I had thought of it and had some marketing sense every computer and just about any gadget would have had a little 'C++ Inside' sticker on it'"

  3. #3
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Portugal
    Posts
    7,412
    Quote Originally Posted by VirtualAce View Post
    but could also make the computer more imperfect than perfect which may not be what we need [...] Specifically, where would software engineers fit into this kind of hardware technology?
    Indeed it would be a bit scary. One of the first lessons we learn while debugging is that the computer is always right. We depend on our computers deterministic behavior to make effective use of them. It's the human mind the one which instructs the machine, either through programming or merely by moving and clicking a mouse. A system that we pretend being able to "think" and "decide" is the type of computer system we expect to provide us with all the wrong answers, diminishing the tremendous value Computing in general has been having to humanity in all these years. If you want a job for those software engineers, that would be a lifetime code maintenance contract.

    We should, of course, expect terms like AI and Machine Learning to evolve constantly. We require these for numerous types of computing; from games to weather forecast and all sorts of scientific analysis. But to expect these systems to relate in any way to the cognitive abilities of a human being is rather disingenuous of them and, I'll say, childish.

    The SyNAPSE project reads almost like a manifest of the Computational Theory of the Mind. But for a computer system to behave like the human mind, it is necessary that the human mind behaves like a computer system. And that we don't know for sure. Men like John Searle have in fact produced compelling evidence it doesn't. The human brain and a computer (and by extension the human mind and a program) have apparently nothing to do with each other. Cognition is as yet unobtainable except by natural means. We just don't know enough of our cognitive abilities and how they exist and function within the human brain to pretend we can replicate it by merely replicating neuronal functions. We may build more complex systems, but they will always exist within the framework of a deterministic device and its deterministic programming language. It's not clear that adding adding deterministic complexity leads to uncertainty. In fact Quantum Theory seems to point in the opposite direction; that uncertainty is the base for determinism.

    So I tend to look at all these systems as a serious form of entertainment that unfortunately often smash against their researchers excessive enthusiasm in the way they choose to describe their attempts at making "smarter" systems.

    Note: It's interesting to note however that we are just stuck to our current computer technology. Most research in new computer architectures seems to point towards non deterministic machines. But despite the hoopla around quantum machines, I instead put my stock on biochemistry as the next great contributor to computer technology. And I imagine a future (which I will probably won't witness) of immensely powerful chemical computers at every home processing data through highly deterministic chemical reactions. We just like our computers that way.
    The programmer’s wife tells him: “Run to the store and pick up a loaf of bread. If they have eggs, get a dozen.”
    The programmer comes home with 12 loaves of bread.


    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  4. #4
    Registered User
    Join Date
    Oct 2006
    Posts
    2,293
    I remember, many years ago, watching a show on tv that was talking about analog computing, and neural networks. if a computer is to ever get close to the capabilities and characteristics of a biological brain, I think analog circuitry is the only realistic way to make it happen. there could certainly be digital aspects of it, specifically for high speed data communication, but biological neurons are largely analog devices.
    Code:
    namespace life
    {
        const bool change = true;
    }

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. computing the sum within a loop
    By zesty in forum C Programming
    Replies: 1
    Last Post: 12-03-2009, 12:03 PM
  2. Help me pls...About computing a grade
    By Sunday in forum C Programming
    Replies: 2
    Last Post: 11-03-2007, 12:41 PM
  3. computing median value
    By ademkiv in forum C Programming
    Replies: 3
    Last Post: 04-03-2006, 09:43 PM
  4. Help with Computing Divisors
    By MrPink in forum C++ Programming
    Replies: 2
    Last Post: 09-30-2005, 07:35 PM
  5. Pervasive computing
    By mervin in forum A Brief History of Cprogramming.com
    Replies: 0
    Last Post: 01-23-2002, 05:33 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21