oh... well chyrist sakes... i would have thought you all wouldn't be thinking of that 24-7, but you guys do fit that sterotype of the male humanity... that truly sickens me...
oh... well chyrist sakes... i would have thought you all wouldn't be thinking of that 24-7, but you guys do fit that sterotype of the male humanity... that truly sickens me...
hasafraggin shizigishin oppashigger...
>>but you guys do fit that sterotype of the male humanity... that truly sickens me...
Sickens you? You must be female then.
Ramble on...
Damnit, Ray! Reveal your gender to us in a spectacular spread labeled "My Gender! At Long Last!" Now go on - get posting!
I'm a guy, if anyone cares, and I hadn't even THOUGHT of my nichname in that light.
I like having a one character name (too bad I couldn't register as just "V") -- it makes me more of an international man of mystery=]
BTW, the final printout of my code was some 30+ pages, pretty densely packed, too. Turned it in today, man, am I glad that's over. Total that homework cost me over 36 hours, and I only had a few days when I could do it. And yes, I did spend, last night, 16 hours working nonstop (except for the travel time when I needed to go to a lab to test something). Worked on it from 4 PM to 8 AM.
The description of the program sounds tough. Was it hard coding?
"16 hours working nonstop "
now that's what i'm talkin about!, excuse my last reply.
Not really. Some of the stuff took some time -- I created a Matrix class to handle all kinds of linear algebra, which was time consuming. I chose this over pre-existing libraries specifically because a) I've had ........ poor luck getting performance out of existing classes, and b) I've never found Matrix classes that support all of the necessary operations I wanted.Originally posted by Garfield
The description of the program sounds tough. Was it hard coding?
A lot of the time was in simply running the datasets, and fine tuning. I ran many of the datasets 30 or so times, with various parameter changes, to determine the best values for maybe 4 or 5 parameters. Just changing these parameters DRASTICALLY changes the convergence of the network -- I could often double or even triple classification rates simply by playing with parameters to make it more likely to converge to a global minima.
Plus, some of the questions wanted some strange things. For example, one problem wanted me to run the algorithm until I noticed a particular feature of the hidden node outputs (basically, these are temporary values, computed from the inputs, used to computer the final outputs). It took me 60 runs to see this feature, and so even though my network had 100% classification rate, I had to continue to run it until I saw this special feature.
>>Damnit, Ray! Reveal your gender to us in a spectacular spread labeled "My Gender! At Long Last!" Now go on - get posting!
Well there was no reply by doubleanti was there? So my last guess could've been right.
Ramble on...
hee... our good friend mr. ray, here, has had you all stumped for plenty of time.
i, on the other hand, have known all along!
bwahahahah!
Okay, The V., can you actaully explain what the program is supposed to do:
> classes to do a multilayer perceptron (one type of neural network) for my ANN class.
Explain this. I don't really understand. Sounds interesting, though.
--Garfield
Well, a neural network is a "learning machine". Basically, you write a program that, when given training data, "learns" various features of the data, and uses this to make generalizations or predictions.
An example is the original data I tested this MLP with -- it is a set of measurements taken from iris flowers, with each sample taken from an iris of one of three types. The goal is to have the network learn how to distinguish among iris flowers based solely on this data.
So, the real test of the network is given testing data, can it correctly classify an iris into the proper class? And the answer, in my case, is yes, it can do it with above 90% correctness (97.3% was the best I've achieved).
The idea behind neural networks are that computers, working mathematically, can analyze data and find patterns within that are far too complicated for people to derive themselves. For example, for the iris data, there IS a mathematical equation you could write which would give you the ability to classify 97.3% of all iris flowers correctly. But this function is very complicated, and to try to figure out what the function is on pencil and paper would take ages for a person to accomplish. Because computers can do math vastly faster than humans, the goal of a neural network program is to create a mathematical model which the computer will follow that will eventually yield the function you want -- in this example, you want the function to tell you what kind of iris you have.
So, what you do is create a neural network program. The network takes however many inputs you want to feed it, and generates the desired number of outputs. You give it training data, you give it rules as to how it is supposed to learn, and you see how well it does.
I've got a class for my degree next year called Decision Support Systems. Apparently we get to do some basic (!) neural network programming. You wouldn't happen to know of some good sites or books I could read about NN?
Ramble on...
The V., that sounds awesome! Very interesting program. By that description, it sounds like one of those programs that is fun to program. Not dreading it when you have to program it. That is definitely neat, though.
--Garfield
what compiler did you use?
Oskilian
I used Borland C++ 5.02, which is what I do a lot of programs in, being as it was free, it has an IDE, and is mostly OK in most respects. It's NOT fully ANSI C++ compliant, though, which always bugs me when it causes problems...
Neural network programming IS fun stuff. It's the ONLY class that I've had where the homework problems are actually useful beyond the scope of the course.
The only book on neural networks that I've used so far is the book for the class, and unfortunately I haven't read it as much as I should -- I prefer the lectures, which give all the information needed and are explained better. The book does explain things, but sometimes you really have to pick apart the sentences to understand what they're saying, because they love to densely pack information into sentences.
The book, in case others wondered, is Neural Networks: A Comprehensive Foundation, by Simon Haykin. It's decent, although you better remember your calculus. I suppose that's true for any NN bok, though, unless they completely leave out the explanations of how they work. There's always a lot of math.
Luckily, often it simplifies incredibly, so implementation isn't as hard as you might think.