ok, i get this gig... but how do you "learn" them?
just memorize the definititon? or do you impliment them in code? i know it gives various examples, I'm just curious of your meathods, you seem to know what your talking about :D
Printable View
ok, i get this gig... but how do you "learn" them?
just memorize the definititon? or do you impliment them in code? i know it gives various examples, I'm just curious of your meathods, you seem to know what your talking about :D
>but how do you "learn" them?
Read the algorithm, read source code implementing the algorithm, understand the reasoning behind it, then try to implement it yourself. Once you have a working implementation you can then do a paper run (draw out the execution on paper) and step through the code as it runs in a debugger, comparing the results with your paper run to see if your understanding is correct. Then work on variations in the implementation, (using a sort algorithm as an example) change it to sort in descending order, then ascending order, then do a partial sort, come up with an easy interface, a generic interface, an easy and generic interface :D, make it stable if it isn't already, improve the performance, make it type independent, etc... There's a bunch you can do with even the simplest of algorithms.
How far you go depends on when you consider an algorithm "learnt". If you know the basic idea behind it but don't want to memorize implementations, you can keep a reference around and just look it up when you need to write it. Most programmers just learn the algorithms well enough to make a single good library that they'll use for the rest of their career (they hope!). Or they learn the algorithm well enough to use a library implementing it (written by someone else) wisely. If you don't care much about writing libraries then you can simply use them, but it's important to recognize the advantages and disadvantages of an algorithm when you use it, even if you didn't write it.