On the general subject of recursion
The algorithm to traverse a binary tree you present is a typical recursive algorithm, though in two paths. First, to better understand recursion, one should review a single path in recursion.
For that I start with a simple depth counter:
Code:
void foo( int n )
{
std::cout << "Entering at: " << n << std::endl;
if ( n < 5 ) foo( n + 1 );
std::cout << "Leaving at: " << n << std::endl;
}
int main()
{
foo( 0 );
return 0;
}
The output is this:
Entering at: 0
Entering at: 1
Entering at: 2
Entering at: 3
Entering at: 4
Entering at: 5
Leaving at: 5
Leaving at: 4
Leaving at: 3
Leaving at: 2
Leaving at: 1
Leaving at: 0
This illustrates two phases for each call to foo. The code before the recursive call (the call to foo inside foo) is phase 1, the recursive call itself descends into the recursion, then as that returns the code after the recursive call executes as phase 2 of the recursion.
Some recursive algorithms ignore either phase, some rely upon both phases.
What's important to understand is the operation of the stack when a function calls itself. You are probably aware of stack containers, but in the context of function oriented languages (most modern languages), "the" stack is as a block of memory intimately associated with the CPU (it has at least one register devoted to the operation of the stack). "The" stack operates as a LIFO stack (last in, first out). This is like a stack of cards dealt face up, one atop the other. Deal out 5 cards, and from your view they appear, from the top card, in reverse order from that which was dealt.
Each function call generates a new entry on "the" stack. It represents, first among a number of things, the position of the code that called the function. That's how the CPU knows where to return when the function concludes.
The next thing represented on the stack, often known as the stack frame, are the local variables declared in that function, and the parameters pass to that function.
Armed with that, review how calling foo acts upon the stack. In main, the call to foo with paremeter 0 (foo at 0) pushes one entry on the stack,
including a copy of the value of the parameter 0.
When foo calls itself with foo at 1, another entry is pushed on the stack with a copy of the parameter (at 1 now). This repeats until the termination test is met (in the example n >= 5 terminates the recursion).
At that moment all phase 1 steps of the recursion were performed, leading to the first half of the output example. At that moment "the" stack resembles that output of all the "entering at" statements, but in reverse. The stack, like that stack of cards dealt face up, shows parameter 5 on the top of the stack.
Upon return from the recursive call with foo at 5, that entry on the stack is removed, much like pulling a card off the stop of the stack of cards and putting it aside.
That reveals the parameter 4, which is the recursive level being executed upon the return from 5. The output of the "leaving at" lines are now being executed. 5 was already printed when that recursive call was denied, and upon return the entry on "the" stack for 5 was removed.
Pause a moment to seal this image in mind. The execution is on the line printing "leaving at" where n is 4.
Execution just returned from the call to foo, where n was 4 but the parameter was incremented. The stack, storing the value of n, represents the "stack frame" for this recursive call.
All of the local variables (including the input parameters) are allocated in stack memory. That allocation is formed automatically for you by the compiler based on the code you write, so if you declare more local variables more room is set aside by the compiler.
Each level of recursion is an entry on the stack. Each entry corresponds to a kind of structure formed by the local variables you declare. It is almost as if there is a hidden pointer to a structure defined by all the local variables (and input parameters).
Actually, it's not "almost", it is. That's the stack pointer (or more specifically the stack frame pointer).
Imagine you're inside the machine, that you've been transported into the computer, you're floating along with the data, you're standing on the stack at this moment where n is 4, and your feet are on that code about to print "leaving at".
Looking down, through a translucent floor below you, there is now
an array of structures underneath you. If you look carefully you'll see your on the 4th story of a building of structures, with the one below you being n at 3, and below that n at 2, and so on to the ground where n is 0.
When you step forward through the code, and foo at 4 returns, you jump down to the floor below, where n was 3 (and because you dropped a floor, or level, n is 3 again).
This is the general notion of recursion. Although many algorithms call recursion (depth), because they view this from a reverse perspective, it is actually more appropriate to think of each recursive call to be an upward growth of the stack, because it corresponds to real world analogies, like a stack of cards dealt face up, more naturally to the human mind.
Factually, either view is just as valid because inside the machine there really is no absolute orientation, no gravity, no flat ground.
Now, think upon your professor's example (which, no doubt, he got years ago from our elders).
It recurses on two values, moving to left and a right node pointer, but the operation is exactly the same - there are just two paths. There is still only one stack at a time, but the path traveled forms a kind of illustration of the tree itself.
If you followed the operation of the stack over time running the tree traversal code, you'd notice that each separate recursion traces out the path through the tree. The operation of the stack itself is the same as depicted above.
Instead of incrementing a value like n, the traversal depends upon the structure of the data where each link points to another, where each recursion places a copy of the "next pointer" on the stack, and upon return is running code based on the same structure of the stack illustrated above.