> (1) is it the time to complete the execution of code ?
Only indirectly. It is used to compare algorithms completely independent of any implementation bias.
Theoretically at least, if you have two algorithms which are O(nlogn) and O(n*n), the first is better since the result grows more slowly. Which means its quicker to do the same amount of work.

For a given algorithm, say quicksort which is O(nlogn), and a given machine, OS, compiler, you can work out the value of some constant 'C', such that
time = C * n * log(n)
Having timed how long it takes to sort say n=1000 integers (to determine the value of C), you can use that to predict with some reasonable accuracy how long it would take to sort say 5000 integers.

One more thing to note - the value of n has to be pretty large to get a true idea of the relationship between the complexity and the amount of real time it takes to execute.
Try comparing quicksort and bubblesort for small values of n to see what I mean.

> (2) suppose a code has time complexity O(n*log n ) . what does it mean ? what does n stands for ?
The size of the data - n is how many elements you have in your array for example.

> (3) At last , i want to know how can i calculate the time complexity of a code ?
It pretty much boils down to examining the nature of the loops in the algorithm, like how they're nested one inside another.
Code:
for ( i = 0 ; i < n ; i++ ) 
This is O(n)

for ( i = 0 ; i < n ; i++ ) for ( j = 0 ; j < n ; j++ )
This is O(n*n)

for ( i = 0 ; i < n ; i++ ) for ( j = 0 ; j < n ; j++ )
for ( k = 0 ; k < n ; k++ )
Although its tempting to write O(n*n+n) or O((n+1)*n), multiple terms and constants are
not usually shown.  For very large n, its only the major term which is going to be significant.
So this is also O(n*n)
From a pragmatic point of view, you run your code with lots of values of n, and attempt to plot a graph of n vs. time.