the topic is writing a program that determines the length of a quantum in the OS loaded in the computer.

I think, program is a simple program doing nothing but integer operations for a long time, say for 20secs. Let us run this program twice unless our computer is not dual core system the processes normally will run in 41 says.The one second delay is due to switching. From here, we may calculate how many quanta are used per process; then, we may calculate the duration of a quantum. Our program should measure the time in milliseconds for accuracy.