Analog to Digital (conversion time)
I have this question, and my work in the lab seems to conflict my calcultations
Given an ADC(analog to digital conversion) time of 25 micro seconds, how many data points can be taken if the frequency is 50 Hz? 200Hz? 1 KHz? 1 MHz?
what I did is I too 1/50= .02 secs per oscillation
then .02 = 20,000 micro seconds.
then I did 20,000 micro sec/ 25 micro sec
to give me 800 data points taken.
then as I did the calculations for the 200 Hz and so on, the number of data points taken decreases.
does this make sense?
how does, please explain it in a physical way if you can, in terms of how we would see a general sign waves(if that relationship works at all).