I'm on my quest on building a digital scope with a microcontroller on the serial port and c++.
The code on the uc is written, and will need various adjustments, now comes the real hard part - implementing a PC side software.
I'm thinking of how to send and process the data. Basically, I send at first two bytes, one which will tell the IC how many pins will be read, and then another byte for "start".
Then, the uc sends ascii 65 (packet start), then two characters for each pin's value, then ascii 90 (packet end).
I'm thinking of enhancing the protocol. Surely, there will be missed bytes, which will render this bad. How exactly can I detect when there will be skipped bytes, and how should the software react?
For the moment, I'm thinking of adding a "time" value to each sample, so in case a few samples are missed, the current sample is added a bit ahead of the last good one, and the missed data can be marked as bad, or it can be interpolated.
I'm also thinking - should I buffer all this data? I'm thinking the best way to do it this way is a circular/ring buffer. Another idea is using a fixed size deque as a buffer. And there's also a standard class for the last one.