Hello all, I have two processes. The first process reads large amounts of data off the HDD (originally created with a SDR sending data to the disk via a USB 2.0 link) at a minimum speed of 40MB/s. The second process will will run the data through various DSP algorithms. I need to develop a good strategy for processing this data in real-time as it enters the system. What would be the best way of streaming data through the various processes (at 40MB/s)? I was thinking shared memory but not quite sure about this strategy. Can anyone suggest any good reference material for large-volume, high-speed real-time data processing?
I'm running Fedora 10 and writing code in C.
Thanks,
-mntgoat