I'm developing an app that should capture stream from a webcam, capture audio, mux it in a FLV(maybe other container, e.g. mp4) and then stream it over a network. As I found in MS SDK, I can do it using a DirectShow, but the only option I have is to write a muxed stream to a file. So is there any options to write it to a memory buffer for furhter writing to a socket?
Ofcourse there are some tricks like a continuous reading from a muxed file, but it looks too ugly.
Another question is - can I split icoming muxed stream into frames? I mean if I have a 24 fps live source, is it possible to capture a muxed frame (audio + video), so I could drop some frames in case there is a really bad latency.
Thanks in advance!