Related to this question, does the Media SDK even generate this moov atom? I'm just thinking that at least in my code, I call a bunch of MFXMuxer_PutBitstream's, during which time I doubt the moov atom is being generated and write to the file after every call. Then I call a MFXMuxer_Close, which is where I assume this would be generated, except that I never write to the file afterward. Is there a way to write that to the file using the Media SDK?
So upon further looking into this, it appears that the MediaSDK library does not support this, I'm hardcoding it but if you wanted to support streaming mp4s, it's a matter of changing this line in MFXMuxer_PutBitstream, probably as an option of your SDK, because I believe it increases the size of the video stream generated, since essentially you are writing a header for every frame:
If the packet is NULL then it seems it will automatically write the information needed for fragmented mp4s http://www.ffmpeg.org/doxygen/trunk/group__lavf__encoding.html#ga37352ed2c63493c38219d935e71db6c1
As indicated here: http://libav.org/avconv.html#MOV_002fMP4_002fISMV
I'm not entirely sure this actually works however, since the issue I'm now having is that it turns out I haven't been actually saving the frames generated by MFX_PutBitstream to a file. I had some misdirected pointers and anyway thought it was working when it was actually just saving the original h264 frames. I'm wondering though, could you tell me how can I access the bitstream written to by MFXMuxer_PutBitstream, after I've called it, I don't see anything in the documentation.
I assume it has something to do it: mux->data_io->Read ?
I'm still checking into the location of the moov atom, but the ffmpeg integration code in the Media SDK tutorial package has been used in demos of live streaming. If you output to UDP you can display the live stream with ffplay with what is included in the tutorial.
I've heard that sometimes it is possible to stream even without the moov atom, but often the video player will see that it doesn't have one and throw an error instead.. just not always and I hear it doesn't work in my case where I want to stream to an HTML5 video tag.
Regarding the other issue I mentioned, that is currently more important, I am not able to read the generated mp4 frames after calling MFX_PutBitstream, which I feel should but does not call dataIO->Write, correct? Because it should write to the mfxMuxer's data_io object, which was set in the Init method to be the data_io that was passed into MFXMuxer_Init as mfxDataIO *data_io, right?