- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi there,
I would like to know the way that H.264 Encoder saves the NV12 interlaced data (field pair) in memory.
Are the two fields saved one after another, or saved interlaced between each other?
Any advice appreciated.
Joyah.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Petter Larsson (Intel) wrote:Hi Peter, Since I got this sentence in msdk-man.pdf at Page.20 "Note: NV12 is the only supported native encoding and decoding format." It really makes me confused that which is the more preferred format, comparing YV12 and NV12? Thanks JoyahHi Joyah,
The native color format used by Media SDK follows the NV12 pixel format as is described on the FourCC webpage: http://www.fourcc.org/yuv.php#NV12
In essence it's a Y plane followed by an interleaved U/V plane.Note that the Media SDK sample code has file access utility functions that write/read to RAW YV12 file. The reason YV12 is used as storage format is due to the fact that it is a more common YUV format (http://www.fourcc.org/yuv.php#YV12) (NOT interleaved U/V) for storing raw frame data. So, to achieve this NV12 to YV12 or the reverse conversion is performed by the file access utility functions.
Regards,
Petter
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
camkego wrote:Hi Cameron, I'll look these two pdfs for more information, thank you so much! JoyahI hope this is helpful,
In both:
"C:\Program Files\Intel\Media SDK 2012 R3\doc\mediasdk-man.pdf"
"C:\Program Files\Intel\Media SDK 2012 R3\doc\Intel_Media_Developers_Guide.pdf"Search for 'interlace', and read about the options about how to setup the various structs for frame input.
Most importantly, you will see mfxExtCodingOption contains flags for controlling whether interlaced video is created.
Also, you will see the PicStruct enum contains options for informing which scan lines represent which fields when encoding.Expect to experiment and spend some time making sure you get the desired bitstream when changing these parameters.
Good luck
Cameron
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page