Media (Intel® Video Processing Library, Intel Media SDK)
Access community support with transcoding, decoding, and encoding in applications using media tools like Intel® oneAPI Video Processing Library and Intel® Media SDK
Announcements
The Intel Media SDK project is no longer active. For continued support and access to new features, Intel Media SDK users are encouraged to read the transition guide on upgrading from Intel® Media SDK to Intel® Video Processing Library (VPL), and to move to VPL as soon as possible.
For more information, see the VPL website.

NV12 Interlaced Field Pairs in Memory

Joyah
Beginner
548 Views

Hi there,

I would like to know the way that H.264 Encoder saves the NV12 interlaced data (field pair) in memory.

Are the two fields saved one after another, or saved interlaced between each other?

Any advice appreciated.

Joyah.

0 Kudos
4 Replies
Petter_L_Intel
Employee
548 Views
Hi Joyah, The native color format used by Media SDK follows the NV12 pixel format as is described on the FourCC webpage: http://www.fourcc.org/yuv.php#NV12 In essence it's a Y plane followed by an interleaved U/V plane. Note that the Media SDK sample code has file access utility functions that write/read to RAW YV12 file. The reason YV12 is used as storage format is due to the fact that it is a more common YUV format (http://www.fourcc.org/yuv.php#YV12) (NOT interleaved U/V) for storing raw frame data. So, to achieve this NV12 to YV12 or the reverse conversion is performed by the file access utility functions. Regards, Petter
0 Kudos
celli4
New Contributor I
548 Views
I hope this is helpful, In both: "C:\Program Files\Intel\Media SDK 2012 R3\doc\mediasdk-man.pdf" "C:\Program Files\Intel\Media SDK 2012 R3\doc\Intel_Media_Developers_Guide.pdf" Search for 'interlace', and read about the options about how to setup the various structs for frame input. Most importantly, you will see mfxExtCodingOption contains flags for controlling whether interlaced video is created. Also, you will see the PicStruct enum contains options for informing which scan lines represent which fields when encoding. Expect to experiment and spend some time making sure you get the desired bitstream when changing these parameters. Good luck Cameron
0 Kudos
Joyah
Beginner
548 Views
Petter Larsson (Intel) wrote:

Hi Joyah,

The native color format used by Media SDK follows the NV12 pixel format as is described on the FourCC webpage: http://www.fourcc.org/yuv.php#NV12
In essence it's a Y plane followed by an interleaved U/V plane.

Note that the Media SDK sample code has file access utility functions that write/read to RAW YV12 file. The reason YV12 is used as storage format is due to the fact that it is a more common YUV format (http://www.fourcc.org/yuv.php#YV12) (NOT interleaved U/V) for storing raw frame data. So, to achieve this NV12 to YV12 or the reverse conversion is performed by the file access utility functions.

Regards,
Petter

Hi Peter, Since I got this sentence in msdk-man.pdf at Page.20 "Note: NV12 is the only supported native encoding and decoding format." It really makes me confused that which is the more preferred format, comparing YV12 and NV12? Thanks Joyah
0 Kudos
Joyah
Beginner
548 Views
camkego wrote:

I hope this is helpful,
In both:
"C:\Program Files\Intel\Media SDK 2012 R3\doc\mediasdk-man.pdf"
"C:\Program Files\Intel\Media SDK 2012 R3\doc\Intel_Media_Developers_Guide.pdf"

Search for 'interlace', and read about the options about how to setup the various structs for frame input.

Most importantly, you will see mfxExtCodingOption contains flags for controlling whether interlaced video is created.
Also, you will see the PicStruct enum contains options for informing which scan lines represent which fields when encoding.

Expect to experiment and spend some time making sure you get the desired bitstream when changing these parameters.

Good luck
Cameron

Hi Cameron, I'll look these two pdfs for more information, thank you so much! Joyah
0 Kudos
Reply