Intel® Integrated Performance Primitives
Deliberate problems developing high-performance vision, signal, security, and storage applications.

MJPEG YCbCr411 decoding issue

Steve_Browne
Beginner
506 Views

We recently just found an issue that popped up with MJPEG decoding in the UMC sample code. It looks like this issue was introduced in the 6.1.x sample code and is still present as far as 7.1.x. In the 6.0.x sample code YCbCr411 JPEG was decoded and converted to RGB so it didn't have this problem. Starting in 6.1.x this gets treated as YUV420 which is close, but not exactly the same. The problem is that the value ranges are different between YCbCr411 and YUV420. YCbCr411 has the range of 0-255 for all components while YUV 420 has a range of 16-235 for Y and 16-240 for UV data. The displayed images look pretty close, but are not accurate.

The main issue with this inaccuracy is that certain fisheye cameras etc watermark the video in order to indicate to their dewarping libraries that the image can be dewarped. When the proper conversion does not take place and this watermark data is distorted the dewarping libraries no longer work.

The code in question is located in jpegdec.cpp in ProcessBuffer in the if(m_jpeg_precision <= 8) block it does an ippiCopy_8uC1R to just copy the data when it should be doing a YCbCr411 to YUV420 conversion in here. As a workaround for now I've used the following code:

int max = (c == 0 ? 235 : 240);
for (int i = 0; i < roi.height; i++)
{
for (int j = 0; j < srcStep; j++)
{
pDst8u = (Ipp8u)(16 + ((int)pSrc8u * (max - 16) / 255));
}
pSrc8u += srcStep;
pDst8u += m_dst.lineStep;
}

I've attached some examples of the same JPEG image using both methods to show the difference.

0 Kudos
3 Replies
Jeffrey_M_Intel1
Employee
506 Views
Thanks for posting this issue. I can definitely see the difference in your example images. Just to make sure we can reproduce exactly what you're seeing, could you attach a YCbCr411 input image that shows this behavior?
0 Kudos
Steve_Browne
Beginner
506 Views
Sure, see the attached files. One is the exact image as above (aside from the overlays added by our software) and the other is an example of the fisheye image that started this whole thing. Both are images that came directly from the camera and have not been altered.
0 Kudos
Steve_Browne
Beginner
506 Views
On a separate but related issue we started messing with streaming H.264 and MJPEG at the same time from a camera and comparing the way the two looked. We noticed that in 6.1.x and higher they both had the same darker look on most cameras. However, when using proper YUV ranges the images should look slightly lighter so it made me wonder about the H.264 video. It looks like the H.264 decoder reads a video_full_range_flag from the SPS headers, but does nothing with it. There doesn't appear to be any way to get this information either. It would be nice if it used this information and put the YUV data in the proper range or more colorspaces were added to differentiate full range YUV data and conversion. I modified our OpenGL pixel shader to support full range YUV data, but without any way of knowing which version of the shader to use that's useless. Also Direct3D's StretchRect seems to expect limited range YUV data. I'm not sure what the UMC::ColorSpaceConversion class expects, but I'm guessing its expecting limited range as well.
0 Kudos
Reply