- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi everyone.
I am using the Intel Media SDK to decode a 4K-pixel video, for a live broadcast application.
The video has a 19fps frame rate and 10Mbps bit rate, its exact resolution is 4000 x 3000 pixels.
When the application is decoding 12 of this video simultaneously, the video screens are all blurred.
Through GPU-Z, I see the GPU load is lower than 40%, and about 1.2GB graphics memory has been used.
Well when only 10 of this video are being played at the same time, everything is almost normal.
My processor is Celeron 1037U, the system memory 4GB, 1600MHz.
The system is a 64-bit Windows 7. I have installed the newest graphics driver, which is released in Sep 2015.
I tried disabling the code calling MFXVideoDECODE_DecodeFrameAsync() for some of the decoding channels,
and when I did that, the other channels did not show blurred screens again.
Can this be categorized as a hardware performance bottleneck issue? Or is some software probably not optimized correctly?
Thanks!
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Guess, it is H264 decoding? Curious, how are you able to watch 12 4Kx3K channels simultaneously? Should be a huge screen:) do you resize them for playback or save to file? Both will have own implications. Mosaic view should retain aspect ratio, while saving to a file seems impossible for such bandwidth. I guess, it is not a decoding problem but post-decoding manupulations impact

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page