- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am getting strange artifacts when encoding H.264 video using Intel hardware accelerated encoder.
As I described in the following question, the artifacts appears with the following parameters:
- All frames are I-Frames (all are also keyframes).
- The bitrate is relatively high (much higher the required for encoding the specific frame content).
The issue appears with Intel Core i7 13700, but not appears with Intel Core i7 6700.
I have managed to reproduce the issue with the following configuration:
- 13th Gen Intel Core i7-13700 running Windows 11 24H2
- 11th Gen Intel Core i7-1185G7 running Ubuntu 20.04
Note that the issue is not limited to the above configurations.
It is simple to reproduce the issue using FFmpeg version 7.1, by executing the following command:
ffmpeg -y -f lavfi -i testsrc=s=192x128:r=25:d=2 -c:v h264_qsv -g 1 -b:v 20000000 -pix_fmt nv12 output.mp4
The issue is also reproducible with the sample_encode example from Intel Media SDK (on Gen 11):
./sample_encode h264 -nv12 -f 25 -b 20000l -g 1 -hw -i in.nv12 -o output.264 -w 192 -
h 128
The issue is also reproducible with hello_encode example from oneVPL (after modifying the code to use AVC encoder, with GOP size 1, and high bitrate).
We may build the NV12 sample input using FFmpeg:
ffmpeg -y -f lavfi -i testsrc=s=192x128:r=25:d=2 -pix_fmt nv12 -f rawvideo in.nv12
The issue appears both in Windows and Linux, and both in Intel Core Gen 11 with Iris Xe Graphics and Intel Core Gen 13 with UHD Graphics 770 (and also both with MeidaSDK and oneVPL).
I assume the source of the issue is relatively low-level, and it is lasting for few years...
Link Copied

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page