Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
22668 Discussions

AVC/H.264 10-bit decode

Stubbly3343
Beginner
724 Views

The hardware encoding and decoding capabilities of the Intel Arc discrete GPU are generally quite good, but it notably lacks support for AVC/H.264 10-bit. I suspect this isn’t due to a hardware limitation but rather to firmware, driver, or software issues. However, since I’m not an expert, I’d like to ask whether it’s feasible to add AVC/H.264 10-bit codec support (particularly for decoding) to the Intel Arc discrete GPU? Certain camcorders and cameras use this format, so adding this capability would greatly facilitate video editing and playback. Thank you very much.

0 Kudos
8 Replies
MUC
Honored Contributor I
685 Views

Media Capabilities Supported by Intel Hardware

 

H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD). If CPU processing is too slow for playback, I'd recommend transcoding the videos to H.265 (HEVC) or selecting this format directly on the camera for recording if possible.

 

 

0 Kudos
Stubbly3343
Beginner
267 Views

H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD).

Not true.

Video Encode and Decode GPU Support Matrix 

Screenshot 2025-08-16 at 19-34-50 Video Encode and Decode GPU Support Matrix NVIDIA Developer.pngThis is why some creators find it reassuring. Given that hardware encoding and decoding are areas Intel emphasizes, I hope Intel can catch up with Team Green.

0 Kudos
Stubbly3343
Beginner
267 Views

H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD).


Not true.

Video Encode and Decode GPU Support Matrix 

Screenshot 2025-08-16 at 19-34-50 Video Encode and Decode GPU Support Matrix NVIDIA Developer.png

This is why some creators find it reassuring. Given that hardware encoding and decoding are areas Intel emphasizes, I hope Intel can catch up with Team Green.

0 Kudos
Stubbly3343
Beginner
314 Views

H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD).

Not true.

Screenshot 2025-08-16 at 19-34-50 Video Encode and Decode GPU Support Matrix NVIDIA Developer.png

This is why some creators find it reassuring. Given that hardware encoding and decoding are areas Intel emphasizes, I hope Intel can catch up with Team Green.

0 Kudos
Stubbly3343
Beginner
314 Views

H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD).

Not true. RTX 50 series support it.

 

Nvidia's Video Encode and Decode GPU Support Matrix 

 

This is why some creators find it reassuring. Given that hardware encoding and decoding are areas Intel emphasizes, I hope Intel can catch up with Team Green.

0 Kudos
Stubbly3343
Beginner
554 Views

H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD).

Not true. RTX 50 series support it.

 

This is why some creators find it reassuring. Given that hardware encoding and decoding are areas Intel emphasizes, I hope Intel can catch up with Team Green.

0 Kudos
MUC
Honored Contributor I
504 Views

Yes, you're right.

 

https://developer.nvidia.com/blog/nvidia-video-codec-sdk-13-0-powered-by-nvidia-blackwell/

 

Sorry, I didn't know that. Since the H.264 codec is older, I wouldn't have expected anyone (other than Apple) to bother supporting it in hardware at 10-bit. Therefore, I never checked the Nvidia Blackwell series' capabilities in this regard. Nvidia has apparently not yet updated the official NVDEC table.

 

MUC_0-1755374440427.png

 

0 Kudos
Stubbly3343
Beginner
309 Views

My previous posts kept getting marked as spam, and now they’ve suddenly reappeared. Sorry for the bunch of duplicate posts.

0 Kudos
Reply