- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The hardware encoding and decoding capabilities of the Intel Arc discrete GPU are generally quite good, but it notably lacks support for AVC/H.264 10-bit. I suspect this isn’t due to a hardware limitation but rather to firmware, driver, or software issues. However, since I’m not an expert, I’d like to ask whether it’s feasible to add AVC/H.264 10-bit codec support (particularly for decoding) to the Intel Arc discrete GPU? Certain camcorders and cameras use this format, so adding this capability would greatly facilitate video editing and playback. Thank you very much.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Media Capabilities Supported by Intel Hardware
H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD). If CPU processing is too slow for playback, I'd recommend transcoding the videos to H.265 (HEVC) or selecting this format directly on the camera for recording if possible.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD).
Not true.
Video Encode and Decode GPU Support Matrix
This is why some creators find it reassuring. Given that hardware encoding and decoding are areas Intel emphasizes, I hope Intel can catch up with Team Green.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD).
Not true.
Video Encode and Decode GPU Support Matrix
This is why some creators find it reassuring. Given that hardware encoding and decoding are areas Intel emphasizes, I hope Intel can catch up with Team Green.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD).
Not true.
This is why some creators find it reassuring. Given that hardware encoding and decoding are areas Intel emphasizes, I hope Intel can catch up with Team Green.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD).
Not true. RTX 50 series support it.
Nvidia's Video Encode and Decode GPU Support Matrix
This is why some creators find it reassuring. Given that hardware encoding and decoding are areas Intel emphasizes, I hope Intel can catch up with Team Green.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
H.264 (AVC) 10-bit decoding is not accelerated by any graphics hardware (even Nvidia or AMD).
Not true. RTX 50 series support it.
This is why some creators find it reassuring. Given that hardware encoding and decoding are areas Intel emphasizes, I hope Intel can catch up with Team Green.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes, you're right.
https://developer.nvidia.com/blog/nvidia-video-codec-sdk-13-0-powered-by-nvidia-blackwell/
Sorry, I didn't know that. Since the H.264 codec is older, I wouldn't have expected anyone (other than Apple) to bother supporting it in hardware at 10-bit. Therefore, I never checked the Nvidia Blackwell series' capabilities in this regard. Nvidia has apparently not yet updated the official NVDEC table.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
My previous posts kept getting marked as spam, and now they’ve suddenly reappeared. Sorry for the bunch of duplicate posts.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page