- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi guys,
I have an Intel ARC A770 GPU and a 14900K CPU. The specs of the GPU say that it can decode HEVC 10 bit 422 files in hardware.
My Canon R5 camera outputs two bitrates in this format, 680 Mbps and 340 Mbps.
- the 340 Mbps files play quite fine, with some occasional judder, BUT the strange thing to me is that the CPU is also involved in decoding. So the "team" CPU + GPU can handle the files but I wonder why not only GPU is involved
- for the 680 Mbps files, same as above, CPU + GPU work, but no chance to play smoothly.
Any clues? Can I improve this situation somehow?
Thanks a lot!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This obviously depends heavily on the video renderer used. Maybe this can help you:
SOLVED: Graphics - NUCWSHi5 Not HW acceleration H.265 50 fps 4k 4:2:2 10 bit L5.1-High
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This obviously depends heavily on the video renderer used. Maybe this can help you:
SOLVED: Graphics - NUCWSHi5 Not HW acceleration H.265 50 fps 4k 4:2:2 10 bit L5.1-High
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The problem is totally solved! I can play the 680 Mbps files using only GPU.
As the guy from the post above said, it would be great if we have a chance to drink a beer together. I am a low level C++ programmer btw. Am from România.
Many thanks!

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page