- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have ripped some of my 4K UHD Bluray discs to MKV and M2TS files HEVC 10-bit bit depth and when I play them back with my NUC NUC7i7BNH set to 3840 x 2160 24 Hz my 4K capable projector is displaying only 8 bit resolution. If I play the actual 4K UHD Bluray disc I get 10 bit on my projector. So is the Iris Plus Graphics 650 capable of higher than 8 bit resolution? I'm attaching media info on one of my files that shows it's 10-bit depth and HEVC.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello WReib,
Thank you for posting on the Intel® communities.
To better assist you with your request, I will need to check some information about your computer to further investigate it. Please follow these steps:
Please generate the following reports and attach the reports to this thread.
1. Intel® System Support Utility (Intel® SSU)
- Intel® SSU Download link
- Open the application and click on "Scan" to see the system and device information. By default, Intel® SSU will take you to the "Summary View".
- Click on the menu where it says "Summary" to change to "Detailed View".
- To save your scan, click on "Next", then "Save".
2. DXDIAG report:
- Go to Start > Run or Windows Key + R.
- On the Run prompt, type "dxdiag" then click OK.
- On the DirectX Diagnostic Tool window, click on Save All Information.
- Browse to a folder, type in a filename then click Save.
Questions:
What is the model of your projector?
Have you tested any other display?
Regards,
Adrian M.
Intel Customer Support Technician
A Contingent Worker at Intel
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I was asking about the specs of the video card. I can do the steps you indicate but what I was really after is to find out if the video card is capable of 10-bit depth for HEVC files. I don't have any other 4K capable display devices to connect to. My projector is an Epson 5040UB.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
DxDiag attached (for some reason only letting me attach one file at a time.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I play back the file with PowerDvd 19 and have hardware acceleration enabled in PowerDvd. I have the NUC set to 3840x2160 24 Hz with HDR turned on. The HDMI cable is premium 4K certified and is the same cable I get 10 bit when connected to my Sony x800 playing the same title with the actual 4K UHD disc. The projector is an Epson 5040UB. I get 10 bit depth when playing the 4K UHD disc with the Sony x800 per the Epson Info screen but only get 8 bit when playing the HEVC 10 bit MKV file via the NUC. The MediaInfo screen shot I attached shows it's an HEVC 10-bit depth file. I don't see anything anywhere about H265 in MediaInfo not sure where I would see that but ripped with latest DVDFab. I see a note on your specs that says Hardware decode for H264 SVC is not supported so I believe the file is H265.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello WReib,
Thank you for your response.
The processor is capable to handle 10-bit depth for Hardware Accelerated Video Decoding (HEVC files) but also the application/tool, cable and display have to support 10-bit depth.
For reference you can check the link below: page 32 Table 2-13 Hardware Accelerated Video Decoding.
https://www.intel.com/content/www/us/en/processors/core/7th-gen-core-family-desktop-s-processor-lines-datasheet-vol-1.html (download PDF file)
Regards,
Adrian M.
Intel Customer Support Technician
A Contingent Worker at Intel
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello WReib,
Thank you for your response.
Can you please test with this video: jellyfish-160-mbps-4k-uhd-hevc-10bit.mkv 160 Mbps from this website http://jell.yfish.us/ (Select the option Show Only 10-Bit).
Let me know if by testing this video you can get 10bit.
At the same time I will test it on my end.
Please provide us with a short video and screenshots of the video you are testing.
Regards,
Adrian M.
Intel Customer Support Technician
A Contingent Worker at Intel
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I don't have a way to rip only a small part of a 4K UHD Bluray. I get 8 bit playback from NUC no matter which file I try that are all 10 bit. So it isn't file dependent.
The jelly fish video shows 8 bit on the Epson info screen just like every other HEVC 10-bit MKV file I've tried. Here's what the Epson shows while playing the jelly fish video with PowerDvd 19 with hardware acceleration turned on:
8 bit with dithering (vs 10 bit):
Denon driver:
Hobbs and Shaw 4K UHD HEVC 10 bit MKV rip played back on NUC to Epson 5040UB:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This is with a direct connect between the NUC and the Epson 5040UB projector taking the Denon X6300H out of the equation, still 8 bit with dithering and when I play back a 10-bit HEVC MKV with this setup I still get 8 bit on the Epson info screen like above.
Here is what I get on the Epson info screen (12 bit!) when I play the actual Hobbs and Shaw 4K UHD disc on my Sony x800:
I also get 12 bit when I play the 10-bit HEVC 4K MKV file of Hobbs and Shaw on the Sony x800:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello WReib,
Thank you for your response.
I am still working on your case, please allow me some time to update the thread.
Thank you for sharing all the information.
Regards,
Adrian M.
Intel Customer Support Technician
A Contingent Worker at Intel
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello WReib,
I would like to update the thread.
- Our driver supports Color Depths of 8-bit or 12-bit via HDMI*.
- Currently, if a 10-bit display is used the driver will default to 8-bit with Dithering or 12-bit if supported.
- Please refer to Deep Color Support of Intel Graphics White Paper. (Page 11)
- There is already a request to allow users to manually select desired Color Depth via IGCC (Intel® Graphics Command Center), but this is a work in progress with no ETA however it is in our Top priority list.
- The above doesn't impact at all the Encoding/Decoding capabilities of the graphics controller. HEVC 10-bit video encoding/decoding via Hardware is supported by the graphics controller.
Regards,
Adrian M.
Intel Customer Support Technician
A Contingent Worker at Intel
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Adrian,
You know my projector supports 12 bit depth because I have screen shots above taken from my Epson 5040UB Info menu that shows the resolution it's RECEIVING and it can play 12 bit from the same Hobbs and Shaw 4K UHD HDR HEVC MKV via my Sony x800 but same file with the NUC only gives 8 bit with dithering. I've tried other 10-bit HEVC 4K UHD HDR MKV files and same result, the Sony x800 plays them 12 bit but the NUC plays them 8 bit. So I'm not sure why you stated 12-bit if supported when you know my projector supports 12 bit and I showed you it playing 12 bit. So something is awry with the NUC why it's displaying 8 bit with dithering vs 12 bit.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello @WReib
I would like to jump in and offer more information on this issue if that is ok with you.
I noticed the issue you are reporting is very similar to the one we're also tracking here
For context: the situation is occurring because the driver is currently designed to allow the OS to select the best Color Depth and Color Space options based on hardware capabilities, output content, Display capabilities and available video bandwidth. Clearly the issue here is that now the OS is choosing 8-bit with dithering for Color Depth to maintain RGB or YCbCr 4:4:4 sampling with the non-4K HDR media when it could be 12-bit and 4:2:0 subsampling instead.
Intel already recognized that our current driver design needs to change the way it handles Color Depth, especially now that more and more Displays support higher color depths (10/12-bit). Our development team is prioritizing work on adding the option to manually select Color Depth and Color Space through Intel® Graphics Command Center (IGCC), so let's wait for new updates on this front. (ETA is end of 2019 or early 2020, just keep in mind that projections may change at any time).
If it is OK with you, I'll be temporarily closing this thread since I can confirm our dev team is working on this issue (under internal bug ID 14010239023) and all we can do for now is wait for upcoming IGCC + driver updates.
Best Regards,
Ronald M.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It sounds to me like the algorithm being used to determine color depth is flawed. Why not fix the algorithm so it automatically picks the correct (highest) color depth since apparently the choices are only 8 bit or 12 bit based on the Intel graphics card? Why make the user have to go find a selection and manually select it?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The algorithm being used essentially transfers the decision to the OS.
The changes we are working on will shift the responsibility to the driver (i.e. setting a default setting for Color Depth). However most of the feedback we have received over the years from our customers dictate that users want the option to manually select their preferred color depth and color space. Per example check out this thread we opened a while ago asking the Community which changes/features they wanted added to the Graphics Control Panel. Needless to say we listened to all the feedback, and most of the changes/features requested are already included (e.g. Integer Scaling) or about to be included (ability to manually select Color Depth).
Best Regards,
Ronald M.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Any updates on the availability of this change in your Intel utility?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
No updates yet. Our driver development team is still working on it.
Thank you for your patience.
Best Regards,
Ronald M.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page