We need to hear directly from you, our valued customers, which color-related options (e.g. Hue, Wide Gamut) from the Intel® HD Graphics Control Panel you use the most. In an effort to better optimize our driver, some of these Control Panel features are targeted for removal for future platforms, and thus we need to know what are the different real-world use cases in which these features prove useful to you.
Your feedback can help us build a better product! Feel free to add your comments in this post.
I agree with Stefan75 with regards to what needs to be added.
Some are there already like RGB or YCbCr output but it is not very clear when they are actually displayed in the UI. I have not got access to my PC using Intel HD graphic at the moment. But from memory when I use HDMI1 (1.4) out on my motherboard I can only select xvYCC. But I cannot select YCbCr as that option is simply not shown in the UI. If I use HDMI2 (2.0) the option to select YCbCr is actually shown in the UI but then xvYCC is gone. This is using the same display and the same resolution. Maybe this is a bug or by design. But it would be better to ALWAYS show the various settings in the UI. Settings that cannot be applied in a specific setups could simply be greyed out or not selectable.
Other settings are simply missing like bit depth and chroma subsampling. Higher bit depth seems to be applied automatically if supported by the display. But there might be situations where you want to use a lower bit depth or chroma subsampling. As it stands now that is not possible through the UI.
Suggest you have a look at one of your competitors (Nvidia) and how they manage these additional settings in their UI. It is much clearer and easier to understand.
For the other settings in your screenshots like brightness/contrast I think they might be useful on certain tablets e t c. But I imagine most users like me with separate/external displays would rather control these through the display's settings.
Personnaly, I don't use any "Image enhancement", "Color enhancement" or "Gamut mapping" but I do need a way to select YCbCr 4:4:4 / 4:2:2 / 4:2:0 and to select output colour depth 8bpc / 10bpc / 12bpc...
Intel is planning on removing features from their graphics?
The color settings are were very useful for me - I've touched contrast, brightness, gamma adjustments for each color value in order to calibrate a really trashy panel that came built into the laptop I was using. An ICC would not have done the job because those adjustments are lost in 3D applications.
Sometimes in a really dark environment these adjustments also help take the panel brightness down below what the OS adjustment allows.
There are probably other useful cases that you're not going to hear about but there you go.
I use the Saturation function of the drivers to boost my monitor's color vividness. I do not have direct control over these functions on a regular monitor, typically only TVs have these features. By keeping the options in software, you enable configuration of these settings for people like me who do not have the ability to do so otherwise.
Given the number of folks sounding off in these two threads, those on HDMI 2.0 capable drivers (7th gen Nuc) need to be able:
- Set the desktop color bit depth (8, 10, 12)
- Set the chroma output (4:4:4, 4:2:2, 4:2:0).
Currently the desktop is defaulting only to 8bpc in 4:4:4 chroma. We need a way to choose 10bpc or 12bpc at lower chroma settings when we want to watch HDR (HDR-10 content) in apps that don't have explicit direct-x call-outs (e.g. players that use in-window native desktop color bit depth).
The other driver manufacturers (http://imgur.com/xXhRZkW Example 1 (Nvidia), https://tweakers.net/ext/f/RI713MLnSSKhY8uYy92ipCPJ/full.jpg Example 2 (AMD)) have had this for quite a while and Intel needs to begin to support this and be competitive in the HDR space, especially with this being a focus of Windows 10 in the Creator's update.
Needless to say... WDDM 2.2 drivers would be a request alongside all these color-bit depth and chroma setting options (since WDDM 2.2 is needed to allow for 10-bit depth post Creator's Update in Win10).
@Ronald_Intel 3 years ago, you asked advanced users on what needs to be improved in the Intel drivers. Many people asked to finally implement a function to set chroma subsampling (4:4:4 / 4:2:2 / 4:2:0) and color depth per channel (12/10/8 bpc) manually like e. g. nVidia and AMD has been offering it for years now. Nothing has happened. Many people worldwide can't use their UHD Graphics to display 4K60fps with HDR due to this limitation. Are you so ignorant about customer desires or what is the reason why nothing has happened for 3 years regarding that topic?
Thank you for your feedback. I do agree that it is important to have the option to manually configure subsampling and color depth. This is something that is already in our plans for future updates, however I don't have details on target platforms or ETA.
I do find interesting that you believe this is needed for users to enable 4K@60hz with HDR. Can you elaborate on this statement? Or better yet, if you have this issue go ahead and create a new thread in our Community or contact Intel support staff for assistance, because even without the option to configure Subsampling and color depth you should be able to use 4K@60hz with HDR.
Personally I have been able to set 4K@60hz with HDR using Intel integrated graphics many times. As long as it is 7th Generation Intel® Core™ Processor or newer (Atom, Celeron and Pentium do not support HDR) that is connected via HDMI 2.0a/b and the Display itself supports HDR. You can find more details here
EDIT: I stand corrected, I'm my previous rigs I believe I ended up getting HDR 4K@60hz 8-bit with dithering and 4:4:4. I'll have to double check for 10/12-bit 4:2:0 with HDR.
@Ronald_Intel Hi Ronald, good to hear that we might be able to use that feature maybe in the near future. Regarding my issue, I already have a thread open for that (https://forums.intel.com/s/question/0D50P00004TQc5cSAD/strange-issue-with-uhd-630-dp-12-4k60fps-hdr1...) and what I can read in the web, many others have similar issues. If you have been able to establish the desired connection in the past, maybe you can give some more details about the adapter being used to get the HDMI signal from the DP1.2 port found on most of the boards with Intel chipsets?
I haven't tried enabling HDR via DP 1.2, only through HDMI 2.0a
The official HDR white paper does point out that only DP 1.3 supports HDR natively (without the need for any adapter or dongle), and here is the part where this gets tricky as Intel cannot promise functionality with 3rd party adapters or dongles.
Anyway, I recently purchased a couple of DP to HDMI 2.0a adapters that are advertised as supporting HDR, so I'll give them a try as soon as possible and report back my results.
I get your point, but the problem is: there is NO Intel chipset based board on the market with DP1.3 or DP1.4. And the few boards with a HDMI 2.0 port are ALL using an integrated LSPCon making HDMI 2.0 out of the DP1.2 signal internally (correct me if I'm wrong). So it's nice the white paper points out that DP1.3 is required, but it is of no practical relevance as no board has 1.3 or 1.4. What really puzzles me is that external adapters use the same chipsets (Parade, MegaChips) as the boards with integrated LSPCon, but they're not working properly with the Intel IGPs, thats the tricky part.
As mentioned in my other thread, I get 4K30p with 4:4:4 12bpc, but for 4K60p with 4:2:0 only 8bpc instead of 12bpc although the required bandwidth would be the same - and the adapter cable as well as the projector support both 4K60p 4:2:0 12bpc (tested with dedicated nVidia card). So thats why I think being able to set the chroma and color depth manually would help a lot.
Best r egards
Hi again Stefan,
"boards with a HDMI 2.0 port are ALL using an integrated LSPCon making HDMI 2.0 out of the DP1.2 signal internally" >> that is correct, however in this scenario you do get the HDMI 2.0a port and thus no other adapter is needed. This is actually the scenario that I have tested many times with success on 4K@60hz 8-bit with dithering 4:4:4 + HDR (as long as the LSPCon supports HDR metadata). I do agree with you that when doing DP 1.2 to HDMI 2.0a via external adapter is essentially the same situation, but I can't vouch for external adapters to work properly at all times though. At least with internal LSPCon the OEM (Original Equipment Manufacturer) has done plenty of testing and validation before the product goes out.
On a side note: In your other thread the issue is clearly related to the less than optimum video mode automatically used by the driver and OS (Resolution + Refresh Rate + Color Depth + Subsampling) but doesn't necessarily has a relation to enabling 4K@60hz with HDR and adapters or converters.
I do agree with you that once we get manual control of Color Depth and Subsampling issues like the one from your other thread shouldn't occur anymore so let's stay tuned to upcoming news and driver releases.