Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
22684 Discussions

How can I use Deep Colour (10/12bpc = 30/36-bit color depth)?

idata
Employee
22,795 Views

First, education time, because most people still really don't get it. 

 

1. Explanation of color depths

High Color (5/6-bit colour channels x RGB = 15 or 16-bit colour depth) 

32,768 colours (15-bit depth) or 65,536 colours (16-bit colour depth)
  • Go back 15 years or so, and you might remember this as an option in Windows XP's display settings panel.
  • Nobody needs these modes any more

True Color (8-bit colour channels x RGB = 24-bit color depth). 32-bit colour is actually 24-bit colour with transparency) 

8 bits of information per colour channel (24-bit color depth) = 16.7 million colours.
  • Almost all computer displays and all current smartphone displays are 8-bit per channel. 
  • Has been the standard for many years.
  • Each pixel's three colour channels (red, green, and blue) have 256 levels of gradation.
  • Often results in unsightly 'banding', particularly noticeable in gradients. Certain tricks have to be employed to avoid these colour problems.
  • Many video sources are 8-bit per RGB color channel

Deep Color (10/12/16-bit colour channels x RGB = 30/36/48-bit color depth) 

10 bits of information per colour channel (30-bit color depth) = 1.08 billion colours.12 bits of information per colour channel (36-bit color depth) = 68.71 billion colours.16 bits of information per colour channel (36-bit color depth) = 281.5 trillion colours.
  • Many modern monitors and almost all modern TVs are actually able to handle 10-bit
  • 10-bit is used in video formats (see Bluray or HEVC Main10)
  • Deep colour's higher colour count eliminates 'banding' and various other visual artifacts
  • 12-bit color is required for the HDMI definition of 'Deep Color'
  • In 10-bit colour, each pixel's three colour channels (red, green, and blue) have 1024 levels of gradation.

 

2. The problem I am facing

 

  1. How can I enable 10/12-bit color in linux? I am already running latest drivers on arch linux.
  2. What about OSX or Windows?
  3. Whoa, hang on, what the fu.. have I just found a PDF where you guys actually say there is NO OPTION despite full hardware/software support because there is 'no content'? What? I can't enable more accurate colour even though everything works, because Bluray 4K isn't out yet? Are you £$%ing serious?

     

    Page 10 > http://www.intel.co.uk/content/dam/www/public/us/en/documents/white-papers/deep-color-support-graphics-paper.pdf http://www.intel.co.uk/content/dam/www/public/us/en/documents/white-papers/deep-color-support-graphics-paper.pdf

    This document details the _full support_ of deep colour, but that UI options don't exist - so how do I enable this in linux (of which intel drivers lack UI anyway)?

    FFS Intel. The drivers situation is appalling.

    1. The driver versions bullshit is still plaguing millions of machines

    2. Haven't implemented 'custom' resolutions/refresh rates in Windows. After 5 god forsaken years.

    3. And now, advertising features that should be supported BUT ARE DISABLED?

 

My stuff:

  • 10-bit yamakasi IPS 1440p monitor, 3x panasonic TVs supporting 30/36-bit colour (tested with bluray player test modes)
  • Intel i7 3667U laptop (DP)
  • Intel i7 4710MQ laptop (HDMI)
  • Intel i7 6560U laptop (DP)
  • Intel Celeron 3150N (HDMI or DP)
  • Windows 10
  • OSX (3667u)
  • Arch Linux (Celeron 3150N)
0 Kudos
24 Replies
idata
Employee
5,698 Views

Thanks Bryce.

It's a complicated topic with little information across the web, I think I finally understand:

  • The Windows Desktop Environment is exclusively 8-bit (24/32-bit true colour), which explains why:
    • You can't choose a 'Deep Color' 10-bit (36-bit) colour mode in Display Settings;
    • You don't see any 30/36/42-bit modes under 'List all modes' in Monitor/Adapter settings;
    • 'Deep colour' monitor modes are onlydriver settings (e.g. Nvidia Control Panel, AMD Catalyst);
    • It's sometimes hard to tell if deep colour is working or not.
  • Windows Applications
    • Apps are typically 8-bit (e.g. Paint, Edge, Photo viewer), so 10-bit content will be dithered and displayed as 8-bit, losing image quality.
    • Special 10-bit apps/games can render 10-bit to the display, avoiding Windows Desktop's 8-bit dithering:
      • Apps like Photoshop CS6+ render an OpenGL 10-bit backbuffer 'over' the 8-bit Windows Desktop.
      • Games like Alien isolation use DirectX11 full screen to bypass the 8-bit windows desktop.
  • Specific Hardware:
    • Radeons
      • Typically good, allowing modern consumer GPUs to enable 10-bit output via HDMI or DisplayPort
      • Older models only allow 10-bit via DisplayPort
    • Nvidia
      • Allows 10-bit output in fullscreen DirectX (e.g. games such as Alien Isolation, or a media player).
      • Doesn't allow 10-bit desktop applications via OpenGL (e.g. Photoshop) - you are supposed to buy a 'professional-grade' (and pricey) Quadro.
      • Doesn't support 10-bit over HDMI
    • Intel
      • ???????
  • Intel's Linux driver may support 10-bit colour now that Kernel 4.7 has a new colour management system.
  • Apple Mac OS only just got 10-bit display support late 2015.

Also:

  • HDR = 10-bit colour.
  • HDR Display = 10-bit display that meets minimum brightness/contrast requirements.

Let's just say I have adjusted my expectations!

Intel, do this:

  • Make a Deep-color page somewhere in support. This thing is complicated and really needs explaining.
  • Adjust your CUI design paradigms. Hiding your Deep Colour option created uncertainty for me. A simple unselectable checkbox displaying deep-colour support would eliminate all these issues. Does my driver support this, etc?
  • Add more debug/information options to the driver settings. How do I check 10-bit support?

Bryce:

  • Could you ask your developers if they have a list of 10-bit applications that they test with? All I know of is Photoshop and Alien Isolation.

Thanks for all the help.

0 Kudos
STurn5
Beginner
5,698 Views

The linux kernel's DRM core has supported 30-bit colour (10bpc) since the 3.0 kernel - see https://lists.freedesktop.org/archives/dri-devel/2011-May/011502.html https://lists.freedesktop.org/archives/dri-devel/2011-May/011502.html

The Intel xf86-video-intel DDX driver for linux has supported 30-bit colour since version 2.16.0 (released 9 August 2011) - see http://www.phoronix.com/scan.php?page=news_item&px=OTc3Mw Intel Rounds Out Its Ivy Bridge Graphics Driver - Phoronix and https://cgit.freedesktop.org/xorg/driver/xf86-video-intel/commit/?id=7976f5144d42a03ccd027908252a600db2631054 https://cgit.freedesktop.org/xorg/driver/xf86-video-intel/commit/?id=7976f5144d42a03ccd027908252a600db2631054

The HDMI standard has supported deep colour (30-bit, 36-bit and 48-bit) since version 1.3. Bear in mind that if you want to output 4K resolution at 60Hz you will need HDMI 2.0. The 4K modes have also been supported by Intel's DDX driver for some time - see https://lists.freedesktop.org/archives/intel-gfx/2013-August/031357.html https://lists.freedesktop.org/archives/intel-gfx/2013-August/031357.html

The Intel colour manager support added in kernel 4.7 seems to be about something else - "Also noteworthy is that the Intel DRM driver in Linux 4.7 will support pipe-level color management. The color management is handled by a set of properties attached to a CRTC. This color management support is available for Intel Broadwell, Skylake, Broxton, Kabylake, and Cherryview hardware. The color manager will work for items like de-gamma, color conversion matrix, and gamma correction." See https://lists.freedesktop.org/archives/intel-gfx/2016-April/091373.html https://lists.freedesktop.org/archives/intel-gfx/2016-April/091373.html

0 Kudos
Bryce__Intel
Employee
5,698 Views

Hi Replete

Looking more into this I was able to get some answers to your questions. I hope this satisfies the last of your inquiry.

Intel

  • ???????

[Bryce] Alien Isolation and DirectX is validated and functional for DP 10-bit. For HDMI, we only support 8-bit and 12-bit. If the game/app supports 10-bit, it will default to 8-bit. There is currently no support for OpenGL.

  • Could you ask your developers if they have a list of 10-bit applications that they test with? All I know of is Photoshop and Alien Isolation.

[Bryce] We use Alien Isolation and DirectX Sample apps from Microsoft DirectX Jun10 package for validating 10-bit color depth.

0 Kudos
ST1
Novice
5,698 Views

Hi Replete

Please post your wishes here:

0 Kudos
Reply