Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
22365 Discussions

Driver Feature Regression: Interlacing Support, Deep Color support

Dantakurai
Beginner
1,365 Views

Hello, Currently on my machine is an i7 13700K & and RTX 3090. As my displays only support VGA, my most common configuration has been to use an HDMI to VGA adapter connected from the iGPU thru the Motherboard outputs since the Nvidia card only supports the bare minimum required to output an image, however on the recent Intel Graphics Command Center releases, various features that are advertised and were used extensively by me as my displays work better with them have been broken, such as Interlacing support.

As a result, in order to continue to have interlaced resolutions I need to be on graphics driver version 31.0.101.3959, or the better suited Graphics Command Center beta app from 2019.
And from my friend who owns an Intel Alchemist Arc card, he also tells me that he had trouble getting Interlacing to work with any driver on his discrete Arc card despite it saying that it supports it.
In the event that this feature is being overlooked in driver development, I am here to report that I very much appreciate and use this feature a lot for gaming on my CRT displays, which is what I game and watch most of my media on.

Another similar issue that I've also been recently having is on color depth support. I've been able to look up that my processor's iGPU supports 10-bit and 12-bit color depths, though I haven't been able to get it to work on my PC. On another friend's Radeon set up, I've been able to confirm that both my VGA display, and my HDMI to VGA display adapter both support deep color 10-bit and 12-bit color depths, however I've not been able to get any of these modes working on my i7 + RTX set-up at all, (only the switchable RGB and YPbPr 4:4:4 modes were able to be working) and since I value these features, I've been considering swapping my current Nvidia GPU to a Radeon or Arc GPU in the future. I would prefer to look forward to Battlemage for a direct replacement to my RTX, but both Interlacing and deep color support for VGA adapters (and not just native displays) would need to be present.

For now I am mainly concerned with Interlacing support and would like to see this previously working feature be fixed.

1 Reply
Slay3rOne
Beginner
1,093 Views

I absolutely second this. In fact I spent quite some time trying various ways to achieve the highest VGA resolution and frequencies output on old and recent hardware, using integrated GPU RAMDACS where available, and external HDMI/DisplayPort/Type C to VGA adapters for recent hardware without any analog outputs. My main displays are 27" 2K OLED and IPS high refresh rate panels, but I still do use a few different models of CRT monitors as well.

Interlaced resolutions have always been an issue for me with these monitors on recent hardware, but I never had the necessary components to try Intel iGPUs. I was recently able to try that on a i7 9700K, and I was amazed to see interlaced resolutions working perfectly via both HDMI and DisplayPort adapters with the latest drivers. I was looking everywhere to find whether or not it would still be working with the latest 13th and 14th gen CPUs, so I would switch from my main Ryzen 5900X CPU to an Intel chip, just to be able to get access to interlaced resolutions again.

Your post suggests to me that it does, though with some issues with the recent drivers. As I'm using the iGPU only for passthrough, and either a RTX 4060 or RTX 3080 for rendering, I guess reverting to an older iGPU driver shouldn't matter much. So it looks like a viable solution to me, at least temporarily, waiting for interlaced resolutions support to be fixed in laters drivers releases, hopefully.

That said, on my testing with the 9700K I stumbled upon a really annoying limitation. In normal progressive mode (CVT timings), the horizontal resolution is allowed to go all the way up to 4096 pixels no problem. So I am able to run really high resolutions, only being limited by the maximum pixel clock capability of the adapter I'm using. I'm running a Sunix DPU3000 DisplayPort to VGA adapter, which goes up to 540MHz pixel clock. That adapter works perfectly with interlaced resolutions, but when switching to interlaced, the Intel driver won't allow me to go over 2048 pixels in width!

In general for normal use, my prefered resolution would be 1920x1440 at 170Hz on my Sony GDM-F520 CRTs, and 2304x1440 at 154Hz on my Sony GDM-FW900. The first one being below 2048 pixels in width, works, but the second one is not possible, so I'm limited to 1920x1200 instead. With interlaced resolutions allowing me to go much higher than progressive, displaying something like 2560x1920 at 130Hz or even 4K 16:9 3840x2160 at 72Hz should be possible and perfectly within the limits of the hardware. Why is that a thing, I don't know.

While displaying 4K on one of these displays might not be that useful, the fact that it will display it and remain incredibly sharp (especially the F520) is a feat on its own. Watching 4K contents at native resolution on a CRT is quite a nice thing. I was able to display 4K at 60Hz progressive on an old Radeon R9 380X with its 650MHz capable RAMDAC and some CVT timings tweaking. Being able to use interlaced to give some room and allow higher refresh rates would be really neat, and avoid unnecessary flickering I understand this particular usecase is more than niche, but I'm sure this limitation is only in software, and seeing it increased would be really nice.

That said, interlaced support being part of the drivers already, and still available to this day, I hope support won't be dropped, especially since the current hardware is perfectly cappable of outputing such resolutions, and various DisplayPort and HDMI to VGA adapters support it as well without any issue.

0 Kudos
Reply