I am developing an application to control a secondary monitor in fullscreen with DirectX10.
My application can change the resolution of the secundary monitor.
I have tested in several hw with nvidia graphics card an it works ok.
Now I am testing diferent HW (a desktop with GigaByte motherboard with Intel graphic integrated HD4000 and industrial board NANO-HM650 with Intel 3000 integrated) and always have the same problem:
I have connected the secondary monitor through a HDMI cable. The secondary monitor is a 1400x1050 projector.
Each time my appl get the control of the secondary monitor the resolution changes automatically to 1080p!!!
There is no way to change it.
I have the same problem with Multimon10.exe (an example of Microsoft to use multimonitor with DirectX) so it is not a problem of my application.
I have updated the latest drivers and the proble is the same (both 4000 & 3000 )
When you put seconary monitor to fullscreen ALWAYS change to 1080p resolution automatically.
I am working with W7 32 bits
I have made several tests and I think the driver has any problem with the EDID information from the display.
I have put a converter beetwen the graphic card and the secondary projector that change the EDID information and the driver changes the resolution of the monitor but now to 1600x1200!.
The same PC / the same application / the same monitor and the driver does a different thing??
Any idea of what can I test?
Let me get this straight, you have an application that can control the resolution of your monitor(s), but when you tell it (via your application) to go to X-resolution, it automatically defaults back to 1080p instead, is this correct?
But this effect only happens if you force FULLSCREEN mode in DirectX.
I have a projector connected as secondary monitor to the intel graphic card using HDMI cable.
I have my application working in windowd mode in mailto:1400x1050@60Hz 1400x1050@60Hz (native mode of the projector and all is OK). If I select fullscreen mode with mailto:1400x1050@60Hz 1400x1050@60Hz automatically the driver change to 1080p???.
With the same system PC+projector but using an adapter in the projector to connect with the graphic card, a converter from HDMI to 'ethernet' and from ethernet to HDMI the effect is different: The driver always goes to 1600x1200 automatically. The only change I have detected is the EDID information has change.
The resolution of your secondary is 1400x1050 - what is the resolution of your primary?
Also, if the max resolution is 1400x1050, and that is what it's displaying at in windowed mode (ala windowed fullscreen) - is it a gamebreaker to disable the fullscreen and leave it in windowed (just curious)? It sounds like your app is not set to override the operating system (which is what automatically sets resolutions, and may be causing your projector to display at 1080p).
However, I'm not an expert at this kind of software. I would suggest looking over in the Visual Computer Source/Developer Zone here (specifically in their forums but the rest of the site may be useful to you); http://software.intel.com/en-us/vcsource http://software.intel.com/en-us/vcsource