- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have NUC9i7QNX Ghost Canyon. There two thunderbolt3 ports. I connected two eGPUs (aorus gaming box), but work only one of them. Separate connection works fine.
Also there one nvme in compute board slot.
Is NUC9 Extreme Kit compatible with setup of two eGPUs on two thunderbolt ports?
I suspect there only one controller for both thunderbolt ports with only 4xPCIe lines and two eGPUs are impossible. Maybe there a way to configure controller to give 2xPCIe for each device? Its enough for my ML computing application.
Thunderbolt Control Center about screenshot attached.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello, sorry for silence.
My problem is resolved by accident. There was electro blackout in my flat. After that both cards started nicely on windows.
Then I tried force previous bad nonworking state by random actions with power and connection of eGPUs. Succeeded.
Then after I emulate artificial "blackout" for whole PC plug, cards start to work again.
Now I have nicely working configuration:
Windows 10
nvidia 1660s on internal 16xPCI port
nvidia aorus gaming box 2070 on rear thunderbolt 3
nvidia aorus gaming box 2080ti on second rear thunderbolt 3
nvidia driver 466.47
thunderbolt controller driver 1.41.1054.0
To reproduce good state you need to follow:
switch PCIe bifucration mode to auto in bios
(1) shutdown
insert both eGPUs
load into windows, check if windows and apps can work with both cards (have to be ok)
if on previous step cards dont work, shutdown and disconnect power plug for whole PC (for 10-20 seconds), reboot (if cards dont work after, probably we have different problems)
shutdown
disconnect both eGPUs from thunderbolt (both must be powered by plug)
insert and connect to power internal card
load to Windows
connect any eGPU via thunderbolt
wait while windows initialize card
connect second eGPU
profit
PCI state of such configuration in attachment.
My hypothesis about nature of phenomenon is about electrical potentials (maybe static electricity) and bad grounding of my plug. Probably I am wrong.
PS great thanks to @Ronny_G_Intel for principal scheme, for which I looked for in deep specs.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello @GennadyShtekh
Thank you for posting on the Intel® communities.
In regards to these inquiries, please allow us to review this further and we will be posting back in the thread as soon as more details are available.
Best regards,
Andrew G.
Intel Customer Support Technician
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
My initial thinking is that 2 eGPUs connected to the NUC9i7QNX won't work because the NUC has only one TB controller, that's on the hardware side but I would also recommend that you check with the Operating System manufacturer, I dont know how Windows* or Linux* would handle this integration, my understanding is that macOS* is able to handle it but again only if the system has 2 Thunderbolt controllers available.
I hope this helps.
Regards,
Ronny G
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello GennadyShtekh
We have not heard back from you so we will proceed to close this thread now. If you need any additional information, please submit a new question as this thread will no longer be monitored.
Best regards,
Andrew G.
Intel Customer Support Technician
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello, sorry for silence.
My problem is resolved by accident. There was electro blackout in my flat. After that both cards started nicely on windows.
Then I tried force previous bad nonworking state by random actions with power and connection of eGPUs. Succeeded.
Then after I emulate artificial "blackout" for whole PC plug, cards start to work again.
Now I have nicely working configuration:
Windows 10
nvidia 1660s on internal 16xPCI port
nvidia aorus gaming box 2070 on rear thunderbolt 3
nvidia aorus gaming box 2080ti on second rear thunderbolt 3
nvidia driver 466.47
thunderbolt controller driver 1.41.1054.0
To reproduce good state you need to follow:
switch PCIe bifucration mode to auto in bios
(1) shutdown
insert both eGPUs
load into windows, check if windows and apps can work with both cards (have to be ok)
if on previous step cards dont work, shutdown and disconnect power plug for whole PC (for 10-20 seconds), reboot (if cards dont work after, probably we have different problems)
shutdown
disconnect both eGPUs from thunderbolt (both must be powered by plug)
insert and connect to power internal card
load to Windows
connect any eGPU via thunderbolt
wait while windows initialize card
connect second eGPU
profit
PCI state of such configuration in attachment.
My hypothesis about nature of phenomenon is about electrical potentials (maybe static electricity) and bad grounding of my plug. Probably I am wrong.
PS great thanks to @Ronny_G_Intel for principal scheme, for which I looked for in deep specs.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page