I’m running Linux Mint 18 with KDE5 on a desktop PC.
lspci | grep -i vga
00:02.0 VGA compatible controller: Intel Corporation Xeon E3-1200 v3/4th Gen Core Processor Integrated Graphics Controller (rev 06)
01:00.0 VGA compatible controller: NVIDIA Corporation GM206 [GeForce GTX 960] (rev a1)
I was trying to get the Intel GPU to handle the display, while using dedicated nVidia card for CUDA only (Blender Cycles rendering).
I’ve switched the Intel as primary display in the BIOS, and plugged my monitors into the motherboard.
I’ve modified my X.org config to use “intel” device instead of “nvidia”, leaving “nvidia” device unused by the X.
Here’s my xorg.conf file:
Section "ServerLayout"
Identifier "layout"
Screen 0 "intel"
Inactive "nvidia"
#Screen 0 "nvidia"
#Inactive "intel"
EndSection
Section "Device"
Identifier "intel"
Driver "modesetting"
BusID "PCI:0@0:2:0"
Option "AccelMethod" "None"
EndSection
Section "Screen"
Identifier "intel"
Device "intel"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusID "PCI:1@0:0:0"
Option "ConstrainCursor" "off"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
Option "AllowEmptyInitialConfiguration" "on"
Option "IgnoreDisplayDevices" "CRT"
EndSection
I swapped “intel” with “nvidia” in the original pair of lines:
Screen 0 "intel"
Inactive "nvidia"
I kinda seem like got it to work, but there’s some things that make it really weird:
When X.org starts, Plasma Desktop warns that the GPU doesn’t support OpenGL. Blender won’t run - no GLX extension found. nvidia Optimus panel says I’m using nVidia GPU, not Intel.
X.org log seems to prove X is using Intel GPU:
Information [ 1138.958] (II) intel(0): resizing framebuffer to 3840x1080
Information [ 1138.987] (II) intel(0): switch to mode 1920x1080@60.0 on HDMI3 using pipe 1, position (1920, 0), rotation normal, reflection none
If I switch to Intel GPU from the nVidia Optimus panel, the Plasma Desktop and Blender work (immediately! no X.org restart needed!), but my thermal widget doens’t report Temerature for nVidia anymore.
I tested out CPU vs GPU rendering in Blender and the GPU renders 3~4 times faster than CPU so I guess CUDA is really working.
If I switch the nVidia Optimus back to nVidia GPU, I can’t run blender anymore (without loggin out! it’s the same X.org session!) but the thermal widget displays GPU temperature again!
If I just finished rendering with Blender using CUDA I can see the GPU temperature falling down from 60°C - so the GPU was used, even though nVidia-settings things it’s disabled (it won’t show me the PowerMizer panel or Thermal Control).
Also why is GLX extension not reported to processes when selecting nVidia card from Optimus? And why it’s magically present again when I switch to Intel without restarting X.org server?
I’d expect to be using nVidia GPU, and have GLX extension reported for Intel-driven display device in X.org. Maybe there should be a “Dual” mode? With both GPUs enabled? I know it’s not the primary goal of optimus but this is what I’m doing and nVidia-settings is behaving weird, because it behaves like nVidia GPU is non-existent while I’m actually using it for CUDA.