GTX 1050 with bumblebee is slow

I bought new HW Dell XPS 15 9560 with dual graphics:

  1. Intel(R) HD Graphics 630 (Kaby Lake GT2)
  2. GeForce GTX 1050/PCIe/SSE2

I’ve installed openSUSE Tumbleweed with bumblebee and proprietary nvidia driver
375.66. Everything seems work, but I don’t understand some results.

Xorg 1.19.3
kernel 4.11.2

nvidia-settings says it has resolution 640x480

xorg.0.log is http://paste.opensuse.org/view/simple/87101674
xorg.8.log is http://paste.opensuse.org/view/raw/14081418
glxinfo - http://paste.opensuse.org/view/simple/92915670
optirun glxinfo - http://paste.opensuse.org/view/simple/73159504

I start with simple glxgears

Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.

422 frames in 5.0 seconds = 84.255 FPS
300 frames in 5.0 seconds = 59.994 FPS
300 frames in 5.0 seconds = 59.999 FPS
300 frames in 5.0 seconds = 59.997 FPS
300 frames in 5.0 seconds = 59.996 FPS

optirun glxgears

11569 frames in 5.0 seconds = 2313.779 FPS
9155 frames in 5.0 seconds = 1830.960 FPS
9769 frames in 5.0 seconds = 1953.718 FPS
9915 frames in 5.0 seconds = 1983.000 FPS
9133 frames in 5.0 seconds = 1826.520 FPS

It looks reasonable. Intel runs with vsync.

Then:

glxgears -fullscreen
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.

303 frames in 5.0 seconds = 60.412 FPS
300 frames in 5.0 seconds = 59.995 FPS
300 frames in 5.0 seconds = 59.993 FPS

optirun glxgears -fullscreen

236 frames in 5.0 seconds = 47.195 FPS
247 frames in 5.0 seconds = 49.213 FPS
243 frames in 5.0 seconds = 48.494 FPS

This is what I don’t understand. In fullscreen mode nvidia has really
bad results. It’s same or even worse with glxspheres.

Another example is Unigine heaven benchmark. Same setup on same
hardware I get e.g. 110 FPS on Windows 10 home and about 20 FPS on
Linux with optirun heaven.

It looks like Intel graphics works well but nvidia doesn’t.

Some snippets from xorg.8.log

[155171.437] (II) xfree86: Adding drm device (/dev/dri/card1)
[155171.437] (II) xfree86: Adding drm device (/dev/dri/card0)
[155171.437] (EE) /dev/dri/card0: failed to set DRM interface version
1.4: Permission denied
[155171.439] (--) PCI:*(0:1:0:0) 10de:1c8d:1028:07be rev 161, Mem @
0xec000000/16777216, 0xc0000000/268435456, 0xd0000000/33554432, I/O @
0x0000e000/128, BIOS @ 0x????????/524288
[155171.439] (II) LoadModule: "glx"
[155171.439] (II) Loading /usr/lib64/nvidia/xorg/modules/extensions/libglx.so
[155171.441] (II) Module glx: vendor="NVIDIA Corporation"
[155171.441]    compiled for 4.0.2, module version = 1.0.0
[155171.441]    Module class: X.Org Server Extension
...

155172.006] (II) NVIDIA(0): NVIDIA GPU GeForce GTX 1050 (GP107-A) at
PCI:1:0:0 (GPU-0)
[155172.006] (--) NVIDIA(0): Memory: 4194304 kBytes
[155172.006] (--) NVIDIA(0): VideoBIOS: 86.07.3e.00.1c
[155172.006] (II) NVIDIA(0): Detected PCI Express Link width: 16X
[155172.006] (II) NVIDIA(0): Validated MetaModes:
[155172.006] (II) NVIDIA(0):     "NULL"
[155172.006] (II) NVIDIA(0): Virtual screen size determined to be 640 x 480
[155172.006] (WW) NVIDIA(0): Unable to get display device for DPI computation.
[155172.006] (==) NVIDIA(0): DPI set to (75, 75); computed from built-in default
[155172.006] (--) Depth 24 pixmap format is 32 bpp
[155172.006] (II) NVIDIA: Using 49152.00 MB of virtual memory for
indirect memory
[155172.006] (II) NVIDIA:     access.
[155172.012] (II) NVIDIA(0): ACPI: failed to connect to the ACPI event
daemon; the daemon
[155172.012] (II) NVIDIA(0):     may not be running or the "AcpidSocketPath" X
[155172.012] (II) NVIDIA(0):     configuration option may not be set
correctly.  When the
[155172.012] (II) NVIDIA(0):     ACPI event daemon is available, the
NVIDIA X driver will
[155172.012] (II) NVIDIA(0):     try to use it to receive ACPI event
notifications.  For
[155172.012] (II) NVIDIA(0):     details, please see the "ConnectToAcpid" and
[155172.012] (II) NVIDIA(0):     "AcpidSocketPath" X configuration
options in Appendix B: X
[155172.012] (II) NVIDIA(0):     Config Options in the README.
[155172.038] (II) NVIDIA(0): Setting mode "NULL"
[155172.041] (==) NVIDIA(0): Disabling shared memory pixmaps

Can somebody explain me what can be wrong?

Seems like a relevant thread to start:

[url]https://github.com/Bumblebee-Project/Bumblebee/issues/287[/url]

Another:

[url]https://github.com/Bumblebee-Project/Bumblebee/issues/478[/url]