Update! Kyle Brenneman from NVIDIA has been reportedly working on offload support for quite a while now. Recently (08-05-2019) an extension has been merged into GLXVND. This extension can control to which GPU GLXVND will dispatch the OpenGL call.

One thing missing from the NVIDIA driver for Linux is full support for Optimus. The Linux kernel and Xorg do have PRIME, a technology which provides Optimus-like functionality. Unfortunately, PRIME isn't enough and NVIDIA themselves need to make changes to their drivers. Partial changes have been made, but they work only partially and the experience is far from plug-and-play. We have managed to get PRIME working on one of our laptops with a NVIDIA GeForce GTX 1050 graphics card, without any screen tearing.

Be warned that the fix for screen tearing we used does have some drawbacks. At the moment, alt-tabbing out of a fullscreen application causes Xorg to freeze for roughly 10 seconds. This is probably caused by a mode switch.
Edit: It might not be true that Xorg freezes, because of the way we have setup Optimus. Further testing is required
Edit 2: The freezing is caused by enabling mode setting, on the other hand this fixes screen tearing.


An Optimus laptop containes two GPUs (Graphical Processing Units): an Intel integrated GPU or iGPU; and a NVIDIA GPU or dGPU. The dGPU is faster but consumes more energy, therefore it is common practice to have the iGPU render everything except graphics intensive applications. This is called PRIME offload in the world of Linux, and sadly it is not currently supported by the propriatary nVidia driver. Because of this we have to resort to the less pleasant way - PRIME output.

You can use PRIME output to switch between the iGPU and dGPU. However, you need to restart Xorg everytime you switch.

We are currently unable to turn off the dGPU while using the iGPU. It seems that the dGPU goes into a power state which makes its power consumtion negligible.

Required files

This can be achieved with two bash scripts, two shell scripts suitable for execution with your desktop manager of choice, and a Xorg configuration file. We'll be using LightDM as our display manager, but it should be fairly easy to adapt this guide to your display manager of choice.


sudo rm /usr/share/X11/xorg.conf.d/optimus.conf
ln -s $PWD/Xsetup.intel $PWD/Xsetup

sudo systemctl restart lightdm

sudo cp $PWD/optimus.conf /usr/share/X11/xorg.conf.d/optimus.conf
ln -s $PWD/Xsetup.nvidia $PWD/Xsetup

sudo systemctl restart lightdm

xrand --output <intel-outpu> --dpi <dpi>

xrandr --setproviderouputsource modesetting NVIDIA-0
xrandr --auto

xrand --output <nvidia-output> --dpi <dpi>

Section "ServerLayout"
    Identifier "layout"
    Screen 0 "nvidia"
    Inactive "intel"

Section "Device"
    Identifier "nvidia"
    Driver "nvidia"
    BusID "<nvidia-pci-id>"

Section "Screen"
    Identifier "nvidia"
    Device "nvidia"
    Option "AllowEmptyInitialConfiguration"

Section "Device"
    Identifier "intel"
    Driver "modesetting"
    BusID "<intel-pci-id>"
    Option "AccelMethod" "none"

Section "Screen"
    Identifier "intel"
    Device "intel"

Section "Files"
    ModulePath "/usr/lib/nvidia/xorg"
    ModulePath "/usr/lib/xorg/modules"

The paths used in this guide are confirmed to work on Arch Linux, Debian and Debian derivitives, such as Ubuntu or Mint. If you use a different distribution, you may need to change the paths.

We intentionally ommited PCI IDs, output identifiers and DPIs. These files should be put into the same directory anywhere on your computer. The file Xsetup, should have it's permission set to 755 with chmod 755 Xsetup with the terminal opened in the same directory.


Because each system is configured differently we cannot provide you with a concrete pair of IDs. Fortunately, they are easy to find. In order to find them you need to run lspci, which will produce output similair to:

00:00.0 Host bridge: Intel Corporation Device 3e10 (rev 07)
00:01.0 PCI bridge: Intel Corporation Xeon E3-1200 v5/E3-1500 v5/6th Gen Core Processor PCIe Controller (x16) (rev 07)
00:02.0 VGA compatible controller: Intel Corporation UHD Graphics 630 (Mobile)
00:04.0 Signal processing controller: Intel Corporation Xeon E3-1200 v5/E3-1500 v5/6th Gen Core Processor Thermal Subsystem (rev 07)
00:12.0 Signal processing controller: Intel Corporation Cannon Lake PCH Thermal Controller (rev 10)
00:14.0 USB controller: Intel Corporation Cannon Lake PCH USB 3.1 xHCI Host Controller (rev 10)
00:14.2 RAM memory: Intel Corporation Cannon Lake PCH Shared SRAM (rev 10)
00:14.3 Network controller: Intel Corporation Wireless-AC 9560 [Jefferson Peak] (rev 10)
00:16.0 Communication controller: Intel Corporation Cannon Lake PCH HECI Controller (rev 10)
00:17.0 RAID bus controller: Intel Corporation 82801 Mobile SATA Controller [RAID mode] (rev 10)
00:1d.0 PCI bridge: Intel Corporation Cannon Lake PCH PCI Express Root Port #14 (rev f0)
00:1d.7 PCI bridge: Intel Corporation Cannon Lake PCH PCI Express Root Port #16 (rev f0)
00:1f.0 ISA bridge: Intel Corporation Device a30d (rev 10)
00:1f.3 Audio device: Intel Corporation Cannon Lake PCH cAVS (rev 10)
00:1f.4 SMBus: Intel Corporation Cannon Lake PCH SMBus Controller (rev 10)
00:1f.5 Serial bus controller [0c80]: Intel Corporation Cannon Lake PCH SPI Controller (rev 10)
01:00.0 VGA compatible controller: NVIDIA Corporation GP107M [GeForce GTX 1050 Mobile] (rev a1)
02:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168/8411 PCI Express Gigabit Ethernet Controller (rev 16)
03:00.0 Unassigned class [ff00]: Realtek Semiconductor Co., Ltd. RTS522A PCI Express Card Reader (rev 01)

We're looking for VGA compatible controllers, and in the output above there are two. One has the ID 00:02.0, the other one has 01:00.0. The latter one tends to be the dGPU, but this is not always the case, so make sure to analyze the output more closely. You will need to modify these IDs slightly because Xorg expects a slightly different format from the one lspci provides. Xorg expects IDs in the following format: PCI:1:0:0, PCI:0:2:0.


Xrandr requires a name of an output, when setting the dpi. If you do not provide a correct name the Xsetup script will fail harmlessly. These names can be found out by disconnecting all external monitors and running xrandr while logged into a desktop session, which will produce output similair to:

Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 32767 x 32767
HDMI-0 disconnected primary (normal left inverted right x axis y axis)
DP-0 disconnected (normal left inverted right x axis y axis)
DP-1 disconnected (normal left inverted right x axis y axis)
eDP-1-1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 194mm
   1920x1080     60.06*+  40.04  
DP-1-1 disconnected (normal left inverted right x axis y axis)
HDMI-1-1 disconnected (normal left inverted right x axis y axis)

This output was produced while using the dGPU.

As we can see the only screen with a resolution associated is eDP-1-1 and we can deduce that that is the output-name we want.


For whatever reason, reading the EEID of a laptop monitor tends to fail with both drivers, therefore DPI needs to be set manually. We recommend commenting the two commands which need them and evaluating if your DPI setting needs to be changed. If yes, then you should uncomment the two commands and set the DPIs at random and see how it displays, and adjust accordingly. You may also run the command in a terminal, without the need to relog.

Kernel command line argument

To enable the experimental support for KMS (Kernel Mode Setting) you need to add it to the kernel command line. This can be done by editing the file /etc/default/grub and appending nvidia-drm.modeset=1 to the variable GRUB_CMD_LINE_DEFAULT. You will need to upgrade grub after this.

  • On Debian and Debian based distros you need to run update-grub
  • On Arch Linux you need to run grub-mkconfig -o /boot/grub/grub.cfg


After placing all files into one directory, setting the kernel command line argument, setting your display manager to run the file Xsetup on startup and rebooting, you should be able to use the scripts nvidia.bash and intel.bash to switch GPUs.

Running these scripts will log you out of your current session! Save everything before doing so!