Manjaro Linux with Bumblebee on the Thinkpad X1 Extreme in 2019.
Contents
As I mentioned in a previous blog post, I switched from a 2017 15" MacBook Pro to a Thinkpad X1 Extreme in April of this year.
Although I’m still using Windows on this machine in order to see what all the fuss is about (BTW, WSL is amazing), the Linux itch could not be ignored anymore.
What makes this laptop slightly more challenging than some of my previous Linux laptop and Optimus (dynamically switching graphics) adventures, is that in this case the the NVIDIA dGPU is hard-wired to the external outputs (HDMI and and displayport over thunderbolt).
My requirements for Linux on the Thinkpad were as follows:
- Disconnection and reconnection to an external monitor should be supported, without me having to reboot or even log out and back in again.
- It should be possible to use the NVIDIA dGPU when required, for example for PyTorch and other CUDA-dependent applications.
- The laptop should be as battery-efficient as possible, so NVIDIA should only be active when absolutely necessary.
- Suspend and resume should also work.
In short, I should be able to use the Thinkpad with Linux like I do when it runs Windows.
To my surprise, Manjaro Linux, in contrast to Ubuntu 19.04, facilitated meeting all of the requirements listed above.
Why Manjaro?
Ubuntu is my go-to distribution, but with the Thinkpad I was not able to get very far with the latest release 19.04.
Attempting to switch between the NVIDIA GPU and integrated graphics with
prime-select
almost never worked. I sat staring at a frozen desktop (right
after login) more times than I wish to remember. Bumblebee was even less
successful. The whole adventure ended when the only desktop I could get was
640x480 or something similar.
(I have written before about some of the issues caused by Ubuntu’s gpu-manager. Too much magic.)
For years now, I have been impressed by the level of detail on the helpful Arch Linux wiki, and the great deal of effort its developers and users put into the distribution.
However, Arch does require more hands-on time from its users than I currently have available.
Fortunately, Manjaro is an Arch-based distribution that has already taken care of much of the details, a fact that further motivated my decision to try it.
Important update on 2019-08-03: Choose KDE and choose kernel 5.2
I started with the Gnome-based Manjaro. Unfortunately, gnome-shell has the
nasty habit of latching on to the /dev/nvidia*
devices when the nvidia comes
online, and then it becomes difficult to switch the nividia back off again
without logging out and in again.
The whole point of this exercise was to avoid that inconvenience.
It took me about 30 minutes to install the necessary Manjaro packages to convert the installation to KDE (see the wiki page on the topic).
KDE never latches on to /dev/nvidia*
, and the laptop switches the nvidia
off as soon as I kill intel-virtual-output
before monitor disconnection.
Furthermore, the Thunderbolt3 Workstation Dock at work gave some issues, which were all solved when I upgraded from Linux kernel 4.19 to 5.2.4.
Setup instructions.
The following two subsections explain step-by-step how to get Manjaro working with hybrid graphics on the Thinkpad X1 Extreme.
Configure graphics hardware with mhwd.
Use the useful little Manjaro Hardware
Detection
command, mhwd
, to setup bumblebee-based hybrid graphics:
|
|
Ensure that your user account belongs to the bumblebee
group:
|
|
After this, it is probably a good idea to reboot.
The bumblebee daemon, group permissions and optirun.
Check that the bumblebeed service is running:
|
|
If it’s running, you can try the following test:
|
|
You should see the well-known 3D rotating gears example, and on stdout you should see information about the graphics card, including a really long list of GL extensions.
The first few lines should look like this:
|
|
If instead you see something like the following:
|
|
… check that your user currently has the bumblebee
group active by typing:
|
|
If you don’t see bumblebee
in the output list, “login” to the group by typing the following:
|
|
… and then trying the optirun command again.
Connecting an external display.
Up to now, you will have been working on the laptop’s built-in screen.
Also, the NVIDIA should be switched off, which you can confirm by:
|
|
When you connect a monitor to the HDMI or thunderbolt3 output of the ThinkPad, that will be connected directly to the NVIDIA’s output.
We need some sort of trick to enable the Intel graphics driver, to use the NVIDIA as a virtual output.
This is exactly what the Intel-developed tool intel-virtual-output does!
Fixing xorg.conf.nvidia
Before continuing, make a two changes to the file
/etc/bumblebee/xorg.conf.nvidia
.
In the “device” section, change option UseEDID
to true
. By default this is
set to false, which can prevent the bumblebee X server from correctly detecting
the resolution of your external display.
Also in the device section, comment out the line with Option "ConnectedMonitor" "DFP"
. Without this, the external monitor connected via my
ThinkPad Thunderbolt3 workstation dock at work would not come on, with the
following error:
|
|
It was trying to use DFP-0.3
, whilst the monitor in this case is named DP-1
.
At home, I have the monitor connected directly to the tb3 port of the
thinkpad. In that case, the monitor was indeed named DFP-1
(or somesuch), so
commenting out the ConnectedMonitor
option was not necessary.
My complete xorg.conf.nvidia
looks like this:
|
|
Activating the external display.
Now, with the monitor connected, do the following:
|
|
This will run an additional X server for the NVIDIA, using the proprietary
NVIDIA drivers, and then run intel-virtual-output
against that display.
The -b none
specifies that it should not make use of any bridge mechanism
(such as primus or vgl) to enable that command’s drawn output to be relayed
back to our main X.
This is because intel-virtual-output
at a lower level hooks up the NVIDIA
output, directly connected to the external output, as a virtual display local
to the intel (primary) X.
The upshot of this is that your currently running Intel-GPU X server now has an addition display that you can manage using all of the existing tools!
Just to ram the point home, you can drag any windows across from the laptop’s display to the external and back.
Before you disconnect your laptop, simply kill the intel-virtual-output
process, which will bring all of your windows back to the laptop’s built-in
display.
Running OpenGL apps in connected mode.
With the external monitor connected, you can run any OpenGL apps using the
NVIDIA dGPU by prepending DISPLAY=:8
.
For example:
|
|
Running CUDA apps.
In all cases (connected and disconnected), running CUDA apps is as simple as the following PyTorch mini-demo:
|
|
Note that here we use -b none
again, as we only want optirun to activate the
NVIDIA (if powered down) and setup the correct library paths so that the
process we are invoking is able to talk directly to the GPU.
Power consumption.
One of the best things about this setup, is how little power the laptop consumes when the NVIDIA is powered down, which should happen automatically when you have no optirun sessions ongoing.
After disconnecting from the external monitor, and double-checking with cat /proc/acpi/bbswitch
that the NVIDIA is off, I get the following TLP output
with Manjaro gnome-shell, Emacs 26.2 in graphics mode, gnome-terminal with
three tabs, and Chromium with four tabs:
|
|
That’s not much more than 5 W of consumption at idle, which has been attained with minimal configuration effort from my side.
I get similar results with KDE instead of Gnome.
Conclusions.
I was quite pleasantly surprised by how well Manjaro runs on the Thinkpad X1 Extreme.
Although I would like some more months for testing, this configuration could easily work as one’s daily driver.
The fact that one can have a workstation-class Linux-running laptop with low idle power consumption, yet with the ability to activate CUDA hardware when required, is compelling.