Monday, 4 May 2020

Dual-GPU support: Launch on the discrete GPU automatically

*reality TV show deep voice guy*

In 2016, we added a way to launch apps on the discrete GPU.

*swoosh effects*

In 2019, we added a way for that to work with the NVidia drivers.

*explosions*

In 2020, we're adding a way for applications to launch automatically on the discrete GPU.

*fast cuts of loads of applications being launched and quiet*




Introducing the (badly-named-but-if-you-can-come-up-with-a-better-name-youre-ready-for-computers) “PrefersNonDefaultGPU” desktop entry key.

From the specifications website:
If true, the application prefers to be run on a more powerful discrete GPU if available, which we describe as “a GPU other than the default one” in this spec to avoid the need to define what a discrete GPU is and in which cases it might be considered more powerful than the default GPU. This key is only a hint and support might not be present depending on the implementation. 
And support for that key is coming to GNOME Shell soon.

TL;DR

Add “PrefersNonDefaultGPU=true” to your application's .desktop file if it can benefit from being run on a more powerful GPU.

We've also added a switcherooctl command to recent versions of switcheroo-control so you can launch your apps on the right GPU from your scripts and tweaks.

15 comments:

Frédéric Boulet said...

Hi,
Do you believe that this could also be used so that gnome-shell itself uses the dedicated GPU instead of the default one?

Chromiumbook User said...

Will there be some kind of conditional option for whether the system is on battery or AC power?

Chromiumbook User said...

Will there be some kind of conditional option for whether the system is on battery or AC power?

Thaodan said...

Is this an extension to the Freedesktop.org spec? If not it should be X-GNOME-PrefersNonDefaultGPU or not?

Thaodan said...

Is this an extension to the Freedesktop.org spec? If not shouldn't it be X-GNOME-PrefersNonDefaultGPU?

Bastien Nocera said...

> Do you believe that this could also be used so that gnome-shell itself uses the dedicated GPU instead of the default one?

No, that's not what it's designed for.

Bastien Nocera said...

> Will there be some kind of conditional option for whether the system is on battery or AC power?

That's not planned.

Bastien Nocera said...

> Is this an extension to the Freedesktop.org spec? If not shouldn't it be X-GNOME-PrefersNonDefaultGPU?

You should consider reading the post a second time...

Ole Laursen said...

PrefersFastestGPU?

Do I win a cake?

"Non default" hardcodes the assumption that the slower GPU is default.

Bastien Nocera said...

> PrefersFastestGPU?

Unfold the threads in that merge request, and see the options that were already discussed. You can't call it "fastest GPU" because you don't know whether it will be the fastest GPU... You win nothing :)

Tomas Janousek said...

It's a bit disappointing that your post omits the important fact that this only works if your primary X server is configured to load the nvidia driver, which is a setup most people don't have, because it usually means the dedicated GPU will consume battery power all the time, rendering this whole "run only select apps on it" idea moot.

Sure, if you have the latest generation of nvidia hardware, then it might just work (http://us.download.nvidia.com/XFree86/Linux-x86_64/440.82/README/dynamicpowermanagement.html, “This feature requires a Turing or newer GPU.”). Many people, however, use bumblebee, bbswitch, nvidia-xrun or some combination of these to achieve acceptable battery life. For them, the feature you described will not work at all.

Bastien Nocera said...

> It's a bit disappointing that your post omits the important fact that this only works if your primary X server is configured to load the nvidia driver

There's a fair number of systems where it works and that don't use the NVidia driver, such as Intel+Radeon or Radeon APU+Radeon GPU, or setups with NVidia cards but using the nouveau driver.

If you think the NVidia drivers don't work well, or well enough, you should definitely bring that with them. I have no power over that, nor do I claim that the little of amount of code I wrote supports all the different use cases. Figuring out how to make NVidia cards use less power is definitely not in my remit, and downgrading to X11 is definitely not in my plans either ;)

illwieckz said...

Hi, I just discovered this recently (from there: https://gitlab.com/xonotic/xonotic/-/merge_requests/66 )

I'm worrying a bit about that “PrefersNonDefaultGPU” key, not about the choice of the words, because the design of it seems to imply a behaviour I'm not sure it is good.

This key name assumes the default GPU is alway the worst one for the application, something the application developer can never assume.

Let's imagine I have a powerful-enough AMD vega-based APU I want to use for everything on my desktop, including games or CAD software, plus an integrated Nvidia I would only use for some proprietary CUDA jobs I don't have a proper alternative to (and nothing else). The “PrefersNonDefaultGPU” key would run games with drivers I don't want to use at all.

I don't know how the “Launch using Integrated Graphic Card” option is done, but I would think about a boolean key named like "CanUseArbitraryGPU". When this key is set, the menu can list the various GPU available by their real names, adding “(default)” string to the name for the default one.

This would also fit well for people who would like to run a given game on a GPU but another game on another GPU, for example to workaround a driver compatibility issue , or to get some features.

On the first case, we can imagine a game that is broken on NVidia drivers on Linux but not on Intel or AMD, given the former would be discrete and the later integrated.

On the second case we have the real case issue of GalliumNine (native Directx9 implementation) only working on Gallium driver, so it's not wrong for a casual gamer playing simple games who is satisfied by its integrated Intel GPU in his desktop computer to plug an old obsolete Radeon GPU to play the old game of his childhood. You may say: wait, an old game running on wine would not ship a .desktop, but your game library management software can write them. Lutris knows how to create .desktop for games and knows how to integrate with `gamemode`, so we can assume Lutris will know how to integrate such .desktop options, integrating brokenness then.

-✀--- I cut my comment there because Blogger does not allow comments that are larger than 4096 character and my comment is 4086 long… wait, it may count bytes and not UTF-8 characters, see below for continuation. -----

illwieckz said...

[Continuation] For games that needs to be started on the gaming GPU (the game developer can't assume the discrete is the gaming GPU), that would be a desktop preference (like GNOME preference), somewhere near the display option in preference windows to set which one is the Gaming GPU.

Basically the .desktop key would be something like “PrefersGPUClass” with “Gaming” value. That makes possible other values (other classes), the default value would be “Default”.

We can also imagine another key the application developer or packager would never use and never ship: “PrefersGPU”. This key would receive an unique identifier from the user computer. This option would only be there to allow the user to configure an application to start with a given GPU and store the custom .desktop in ~/.local/share/applications.

Also, as a game developer myself, I test games on multiple configurations and multiple GPU and I don't exclude to put a weak GPU on a computer that already has an integrated GPU just to be able to test the weak GPU. I would not appreciate to see my games starting to use the weak GPU that is only there for testing my own projects.

Note that with external GPU (eGPU), the later scenario may become popular among game developers because it would allow game developers to hotplug/unplug various GPU for testing their games.

No one can assume someone else non-default GPU is the one to use for a given task.

Having more than one GPU is not only for power consumption, having more than one GPU is not only for laptops, having more than one GPU is not only for gaming, having more than one GPU is not always to use the non-primary one as a gaming one. Also GPUs can be used for compute and we can't assume a secondary GPU is for rendering.

To me that design brings more problems than it solves.

As a game developer I'm not sure I will use this key given this may break someone else environment that was working before because the GPU to use is already the default one.

Bastien Nocera said...

illwieckz: a blog's comments is not the place to start a discussion. I didn't read the comment(s).