Guide to using Hybrid Graphics on Desktop PC

Hello, I am running a desktop PC with an NVIDIA RTX 3060 and an AMD Ryzen 5 7600 and I wanted to try out hybrid graphics, since it sounds like it could be helpful e.g. if I want to do machine learning, blender etc. with my dGPU and use my iGPU for the rest of my applications. I understand that hybrid graphics were developed with laptops in mind, so I haven’t really found documentation dedicated to desktop systems on this topic. I have found that for ujust, there exists enable-supergfxctl and configure-nvidia-optimus, but I am kind of lost what even the difference is (except that supergfxctl seems to be especially dedicated to laptops, but I have no idea if that’s of any relevance). Could someone just give me a rundown on how to best approach this on bazzite for my desktop PC?

You don’t need Optimus, etc.
I have an Intel Arc A380, and an Nvidia RTX A4500. My monitors are connected to the Intel card. I use Bluefin-dx-nvidia, which, as it says, has the nvidia drivers. I use Tensorflow and CUDA to do machine learning. In a venv, I install tensorflow[and-cuda] , which installs all the Python dependencies.

If you run nvidia-smi in a terminal, and see your GPU listed, you’re good to go.

E.g.

> nvidia-smi
Sat May 17 22:30:19 2025       
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 570.144                Driver Version: 570.144        CUDA Version: 12.8     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  NVIDIA RTX A4500               Off |   00000000:01:00.0 Off |                  Off |
| 30%   33C    P8             16W /  200W |      13MiB /  20470MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
                                                                                         
+-----------------------------------------------------------------------------------------+
| Processes:                                                                              |
|  GPU   GI   CI              PID   Type   Process name                        GPU Memory |
|        ID   ID                                                               Usage      |
|=========================================================================================|
|    0   N/A  N/A            5510      G   /usr/bin/gnome-shell                      3MiB |
+-----------------------------------------------------------------------------------------+
1 Like

Hey John, thanks a lot for the reply! Do you know if GPU offloading (I hope that’s the correct term) also works automatically like that for blender?

Edit: Nvm, I found a thread regarding my question so I will just roll with that one: Making sure you're not a bot!

Not near my computer, but I think you can right-click the icon and select Run on GPU, or words to that effect.

Edit:

1 Like

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.