Ollama doesnt work on framework laptops with ujust

I was pulling my hair out because the first thing I did after installing this OS on my FW 13 was install ollama, and it doesn’t work.

Turns out it uses the :rocm tag on the docker container in the systemd service file, which just hangs forever and never starts, not sure why

All i did was switch to the main image for now, and sure enough the service starts and everything runs smoothly after. Sharing here in case someone else has this issue

1 Like

Do you have an intel framework?

1 Like

No. I have one of the first AMD ones.

Is Ollama actually using the GPU for you?

I’ve found this tutorial on how to get it to work, but I don’t know how to apply this in the context of Bluefin: ollama/docs/tutorials/amd-igpu-780m.md at ddb6dc81c26721a08453f1db7f2727076e97dabc · ollama/ollama · GitHub

PS: On CPU it actually works decently

Just for giggles on the Framework Laptop 16, I downloaded Alpaca the software store.

Installed Llama3.2 (small) for some fast GPU testing.

I enabled it in Alpaca, then proceeded to ask it to tell the complete saga of Lord of the Rings.

It was a short run, but you can see where it tapped the dGPU in Mission Center.

Then when the output from the model stopped, so did the usage of the dGPU.

Available models. Easy-peasy.

And my Flatpak setting in Flatseal for the application.