I’m trying to get Ollama working with Bazzite (to make use of my AMD iGPU when not gaming) but am not sure where to start. H…hh…hallppp
You can follow the explanation on hub.docker.com
https://hub.docker.com/r/ollama/ollama
Instead of docker use podman
I do it with brew install ollama
- even simpler than running a docker image. Then just follow the prompts to get it to auto start, etc
That is… I used to. Until I found this which can download and run it for you in a chat UI, as well as other LLMs (sidestepping the terminal):
Both are strongly recommended - have fun!
1 Like
That’s also a possibility. It depends on how you want to use it.
1 Like
This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.