❯ ujust ollama-web
error: Justfile does not contain recipe `ollama-web`.
I then tried
❯ ujust ollama install-open-webui
open-webui container already exists, skipping...
Ollama Web UI container started. You can access it at http://localhost:8080
But neither Ollama or Open WebUI are actually running. If I run ollama serve and then ollama list I can see that my models exist, but no open-webui.
I posted that comment you quote. That had nothing to do with ollama, I was just trying to get pytorch to run on my AMD GPU with acceleration (which worked).
I was not aware of ujust ollama but I tried it just now with the same results you posted. There seems to be no web interface running, just regular ollama which I interact with from the terminal window with ollama run <model>
❯ ujust ollama
Usage: ujust ollama <option>
<option>: Specify the quick option to skip the prompt
Use 'install' to Install ollama container
Use 'install-open-webui' to Install open-webui container
Use 'remove' to remove the ollama container
Use 'remove-open-webui' to remove the open-webui container and persistent volume
systemctl start --user ollama-web.service worked for me.
I did uninstall / reinstall though so that might have set some things right. My models still exist, as does my open-webui login+cookies, but the chat history is gone (fine in my case)
So personally I had configured the brew service to start automatically, which is why it was already running in my case. Stopping it fixed things for me as well, which seems to indicate that it still tries to start the service (which was failing for me due to it already running?).
I can’t reboot right now but I suspect that if you restart, it should start everything automatically.
I suspect ollama uses a bit of RAM so it’s not set to autostart.
I was going to suggest the standard way of enabling systemd services, but got:
> systemctl --user enable ollama-web.service
Failed to enable unit: Unit /run/user/1400601103/systemd/generator/ollama-web.service is transient or generated.
Not sure what that is about (never seen this before), but generally systemd would auto-start services on login if you enabled it this way. Perhaps this will work for you though?
If not, another idea is to create a desktop file for running your script in folder ~/.config/autostart.
Or perhaps someone more knowledgable can suggest something better…
That’s some high quality documentation! If I could edit the page, I’d probably change the bit about checking Pytorch to read python -c "import torch; print(torch.cuda.is_available())", though I suppose most people running the NGC are going to recognize that they were supposed to start a Python interpreter to execute the commands.
Damn any time I look at these docs, things have gotten even simpler and even better. Straight ollama works great, but now there’s even just a simple flatpak app! Thank you for all the hard work, everyone!