Has ujust ollama-web been removed?

Has ujust ollama-web been removed?

❯ ujust ollama-web                                                                                                                                                                                           
error: Justfile does not contain recipe `ollama-web`.

I then tried

❯ ujust ollama install-open-webui                                                                                                                                                                      
open-webui container already exists, skipping...
Ollama Web UI container started. You can access it at http://localhost:8080

But neither Ollama or Open WebUI are actually running. If I run ollama serve and then ollama list I can see that my models exist, but no open-webui.

This thread still mentions ujust ollama-web

Hello,

I posted that comment you quote. That had nothing to do with ollama, I was just trying to get pytorch to run on my AMD GPU with acceleration (which worked).

I was not aware of ujust ollama but I tried it just now with the same results you posted. There seems to be no web interface running, just regular ollama which I interact with from the terminal window with ollama run <model>

1 Like

It’s just ujust ollama now:

❯ ujust ollama
Usage: ujust ollama <option>
  <option>: Specify the quick option to skip the prompt
  Use 'install' to Install ollama container
  Use 'install-open-webui' to Install open-webui container
  Use 'remove' to remove the ollama container
  Use 'remove-open-webui' to remove the open-webui container and persistent volume

That page is old so I’ve hidden it. AI and Machine Learning is the docs page.

4 Likes

I read the ujust recipe and it seems to install a user systemd service. Check:

> ls ~/.config/containers/systemd/
ollama.container  ollama.network  ollama-web.container

> systemctl status --user ollama-web.service 
○ ollama-web.service - An Ollama WebUI container
     Loaded: loaded (/var/home/myuser/.config/containers/systemd/ollama-web.container; generated)
    Drop-In: /usr/lib/systemd/user/service.d
             └─10-timeout-abort.conf
     Active: inactive (dead)

I tried start but it timed out and failed:

> systemctl start --user ollama-web.service
systemctl start --user ollama-web.service
2 Likes

Reading the logs, it seems you need to stop the daemon. It may have worked for you but if not, try:

> brew services
Name   Status  User            File
ollama started myuser         ~/.config/systemd/user/homebrew.ollama.service

If the service is running, stop it:

> brew services stop ollama
Stopping `ollama`... (might take a while)

> systemctl start --user ollama-web.service
# should work now!
1 Like

systemctl start --user ollama-web.service worked for me.
I did uninstall / reinstall though so that might have set some things right. My models still exist, as does my open-webui login+cookies, but the chat history is gone (fine in my case)

ujust ollama remove
ujust ollama remove-open-webui
ujust ollama install
ujust ollama install-open-webui
systemctl start --user ollama-web.service

Previously ujust ollama-web would also start the service. But it looks like that’s a separate step?

1 Like

So personally I had configured the brew service to start automatically, which is why it was already running in my case. Stopping it fixed things for me as well, which seems to indicate that it still tries to start the service (which was failing for me due to it already running?).

I can’t reboot right now but I suspect that if you restart, it should start everything automatically.

1 Like

I just tested reboot and it doesn’t start automatically (It’s not enabled for me).
I don’t need it running all the time.

I’ve put that in a script for now

 echo "systemctl start --user ollama-web.service" > start-ollama-web.sh         

Might be handy to have a ujust ollama start-open-webui.

1 Like

I suspect ollama uses a bit of RAM so it’s not set to autostart.

I was going to suggest the standard way of enabling systemd services, but got:

> systemctl --user enable ollama-web.service
Failed to enable unit: Unit /run/user/1400601103/systemd/generator/ollama-web.service is transient or generated.

Not sure what that is about (never seen this before), but generally systemd would auto-start services on login if you enabled it this way. Perhaps this will work for you though?

If not, another idea is to create a desktop file for running your script in folder ~/.config/autostart.

Or perhaps someone more knowledgable can suggest something better…

1 Like

That’s some high quality documentation! If I could edit the page, I’d probably change the bit about checking Pytorch to read python -c "import torch; print(torch.cuda.is_available())", though I suppose most people running the NGC are going to recognize that they were supposed to start a Python interpreter to execute the commands.

Just a heads up that we’re removing the service units and justfiles from new installs. (Too many support footguns)

If you’ve installed it already then it’ll still be there. Going forward we’re just going to recommend that users install Alpaca.

Damn any time I look at these docs, things have gotten even simpler and even better. Straight ollama works great, but now there’s even just a simple flatpak app! Thank you for all the hard work, everyone!