Tried that and it works like a charm!
Following the instructions on docker hub, I run these two commands before, no prior toolkit installations needed (not sure if they are necessary, but I’ve run them just in case):
sudo nvidia-ctk runtime configure --runtime=docker
sudo systemctl restart docker
and then spinned up an ollama container using the following compose file
---
services:
ollama:
image: ollama/ollama
container_name: ollama
restart: unless-stopped
ports:
- 11434:11434
volumes:
- ./ollama_v:/root/.ollama
deploy:
resources:
reservations:
devices:
- capabilities:
- gpu
It exposes the ollama APIs on localhost:11434
(or any other port mapped instead of that when spinning up the container) and with that I can connect it both to Alpaca GUI (by specifying it as a “remote instance” through the settings) as well as Zed and Jetbrains IDEs.
Idk how common could be my use case, but if it’s something more people encounters it could worth adding docker instead of homebrew as a “more advanced” option in the Bluefin docs?