Should one use DistroBox to install and run Local Ai tools in isolation? Ramalama, llama.cpp, huggingface-cli, etc.
Once installed, I am not sure if I should use Docker-in-Docker (Ramalama within the container?); or, if exporting Ramalama, llama.cpp, and other tools to run on the host is ideal.
Ideally, I’d want to follow best practices to isolate me from malicious models. Ramalama already runs and hosts models using podman/containers.
I like the “ujust bbrew” Ai list. However, this also feels a lot like polluting my host OS, instead of using distrobox for Ai tooling. I’m new to Aurora-DX so if I am missing a workflow, please elaborate. ![]()