OpenWebUI With Ollama Playbook Issues/Questions

Noticed when following the following playbook:

It appears to work, but when I access the Open WebUI remotely using Nvidia Sync on a Macbook, there are a few issues…

(1) If i step away for a few hours then come back, Open WebUI does not load in the web browser. I have to relaunch Nvidia Sync, and click on the “Open WebUI” entry and wait before I can access the browser again.

(2) I noticed within Open WebUI when I attempt to download a new model from Ollama, if it takes a lot of time, when (1) occurs, it appears the model no longer is downloading when I actually ideally want to be able to make sure if I leave, the download of a new model can finish appropriately.

(3) How to upgrade the Ollama that is in the distribution such that it works with Open WebUI? It appears some models need the newest version of Ollama and the version in the playbook is already out of date.

In my environment I have not seen the first point you mentioned, but I am using Windows. Maybe you need to enable NVIDIA Sync to run in background in the OS settings.

For the 2nd point I have seen exactly the same. Only stable solution is to use an SSH tool and pull the model via ollama inside the Open-WebUI contain like this:

docker exec -it open-webui ollama pull [MODEL_NAME]

To check the ollama version in the container you can run this command:
docker exec -it open-webui ollama --version