If you're using Ollama to run models locally, check out the latest update! I run it on Windows 11 natively and also on headless Linux servers, with NVIDIA GPUs. I'm also running Open WebUI as a front-end for the Ollama REST API, and it works great! Really nice to have the option to keep data private and run inference on my own network. 🖥️ I highly recommend trying it out, if you haven't yet. New release ➡️ github.com/ollama/ollama/releases/tag/v0.9.5
Trevor Sullivan
If you're using Ollama to run models locally, check out the latest update! I run it on Windows 11 natively and also on headless Linux servers, with NVIDIA GPUs. I'm also running Open WebUI as a front-end for the Ollama REST API, and it works great! Really nice to have the option to keep data private and run inference on my own network. 🖥️ I highly recommend trying it out, if you haven't yet. New release ➡️ github.com/ollama/ollama/releases/tag/v0.9.5
3 months ago | [YT] | 7