![Ollama on Windows with OpenWebUI on top.](/assets/blog/2024/ollama-on-windows/ollama-windows.jpg)
Ollama on Windows: How to Install and Use it with OpenWebUI
Ollama is one of the easiest ways to run large language models locally. Thanks to llama.cpp, it can run models on CPUs or GPUs, even older ones like my RTX 2070 Super. It provides a CLI and an Ope...