ollama dont recognize my gpu

Hello guys I have perfectly configured Ollama and WebUI via docker in container station but I can not in any way use the GPU for the power of calculations, hon downloaded simple models so my Quadro P2200 should be perfectly able to handle them. I have perfectly enabled the passtrought of the GPU and I have assigned it to container station. The JHO assigned to the Ollama docker and from the MSI command you can see that the GPU is perfectly recognized by the docker and that the drivers and CUDA are installed correctly but I can not in any way make it work with the Ollama models.


ok I’ll answer myself, it was enough to dedicate the GPU also to the Web UI container as well as dedicate it to the Ollama container, I hope this helps other people

i posted immage here to confirm that web ui use a gpu to trascoding the chat

your image link is broken

ok, you have right, try now

Hi fraste03, could you give me some advise on how to install ollame and open-webui. I generally know how to install containers, but how do i make them work as supossed to…

thank you