In this blog we setup Ollama running latest Deepseek models with openwebui and access the model from anywhere using tailscale with private data not touching the internet at all
Devops
•
1 Min Read
How to access local LLMs from anywhere

In this blog we setup Ollama running latest Deepseek models with openwebui and access the model from anywhere using tailscale with private data not touching the internet at all