Devops
1 Min Read

How to access local LLMs from anywhere

Subhajeet Dey
February 10, 2025

In this blog we setup Ollama running latest Deepseek models with openwebui and access the model from anywhere using tailscale with private data not touching the internet at all