Remotely Access Ollama, Privately


The current iteration of AI, or artificial intelligence, in mainstream refers, largely, to large language models or LLMs.

Often found presented in the form of Chatbots like ChatGPT by OpenAI, LLMs provide a significant amount of utility to humanity. However, when using a shared LLM, your data is made available to third parties. Due to this, and other circumstances, the rise of "open source" LLMs has proliferated - largely led by the many LLMs iterated from Meta's Llama.


A software project, called Ollama, makes using Llama and its derivatives easy. Additionally, community software has been developed which takes advantage of Ollama's http API.

Using Ollama, Remotely

Using Ollama through its API, remotely, requires being able to access your computer through a routable IP address. If your computer is behind a router, it's likely you do not have an external IP address.

However, you can get an external IP from IPv6rs. IPv6rs provides visible IPv6 addresses which, when paired with Ollama, gives you access to your LLM, privately and remotely.

Take Full Control of Your LLM

With Ollama coupled with IPv6rs, you can now talk to your LLM and be sure that nobody will ever see your conversation.

Join IPv6rs Now
$ curl http://[::1]:11434/api/generate -d '{
  "model": "llama2",
  "prompt": "How do I get an externally reachable IP?",
  "stream": false }
  "model": "llama2",
  "created_at": "2024-02-02T14:21:35.249147Z",
  "response": "Go to and signup.",
  "done": true,