From aeadbeb226c67b66d426137a0391d759418088f2 Mon Sep 17 00:00:00 2001 From: byarbrough <6315292+byarbrough@users.noreply.github.com> Date: Fri, 28 Mar 2025 09:17:56 -0600 Subject: [PATCH 1/5] Add OLLAMA_HOST blurb to README Without this it's opaque that the library is making HTTP requests, or to where it might be making them. The only place I could find how to change the localhost:11434 behavior was in the Ollama FAQs. If there is a way to set it besides OLLAMA_HOST I would love to know! --- README.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/README.md b/README.md index 172d3252..ebd0d00a 100644 --- a/README.md +++ b/README.md @@ -8,6 +8,10 @@ The Ollama Python library provides the easiest way to integrate Python 3.8+ proj - Pull a model to use with the library: `ollama pull ` e.g. `ollama pull llama3.2` - See [Ollama.com](https://ollama.com/search) for more information on the models available. +The library connects to Ollama running on `http://localhost:11434` by default. +You can change this address by setting the `OLLAMA_HOST` environment variable. +Refer to the [FAQs](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server) for how to set environment variables on your platform. + ## Install ```sh From 755ff5f97ccb18a5050e075f4c3bfd36d686721f Mon Sep 17 00:00:00 2001 From: byarbrough <6315292+byarbrough@users.noreply.github.com> Date: Fri, 28 Mar 2025 15:45:26 -0600 Subject: [PATCH 2/5] Tweak README to mention custom client earlier --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index ebd0d00a..3793a1d6 100644 --- a/README.md +++ b/README.md @@ -9,8 +9,8 @@ The Ollama Python library provides the easiest way to integrate Python 3.8+ proj - See [Ollama.com](https://ollama.com/search) for more information on the models available. The library connects to Ollama running on `http://localhost:11434` by default. -You can change this address by setting the `OLLAMA_HOST` environment variable. -Refer to the [FAQs](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server) for how to set environment variables on your platform. +You can change this address with the [custom client](https://github.com/ollama/ollama-python?tab=readme-ov-file#custom-client) +or by setting the [`OLLAMA_HOST`](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network) environment variable. ## Install From 109aeba307bb2282245bcecd8b3ef4adf0871c51 Mon Sep 17 00:00:00 2001 From: Brian Yarbrough <6315292+byarbrough@users.noreply.github.com> Date: Wed, 30 Apr 2025 21:15:28 -0600 Subject: [PATCH 3/5] Remove callouts for changing the client Custom client is discussed below and the OLLAMA_HOST is primarily related to the main Ollama library. Co-authored-by: Parth Sareen --- README.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/README.md b/README.md index 3793a1d6..eda365a0 100644 --- a/README.md +++ b/README.md @@ -9,8 +9,6 @@ The Ollama Python library provides the easiest way to integrate Python 3.8+ proj - See [Ollama.com](https://ollama.com/search) for more information on the models available. The library connects to Ollama running on `http://localhost:11434` by default. -You can change this address with the [custom client](https://github.com/ollama/ollama-python?tab=readme-ov-file#custom-client) -or by setting the [`OLLAMA_HOST`](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network) environment variable. ## Install From 345b8646b0ebd2f9e5517a0abe203a784dd1291b Mon Sep 17 00:00:00 2001 From: byarbrough <6315292+byarbrough@users.noreply.github.com> Date: Wed, 30 Apr 2025 21:17:47 -0600 Subject: [PATCH 4/5] Move localhost default up Per @ParthSareen review --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index eda365a0..b0c1fee6 100644 --- a/README.md +++ b/README.md @@ -2,14 +2,14 @@ The Ollama Python library provides the easiest way to integrate Python 3.8+ projects with [Ollama](https://github.com/ollama/ollama). +The library connects to Ollama running on `http://localhost:11434` by default. + ## Prerequisites - [Ollama](https://ollama.com/download) should be installed and running - Pull a model to use with the library: `ollama pull ` e.g. `ollama pull llama3.2` - See [Ollama.com](https://ollama.com/search) for more information on the models available. -The library connects to Ollama running on `http://localhost:11434` by default. - ## Install ```sh From 0f973640c2ac0427a55bc0329a2a94054fc511ac Mon Sep 17 00:00:00 2001 From: byarbrough <6315292+byarbrough@users.noreply.github.com> Date: Fri, 30 May 2025 14:30:40 -0600 Subject: [PATCH 5/5] Add default connection and custom client subbullet to prereqs --- README.md | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/README.md b/README.md index b0c1fee6..940b4c58 100644 --- a/README.md +++ b/README.md @@ -2,11 +2,10 @@ The Ollama Python library provides the easiest way to integrate Python 3.8+ projects with [Ollama](https://github.com/ollama/ollama). -The library connects to Ollama running on `http://localhost:11434` by default. - ## Prerequisites - [Ollama](https://ollama.com/download) should be installed and running + - The library connects to Ollama running on `http://localhost:11434` by default. This can be changed with a [custom client](https://github.com/ollama/ollama-python?tab=readme-ov-file#custom-client). - Pull a model to use with the library: `ollama pull ` e.g. `ollama pull llama3.2` - See [Ollama.com](https://ollama.com/search) for more information on the models available.