-
Notifications
You must be signed in to change notification settings - Fork 4.7k
Added option to select own locally running Ollama model when running python -m cli.main #221
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -152,6 +152,7 @@ def select_shallow_thinking_agent(provider) -> str: | |
| "ollama": [ | ||
| ("llama3.1 local", "llama3.1"), | ||
| ("llama3.2 local", "llama3.2"), | ||
| ("Custom (enter model name)", "__custom__"), | ||
| ] | ||
| } | ||
|
|
||
|
|
@@ -177,6 +178,17 @@ def select_shallow_thinking_agent(provider) -> str: | |
| ) | ||
| exit(1) | ||
|
|
||
| # If custom is selected, prompt for the model name to use with Ollama | ||
| if choice == "__custom__": | ||
| custom_model = questionary.text( | ||
| "Enter your Ollama model name (e.g., mistral-nemo:latest):", | ||
| validate=lambda x: len(x.strip()) > 0 or "Please enter a valid model name.", | ||
| ).ask() | ||
| if not custom_model: | ||
| console.print("\n[red]No model name provided. Exiting...[/red]") | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The |
||
| exit(1) | ||
| return custom_model.strip() | ||
|
Comment on lines
+181
to
+190
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This block of code for handling a custom model selection is nearly identical to the one in |
||
|
|
||
| return choice | ||
|
|
||
|
|
||
|
|
@@ -214,6 +226,7 @@ def select_deep_thinking_agent(provider) -> str: | |
| "ollama": [ | ||
| ("llama3.1 local", "llama3.1"), | ||
| ("qwen3", "qwen3"), | ||
| ("Custom (enter model name)", "__custom__"), | ||
| ] | ||
| } | ||
|
|
||
|
|
@@ -237,6 +250,17 @@ def select_deep_thinking_agent(provider) -> str: | |
| console.print("\n[red]No deep thinking llm engine selected. Exiting...[/red]") | ||
| exit(1) | ||
|
|
||
| # If custom is selected, prompt for the model name to use with Ollama | ||
| if choice == "__custom__": | ||
| custom_model = questionary.text( | ||
| "Enter your Ollama model name (e.g., llama3.1:latest):", | ||
| validate=lambda x: len(x.strip()) > 0 or "Please enter a valid model name.", | ||
| ).ask() | ||
| if not custom_model: | ||
| console.print("\n[red]No model name provided. Exiting...[/red]") | ||
| exit(1) | ||
| return custom_model.strip() | ||
|
|
||
| return choice | ||
|
|
||
| def select_llm_provider() -> tuple[str, str]: | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The string
"__custom__"is a magic value used for the choice's value and for comparison later. To improve readability and maintainability, it's best to define this as a constant at the module level (e.g.,CUSTOM_OLLAMA_MODEL_CHOICE = "__custom__") and use that constant in both places. This would also apply to its usage for the deep thinking agent.