Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 24 additions & 0 deletions cli/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -152,6 +152,7 @@ def select_shallow_thinking_agent(provider) -> str:
"ollama": [
("llama3.1 local", "llama3.1"),
("llama3.2 local", "llama3.2"),
("Custom (enter model name)", "__custom__"),

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The string "__custom__" is a magic value used for the choice's value and for comparison later. To improve readability and maintainability, it's best to define this as a constant at the module level (e.g., CUSTOM_OLLAMA_MODEL_CHOICE = "__custom__") and use that constant in both places. This would also apply to its usage for the deep thinking agent.

]
}

Expand All @@ -177,6 +178,17 @@ def select_shallow_thinking_agent(provider) -> str:
)
exit(1)

# If custom is selected, prompt for the model name to use with Ollama
if choice == "__custom__":
custom_model = questionary.text(
"Enter your Ollama model name (e.g., mistral-nemo:latest):",
validate=lambda x: len(x.strip()) > 0 or "Please enter a valid model name.",
).ask()
if not custom_model:
console.print("\n[red]No model name provided. Exiting...[/red]")

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The console object is used here but it is not defined or imported in cli/utils.py. It appears to be defined as a global in cli/main.py, which will likely cause a NameError at runtime because this module doesn't have access to main.py's globals. This is a critical correctness issue. This anti-pattern is also present in the new code in select_deep_thinking_agent (line 260). The console object should be made available to this module properly, for instance by defining it in a shared location and importing it where needed.

exit(1)
return custom_model.strip()
Comment on lines +181 to +190

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This block of code for handling a custom model selection is nearly identical to the one in select_deep_thinking_agent (lines 253-262). To improve maintainability and adhere to the DRY (Don't Repeat Yourself) principle, this logic should be extracted into a private helper function. The example model name could be passed as an argument to this new function.


return choice


Expand Down Expand Up @@ -214,6 +226,7 @@ def select_deep_thinking_agent(provider) -> str:
"ollama": [
("llama3.1 local", "llama3.1"),
("qwen3", "qwen3"),
("Custom (enter model name)", "__custom__"),
]
}

Expand All @@ -237,6 +250,17 @@ def select_deep_thinking_agent(provider) -> str:
console.print("\n[red]No deep thinking llm engine selected. Exiting...[/red]")
exit(1)

# If custom is selected, prompt for the model name to use with Ollama
if choice == "__custom__":
custom_model = questionary.text(
"Enter your Ollama model name (e.g., llama3.1:latest):",
validate=lambda x: len(x.strip()) > 0 or "Please enter a valid model name.",
).ask()
if not custom_model:
console.print("\n[red]No model name provided. Exiting...[/red]")
exit(1)
return custom_model.strip()

return choice

def select_llm_provider() -> tuple[str, str]:
Expand Down