-
Notifications
You must be signed in to change notification settings - Fork 138
Open
Labels
bugSomething isn't workingSomething isn't workinghelp wantedExtra attention is neededExtra attention is needed
Description
The below code
LLM-VM/src/llm_vm/completion/optimize.py
Line 261 in 7a5877b
small_model_filename = kwargs.get("small_model_filename", None) |
tries to extract small_model_filename
from kwargs
which is passed down from client.complete(...)
-> optimizer.complete(...)
-> optimizer.complete_delay_train(...)
I think the main purpose of this is to make sure we can save finetuned model to any location via llm_vm.client
, but it will cause error because the same kwargs
with small_model_filename
will be passed to chat_gpt model:
LLM-VM/src/llm_vm/completion/optimize.py
Line 254 in 7a5877b
best_completion = self.call_big(prompt, **kwargs) |
Resulting in
openai.error.InvalidRequestError: Unrecognized request argument supplied: small_model_filename
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workinghelp wantedExtra attention is neededExtra attention is needed