model = LoaderClass.from_pretrained(llm_model_dict[checkpoint.name]["local_model_path"],trust_remote_code=True) #2876
Replies: 2 comments 1 reply
-
🤖 嘿,@Liwan-Chen!很高兴再次见到你。希望你一切都好! 根据提供的上下文, 在提供的上下文中, parser.add_argument(
"--device",
type=str,
choices=["cpu", "cuda", "mps", "xpu"],
default="cuda",
help="The device type",
) 如果你想在CPU上运行模型,可以在运行脚本时将 python llm_api_stale.py --device cpu 请注意,实际加载模型并在指定设备上运行的代码行并未在上下文中提供。你提供的代码行( 另外,模型在 如果你需要更多的帮助,或者有其他问题,欢迎随时提问。 SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
直接在配置文件改吧 因为要改model config 和server config |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
如何 在运行python startup.py -a 时候 通过命令行指定--device参数来设定运行模型的设备 为cpu
Beta Was this translation helpful? Give feedback.
All reactions