Skip to content

Conversation

firecoperana
Copy link
Collaborator

@firecoperana firecoperana commented Sep 19, 2025

This PR adds --webui parameter in llama-server so user can select different built-in webui. This is useful for those who wants to try new llama.cpp webui without losing access to the default webui. (ggml-org/llama.cpp#14839) New llama.cpp webui is still in beta phase, which expects more features and bug fixes incoming. If you encounter bugs, please confirm if mainline has it. If it does, submit issues there. To request more features, also submit feature request there. I just port it without doing much modification. New upgrade in llama.cpp includes better UI, conversation branching...
Use with caution and either start server in new ip:port or make backups to your conversation history.

To launch it, use --webui llamacpp. By default, it still use current webui.
--webui accepts three values:
none: disable webui
auto: default webui
llamacpp: llamacpp webui

@ikawrakow
Copy link
Owner

Unless changes 2-5 are tightly coupled with change 1 and are difficult to disentangle, I think it would be better to have them in a separate PR.

I do see quite a few bug reports in mainline related to the new webui. Perhaps it would be better to keep change 1 on a branch until most issues are fixed?

@firecoperana
Copy link
Collaborator Author

Yes, that is possible.

@firecoperana
Copy link
Collaborator Author

Separate PR is created for 2-5

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants