I made a PHP integration with llama.cpp server #5028
mcharytoniuk
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello! I made an async PHP framework, and included llama.cpp integration to serve output from LLMs through WebSockets or just plain HTTP: https://resonance.distantmagic.com/tutorials/connect-to-llama-cpp/
Resonance is async PHP framework based on Swoole, aimed for SaaS systems, or being an infrastructure communication hub.
Thanks to your great work with continuous batching in llama.cpp, I think llama.cpp will be a go-to solution for enterprise SaaS, so I decided to include it as a default solution in my framework.
Beta Was this translation helpful? Give feedback.
All reactions