Skip to content

hi everyone, I just added Llama.cpp, Kobolt.cpp and Ollama support to my single .py chat UI - Retrochat - , perfect for RP or low end machines! (there is an .exe option as well if you don't want to bother with installing dependencies etc) #7902

DefamationStation started this conversation in Show and tell
Discussion options

You must be logged in to vote

Replies: 0 comments

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
1 participant