Hi, Is there or will there be a way to have this work with llama cpp/server? For those of us who are (can) not using linux?
Hi,
Is there or will there be a way to have this work with llama cpp/server? For those of us who are (can) not using linux?