-
Notifications
You must be signed in to change notification settings - Fork 13.3k
Description
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- I carefully followed the README.md.
- I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- I reviewed the Discussions, and have a new bug or useful enhancement to share.
(yup, i searched 'linux binary' and 'linux binaries' and didn't found a single issue)
Feature Description
as i said in the title, it would be great if llama.cpp officially adds support for automated compile in
https://github.com/ggerganov/llama.cpp/releases
because i get compile issues and i'm not programmer, so it would be great if you would officially include such binary, even if it's 343.2 MIB (359.8712832 MB) of size like what kobold.cpp is
Motivation
alot of compile errors and discord calls, and the fact that kobold.cpp has it already, as for why it's 'necessary', because newbies like me could also use it, and the fact that kobold.cpp is very slow in updates and can't use the mainline features
Possible Implementation
maybe this would be useful?
https://github.com/Nexesenex/kobold.cpp/blob/concedo_experimental/.github/workflows/kcpp-build-release-linux.yaml
but overall, feel free to close and ignore this issue if it isn't a priority, but it hugely sucks