You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
【BUG: CUDA error invalid configuration argument】When token length more than 2048 use Baichuan2-13B-Chat
When using Baichuan2-13B-Chat cpp model and token length more than 2048, will killed by this error:
CUDA error 9 at /tmp/pip-install-alzfik5u/chatglm-cpp_304776c77dfd4f94a8265b20b0fe43e0/third_party/ggml/src/ggml-cuda.cu:6047: invalid configuration argument
The text was updated successfully, but these errors were encountered:
trekrollercoaster
changed the title
【BUG: CUDA error invalid configuration argument】When token length more than 1000 use Baichuan2-13B-Chat
Is it possible to increase the Baihuan2-13b default ctx length to 4096?
Sep 25, 2023
Uh oh!
There was an error while loading. Please reload this page.
【BUG: CUDA error invalid configuration argument】When token length more than 2048 use Baichuan2-13B-Chat
When using
Baichuan2-13B-Chat
cpp model and token length more than 2048, will killed by this error:The text was updated successfully, but these errors were encountered: