-
-
Notifications
You must be signed in to change notification settings - Fork 30
Closed
Labels
Description
@su77ungr
I am having 32Cores and 64GB RAM.
-
I am getting
ggml_new_tensor_impl: not enough space in the context's memory pool (needed 7082923680, available 7082732800)
-
How we can restrict the token_length? and limit its domain to the ingested document file?
> Question:
who is saying that "save democracy"
> Answer:
The speaker is calling for the Senate to pass the Freedom to Vote Act, the John Lewis Voting Rights Act, and the Disclose Act to ensure that Americans have the right to vote and to know who is funding their elections.
> Time Taken: 39.02538466453552
Enter a query: what is the date today?
llama_print_timings: load time = 227.71 ms
llama_print_timings: sample time = 0.00 ms / 1 runs ( 0.00 ms per run)
llama_print_timings: prompt eval time = 334.69 ms / 7 tokens ( 47.81 ms per token)
llama_print_timings: eval time = 0.00 ms / 1 runs ( 0.00 ms per run)
llama_print_timings: total time = 337.46 ms
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
gpt_tokenize: unknown token '�'
ggml_new_tensor_impl: not enough space in the context's memory pool (needed 7082923680, available 7082732800)