Open
Description
Environment
- CPU: NVIDA Grace CPU(72thread)
- Model: Llama3-8B-1.58-100B-tokens with TL1
- Prompt: AI is going to
- N_Predict: 128
- Threads: 1, 2, 4, 8, 16, 32, 64, 72
- Context Size: 2048
- Temperature: 0.8
When I implemented Llama3-8B-1.58-100B-tokens with TL1 on ARM, a malloc() error was occured.
- Error Type
- Type 1: malloc() error and nothing output
- Type 2: malloc() error and correct output(generated text)
- Type 3: double malloc() error and correct output(generated text)
Thread Size | Error Message | Output | Error Type |
---|---|---|---|
1thread | malloc(): invalid next size (unsorted) Error occurred while running command: Command '['build/bin/llama-cli', '-m', '/root/BitNet/models/Llama3-8B-1.58-100B-tokens/ggml-model-tl1.gguf', '-n', '128', '-t', '1', '-p', 'AI is going to', '-ngl', '0', '-c', '2048', '--temp', '0.8', '-b', '1']' died with <Signals.SIGABRT: 6>. | Nothing | Type 1 |
2thread | free(): invalid next size (normal) Error occurred while running command: Command '['build/bin/llama-cli', '-m', '/root/BitNet/models/Llama3-8B-1.58-100B-tokens/ggml-model-tl1.gguf', '-n', '128', '-t', '2', '-p', 'AI is going to', '-ngl', '0', '-c', '2048', '--temp', '0.8', '-b', '1']' died with <Signals.SIGABRT: 6>. | Generated Text | Type 2 |
4thread | double free or corruption (!prev) Error occurred while running command: Command '['build/bin/llama-cli', '-m', '/root/BitNet/models/Llama3-8B-1.58-100B-tokens/ggml-model-tl1.gguf', '-n', '128', '-t', '4', '-p', 'AI is going to', '-ngl', '0', '-c', '2048', '--temp', '0.8', '-b', '1']' died with <Signals.SIGABRT: 6>. | Generated Text | Type 3 |
8thread | double free or corruption (!prev) Error occurred while running command: Command '['build/bin/llama-cli', '-m', '/root/BitNet/models/Llama3-8B-1.58-100B-tokens/ggml-model-tl1.gguf', '-n', '128', '-t', '8', '-p', 'AI is going to', '-ngl', '0', '-c', '2048', '--temp', '0.8', '-b', '1']' died with <Signals.SIGABRT: 6>. | Generated Text | Type 3 |
16thread | double free or corruption (!prev) Error occurred while running command: Command '['build/bin/llama-cli', '-m', '/root/BitNet/models/Llama3-8B-1.58-100B-tokens/ggml-model-tl1.gguf', '-n', '128', '-t', '16', '-p', 'AI is going to', '-ngl', '0', '-c', '2048', '--temp', '0.8', '-b', '1']' died with <Signals.SIGABRT: 6>. | Generated Text | Type 3 |
32thread | double free or corruption (!prev) Error occurred while running command: Command '['build/bin/llama-cli', '-m', '/root/BitNet/models/Llama3-8B-1.58-100B-tokens/ggml-model-tl1.gguf', '-n', '128', '-t', '32', '-p', 'AI is going to', '-ngl', '0', '-c', '2048', '--temp', '0.8', '-b', '1']' died with <Signals.SIGABRT: 6>. | Generated Text | Type 3 |
64thread | free(): invalid next size (normal) Error occurred while running command: Command '['build/bin/llama-cli', '-m', '/root/BitNet/models/Llama3-8B-1.58-100B-tokens/ggml-model-tl1.gguf', '-n', '128', '-t', '64', '-p', 'AI is going to', '-ngl', '0', '-c', '2048', '--temp', '0.8', '-b', '1']' died with <Signals.SIGABRT: 6>. | Generated Text | Type 2 |
72thread | double free or corruption (!prev) Error occurred while running command: Command '['build/bin/llama-cli', '-m', '/root/BitNet/models/Llama3-8B-1.58-100B-tokens/ggml-model-tl1.gguf', '-n', '128', '-t', '72', '-p', 'AI is going to', '-ngl', '0', '-c', '2048', '--temp', '0.8', '-b', '1']' died with <Signals.SIGABRT: 6>. | Generated Text | Type 3 |
Metadata
Metadata
Assignees
Labels
No labels