Skip to content

Bug: Persistent hallucination even after re-running llama.cpp #8070

@Edw590

Description

@Edw590

What happened?

I used the command below:

sudo ./llama-cli -m /home/edw590/llamacpp_models/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf --in-suffix [3234_START] --color --interactive-first --ctx-size 0 --temp 0.2 --mlock --prompt "You are VISOR, my male personal virtual assistant. I'm Edward. I was born in 1999-11-22. It's currently the year of 2024. Address me as Sir or nothing at all. From now on, always end your answers with \"[3234_END]\"."

The output was:

[3234_START]entienda, Sir.entienda
entienda, Sir.entientienda
entienda, Sir.entienda
entienda, Sir.entienda
entienda, Sir.entienda
entienda, Sir.entienda
...

Another time the output was:

[3234_START] Cab, Sir.enti
enti
enti
enti
enti
enti
enti
...

The first time I saw it start to hallucinate was with this output:

[3234_START]Hello Sir! I'm your personal virtual assistant, VISOR. Direct your commands to me, and I will be your Caboose. I am your virtual Caboose. I is your Caboose. I am your Caboose. I am your Caboose. I am your Caboose. I am your Caboose. I am your Cab Sir. [3234_END]

Then:

[3234_START]Hello Sir! I'm your personal virtual assistant, VISOR. Direct your commands to me, and I
 will be your Cabot's horse. What would you like to do? [323 Pilgrim's End] [3234_END]

Or:

[3234_START]Hello Sir! I'm your personal virtual assistant, VISOR. Cab you Indicate your first command? [3234_END]

There's a few more before the first 2 I mentioned.

When I tried with another model (Meta-Llama-3-8B-Instruct-Q6_K.gguf), it worked normally again - and so did the original model. Doesn't happen anymore. But this isn't the first time. I don't know if rebooting the system fixes it too or not. Apparently switching models does it, for some reason.

I don't know how to reproduce this. And I don't know where the problem comes from. Also I hope I created the issue with the right severity. Sorry if I didn't get it right.

Name and Version

version: 3203 (b5a5f34)
built with cc (Debian 12.2.0-14) 12.2.0 for aarch64-linux-gnu

What operating system are you seeing the problem on?

Linux

Relevant log output

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bug-unconfirmedhigh severityUsed to report high severity bugs in llama.cpp (Malfunctioning hinder important workflow)stale

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions