Skip to content

low_level_api_chat_cpp.py: Fix missing antiprompt output in chat. #274

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 26, 2023

Conversation

dmahurin
Copy link

low level Chat.py and low_level_api_chat_cpp.py do not output the antiprompt when end of text is reached. Example below. Without the fix, sometimes the antiprompt is output. Most times not.

The fix appends the antiprompt to embd when the end of text input token is received.

--

USER: Can I ask a question?
ChatLLaMa: Of course! Please go ahead with your question.
Can I ask another question?
ChatLLaMa: Yes, you can ask as many questions as you like.
USER:

@dmahurin dmahurin force-pushed the fix-missing-antiprompt branch from acaa5e3 to 0fa2ec4 Compare May 26, 2023 13:55
@dmahurin dmahurin changed the title low_level_api_chat_cpp.py: Fix antiprompt output in chat. low_level_api_chat_cpp.py: Fix missing antiprompt output in chat. May 26, 2023
@abetlen abetlen merged commit 34ad71f into abetlen:main May 26, 2023
xaptronic pushed a commit to xaptronic/llama-cpp-python that referenced this pull request Jun 13, 2023
…etlen#274)

LLaMA doesn't support more than 2048 token context sizes, and going above that produces terrible results.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants