Skip to content

Check for reverse prompt by characters instead of tokens (#292) #330

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Mar 21, 2023

Conversation

tjohnman
Copy link
Contributor

@tjohnman tjohnman commented Mar 20, 2023

This fixes bug #292 as suggested here.

@gjmulder gjmulder added the bug Something isn't working label Mar 20, 2023
@Green-Sky
Copy link
Collaborator

Green-Sky commented Mar 20, 2023

Not sure we need the string stream here.

@tjohnman
Copy link
Contributor Author

tjohnman commented Mar 20, 2023

Not sure we need the string stream here.

@Green-Sky Should we use std::string::append() or the sum operator instead? What do you suggest as the most efficient alternative?

EDIT: I used addition. Please let me know if there are any other issues.

@ggerganov ggerganov merged commit 6a61295 into ggml-org:master Mar 21, 2023
glinscott pushed a commit to glinscott/llama.cpp that referenced this pull request Mar 21, 2023
… (ggml-org#330)

* Check for reverse prompt by characters instead of tokens (ggml-org#292)

* Update main.cpp

Wording.

* Cleanup.

* Remove unnecessary use of std::stringstream.

---------

Co-authored-by: Johnman <tjohnman@github>
Co-authored-by: Georgi Gerganov <[email protected]>
@tjohnman tjohnman deleted the bugfix-292 branch March 21, 2023 16:26
ggerganov added a commit that referenced this pull request Mar 23, 2023
…OS in interactive mode (#333)

* Improve interactive mode's coherence after EOS

Aims to improve coherence and ability to resume the interactive session when the user is given input back after an end of text token is reached.
Not sure what token 13 is or why it seems to help. See conversation for examples.

* Make newline token a constant

* dynamically determine newline token

* relocate previous newline token const

* cleanup whitespace

* print a new line on end of text in interactive

this may need to be looked into further when not using a reverse prompt

* only print manual newline with reverse prompt

fix formatting of reverse prompts so they don't end up at the end of the current line while not introducing unnecessary new lines otherwise

* alternate approach to replace end of text tokens

* Inject the reverse prompt again after eos in interactive mode

* tokenize reverse prompt when needed

makes this PR compatible with #330

* tokenize and inject only first reverse prompt

thanks to tjohnman

* tokenize first reverse prompt once

* add newline token

* add newline token

* tokenize/inject reverse prompt for refactor

this doesn't seem right though

* tokenize nothing for antiprompt if no reverse

* Update main.cpp

* Update main.cpp

* tokenize and inject reverse prompt as needed

this doesn't seem to work if the reverse prompt is tokenized outside earlier on

* not needed

* remove newline token

* remove newline token

* tokenize newline token

* add space to comment

* Update main.cpp

Co-authored-by: Georgi Gerganov <[email protected]>

---------

Co-authored-by: Slaren <[email protected]>
Co-authored-by: Georgi Gerganov <[email protected]>
aroidzap pushed a commit to aroidzap/llama.cpp that referenced this pull request Apr 8, 2023
… (ggml-org#330)

* Check for reverse prompt by characters instead of tokens (ggml-org#292)

* Update main.cpp

Wording.

* Cleanup.

* Remove unnecessary use of std::stringstream.

---------

Co-authored-by: Johnman <tjohnman@github>
Co-authored-by: Georgi Gerganov <[email protected]>
aroidzap pushed a commit to aroidzap/llama.cpp that referenced this pull request Apr 8, 2023
… (ggml-org#330)

* Check for reverse prompt by characters instead of tokens (ggml-org#292)

* Update main.cpp

Wording.

* Cleanup.

* Remove unnecessary use of std::stringstream.

---------

Co-authored-by: Johnman <tjohnman@github>
Co-authored-by: Georgi Gerganov <[email protected]>
aroidzap pushed a commit to aroidzap/llama.cpp that referenced this pull request Apr 8, 2023
… (ggml-org#330)

* Check for reverse prompt by characters instead of tokens (ggml-org#292)

* Update main.cpp

Wording.

* Cleanup.

* Remove unnecessary use of std::stringstream.

---------

Co-authored-by: Johnman <tjohnman@github>
Co-authored-by: Georgi Gerganov <[email protected]>
Deadsg pushed a commit to Deadsg/llama.cpp that referenced this pull request Dec 19, 2023
YuMJie pushed a commit to YuMJie/powerinfer that referenced this pull request Oct 25, 2024
…OS in interactive mode (#333)

* Improve interactive mode's coherence after EOS

Aims to improve coherence and ability to resume the interactive session when the user is given input back after an end of text token is reached.
Not sure what token 13 is or why it seems to help. See conversation for examples.

* Make newline token a constant

* dynamically determine newline token

* relocate previous newline token const

* cleanup whitespace

* print a new line on end of text in interactive

this may need to be looked into further when not using a reverse prompt

* only print manual newline with reverse prompt

fix formatting of reverse prompts so they don't end up at the end of the current line while not introducing unnecessary new lines otherwise

* alternate approach to replace end of text tokens

* Inject the reverse prompt again after eos in interactive mode

* tokenize reverse prompt when needed

makes this PR compatible with ggml-org/llama.cpp#330

* tokenize and inject only first reverse prompt

thanks to tjohnman

* tokenize first reverse prompt once

* add newline token

* add newline token

* tokenize/inject reverse prompt for refactor

this doesn't seem right though

* tokenize nothing for antiprompt if no reverse

* Update main.cpp

* Update main.cpp

* tokenize and inject reverse prompt as needed

this doesn't seem to work if the reverse prompt is tokenized outside earlier on

* not needed

* remove newline token

* remove newline token

* tokenize newline token

* add space to comment

* Update main.cpp

Co-authored-by: Georgi Gerganov <[email protected]>

---------

Co-authored-by: Slaren <[email protected]>
Co-authored-by: Georgi Gerganov <[email protected]>
poulphunter pushed a commit to poulphunter/llama.cpp that referenced this pull request Feb 23, 2025
… (ggml-org#330)

* Check for reverse prompt by characters instead of tokens (ggml-org#292)

* Update main.cpp

Wording.

* Cleanup.

* Remove unnecessary use of std::stringstream.

---------

Co-authored-by: Johnman <tjohnman@github>
Co-authored-by: Georgi Gerganov <[email protected]>
poulphunter added a commit to poulphunter/llama.cpp that referenced this pull request Feb 26, 2025
… ideas

* add translations packages

* add translations packages

* begining of translation

* Check for reverse prompt by characters instead of tokens (ggml-org#292) (ggml-org#330)

* Check for reverse prompt by characters instead of tokens (ggml-org#292)

* Update main.cpp

Wording.

* Cleanup.

* Remove unnecessary use of std::stringstream.

---------

Co-authored-by: Johnman <tjohnman@github>
Co-authored-by: Georgi Gerganov <[email protected]>

* metal : fix `ggml_metal_log` vargs (ggml-org#4373)

* docs: utilize the forward slash (/) as the path separator for Unix-like systems (ggml-org#11770)

* close dropdown Menu function

* no export here

* automatic close sidebar when choice is made (mobile UX improvement)

* fix closeDropDownMenu

* Translation config in Header now

* Translation config in Header now

* French promt example

* continue translation

* continue translation

* change refreshing when language is selected
will use first config in json file if present and if in the correct language

* Add languages and translations

* fix loading when no prompt from language is found

* fix click on manual settings if already checked in dropdown

* rename Sidebar to ConversationList, better understanding, refactor code to have all that concerns this component in the same place

* rename Sidebar to ConversationList, better understanding, refactor code to have all that concerns this component in the same place

* rename Sidebar to ConversationList, better understanding, refactor code to have all that concerns this component in the same place

* UI improvements
UX is easier to understand

* json reformat

* UI buttons margins

* configs and language selection in appcontext, improve UI

* continue translation

* add favicon (no more console log errors)

* start changing Setting Dialog

* fix color in light/auto theme

* fix color in light/auto theme

* UX/UI improvements, no more drawer

* code refactor

* code refactor, continue translation, UX/UI improvements

* fix key

* format

* format

* loading / save presets

* translations

* build

* embed translations

* remove log

* code refactor, main functions in app.context

* build

* fix not needed, revert

* New README.md for Web UI app.

* prompts renamed to presets

* favicon

* remove unused parameter in json preset files

* add favicon (no more console logs)

* new build

* readme and screenshoots

---------

Co-authored-by: Votre Nom <[email protected]>
Co-authored-by: tjohnman <[email protected]>
Co-authored-by: Johnman <tjohnman@github>
Co-authored-by: Georgi Gerganov <[email protected]>
Co-authored-by: Finn Voorhees <[email protected]>
Co-authored-by: jason_w <[email protected]>
Co-authored-by: poulphunter <>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants