Skip to content

Issue with emoji decoding in steaming mode, only #57

@CyberTimon

Description

@CyberTimon

When the model wants to output an emoji, this error comes up:

Debugging middleware caught exception in streamed response at a point where response headers were already sent. Traceback (most recent call last): File "C:\Users\zblac\AppData\Local\Programs\Python\Python310\lib\site-packages\werkzeug\wsgi.py", line 500, in __next__ return self._next() File "C:\Users\zblac\AppData\Local\Programs\Python\Python310\lib\site-packages\werkzeug\wrappers\response.py", line 50, in _iter_encoded for item in iterable: File "C:\Users\zblac\llama.cpp\test\normal.py", line 37, in vicuna for line in response: File "C:\Users\zblac\AppData\Local\Programs\Python\Python310\lib\site-packages\llama_cpp\llama.py", line 370, in _create_completion "text": text[start:].decode("utf-8"), UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf0 in position 0: unexpected end of data

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingqualityQuality of model output

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions