Skip to content

llama-server : Improve messages bubble shape in RTL #11220

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jan 13, 2025

Conversation

ebraminio
Copy link
Contributor

@ebraminio ebraminio commented Jan 13, 2025

I simply have overlooked message bubble's tail placement for RTL text in #11208 as I use the dark mode (#2414) and that isn't visible there but this fixes it.

This turns

image

To

image

I simply have overlooked message bubble's tail placement for RTL
text as I use the dark mode and that isn't visible there and this
fixes it.
@ngxson ngxson merged commit 504af20 into ggml-org:master Jan 13, 2025
6 checks passed
tinglou pushed a commit to tinglou/llama.cpp that referenced this pull request Feb 13, 2025
I simply have overlooked message bubble's tail placement for RTL
text as I use the dark mode and that isn't visible there and this
fixes it.
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Feb 26, 2025
I simply have overlooked message bubble's tail placement for RTL
text as I use the dark mode and that isn't visible there and this
fixes it.
mglambda pushed a commit to mglambda/llama.cpp that referenced this pull request Mar 8, 2025
I simply have overlooked message bubble's tail placement for RTL
text as I use the dark mode and that isn't visible there and this
fixes it.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants