-
Notifications
You must be signed in to change notification settings - Fork 6.1k
Fix running LoRA with xformers #2286
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
The documentation is not available anymore as the PR was closed or merged. |
Great job @bddppq ! This looks exactly like it should :-) |
Could we maybe add one quick tests that ensures that the two implementations are identical. It should be pretty easy to add as this test:
That would be amazing! Also happy to do it if you're too busy #1877 |
Thanks a lot! |
* Fix running LoRA with xformers * support disabling xformers * reformat * Add test
I found this issue while noticing that my inference times also increased after loading a LoRA. I am using a LoRA from civitai:
Is it normal that now inference takes longer? Should I load the model differently? It is not clear for me from the example if I should first run the model without xformers and then activate it. Thank you very much! |
Nevermind! I found
and this worked! |
* Fix running LoRA with xformers * support disabling xformers * reformat * Add test
* Fix running LoRA with xformers * support disabling xformers * reformat * Add test
#2247
#2124