You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The current batch inference code is only applicable to the patch32 model. When using other models such as patch16 or patch14, it produces incorrect results. Specifically, the behavior is such that the first embedding result in a batch is correct, but all subsequent results are a single incorrect fixed value.