For the LoRA and other attention-block related tests, we need to have `attn_processors` implemented for the `UNet3DConditionModel`. `tests/models/test_models_unet_3d_condition.py` has sufficient comments about these.