-
Notifications
You must be signed in to change notification settings - Fork 6.3k
Description
Describe the bug
Hello! I try to use VersatileDiffusionPipeline. Following instructions I get "get_down_block() got an unexpected keyword argument 'resnet_skip_time_act'" error.
In diffusers/pipelines/versatile_diffusion/modeling_text_unet.py line 422 we really have "resnet_skip_time_act=resnet_skip_time_act" in call of get_down_block, but definition of this function def get_down_block in line 27 of the same file have no this parameter.
Reproduction
from diffusers import VersatileDiffusionPipeline
import torch
pipe = VersatileDiffusionPipeline.from_pretrained("shi-labs/versatile-diffusion", torch_dtype=torch.float16)
Logs
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ in <cell line: 5>:5 │
│ │
│ /usr/local/lib/python3.9/dist-packages/diffusers/pipelines/pipeline_utils.py:1008 in │
│ from_pretrained │
│ │
│ 1005 │ │ │ │ loaded_sub_model = passed_class_obj[name] │
│ 1006 │ │ │ else: │
│ 1007 │ │ │ │ # load sub model │
│ ❱ 1008 │ │ │ │ loaded_sub_model = load_sub_model( │
│ 1009 │ │ │ │ │ library_name=library_name, │
│ 1010 │ │ │ │ │ class_name=class_name, │
│ 1011 │ │ │ │ │ importable_classes=importable_classes, │
│ │
│ /usr/local/lib/python3.9/dist-packages/diffusers/pipelines/pipeline_utils.py:444 in │
│ load_sub_model │
│ │
│ 441 │ │
│ 442 │ # check if the module is in a subdirectory │
│ 443 │ if os.path.isdir(os.path.join(cached_folder, name)): │
│ ❱ 444 │ │ loaded_sub_model = load_method(os.path.join(cached_folder, name), **loading_kwar │
│ 445 │ else: │
│ 446 │ │ # else load from the root directory │
│ 447 │ │ loaded_sub_model = load_method(cached_folder, **loading_kwargs) │
│ │
│ /usr/local/lib/python3.9/dist-packages/diffusers/models/modeling_utils.py:571 in from_pretrained │
│ │
│ 568 │ │ │ if low_cpu_mem_usage: │
│ 569 │ │ │ │ # Instantiate model with empty weights │
│ 570 │ │ │ │ with accelerate.init_empty_weights(): │
│ ❱ 571 │ │ │ │ │ model = cls.from_config(config, **unused_kwargs) │
│ 572 │ │ │ │ │
│ 573 │ │ │ │ # if device_map is None, load the state dict and move the params from me │
│ 574 │ │ │ │ if device_map is None: │
│ │
│ /usr/local/lib/python3.9/dist-packages/diffusers/configuration_utils.py:229 in from_config │
│ │
│ 226 │ │ │ │ init_dict[deprecated_kwarg] = unused_kwargs.pop(deprecated_kwarg) │
│ 227 │ │ │
│ 228 │ │ # Return model and optionally state and/or unused_kwargs │
│ ❱ 229 │ │ model = cls(**init_dict) │
│ 230 │ │ │
│ 231 │ │ # make sure to also save config parameters that might be used for compatible cla │
│ 232 │ │ model.register_to_config(**hidden_dict) │
│ │
│ /usr/local/lib/python3.9/dist-packages/diffusers/configuration_utils.py:607 in inner_init │
│ │
│ 604 │ │ ) │
│ 605 │ │ new_kwargs = {**config_init_kwargs, **new_kwargs} │
│ 606 │ │ getattr(self, "register_to_config")(**new_kwargs) │
│ ❱ 607 │ │ init(self, *args, **init_kwargs) │
│ 608 │ │
│ 609 │ return inner_init │
│ 610 │
│ │
│ /usr/local/lib/python3.9/dist-packages/diffusers/pipelines/versatile_diffusion/modeling_text_une │
│ t.py:422 in __init__ │
│ │
│ 419 │ │ │ output_channel = block_out_channels[i] │
│ 420 │ │ │ is_final_block = i == len(block_out_channels) - 1 │
│ 421 │ │ │ │
│ ❱ 422 │ │ │ down_block = get_down_block( │
│ 423 │ │ │ │ down_block_type, │
│ 424 │ │ │ │ num_layers=layers_per_block[i], │
│ 425 │ │ │ │ in_channels=input_channel, │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
TypeError: get_down_block() got an unexpected keyword argument 'resnet_skip_time_act'
System Info
2023-04-20 22:08:24.384765: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
diffusers
version: 0.15.1- Platform: Linux-5.10.147+-x86_64-with-glibc2.31
- Python version: 3.9.16
- PyTorch version (GPU?): 2.0.0+cu118 (True)
- Huggingface_hub version: 0.13.4
- Transformers version: 4.28.1
- Accelerate version: 0.18.0
- xFormers version: not installed