Skip to content

The demo is giving a runtime error #27

@gulcegumrukcu

Description

@gulcegumrukcu

https://huggingface.co/spaces/ymzhang319/FoleyCrafter

The demo is giving a runtime error:

runtime error
Exit code: 1. Reason: eback (most recent call last):
  File "/home/user/app/app.py", line 310, in <module>
    gr.Examples(
  File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 74, in create_examples
    examples_obj.create()
  File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 314, in create
    self._start_caching()
  File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 365, in _start_caching
    client_utils.synchronize_async(self.cache)
  File "/usr/local/lib/python3.10/site-packages/gradio_client/utils.py", line 855, in synchronize_async
    return fsspec.asyn.sync(fsspec.asyn.get_loop(), func, *args, **kwargs)  # type: ignore
  File "/usr/local/lib/python3.10/site-packages/fsspec/asyn.py", line 103, in sync
    raise return_result
  File "/usr/local/lib/python3.10/site-packages/fsspec/asyn.py", line 56, in _runner
    result[0] = await coro
  File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 487, in cache
    prediction = await Context.root_block.process_api(
  File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
    result = await self.call_function(
  File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1514, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
  File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2441, in run_sync_in_worker_thread
    return await future
  File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 943, in run
    result = context.run(func, *args)
  File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 833, in wrapper
    response = f(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/spaces/zero/wrappers.py", line 211, in gradio_handler
    raise gr.Error("GPU task aborted")
gradio.exceptions.Error: 'GPU task aborted'
Container logs:

===== Application Startup at 2024-10-31 01:31:42 =====

Start Load Models...
Start Load Models...
Removing weight norm...
=> loading checkpoint '/home/user/app/models/timestamp_detector.pth.tar'
=> loaded checkpoint '/home/user/app/models/timestamp_detector.pth.tar' (epoch 23)
Cannot initialize model with low cpu memory usage because `accelerate` was not found in the environment. Defaulting to `low_cpu_mem_usage=False`. It is strongly recommended to install `accelerate` for faster and less memory-intense model loading. You can do so with: 

pip install accelerate

.
Cannot initialize model with low cpu memory usage because `accelerate` was not found in the environment. Defaulting to `low_cpu_mem_usage=False`. It is strongly recommended to install `accelerate` for faster and less memory-intense model loading. You can do so with: 

pip install accelerate

.
The config attributes {'decay': 0.9999, 'inv_gamma': 1.0, 'min_decay': 0.0, 'optimization_step': 100000, 'power': 0.6666666666666666, 'update_after_step': 0, 'use_ema_warmup': False} were passed to UNet2DConditionModel, but are not expected and will be ignored. Please verify your config.json configuration file.
### Control Net missing keys: 0; 
### unexpected keys: 0;
Cannot initialize model with low cpu memory usage because `accelerate` was not found in the environment. Defaulting to `low_cpu_mem_usage=False`. It is strongly recommended to install `accelerate` for faster and less memory-intense model loading. You can do so with: 

pip install accelerate

.
image_encoder is not loaded since `image_encoder_folder=None` passed. You will not be able to use `ip_adapter_image` when calling the pipeline with IP-Adapter.Use `ip_adapter_image_embeds` to pass pre-generated image embedding instead.
Load Finish!
Load Finish!
Caching examples at: '/home/user/app/gradio_cached_examples/36'
Caching example 1/4
Traceback (most recent call last):
  File "/home/user/app/app.py", line 310, in <module>
    gr.Examples(
  File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 74, in create_examples
    examples_obj.create()
  File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 314, in create
    self._start_caching()
  File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 365, in _start_caching
    client_utils.synchronize_async(self.cache)
  File "/usr/local/lib/python3.10/site-packages/gradio_client/utils.py", line 855, in synchronize_async
    return fsspec.asyn.sync(fsspec.asyn.get_loop(), func, *args, **kwargs)  # type: ignore
  File "/usr/local/lib/python3.10/site-packages/fsspec/asyn.py", line 103, in sync
    raise return_result
  File "/usr/local/lib/python3.10/site-packages/fsspec/asyn.py", line 56, in _runner
    result[0] = await coro
  File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 487, in cache
    prediction = await Context.root_block.process_api(
  File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
    result = await self.call_function(
  File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1514, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
  File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2441, in run_sync_in_worker_thread
    return await future
  File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 943, in run
    result = context.run(func, *args)
  File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 833, in wrapper
    response = f(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/spaces/zero/wrappers.py", line 211, in gradio_handler
    raise gr.Error("GPU task aborted")
gradio.exceptions.Error: 'GPU task aborted'
Start Load Models...
Start Load Models...

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions