Skip to content

Commit 56f16d6

Browse files
authored
Update SD readme
1 parent 7a55ab9 commit 56f16d6

File tree

1 file changed

+36
-24
lines changed
  • shark/examples/shark_inference/stable_diffusion

1 file changed

+36
-24
lines changed

shark/examples/shark_inference/stable_diffusion/README.md

Lines changed: 36 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,41 @@
44

55
Follow setup instructions in the main [README.md](https://github.com/nod-ai/SHARK#readme) for regular usage.
66

7+
8+
## Using other supported Stable Diffusion variants with SHARK:
9+
10+
Currently we support fine-tuned versions of Stable Diffusion such as:
11+
- [AnythingV3](https://huggingface.co/Linaqruf/anything-v3.0)
12+
- [Analog Diffusion](https://huggingface.co/wavymulder/Analog-Diffusion)
13+
14+
use the flag `--hf_model_id=` to specify the repo-id of the model to be used.
15+
16+
```shell
17+
python .\shark\examples\shark_inference\stable_diffusion\main.py --hf_model_id="Linaqruf/anything-v3.0" --max_length=77 --prompt="1girl, brown hair, green eyes, colorful, autumn, cumulonimbus clouds, lighting, blue sky, falling leaves, garden"
18+
```
19+
20+
## Run a custom model using a `.ckpt` file:
21+
* Install the following by running :-
22+
```shell
23+
pip install omegaconf safetensors pytorch_lightning
24+
```
25+
* Download a [.ckpt](https://huggingface.co/andite/anything-v4.0/resolve/main/anything-v4.0-pruned-fp32.ckpt) file in case you don't have a locally generated `.ckpt` file for StableDiffusion.
26+
27+
* Now pass the above `.ckpt` file to `ckpt_loc` command-line argument using the following :-
28+
```shell
29+
python3.10 main.py --precision=fp16 --device=vulkan --prompt="tajmahal, oil on canvas, sunflowers, 4k, uhd" --max_length=64 --import_mlir --ckpt_loc="/path/to/.ckpt/file" --hf_model_id="<HuggingFace repo-id>"
30+
```
31+
* We use a combination of 3 flags to make this feature work : `import_mlir`, `ckpt_loc` and `hf_model_id`, of which `import_mlir` needs to be present. In case `ckpt_loc` is not specified then a [default](https://huggingface.co/stabilityai/stable-diffusion-2-1-base) HuggingFace repo-id is run via `hf_model_id`. So, you need to specify which base model's `.ckpt` you are using via `hf_model_id`.
32+
33+
* Use custom model `.ckpt` files from [HuggingFace-StableDiffusion](https://huggingface.co/models?other=stable-diffusion) to generate images. And in case you want to use any variants from HuggingFace then add the mapping of the variant to their base model in [variants.json](https://github.com/nod-ai/SHARK/blob/main/shark/examples/shark_inference/stable_diffusion/resources/variants.json).
34+
35+
36+
37+
38+
</details>
39+
<details>
40+
<summary>Debug Commands</summary>
41+
742
## Debug commands and other advanced usage follows.
843

944
```shell
@@ -43,27 +78,4 @@ unzip ~/.local/shark_tank/<your unet>/inputs.npz
4378
iree-benchmark-module --module_file=/path/to/output/vmfb --entry_function=forward --function_input=@arr_0.npy --function_input=1xf16 --function_input=@arr_2.npy --function_input=@arr_3.npy --function_input=@arr_4.npy
4479
```
4580

46-
## Using other supported Stable Diffusion variants with SHARK:
47-
48-
Currently we support the following fine-tuned versions of Stable Diffusion:
49-
- [AnythingV3](https://huggingface.co/Linaqruf/anything-v3.0)
50-
- [Analog Diffusion](https://huggingface.co/wavymulder/Analog-Diffusion)
51-
52-
use the flag `--hf_model_id=` to specify the repo-id of the model to be used.
53-
54-
```shell
55-
python .\shark\examples\shark_inference\stable_diffusion\main.py --hf_model_id="Linaqruf/anything-v3.0" --max_length=77 --prompt="1girl, brown hair, green eyes, colorful, autumn, cumulonimbus clouds, lighting, blue sky, falling leaves, garden"
56-
```
57-
58-
## Using `ckpt_loc` argument to run a custom model using a `.ckpt` file:
59-
* Install the following by running :-
60-
```shell
61-
pip install omegaconf safetensors pytorch_lightning
62-
```
63-
* To try this feature you may download a [.ckpt](https://huggingface.co/andite/anything-v4.0/resolve/main/anything-v4.0-pruned-fp32.ckpt) file in case you don't have a locally generated `.ckpt` file for StableDiffusion.
64-
* Now pass the above `.ckpt` file to `ckpt_loc` command-line argument using the following :-
65-
```shell
66-
python3.10 main.py --precision=fp16 --device=vulkan --prompt="tajmahal, oil on canvas, sunflowers, 4k, uhd" --max_length=64 --import_mlir --ckpt_loc="/path/to/.ckpt/file" --hf_model_id="<HuggingFace repo-id>"
67-
```
68-
* We use a combination of 3 flags to make this feature work : `import_mlir`, `ckpt_loc` and `hf_model_id`, of which `import_mlir` needs to be present. In case `ckpt_loc` is not specified then a [default](https://huggingface.co/stabilityai/stable-diffusion-2-1-base) HuggingFace repo-id is run via `hf_model_id`. So, you need to specify which base model's `.ckpt` you are using via `hf_model_id`.
69-
* Use custom model `.ckpt` files from [HuggingFace-StableDiffusion](https://huggingface.co/models?other=stable-diffusion) to generate images. And in case you want to use any variants from HuggingFace then add the mapping of the variant to their base model in [variants.json](https://github.com/nod-ai/SHARK/blob/main/shark/examples/shark_inference/stable_diffusion/resources/variants.json).
81+
</details>

0 commit comments

Comments
 (0)