Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Could not resize my SDXL LoCON #3

Open
DarkAlchy opened this issue May 28, 2024 · 2 comments
Open

Could not resize my SDXL LoCON #3

DarkAlchy opened this issue May 28, 2024 · 2 comments

Comments

@DarkAlchy
Copy link

D:\resize_lora>python resize_lora.py F:/stable-diffusion-webui/models/Stable-diffusion/sd_xl_base_1.0.safetensors F:/stable-diffusion-webui/models/Lora/123_XL_V1.safetensors -o .\ -v -r fro_ckpt=1,thr=-2.0
INFO:root:Processing LoRA model: F:/stable-diffusion-webui/models/Lora/Claymation_XL_V1.safetensors
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.0.0.weight' (320, 4, 3, 3), expected LoRA key: 'lora_unet_input_blocks_0_0'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.1.0.emb_layers.1.weight' (320, 1280), expected LoRA key: 'lora_unet_input_blocks_1_0_emb_layers_1'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.1.0.in_layers.2.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_input_blocks_1_0_in_layers_2'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.1.0.out_layers.3.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_input_blocks_1_0_out_layers_3'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.2.0.emb_layers.1.weight' (320, 1280), expected LoRA key: 'lora_unet_input_blocks_2_0_emb_layers_1'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.2.0.in_layers.2.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_input_blocks_2_0_in_layers_2'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.2.0.out_layers.3.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_input_blocks_2_0_out_layers_3'
INFO:root:No LoRA layer for 'model.diffusion_model.input_blocks.3.0.op.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_input_blocks_3_0_op'
INFO:root:No LoRA layer for 'model.diffusion_model.label_emb.0.0.weight' (1280, 2816), expected LoRA key: 'lora_unet_label_emb_0_0'
INFO:root:No LoRA layer for 'model.diffusion_model.label_emb.0.2.weight' (1280, 1280), expected LoRA key: 'lora_unet_label_emb_0_2'
INFO:root:No LoRA layer for 'model.diffusion_model.out.2.weight' (4, 320, 3, 3), expected LoRA key: 'lora_unet_out_2'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.6.0.emb_layers.1.weight' (320, 1280), expected LoRA key: 'lora_unet_output_blocks_6_0_emb_layers_1'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.6.0.in_layers.2.weight' (320, 960, 3, 3), expected LoRA key: 'lora_unet_output_blocks_6_0_in_layers_2'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.6.0.out_layers.3.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_output_blocks_6_0_out_layers_3'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.6.0.skip_connection.weight' (320, 960, 1, 1), expected LoRA key: 'lora_unet_output_blocks_6_0_skip_connection'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.7.0.emb_layers.1.weight' (320, 1280), expected LoRA key: 'lora_unet_output_blocks_7_0_emb_layers_1'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.7.0.in_layers.2.weight' (320, 640, 3, 3), expected LoRA key: 'lora_unet_output_blocks_7_0_in_layers_2'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.7.0.out_layers.3.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_output_blocks_7_0_out_layers_3'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.7.0.skip_connection.weight' (320, 640, 1, 1), expected LoRA key: 'lora_unet_output_blocks_7_0_skip_connection'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.8.0.emb_layers.1.weight' (320, 1280), expected LoRA key: 'lora_unet_output_blocks_8_0_emb_layers_1'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.8.0.in_layers.2.weight' (320, 640, 3, 3), expected LoRA key: 'lora_unet_output_blocks_8_0_in_layers_2'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.8.0.out_layers.3.weight' (320, 320, 3, 3), expected LoRA key: 'lora_unet_output_blocks_8_0_out_layers_3'
INFO:root:No LoRA layer for 'model.diffusion_model.output_blocks.8.0.skip_connection.weight' (320, 640, 1, 1), expected LoRA key: 'lora_unet_output_blocks_8_0_skip_connection'
INFO:root:No LoRA layer for 'model.diffusion_model.time_embed.0.weight' (1280, 320), expected LoRA key: 'lora_unet_time_embed_0'
INFO:root:No LoRA layer for 'model.diffusion_model.time_embed.2.weight' (1280, 1280), expected LoRA key: 'lora_unet_time_embed_2'
Traceback (most recent call last):
File "D:\resize_lora\resize_lora.py", line 314, in
main()
File "D:\resize_lora\resize_lora.py", line 301, in main
paired = PairedLoraModel(lora_model_path, checkpoint)
File "D:\resize_lora\loralib_init_.py", line 120, in init
raise ValueError(f"Target layer not found for LoRA {lora_layer_keys}")
ValueError: Target layer not found for LoRA lora_unet_input_blocks_1_0_emb_layers_1.diff

@elias-gaeros
Copy link
Owner

Thank for reporting this!

Could you tell me more about how this LoCon was trained? Is the safetensors available online?

@DarkAlchy
Copy link
Author

WOW, I am not sure what is wrong with github as some of my tickets, including this one, is not emailing me to tell me I had a response. Not online as it was one I had just trained using Kohya.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants