You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To make it work with OFT, I initially tried adding "NetworkModuleOFT": "oft_blocks" to LORAANDSOON. This runs but does not give the good results because OFT performs multiplication instead of addition.
What really fixes the problem is to add this in line 981
if ltype == "NetworkModuleOFT":
oft_parameters = torch.nn.Parameter(getattr(lora.modules[key],LORAANDSOON[ltype]))
I = torch.eye(oft_parameters.size()[1]).to(oft_parameters.device)
oft_parameters = oft_parameters * ratio + I * (1-ratio)
setattr(lora.modules[key], "oft_blocks", oft_parameters)
set = True
By the way, the same issue for bundle format (i.e. an additional network with built-in embeddings) that I mentioned earlier for regional prompter hako-mikan/sd-webui-regional-prompter#258 applies to LBW as well. This is not so critical but may worth some investigation if you have some time.
Thank you for the great plugins in any case.
The text was updated successfully, but these errors were encountered:
cyber-meow
changed the title
Support for oft (and problem with bundle format)
Support for OFT (and problem with bundle format)
Dec 14, 2023
OFT from Controlling Text-to-Image Diffusion by Orthogonal Finetuning is now working in webui after Kohaku fixes inference: AUTOMATIC1111/stable-diffusion-webui#14300
To make it work with OFT, I initially tried adding
"NetworkModuleOFT": "oft_blocks"
toLORAANDSOON
. This runs but does not give the good results because OFT performs multiplication instead of addition.What really fixes the problem is to add this in line 981
Illustration:
This is from this OFT bundle: https://civitai.com/models/173081
By the way, the same issue for bundle format (i.e. an additional network with built-in embeddings) that I mentioned earlier for regional prompter hako-mikan/sd-webui-regional-prompter#258 applies to LBW as well. This is not so critical but may worth some investigation if you have some time.
Thank you for the great plugins in any case.
The text was updated successfully, but these errors were encountered: