Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add lora-embedding bundle system #13568

Merged
merged 6 commits into from
Oct 14, 2023
Merged

Add lora-embedding bundle system #13568

merged 6 commits into from
Oct 14, 2023

Conversation

KohakuBlueleaf
Copy link
Collaborator

Description

Some Trainer (like hcp-diffusion, naifu) have implemented a method of training called "pivtoal tuning", which is basically trained Embedding and LoRA(DreamBooth) at the same time.
But if the trainer want to train multiple concept within one model, it will produce lot of embedding files which is not very good for user and trainer to managing them.

So I developed this bundle system which can store the bundled embedding withing the lora files and load them with built-in lora extension.

The state_dict key name format:
"bundle_emb.EMBEDDING_NAME.KEY_NAME"
The KEY_NAME here means the "key" in the embedding state_dict, the EMBEDDING_NAME means the "trigger word" of that embedding. With this format we can store multiple embedding within 1 lora file, which is good for pivotal tuning.

Thx to @narugo1992 and @cyber-meow for this idea.

Checklist:

@narugo1992
Copy link

Yes, indeed. Pivotal tuning is a proven effective training method.

Check out my Civitai profile: narugo1992 on Civitai, where over 500 models were trained using LoRA+pt. User feedback indicates excellent quality, currently ranking second on the site.

Clearly, pivotal tuning has demonstrated its effectiveness and potential.

Additionally, we conducted a quantitative analysis of existing pivotal tuning, referenced in this article: Article on pivotal tuning analysis.

@cyber-meow
Copy link

cyber-meow commented Oct 9, 2023

I have written a post in civitai to explain the benefits of pivotal tunining
https://civitai.com/articles/2494/making-better-loras-with-pivotal-tuning

Not to mention that this seems to be the go-to method for many papers:

[1] Kumari, N., Zhang, B., Zhang, R., Shechtman, E., & Zhu, J. Y. (2023). Multi-concept customization of text-to-image diffusion. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 1931-1941).
[2] Smith, J. S., Hsu, Y. C., Zhang, L., Hua, T., Kira, Z., Shen, Y., & Jin, H. (2023). Continual diffusion: Continual customization of text-to-image diffusion with c-lora. arXiv preprint arXiv:2304.06027.
[3] Gu, Y., Wang, X., Wu, J. Z., Shi, Y., Chen, Y., Fan, Z., ... & Shou, M. Z. (2023). Mix-of-Show: Decentralized Low-Rank Adaptation for Multi-Concept Customization of Diffusion Models. arXiv preprint arXiv:2305.18292.
[4] Yoad Tewel, Rinon Gal, Gal Chechik, and Yuval Atzmon. Key-locked rank one editing for text-to-image personalization. In ACM SIGGRAPH 2023 Conference Proceedings, pp. 1–11, 2023.

@KohakuBlueleaf
Copy link
Collaborator Author

Need to add support for strin_to_param dict
draft now

@KohakuBlueleaf KohakuBlueleaf marked this pull request as draft October 10, 2023 03:54
format:
bundle_emb.EMBNAME.string_to_param.KEYNAME
@KohakuBlueleaf KohakuBlueleaf marked this pull request as ready for review October 10, 2023 04:13
Choose standalone embedding (in /embeddings folder) first
@KohakuBlueleaf KohakuBlueleaf changed the title Add lora bundle system Add lora-embedding bundle system Oct 10, 2023
@AUTOMATIC1111
Copy link
Owner

Any files to test this on?

The huge copied block of code in networks.py definitely should not be there. The code in textual_inversion.py should be placed into a function and that function should be used instead. Although if there is an example file to play with I would do it myself.

@KohakuBlueleaf
Copy link
Collaborator Author

@AUTOMATIC1111 Here is the demo of lora-emb bundle for this model

rosmontis_arknights_bundle.zip
(Github dont' allow me to upload .safetensors so I made zip)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants