-
Notifications
You must be signed in to change notification settings - Fork 10.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
optimized attention #177
base: main
Are you sure you want to change the base?
The head ref may contain hidden characters: "\u0441ompvis"
optimized attention #177
Conversation
The author has previously claimed this results in a massive reduction in memory usage. If it works, It would be great to get it merged. |
I suggest to have only the actual changes in the pr and not changes to the style of the code (spaces, tabs, etc), which makes the pr way harder to review. |
ldm/modules/attention.py
Outdated
@@ -1,9 +1,10 @@ | |||
from inspect import isfunction |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please don't include style changes such as import rearrangements or whitespace changes in a commit request
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FYI, you can make git fix some of that automagically. See -> https://github.com/lstein/stable-diffusion/blob/main/.gitattributes
Though, not sure how relevant in this case at second glance. >.>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I rolled this back already
okay I will change it back |
I didn't implement the most consequential part (splitting the softmax in two) because M1 Mac is not so VRAM-constrained. but I implemented the reference-freeing, and also freed x earlier.
…CompVis#177 plus some extras from me (del v, del h)
Bugfixes to image generation logic
adapted my optimized attention module for original fork