Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

k-diffusion-euler #1019

Merged
merged 28 commits into from
Oct 31, 2022
Merged

Conversation

hlky
Copy link
Contributor

@hlky hlky commented Oct 27, 2022

k_euler and k_euler_a

related #944 #636

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Oct 27, 2022

The documentation is not available anymore as the PR was closed or merged.

@hlky hlky mentioned this pull request Oct 27, 2022
@patil-suraj patil-suraj self-assigned this Oct 27, 2022
Copy link
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for the @hlky , very cool addition!

The code looks good to me, will try it out a bit more and comment here. Also it looks like the new tests are failing, happy to take a look at it.

But should be good to merge for me once the tests are green!

@patil-suraj
Copy link
Contributor

Hey @hlky would it be okay if I go push commits to the PR to fix the tests ? The PR should be good to merge once the tests pass :)

@hlky
Copy link
Contributor Author

hlky commented Oct 28, 2022

@patil-suraj Sure, no problem :)

Copy link
Member

@anton-l anton-l left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @hlky, excited to see these schedulers in action!

@patil-suraj
Copy link
Contributor

patil-suraj commented Oct 28, 2022

Hey @hlky , it seems like I can't push to your branch, if it's okay with you, could you allow me to push commits to your branch so I could fix the tests ?

Otherwise, happy to merge this and open a follow-up PR to fix the tests. Let me know. Thank you :)

@tonetechnician
Copy link

tonetechnician commented Oct 28, 2022

This is great! Thanks @hlky and @patil-suraj

One thing I noticed is that in the scheduler's step function you're generating random noise, and not seeded noise. This causes the output to be non-deterministic even when the same seed is used.

I've made a commit here which I'd be happy to push to this branch to your PR if you give access to me too. Or can make a PR to your repo. Or can wait for merge and then make a new PR with the modification, it's not a big one.

One thought though is at each step a new "random" noise should be generated in which case I think maybe we can have some kind of cyclical counter that offsets the generator's seed in the step function so it's continually random. I'm not sure it'll affect the quality of the result doing this, but would affect output

@patil-suraj
Copy link
Contributor

Hey @tonetechnician you are right, and actually that's why the tests were failing. If we pass the generator we need to make sure that it is generic across all schedulers as we want all schedulers in diffusers to be interchangeable. Let's wait to see what @patrickvonplaten @anton-l @pcuenca has to say. I have proposed two ways on the comment above.

Also fine to merge this now, and open a follow-up PR to add this :)

One thought though is at each step a new "random" noise should be generated in which case I think maybe we can have some kind of cyclical counter that offsets the generator's seed in the step function so it's continually random. I'm not sure it'll affect the quality of the result doing this, but would affect output

Very good point! The cool thing about Generator is that, each time it's used in a random function it's state gets advanced, so if we pass generator we can be sure that it will be random each time.

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
Copy link
Member

@anton-l anton-l left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Amazing work @hlky and @patil-suraj 🥳

@patrickvonplaten
Copy link
Contributor

Merging as it blocks #890

@patrickvonplaten patrickvonplaten merged commit a1ea8c0 into huggingface:main Oct 31, 2022
@patil-suraj
Copy link
Contributor

Thanks a lot @hlky and @tonetechnician !

@WASasquatch
Copy link

WASasquatch commented Nov 2, 2022

I hope you guys know Euler_A is non-determinisitc in this implementation, and do not work with generator seeds. Random result every time.

@andrea-gatto
Copy link

@WASasquatch I'm able to get reproducible results on GPU. You need to set your generator
generator = torch.Generator("cuda").manual_seed(seed)
and pass it to the pipeline
You'll get an error because here (and same for euler ancestral)

noise = torch.randn(model_output.shape, dtype=model_output.dtype, generator=generator).to(device)

torch.randn expects the generator to be on cpu. This can be fixed by changing the line to
noise = torch.randn(model_output.shape, dtype=model_output.dtype, generator=generator, device=device)

cc @patil-suraj

@patil-suraj
Copy link
Contributor

Great catch @andrea-gatto !

@geocine
Copy link

geocine commented Nov 3, 2022

Are you making a PR @andrea-gatto ? I also experienced this just now

@patil-suraj
Copy link
Contributor

PR is here #1124

@WASasquatch
Copy link

WASasquatch commented Nov 4, 2022

@WASasquatch I'm able to get reproducible results on GPU. You need to set your generator generator = torch.Generator("cuda").manual_seed(seed) and pass it to the pipeline You'll get an error because here (and same for euler ancestral)

noise = torch.randn(model_output.shape, dtype=model_output.dtype, generator=generator).to(device)

torch.randn expects the generator to be on cpu. This can be fixed by changing the line to
noise = torch.randn(model_output.shape, dtype=model_output.dtype, generator=generator, device=device)
cc @patil-suraj

The issue isn't an error. It runs without error, the results for Euler_A are just random, while every other sampler works fine with torch generator. The code is fine, it's simple Euler_A isn't deterministic with a set seed like the rest. Though I did merge code hours before it was officially merged with a fork of Diffusers, as I was tired of waiting. ;) But I don't see any differences in files to current state on my end so not sure what it'd be with that specific sampler.

@andrea-gatto
Copy link

@WASasquatch I get deterministic results with Euler Ancestral on GPU following that fix, maybe the fork you were using wasn't exactly the same (which could explain why you didn't get the error mentioned above), give diffusers==0.7.0 a try :)

@WASasquatch
Copy link

WASasquatch commented Nov 4, 2022 via email

PhaneeshB pushed a commit to nod-ai/diffusers that referenced this pull request Mar 1, 2023
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
* k-diffusion-euler

* make style make quality

* make fix-copies

* fix tests for euler a

* Update src/diffusers/schedulers/scheduling_euler_ancestral_discrete.py

Co-authored-by: Anton Lozhkov <aglozhkov@gmail.com>

* Update src/diffusers/schedulers/scheduling_euler_ancestral_discrete.py

Co-authored-by: Anton Lozhkov <aglozhkov@gmail.com>

* Update src/diffusers/schedulers/scheduling_euler_discrete.py

Co-authored-by: Anton Lozhkov <aglozhkov@gmail.com>

* Update src/diffusers/schedulers/scheduling_euler_discrete.py

Co-authored-by: Anton Lozhkov <aglozhkov@gmail.com>

* remove unused arg and method

* update doc

* quality

* make flake happy

* use logger instead of warn

* raise error instead of deprication

* don't require scipy

* pass generator in step

* fix tests

* Apply suggestions from code review

Co-authored-by: Pedro Cuenca <pedro@huggingface.co>

* Update tests/test_scheduler.py

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>

* remove unused generator

* pass generator as extra_step_kwargs

* update tests

* pass generator as kwarg

* pass generator as kwarg

* quality

* fix test for lms

* fix tests

Co-authored-by: patil-suraj <surajp815@gmail.com>
Co-authored-by: Anton Lozhkov <aglozhkov@gmail.com>
Co-authored-by: Pedro Cuenca <pedro@huggingface.co>
Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants