-
Notifications
You must be signed in to change notification settings - Fork 6.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
resize-convolution instead transposed-convolution to avoid checkerboard artifacts #64
Conversation
…heckerboard artifacts
Did you also try PixelShuffle? |
I am now testing a transformer-net alternative with PixelShuffle using Fast-Neural-Style loss, CycleGAN still have some convergence issues, and the problem seems to be invariant of the capacity |
Thanks for your PR, @botcs ! I took a brief look at the code. This PR contains much more than what you have described. You should definitely clean up stuff like normalization, video, printing, etc. Some questions:
By the way, nn.Upsample is available now. You may also want to uncomment this line and remove the now deprecated nn.UpsamplingBilinear2d? :) |
Yeah, sorry for the inconvenience I will clean the code up in two weeks, and make another PR, with all the listed details - if that is acceptable for you |
Is there any activity on this topic? I'm also interested to implement this feature - at least to get it work ;-) |
@beniroquai I have an implementation here: #190 (bottom of the thread) |
Wow that was quick! Thanks a lot. Did you plan to make a PR in this repo as well? So the only thing is to exchange the resnet code part, right? |
@beniroquai that's right. Hope that it helps! :) |
Thanks! It's working way better now? Maybe stupid question, but Would it
make sense to do the same in the unet?
…On 9 Feb 2018 20:35, "Tongzhou Wang" ***@***.***> wrote:
@beniroquai <https://github.com/beniroquai> that's right. Hope that it
helps! :)
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#64 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AEJOuEo4EI6GOHR3zXebIt8U4ggeGOx3ks5tTJ36gaJpZM4OctGj>
.
|
@beniroquai Maybe, you can try adept the code to unet. It’d be cool if it
also helps unet. If you do some experiment, please let me know the results.
:)
…On Sat, Feb 10, 2018 at 12:28 beniroquai ***@***.***> wrote:
Thanks! It's working way better now? Maybe stupid question, but Would it
make sense to do the same in the unet?
On 9 Feb 2018 20:35, "Tongzhou Wang" ***@***.***> wrote:
> @beniroquai <https://github.com/beniroquai> that's right. Hope that it
> helps! :)
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <
#64 (comment)
>,
> or mute the thread
> <
https://github.com/notifications/unsubscribe-auth/AEJOuEo4EI6GOHR3zXebIt8U4ggeGOx3ks5tTJ36gaJpZM4OctGj
>
> .
>
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#64 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AFaWZV0w1lXfzsGgZrlgJBapM_yQ--KOks5tTdGogaJpZM4OctGj>
.
|
Thanks for getting back! Would you suggest to exchange the convTranspose2D
or do I need to do anything else? I'm not that familiar with the coding in
pytorch though. I'll have a try! Thanks!
|
Seems nearest neighbour upsampling not solving all issues, but bring another one - random color edges: |
@@ -149,21 +163,40 @@ def __init__(self, input_nc, output_nc, ngf=64, norm_layer=nn.BatchNorm2d, use_d | |||
for i in range(n_downsampling): | |||
mult = 2**i | |||
model += [nn.Conv2d(ngf * mult, ngf * mult * 2, kernel_size=3, | |||
stride=2, padding=1), | |||
stride=1, padding=1), | |||
nn.MaxPool2d(2), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi! Why Pooling is used for avoiding checkerboard artifacts if
" Max pooling was previously linked to high-frequency artifacts in [12].)" https://distill.pub/2016/deconv-checkerboard/
See the reasoning behind the reimplementation:
https://distill.pub/2016/deconv-checkerboard/
I became aware of some checkerboard effects while training the CyclicGAN network:
http://i.imgur.com/uTgsC2l.png
because it resembled this:
http://i.imgur.com/M8r47Bp.png
now it looks like this (during training on Van Gogh set):
http://i.imgur.com/wf7B9sT.png