Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reshape layer #108

Closed
wants to merge 8 commits into from
Closed

Reshape layer #108

wants to merge 8 commits into from

Conversation

sguada
Copy link
Contributor

@sguada sguada commented Feb 14, 2014

Added a reshape layer that allows to change the shape of the blobs, although it doesn't change the size. It is generalization of flatten_layer.

@@ -91,6 +91,12 @@ message LayerParameter {
// point would be set as rand_skip * rand(0,1). Note that rand_skip should not
// be larger than the number of keys in the leveldb.
optional uint32 rand_skip = 53 [ default = 0 ];

// For the Reshape Layer one need to specify the new dimensions
optional int32 num = 54 [default = 0];
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shall re rename it reshape_num, reshape_channels, reshape_height and reshape_width to avoid confusion? Since num, channels, height and width may be too general.

@sguada
Copy link
Contributor Author

sguada commented Feb 17, 2014

Done renamed reshape params. The new params could be used by other layers

@@ -91,6 +91,12 @@ message LayerParameter {
// point would be set as rand_skip * rand(0,1). Note that rand_skip should not
// be larger than the number of keys in the leveldb.
optional uint32 rand_skip = 53 [ default = 0 ];

// For the Reshape Layer one need to specify the new dimensions
optional int32 new_num = 60 [default = 0];
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about reshape_ instead of new_ just to make it really obvious?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did that at first, but then I thought that other layers may want to use the same parameters to specify the new dimensions of the output. See for instance #120

@jeffdonahue
Copy link
Contributor

I think it's kind of redundant to have both reshape and flatten layer -- one option is to just add the extra params to FlattenLayer, change its name to ReshapeLayer, and have it default to the flattening behavior when all 0 (default) params are specified. What do you think?

Also, what is the use case for changing "num"? It seems like it may be more natural to only allow reshaping a given data instance (channels/height/width), but maybe there's a need to merge or split data instances that I'm not considering?

@sguada
Copy link
Contributor Author

sguada commented Feb 26, 2014

@jeffdonahue I think reshape layer could replace flatten layer if the default behavior is clear.
What about the following convention for the params
If it is 0 then it doesn't change the dimensions, that allows to specify that new_num=0 means don't change it.
And if one of the dims is -1 then it will be calculated to match the total number of elements. new_channels=-1 means create as many as needed to hold the data.

That will make the default behave as flatten layer

new_num = 0
new_channels = -1
new_height = 1
new_weight = 1

The need to change num is the motivation I created this layer, I need to be able to concatenate different images and compute features across them.

@jeffdonahue
Copy link
Contributor

All makes sense and sounds good!

@shelhamer
Copy link
Member

@sguada could you implement the flattening convention in your #108 (comment) and rebase to make this a clean merge?

Thanks!

@shelhamer
Copy link
Member

Let's bring in #219 first, and then this can replace the flatten layer once it is a clean merge.

@shelhamer
Copy link
Member

@sguada could you open a PR against dev instead so we can include this layer? Thanks.

@wendlerc
Copy link

wendlerc commented Aug 2, 2014

When will this layer be available?

@shelhamer
Copy link
Member

@sguada can you PR this layer to dev and include the flattening behavior in your #108 (comment)?

@shelhamer shelhamer added this to the 1.0 milestone Sep 19, 2014
@ssafar ssafar mentioned this pull request Oct 11, 2014
@shelhamer
Copy link
Member

Replaced by #1263.

201power pushed a commit to 201power/caffe that referenced this pull request Feb 5, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants