Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Threshold layer to binarize features #422

Merged
merged 18 commits into from
May 27, 2014
Merged

Conversation

sguada
Copy link
Contributor

@sguada sguada commented May 16, 2014

It just binarize the inputs according to a predefined threshold the default is threshold = 0 what means any number greater than 0 would output 1, otherwise would output 0.

Currently there is only a CPU version, and there is not backward available.

@sguada sguada mentioned this pull request May 21, 2014
@sergeyk
Copy link
Contributor

sergeyk commented May 21, 2014

@sguada care to rebase on dev and include docstring in the manner of other layers in neuron_layers.hpp?

@kloudkl
Copy link
Contributor

kloudkl commented May 24, 2014

Can this be implemented as an operation for the EltwiseLayer?

@sguada
Copy link
Contributor Author

sguada commented May 24, 2014

Threshold layer only takes one blob, EltwiseLayer perform operations
between two blobs. Although in theory one could use two blobs to compare
them, not sure how useful could be that.

That functionality could be added after this PR is merged.

Sergio

2014-05-23 19:13 GMT-07:00 kloudkl notifications@github.com:

Can this be implemented as an operation for the EltwiseLayer?


Reply to this email directly or view it on GitHubhttps://github.com//pull/422#issuecomment-44075151
.

@shelhamer
Copy link
Member

It seems reasonable to me to add unary elementwise operations to the
EltwiseLayer. Thresholding as done here and the soft threshold [1] would be
useful. Blob scaling could be done as unary multiplication with a
coefficient too.

@jeffdonahue @sguada what do you think? I could branch off this PR, rebase,
and incorporate these operations into EltwiseLayer.

[1] http://www.simonlucey.com/soft-thresholding/

On Fri, May 23, 2014 at 7:18 PM, Sergio Guadarrama <notifications@github.com

wrote:

Threshold layer only takes one blob, EltwiseLayer perform operations
between two blobs. Although in theory one could use two blobs to compare
them, not sure how useful could be that.

That functionality could be added after this PR is merged.

Sergio

2014-05-23 19:13 GMT-07:00 kloudkl notifications@github.com:

Can this be implemented as an operation for the EltwiseLayer?


Reply to this email directly or view it on GitHub<
https://github.com/BVLC/caffe/pull/422#issuecomment-44075151>
.


Reply to this email directly or view it on GitHubhttps://github.com//pull/422#issuecomment-44075242
.

@sguada
Copy link
Contributor Author

sguada commented May 25, 2014

I think it makes total sense to have unary elementwise operations, so go ahead pull my PR and adapt it to EltwiseLayer.

@kloudkl
Copy link
Contributor

kloudkl commented May 27, 2014

Almost all the neuron layers can be refactored into very simple operations.

The implementations of the non-unary operations are too complex to be placed in the switch cases.

@sguada
Copy link
Contributor Author

sguada commented May 27, 2014

Agreed, neuron layers could be transformed into unary eltement wise
operations

Sergio

2014-05-27 1:44 GMT-07:00 kloudkl notifications@github.com:

Almost all the neuron layers can be refactored into very simple operationshttps://github.com/antinucleon/cxxnet/blob/master/cxxnet/core/cxxnet_op.h
.


Reply to this email directly or view it on GitHubhttps://github.com//pull/422#issuecomment-44249329
.

@shelhamer
Copy link
Member

For coherence and modularity it might make sense to have

  • a neuron layer, that collects all the activation types
  • a threshold layer, for cutoff binarization (as in this PR) and soft
    thresholding
  • the eltwise layer we have now, but add unary multiplication for scaling

Threshold could be folded into Eltwise as already discussed or kept
distinct–reconsidering it feels like thresholds deserve their own layer,
even if the implementation fits the Eltwise scheme.

Sounds like proto v2 is brewing...

On Tue, May 27, 2014 at 8:06 AM, Sergio Guadarrama <notifications@github.com

wrote:

Agreed, neuron layers could be transformed into unary eltement wise
operations

Sergio

2014-05-27 1:44 GMT-07:00 kloudkl notifications@github.com:

Almost all the neuron layers can be refactored into very simple
operations<
https://github.com/antinucleon/cxxnet/blob/master/cxxnet/core/cxxnet_op.h>

.


Reply to this email directly or view it on GitHub<
https://github.com/BVLC/caffe/pull/422#issuecomment-44249329>
.


Reply to this email directly or view it on GitHubhttps://github.com//pull/422#issuecomment-44288713
.

@shelhamer
Copy link
Member

@sguada I'm stalling on this 'til after NIPS. If you want to rebase and merge now go for it, or else I'll revisit this after the deadline and hack whatever decision we come to.

@sguada
Copy link
Contributor Author

sguada commented May 27, 2014

Thanks, @shelhamer I'm going to rebase and merge it now, we can refactor it later if we decide so.

sguada added a commit that referenced this pull request May 27, 2014
Threshold layer to binarize features
Added GPU code and tested
@sguada sguada merged commit 3e0c42d into BVLC:dev May 27, 2014
@sguada sguada deleted the threshold_layer branch May 27, 2014 19:07
@shelhamer shelhamer mentioned this pull request Aug 8, 2014
mitmul pushed a commit to mitmul/caffe that referenced this pull request Sep 30, 2014
Threshold layer to binarize features
Added GPU code and tested
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants