-
Notifications
You must be signed in to change notification settings - Fork 18.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lrelu layer #717
Lrelu layer #717
Conversation
Thanks for this contribution @qipeng! Unless there is a noticeable performance impact on the existing ReLU layer, I think this would be better implemented as a simple generalization of ReLU than a new layer. In particular you could add to caffe.proto a |
… make nvcc unhappy.
Conflicts: src/caffe/common.cpp
More namespace cleaning.
Conflicts: Makefile.config.example docs/index.md src/caffe/proto/caffe.proto
Conflicts: docs/index.md
Hi Jeff (@jeffdonahue ), I've merged the code back into ReLULayer and added unit test. Please let me know if anything further needs to be done. |
I think you need to rebase from dev. |
… make nvcc unhappy.
… make nvcc unhappy.
… make nvcc unhappy.
Closing and moving to #740 |
Implemented the Leaky ReLU unit described in this paper
Maas, Andrew L., Awni Y. Hannun, and Andrew Y. Ng. "Rectifier nonlinearities improve neural network acoustic models." ICML Workshop on Deep Learning for Audio, Speech, and Language Processing. 2013.
which shares similar sparse activation properties with the ReLU, but was shown easier to optimize.