Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

matcaffe interface improvements: maintain correct shape, add backward step, and get_weights function #132

Closed
wants to merge 8 commits into from

Conversation

sguada
Copy link
Contributor

@sguada sguada commented Feb 20, 2014

Reshaped outputs of matcaffe forward to keep it parallel to the caffe representation.
Added get_weights from Ross to be able to get the weights from the network.
Added backward to matcaffe inspired by the pycaffe code (not tested yet)

@sguada
Copy link
Contributor Author

sguada commented Feb 25, 2014

Double checked, now it works properly. Ready to be merged @rbgirshick @shelhamer

@shelhamer
Copy link
Member

Please lint and rebase on the latest dev–it no longer merges cleanly. Thanks!

@sguada
Copy link
Contributor Author

sguada commented Feb 27, 2014

@shelhamer @jeffdonahue now make lint doesn't work for me in OSX
I get the following error

FATAL ERROR: No files were specified.
Found 1 or more lint errors; see log at build/cpp_lint.error_log
make: *** [build/cpp_lint.log] Error 1

@sguada
Copy link
Contributor Author

sguada commented Feb 27, 2014

Also I got another problem when copying the parameters of network

I0227 00:09:54.641738 227295232 net.cpp:156] Network initialization done.
I0227 00:09:55.440505 227295232 net.cpp:268] Copying source layer conv1
I0227 00:09:55.440588 227295232 net.cpp:268] Copying source layer relu1
I0227 00:09:55.440606 227295232 net.cpp:268] Copying source layer pool1
I0227 00:09:55.440611 227295232 net.cpp:268] Copying source layer norm1
I0227 00:09:55.440616 227295232 net.cpp:268] Copying source layer conv2
I0227 00:09:55.440979 227295232 net.cpp:268] Copying source layer relu2
I0227 00:09:55.440999 227295232 net.cpp:268] Copying source layer pool2
I0227 00:09:55.441004 227295232 net.cpp:268] Copying source layer norm2
I0227 00:09:55.441009 227295232 net.cpp:268] Copying source layer conv3
I0227 00:09:55.441958 227295232 net.cpp:268] Copying source layer relu3
I0227 00:09:55.441983 227295232 net.cpp:268] Copying source layer conv4
I0227 00:09:55.442700 227295232 net.cpp:268] Copying source layer relu4
I0227 00:09:55.442718 227295232 net.cpp:268] Copying source layer conv5
I0227 00:09:55.443183 227295232 net.cpp:268] Copying source layer relu5
I0227 00:09:55.443194 227295232 net.cpp:268] Copying source layer pool5
I0227 00:09:55.443199 227295232 net.cpp:268] Copying source layer fc6
F0227 00:09:55.443235 227295232 net.cpp:277] Check failed: target_blobs[j]->width() == source_layer.blobs(j).width() (6400 vs. 9216) 

@sguada
Copy link
Contributor Author

sguada commented Mar 7, 2014

Ready to merge @jeffdonahue @shelhamer
Now it works properly and merges cleanly.

@shelhamer
Copy link
Member

This works around #178 by removing fillers, which is fine since the MATLAB wrapper is only used for deployment. Did you figure out the cause of the #178 bug? We might want to coordinate this with the python wrapper, although I can't think of any complications with taking out fillers at the moment.

@shelhamer shelhamer added this to the 0.99 milestone Mar 13, 2014
@shelhamer shelhamer assigned rbgirshick and unassigned sguada Mar 13, 2014
// The end.
{ "END", NULL },
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sguada the last entry is supposed to have func set to NULL to terminate the command dispatch loop down on line 354. This change doesn't look safe to me, but maybe I'm missing something?

@sguada
Copy link
Contributor Author

sguada commented Mar 17, 2014

@shelhamer I haven't been able to figure out yet #178, but if models/imagenet.prototxt is intended to be used in deployment, it doesn't make sense to include fillers, that would fill the blobs with numbers that are going to be overwritten by the pre-trained model.
So I will remove the fillers from models/imagenet.prototxt.

@sguada
Copy link
Contributor Author

sguada commented Mar 17, 2014

@rbgirshick thanks for the catch, I've fixed that.

@shelhamer
Copy link
Member

@sguada, I can't figure out why this PR isn't updating. I rebased it on the latest dev myself and pushed yet it hasn't been rewritten.

How about opening a new PR, closing this one, and commenting with the issue number? We'll continue the review at the new PR.

@sguada sguada mentioned this pull request Mar 18, 2014
@sguada
Copy link
Contributor Author

sguada commented Mar 18, 2014

@shelhamer done see #223

@sguada sguada closed this Mar 18, 2014
wk910930 pushed a commit to wk910930/caffe that referenced this pull request Jun 21, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants