Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model Saving #159

Open
djsaunde opened this issue Jun 3, 2016 · 12 comments
Open

Model Saving #159

djsaunde opened this issue Jun 3, 2016 · 12 comments

Comments

@djsaunde
Copy link

djsaunde commented Jun 3, 2016

I am using the file SdA.py. I am wondering if there is a way to save the model which has been generated and trained; I would like to run vectors through the network and get back the "encoding" from the trained network.

@amit4111989
Copy link

Use pickle or cpickle (Python modules for serializing objects ) to store the generated model

@djsaunde
Copy link
Author

djsaunde commented Jun 6, 2016

This allows me to save the SdA object, but I would like to input vectors and retrieve their encoding, or "h(x)" in the literature, where "x" is the input.

@amarczew
Copy link

amarczew commented Sep 22, 2016

You can try something like:

def new_representation_X_train(hidden_layer):
    new_representation_train = theano.function(
         inputs = [index],
        outputs = self.sigmoid_layers[hidden_layer].output,
        givens = {
            self.x: train_set_x[index: index + 1 ]
        }
    )
    train_lines = train_set_x.get_value(borrow=True).shape[0]
    return [new_representation_train(i) for i in xrange(train_lines)]

@ohuole233
Copy link

Thanks for your help, another question is that what does [index] mean in your code?^-^

At 2016-09-22 08:22:24, "Alison Marczewski" notifications@github.com wrote:

You can try something like:

def new_representation_X_train(hidden_layer):
new_representation_train = theano.function(
inputs = [index],
outputs = self.sigmoid_layers[hidden_layer].output,
givens = {
self.x: train_set_x[
index: index + 1
]
}
)
train_lines = train_set_x.get_value(borrow=True).shape[0]
return [new_representation_train(i) for i in xrange(train_lines)]


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or mute the thread.

@ohuole233
Copy link

Oh, I get it and I have added it to the DBN class. Thank you again.

At 2016-09-22 08:22:24, "Alison Marczewski" notifications@github.com wrote:

You can try something like:

def new_representation_X_train(hidden_layer):
new_representation_train = theano.function(
inputs = [index],
outputs = self.sigmoid_layers[hidden_layer].output,
givens = {
self.x: train_set_x[
index: index + 1
]
}
)
train_lines = train_set_x.get_value(borrow=True).shape[0]
return [new_representation_train(i) for i in xrange(train_lines)]


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or mute the thread.

@amarczew
Copy link

@ohuole233 remember this value comes from a sigmoid function. If you want the linear output, you need to create a linear_output = T.dot(input, self.W) + self.b and get its value inside the function I wrote above.

@ohuole233
Copy link

@amarczew Hi.professor~ After the method new_representation_X_train added in the class DBN,and I run the code like this:

import theano
import DBN
import mlp
import rbm
import numpy

pretraining_epochs=2
pretrain_lr=0.1
k=1
batch_size=2

train_set=[[1,2,3,1],[2,3,1,4],[3,6,4,2],[3,5,4,5],[4,2,5,2],[4,3,2,6]]
train_set_x = theano.shared(numpy.asarray(train_set,
dtype=theano.config.floatX),
borrow=True)
n_train_batches = train_set_x.get_value(borrow=True).shape[0]
numpy_rng=numpy.random.RandomState(123)

dbn=DBN.DBN(numpy_rng=numpy_rng,n_ins=4,hidden_layers_sizes=[3,3],n_outs=2)
pretraining_fns=dbn.pretraining_functions(train_set_x=train_set_x,batch_size=batch_size,k=k)

print dbn.new_representation_X_train(2)

An unexpected bug occured to remind me that "TypeError: new_representation_X_train() takes exactly 1 argument (2 given)".
Is there any idea cope with this?
Much thanks and Waiting for your reply soon.

@amarczew
Copy link

@ohuole233 probably the problem is related to self parameter that is passed implicitly, because you are calling a function from an object. Change to def new_representation_X_train(self, hidden_layer): may fix it.

@ohuole233
Copy link

Grateful for your reply. but it seems 'global name 'index' is not defined. I haven't got how to define it and my objective is to derive new representation for each input.Waiting for your assistance sincerely .
😀

发自网易邮箱大师
On 09/28/2016 03:41, Alison Marczewski wrote:

@ohuole233 probably the problem is related to self parameter that is passed implicitly, because you are calling a function from an object. Change to def new_representation_X_train(self, hidden_layer): may fix it.


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.

@amarczew
Copy link

amarczew commented Sep 28, 2016

@ohuole233 you need to declare this tensor index = T.lscalar('index'), before this function

@ohuole233
Copy link

@amarczew Hi, Professor. I have abandoned training on GUPs with some special settlements. However, after the above repair including declaring the tensor and adding self, a new error named '' NameError: global name 'train_set_x' is not defined '' occurred. Is there anything to fix the bug?
It would be very kind of you to send me the DBN class code including the intermediate representation method and my email is hhq123go@gmail.com.
Waiting for your reply. Thanks~^-^

@amarczew
Copy link

amarczew commented Oct 8, 2016

@ohuole233 Unfortunately I don't have this implementation. This function is working on SdA.py implementation. You can take a look on this file to solve the problem in your DBN implementation. I have declared this function inside build_finetune_functions(...)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants