You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to implement your framework in TensorFlow and have a few questions about how you deal with relational embeddings.
I understand that in the bilinear formulation, there is one relationship weight matrix "R" per relation, trained during backpropagation. But how does relational embeddings "r" work? You say in the article that these embeddings are static: it mean that they don't evolve like the entities, but are they trained during backpropagation? Do you initialize them at zero like the entities?
In the code, you fetch the last existing embedding if it exists or you get the embedding parameters if not.
It is not quite clear for me what this line does. I understand that you use the one hots and the parameters w_rel_init to fetch the embedding, but what do you fetch exactly? The line of w_rel_init that embed the current relation, using the one hot layer? And is w_rel_init the Wr mentioned in Section 4?
Thanks in advance for your answer.
The text was updated successfully, but these errors were encountered:
Thank you for you query. Yes, w_rel_init is the W_r and the parameter corresponding to the static relation embedding. The line you quoted gets the embedded relation corresponding the one-hot relation input but using the updated parameters. So, yes we train the relation embedding parameter w_rel_init via backprop, however we do not store or update the relation embedding vector themselves after every event like we do for entities using latest_embeddings vectors.
Dear authors,
I'm trying to implement your framework in TensorFlow and have a few questions about how you deal with relational embeddings.
I understand that in the bilinear formulation, there is one relationship weight matrix "R" per relation, trained during backpropagation. But how does relational embeddings "r" work? You say in the article that these embeddings are static: it mean that they don't evolve like the entities, but are they trained during backpropagation? Do you initialize them at zero like the entities?
In the code, you fetch the last existing embedding if it exists or you get the embedding parameters if not.
latest_subject_rel_embed = GetEmbeddingParam("rel", cfg::num_rels, e->rel, inputs, param_dict["w_rel_init"], lookup_rel_onehot, lookup_rel_init);
It is not quite clear for me what this line does. I understand that you use the one hots and the parameters w_rel_init to fetch the embedding, but what do you fetch exactly? The line of w_rel_init that embed the current relation, using the one hot layer? And is w_rel_init the Wr mentioned in Section 4?
Thanks in advance for your answer.
The text was updated successfully, but these errors were encountered: