Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Delete DataManager #691

Merged
merged 1 commit into from
Feb 26, 2021
Merged

Delete DataManager #691

merged 1 commit into from
Feb 26, 2021

Conversation

zachgk
Copy link
Contributor

@zachgk zachgk commented Feb 24, 2021

Also done:

  • Create a forwardInternal with labels and the implementations in the higher level
    blocks SequentialBlock and ParallelBlock
  • Removed duplicate testSeqToSeq test

Also done:
- Create a forwardInternal with labels and the implementations in the higher level
blocks SequentialBlock and ParallelBlock
- Removed duplicate testSeqToSeq test

Change-Id: I16f5343886a62538f3a247df062d28a765805cb8
@lanking520
Copy link
Contributor

LGTM when I see the title. Look into the code now

.addSingleton(
a -> {
try {
return embedding.embedText(a);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem to put it here is: I would like to freeze this block and not use for training. If remove DataManager, I will not be able to freeze the block

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would be frozen because the ModelZooTextEmbedding freezes (it uses a separate predictor). In a different case, you should be able to just call embedding.freeze() to freeze just the embedding.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you verify if PyTorch engine will work in this case? I am not sure if the training work are correctly handled

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is no word embedding for pt in the model zoo so I can't test it. But, is there a reason that PT may not work?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should work by default. Just concerned if the NDArray inside will be impacted. In the DistiledBert training for Amazon Dataset, we throw the embedding layer into a lambda function to avoid parameter collections

@lanking520 lanking520 self-requested a review February 25, 2021 21:26
@lanking520 lanking520 dismissed their stale review February 25, 2021 21:26

Not applicable

@stu1130 stu1130 merged commit 302721a into deepjavalibrary:master Feb 26, 2021
@zachgk zachgk deleted the noDataManager branch February 26, 2021 19:24
Lokiiiiii pushed a commit to Lokiiiiii/djl that referenced this pull request Oct 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants