This repository has been archived by the owner on Oct 9, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 212
Lightning Trainer/PyTorch Lightning 1.7.0 support + CI Fixes (JIT Tracing and Functions to Classes conversion) #1410
Merged
Merged
Changes from 31 commits
Commits
Show all changes
33 commits
Select commit
Hold shift + click to select a range
d22295e
Fix Flash CI: in case of converting functions to classes
krshrimali b725620
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] 6d2c636
Attach Trainer to model for jit tracing
krshrimali f25eb1f
Merge branch 'flash-ci/functions-to-classes' of github.com:Lightning-…
krshrimali 41fd845
on_<>_dataloader were deprecated, and were removed in v1.7 - remove
krshrimali df9f48b
Fix jit_script test
krshrimali 4c8c0fb
fix
rohitgr7 76764e7
Try upgrading torchtext to 0.13.1 to support latest pytorch
krshrimali 05c15ba
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] 2fc3d82
Update requirements/datatype_text.txt
krshrimali 18e7144
dont allow 1.7.0 until issues are fixed
krshrimali 74e7345
Requirements: go back to PL 1.7
krshrimali f04a274
CI Fix: manually setattr for collate_fn after DataLoader is initialized
krshrimali cdbaefd
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] b410d31
install effdet from latest commit hash, fix object_detection
krshrimali a8f57ad
Force install effdet 0.3.0
krshrimali 598d978
Remove dataloader_idx from on_train_batch_end hooks
krshrimali 02826a3
rename, attempt to support previous PL versions
krshrimali cf1d6e9
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] ea2679e
syntax error fix
krshrimali aa3e00e
Merge branch 'flash-ci/functions-to-classes' of github.com:Lightning-…
krshrimali 4241ac9
Fix updating collate fn if input_transform is not None (icevision)
krshrimali d387308
Merge branch 'flash-ci/functions-to-classes' of github.com:Lightning-…
krshrimali 802ab2b
pep8 fix
krshrimali 23de0a4
Revert effdet changes, address reviews
krshrimali f609ef6
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] 7a24767
Apply suggestions from code review
krshrimali e58b1d5
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] 4b32904
indentation fix
krshrimali 2c89478
Update .azure-pipelines/testing-template.yml
krshrimali ccc7e24
Update .azure-pipelines/testing-template.yml
krshrimali ef9b0c0
Add CHANGELOG entries
krshrimali 8e03fd7
Merge branch 'flash-ci/functions-to-classes' of github.com:Lightning-…
krshrimali File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -102,8 +102,10 @@ def _test_jit_trace(self, tmpdir): | |
path = os.path.join(tmpdir, "test.pt") | ||
|
||
model = self.instantiated_task | ||
trainer = self.instantiated_trainer | ||
model.eval() | ||
|
||
model.trainer = trainer | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. See Lightning-AI/pytorch-lightning#14036 (comment) for a workaround There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Just curious, anything wrong with the current workaround? |
||
model = torch.jit.trace(model, self.example_forward_input) | ||
|
||
torch.jit.save(model, path) | ||
|
@@ -117,8 +119,10 @@ def _test_jit_script(self, tmpdir): | |
path = os.path.join(tmpdir, "test.pt") | ||
|
||
model = self.instantiated_task | ||
trainer = self.instantiated_trainer | ||
model.eval() | ||
|
||
model.trainer = trainer | ||
model = torch.jit.script(model) | ||
|
||
torch.jit.save(model, path) | ||
|
@@ -261,10 +265,17 @@ class TaskTester(metaclass=TaskTesterMeta): | |
"test_cli": [pytest.mark.parametrize("extra_args", [[]])], | ||
} | ||
|
||
trainer_args: Tuple = () | ||
trainer_kwargs: Dict = {} | ||
|
||
@property | ||
def instantiated_task(self): | ||
return self.task(*self.task_args, **self.task_kwargs) | ||
|
||
@property | ||
def instantiated_trainer(self): | ||
return flash.Trainer(*self.trainer_args, **self.trainer_kwargs) | ||
|
||
@property | ||
def example_forward_input(self): | ||
pass | ||
|
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not for this PR, but we should figure out what to do with the baal loop. It is so bound to a particular PL and Baal version that I'm not sure it makes sense to have it as part of a framework. Maybe it would be better as a tutorial or in bolts? cc @otaj