Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change compile for pipeline module torch.compile #6478

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

NirSonnenschein
Copy link
Contributor

We have encountered and issue with torch.compile and the pipeline module.
modifying a member of the module (micro_offset) during the forward function will cause torch compile to restart the analysis and treat the module as dynamic.
In order to bypass this issue without significantly changing the way the pipeline module works we propose to compile only the layers in the pipeline module instead of the forward function of pipeline module. this will bypass the issue and should still give most of the benefit of torch compiling the pipeline module while avoiding the issue.

We have encountered and issue with torch.compile and the pipeline
module. modifying a member of the module duing the run will cause
torch compile to restart the analysis and treat the module as dynamic.
this happens because the fwd function will modify the micro_offset
attribute of the pipeline module.
in order to bypass this issue without significantly changing the way
the pipeline module works we propose to compile only the layers in the
pipeline module instead of the pipeline module itslef.
this will bypass the issue, and should still give most of the benefit of
torch compiling the pipeline module while avoiding the issue.
@tohtana
Copy link
Contributor

tohtana commented Sep 3, 2024

Hi @NirSonnenschein, thank you for the great catch! Can you also add a small test case? We want to make sure that this change works for various settings.

@tjruwase tjruwase removed the request for review from duli2012 September 4, 2024 15:30
@loadams
Copy link
Contributor

loadams commented Oct 25, 2024

Hi @NirSonnenschein, thank you for the great catch! Can you also add a small test case? We want to make sure that this change works for various settings.

Hi @NirSonnenschein - following up on this ask?

@NirSonnenschein
Copy link
Contributor Author

Hi @loadams ,
yes, sorry for the delay, I have been diverted to other urgent issues before completing the test.
I should get back to this soon.

@loadams
Copy link
Contributor

loadams commented Oct 28, 2024

Hi @loadams , yes, sorry for the delay, I have been diverted to other urgent issues before completing the test. I should get back to this soon.

No problem, thanks for the update

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants