-
Notifications
You must be signed in to change notification settings - Fork 24.7k
[fake tensor] fix issue of no attribute tags #156689
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/156689
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit b4cbcc8 with merge base 0364db7 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
@@ -2774,7 +2774,7 @@ def validate(x: T) -> Union[T, FakeTensor]: | |||
|
|||
nonlocal flat_arg_fake_tensors | |||
if not self.is_our_fake(x): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @Valentine233 could you please also add unit test for this change ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @atalman , the UT has been added.
@@ -2774,7 +2774,7 @@ def validate(x: T) -> Union[T, FakeTensor]: | |||
|
|||
nonlocal flat_arg_fake_tensors | |||
if not self.is_our_fake(x): | |||
if torch.Tag.inplace_view in func.tags: | |||
if hasattr(func, "tags") and torch.Tag.inplace_view in func.tags: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The changes in this PR looks good to me. But
when auditing the code, I noticed many instances where func.tags
is used without checking whether func
actually has a tags
attribute. Do we need to review other parts of the code for similar issues or should we add the tags
attribute to HigherOrderOperator
? also cc @bdhirsh
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, good point! We can discuss more in this thread, and do the follow-up in the future.
But I suppose we can firstly land this PR, as urgently required in v2.8 release.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, good point! We can discuss more in this thread, and do the follow-up in the future. But I suppose we can firstly land this PR, as urgently required in v2.8 release.
Looks good to me given the changes in this PR should be no side effect. Will you further work on for a formal solution?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe needs some inputs from @bdhirsh about why HigherOrderOperator
does not have tags
attribute. Is this part of design or should we add tags
for HigherOrderOperator
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @drisspg @zou3519 - this looks like it's because of #151719
We now plumb FlexAttention through FakeTensorMode.__torch_dispatch__
, but it looks like FakeTensorMode
expects HigherOrderOperators that we plumb through to have tags.
My vote would probably just be to add a .tag
field to FlexAttention to make it more OpOverload-like, what do you both think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TBH I dont have the best context on when tags are used, do all HOPs not have tags or just FlexAttention?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
all HOPs don't have tags right now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
91c5925
to
b4cbcc8
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, since the changes in this PR should have no side effects. I'm not sure, but perhaps we could consider adding the tags
attribute to the higher-order operator as a future improvement.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Fixes #156688 Pull Request resolved: #156689 Approved by: https://github.com/leslie-fang-intel, https://github.com/atalman (cherry picked from commit 7597988)
…157519) [fake tensor] fix issue of no attribute tags (#156689) Fixes #156688 Pull Request resolved: #156689 Approved by: https://github.com/leslie-fang-intel, https://github.com/atalman (cherry picked from commit 7597988)
Fixes #156688