Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow for more elemwise torch functions using broadcast_tensor and vmap #1032

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

Ch0ronomato
Copy link
Contributor

@Ch0ronomato Ch0ronomato commented Oct 12, 2024

Description

In the event the operator Elemwise is broadcasting over doesn't have a direct torch function, we can leverage vmap and broadcast_tensors to replicate the ufunc machinery.

Related Issue

Checklist

Type of change

  • New feature / enhancement
  • Bug fix
  • Documentation
  • Maintenance
  • Other (please specify):

📚 Documentation preview 📚: https://pytensor--1032.org.readthedocs.build/en/1032/

@Ch0ronomato
Copy link
Contributor Author

I need to add a test, but I want to get feedback on #1031 before continuing.

@Ch0ronomato Ch0ronomato changed the title Use broadcast tensor Allow for more elemwise torch functions using broadcast_tensor and vmap Oct 12, 2024
@Ch0ronomato
Copy link
Contributor Author

I'll fix the tests.

if shaped_inputs[0].dim() == 1:
ufunc = torch.vmap(base_fn)
else:
dims = (tuple(range(shaped_inputs[0].dim())),)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is correct. You need to apply vmap repeatedly to make it vectorize across multiple dimensions

Copy link
Contributor Author

@Ch0ronomato Ch0ronomato Oct 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're correct, my bad. I misunderstood the docs and wrote a bad local test.

Copy link

codecov bot commented Oct 15, 2024

Codecov Report

Attention: Patch coverage is 45.45455% with 6 lines in your changes missing coverage. Please review.

Project coverage is 81.89%. Comparing base (f277af7) to head (8b8e174).

Files with missing lines Patch % Lines
pytensor/link/pytorch/dispatch/elemwise.py 45.45% 6 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #1032      +/-   ##
==========================================
- Coverage   81.90%   81.89%   -0.01%     
==========================================
  Files         182      182              
  Lines       47879    47887       +8     
  Branches     8620     8619       -1     
==========================================
+ Hits        39214    39216       +2     
- Misses       6492     6498       +6     
  Partials     2173     2173              
Files with missing lines Coverage Δ
pytensor/link/pytorch/dispatch/elemwise.py 65.38% <45.45%> (-3.37%) ⬇️

Comment on lines +28 to +29
# @todo: This will fail for anything that calls
# `.item()`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We usually use # TODO: not # @todo:

return base_fn(*inputs)
else:

def elemwise_fn(*inputs):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need a test for this branch. You can create a new test ScalarOp and dispatch something specifically in PyTorch (in the test suite) to test this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants