Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PEP 646: Add explicit section on endorsements #2055

Merged
merged 6 commits into from
Sep 17, 2021
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
56 changes: 54 additions & 2 deletions pep-0646.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1167,6 +1167,54 @@ Footnotes
shape begins with 'time × batch', then ``videos_batch[1][0]`` would select the
same frame.

Endorsements
============

This purpose of this PEP is to make life easier for the numerical computing
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While numerical computing is a key motivation, I don't think it is the only purpose of the PEP. As the numerical computing folks have pointed out, this PEP would be worth having even for non-Tensor purposes. The PEP itself links to a python/typing issue with many use cases.

Perhaps we can clarify that a bit here. Otherwise, it reads like Tensor authors are the only, narrow set of users for this feature and this feature would live or die by adoption from the few libraries, which is not true.

For example, there are plenty of real-world callbacks that expect a variadic *args:

def call_soon(callback: Callable[[*Ts], R], *args: *Ts) -> R:
    ...
    return callback(*args)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. What do you think of the revised wording?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice wording!

community. How likely is it that numerical computing libraries will actually
make use of the features proposed in this PEP?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's worth clarifying that variadic tuples will be useful in Tensor library stubs, not their official implementations. In other words, even if PyTorch, NumPy, etc. never use variadic tuples in their source code, users would still benefit from having stubs with variadic tuples. For the purposes of typechecking, what matters is the stub. These libraries are often written in C++ anyway, meaning that they aren't really the target users. The main targets are people who use Tensors in their code and want to catch shape errors up front.

So, while it is nice to have endorsements from Tensor libraries, I don't see it as make-or-break for Tensor shape types. The community will be maintaining a set of stubs anyway, like we began in pytorch_examples.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's true, but if the steering council are sceptical, I think this argument won't sway them: it would be strange for a language to adopt a controversial language feature only to support community stubs. I think the weight of this section depends on big library authors themselves making use of what we propose (even if it's in stubs that they provide rather than the actual implementation, considering that the implementation is mostly not Python).


We reached out to a number of people with this question, and received the
following endorsements.

From **Stephan Hoyer**, member of the NumPy Steering Council:
[#stephan-endorsement]_

I just wanted to thank Matthew & Pradeep for writing this PEP and for
clarifications to the broader context of PEP 646 for array typing in
https://github.com/python/peps/pull/1904.

As someone who is heavily involved in the Python numerical computing
community (e.g., NumPy, JAX, Xarray), but who is not so familiar with the
details of Python's type system, it is reassuring to see that a broad range
of use-cases related to type checking of named axes & shapes have been
considered, and could build upon the infrastructure in this PEP.

Type checking for shapes is something the NumPy community is very
interested in -- there are more thumbs up on the relevant issue on NumPy's
GitHub than any others (https://github.com/numpy/numpy/issues/7370) and we
recently added a "typing" module that is under active development.

It will certainly require experimentation to figure out the best ways to
use type checking for ndarrays, but this PEP looks like an excellent
foundation for such work.

From **Dan Moldovan**, a Senior Software Engineer on the TensorFlow Dev Team
and author of the TensorFlow RFC, `TensorFlow Canonical Type System`_: [#dan-endorsement]_

I'd be interested in using this the mechanisms defined in this PEP to define
rank-generic Tensor types in TensorFlow, which are important in specifying
`tf.function` signatures in a Pythonic way, using type annotations (rather than
the custom `input_signature` mechanism we have today - see this issue:
https://github.com/tensorflow/tensorflow/issues/31579). Variadic generics are
among the last few missing pieces to create an elegant set of type definitions
for tensors and shapes.

(For the sake of transparency - we also reached out to folks from a third popular
numerical computing library, PyTorch, but did *not* receive a statement of
endorsement from them. Our understanding is that although they are interested
in some of the same issues - e.g. static shape inference - they are currently
focusing on enabling this through a DSL rather than the Python type system.)

Acknowledgements
================
Expand All @@ -1176,7 +1224,7 @@ Thank you to **Alfonso Castaño**, **Antoine Pitrou**, **Bas v.B.**, **David Fos
**Sergei Lebedev**, and **Vladimir Mikulik** for helpful feedback and suggestions on
drafts of this PEP.

Thank you especially to **Lucio** for suggesting the star syntax (which has made multiple aspects of this proposal much more concise and intuitive), and to **Stephan Hoyer** for his kind `endorsement`_ of the PEP on the python-dev mailing list.
Thank you especially to **Lucio** for suggesting the star syntax (which has made multiple aspects of this proposal much more concise and intuitive), and to **Stephan Hoyer** and **Dan Moldovan** for their endorsements.

Resources
=========
Expand Down Expand Up @@ -1229,7 +1277,11 @@ References

.. _this exercise: https://spinningup.openai.com/en/latest/spinningup/exercise2_2_soln.html

.. _endorsement: https://mail.python.org/archives/list/python-dev@python.org/message/UDM7Y6HLHQBKXQEBIBD5ZLB5XNPDZDXV/
.. _TensorFlow Canonical Type System: https://github.com/tensorflow/community/pull/208

.. [#stephan-endorsement] https://mail.python.org/archives/list/python-dev@python.org/message/UDM7Y6HLHQBKXQEBIBD5ZLB5XNPDZDXV/

.. [#dan-endorsement] https://mail.python.org/archives/list/python-dev@python.org/message/HTCARTYYCHETAMHB6OVRNR5EW5T2CP4J/

Copyright
=========
Expand Down