-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Read and install setup_requires
from setup.cfg
instead of hard-coding build requirements in workflow
#19
Conversation
…uilds This is for example for runviewer, which has cython as a build-timde dependency. And setuptools_scm as a build-time dependency should not be a special case (other than the fact that we want to use a git version of it). This patch solves that problem for now, but read onward for a discussion and alternate solutions. Set `--no-build-isolation` in the manylinux action, because it uses `pip wheel` to build the wheels, and the mere existence of a `pyproject.toml` (used here solely for configuring `black`, see psf/black#683) causes pip to build the wheels in an isolated environment, devoid of our actual build dependencies since we are not declaring build dependencies in `pyproject.toml` as this new PEP 517/518 mechanism (that introduced `pyproject.toml`) would have us do. Actually abandoning `setup_requires` (the contents of which there exists no mechanism to automatically install without running setup.py first - other than parsing `setup.cfg` ourselves as is done here) and putting build dependencies in `pyproject.toml` is another option. However, this will prevent us from injecting additional build-time dependencies like we are doing with `setuptools_scm`. We could list the git version of `setuptools_scm` as a build-time dependency in `pyproject.toml`, but conda won't understand that - at the moment we're just hackily pip-installing the git version of `setuptools_conda` in the conda environment. Once `setuptools_scm` releases with the `release-branch-semver` scheme (they released yesterday, but alas in their own migration from `setup.py` to `setup.cfg` they didn't migrate the new scheme! pypa/setuptools-scm#441), this is probably moot and we could declare build dependencies in `pyproject.toml` - I don't imagine we'll be using git versions of build dependencies very often. Then pip could build wheels for us in accordance with PEP 517 without us having to install build dependencies ourselves at all. However the bootstrapping issue for `setuptools_conda` will remain - how will it know that you need to install `cython` in order to run `setup.py`, when no `setuptools_conda` code runs until you're already running `setup.py`? It will presumably require an external tool to let you say: ```bash $ setuptools_conda . dist_conda [args] ``` instead of ```bash $ python setup.py dist_conda [args] ``` so that it has a chance to parse some `toml` and install the build-dependencies before running `setup.py`.
Switch package purity config back to PURE: true, NOARCH: false. The only reason the workflow failed in the last push is that the package is actually pure, so a tool running at the end of the manylinux wheel building failed to find any linux wheels.
setup_requires
from setup.cfg
instead of hard-coding build requirements in workflow
@@ -35,6 +35,18 @@ env: | |||
# PURE: true | |||
# NOARCH: true | |||
|
|||
STEP_GET_BUILD_REQUIREMENTS: | | |||
SETUP_REQUIRES=$(python -c '\ | |||
import configparser as p |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about using setuptools.dist.Distribution
here, e.g.
from setuptools.dist import Distribution
dist = Distribution()
dist.parse_config_files()
dist.setup_requires
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah nice, that is better. There will be possible env markers in the output like package; python_version < '3.8'
, but actually that should be handled by passing to pip with pip install -r
instead of pip install
- that will actually evaluate the requirements and install the appropriate packages. So instead of storing an environment variable to pass to pip, this should become
python -c 'from setuptools.dist import Distribution
dist = Distribution()
dist.parse_config_files()
print("\n".join(dist.setup_requires))' | pip install -r /dev/stdin
Any idea why the manylinux build failed with Repair wheels failed @chrisjbillington? Is merging this contingent on this passing? |
Yes, it failed because the package is actually pure so none of the wheels had 'linux' in their names like the tool that analyses linux wheels expected. It's not a problem, everything succeeded right up to that step. |
I've been reading more and I was hoping the need to write code to install build dependencies would go away if we move to storing build dependencies in pyproject.toml as per PEP 518. And it almost does. Pip can automatically install them when it builds a wheel, and I've extended setuptools-conda to do the same. But for sdists, pip is not involved (there is not I complained about it on the pip bugtracker here So we either switch to storing build dependencies in pyproject.toml (only build dependencies - pyproject.toml is not a new home for all metadata...yet) and ask pip to build wheels for us, and add
before building an sdist, or we stick with having build dependencies as I'll fix this PR up with the setuptools suggestion, but I think I'll also make the alternative PR moving build dependencies to |
Wouldn't
I don't mind from setuptools.build_meta import get_requires_for_build_sdist
get_requires_for_build_sdist()
Sounds good. I don't mind setup.cfg for now and the arguments in PEP 518 against it are less compelling now that Python 2 has been retired. The arguments there could have been resolved alternatively by freezing the format used by configparser in Python 3 and incorporating setuptools in the standard library. However, that ship sailed long ago, and I'd be happy using workflow-sandbox to learn/experiment with pyproject.toml. |
Nope, unfortunately that skips installing the dependencies at all.
No good, this runs There is ongoing discussion for where the 'build an sdist using PEP 517 mechanics' command belongs (or the one to build a wheel for that matter, for which it's not obvious this should be pip), and the stop-gap is the pep517 package, which can build both wheels and sdists according to PEP 517/518 semantics. We can avoid parsing toml in any form if we use it, so I will make the 'modern pyproject' PR use it. (Also, I'll join the discussions and propose they should just rename |
😲
Will merge when done. |
Unless there are objections, abandoning this in favour of #20. |
Read setup_requires from setup.cfg and automatically install before builds
This is for example for runviewer, which has
cython
as a build-time dependency. Andsetuptools_scm
as a build-time dependency should not be a special case (other than the fact that we want to use a git version of it). This patch solves that problem for now, but read onward for adiscussion (this turned out to be a total rabbit hole) and alternate solutions.
(TL;DR, maybe we wait until we don't need a git version of
setuptools_scm
, and then we put build requirements inpyproject.toml
such that pip will process them automatically)Set
--no-build-isolation
in the manylinux action, because it usespip wheel
to build the wheels, and the mere existence of apyproject.toml
(used here solely for configuringblack
, see psf/black#683) causes pip to build the wheels in an isolated environment, devoid of our actual build dependencies since we are not declaring build dependencies inpyproject.toml
as this new PEP 517/518 mechanism (that introducedpyproject.toml
) would have us do.Actually abandoning
setup_requires
(the contents of which there exists no mechanism to automatically install without running setup.py first - other than parsingsetup.cfg
ourselves as is done here) and putting build dependencies inpyproject.toml
is another option. However, thiswill prevent us from injecting additional build-time dependencies like we are doing with
setuptools_scm
. We could list the git version ofsetuptools_scm
as a build-time dependency inpyproject.toml
, but conda won't understand that - at the moment we're just hackily pip-installing the git version ofsetuptools_conda
in the conda environment.Once
setuptools_scm
releases with therelease-branch-semver
scheme (they released yesterday, but alas in their own migration fromsetup.py
tosetup.cfg
they didn't migrate the new scheme!pypa/setuptools-scm#441), this is probably moot and we could declare build dependencies in
pyproject.toml
- I don't imagine we'll be using git versions of build dependencies very often. Then pip could build wheels for us in accordance with PEP 517 without us having to install build dependencies ourselves at all.However the bootstrapping issue for
setuptools_conda
will remain - how will it know that you need to installcython
in order to runsetup.py
, when nosetuptools_conda
code runs until you're already runningsetup.py
? It will presumably require an external tool to let you say:$ setuptools_conda . dist_conda [args]
instead of
so that it has a chance to parse some
toml
and install the build-dependencies before runningsetup.py
.