Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pin doc build requirements #60

Merged
merged 7 commits into from
Aug 4, 2023

Conversation

oraNod
Copy link
Contributor

@oraNod oraNod commented Jul 6, 2023

Partially resolves #30

The devel branch uses docs/docsite/requirements.txt
Package docs latest branch and stable-2.15 uses tests/requirements.txt

@oraNod oraNod requested review from mattclay and felixfontein July 6, 2023 16:36
tests/requirements.txt Outdated Show resolved Hide resolved
Copy link
Member

@mattclay mattclay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@oraNod I'll leave the review of the requirements updates to @felixfontein.

Copy link
Member

@webknjaz webknjaz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@oraNod may I suggest using pip-tools for managing the env pins? It's a great tool for this purpose, Hynek also wrote a post about it years ago.

It is usually a good idea to have the entire dependency tree pinned, not just the direct dependencies.
With pip-tools, the structure would be

# tests/requirements.in <-- only the direct deps

-c constraints.in  # <-- contains known limitations

# only direct deps below:
jinja2 >= 3.0.0  # used by hacking/build_library/build_ansible/command_plugins/generate_man.py
pyyaml >= 5.1  # used by ansible-core
resolvelib  # used by ansible-core
sphinx
sphinx-notfound-page
sphinx-ansible-theme
rstcheck
antsibull-docs
# tests/constraints.in <-- known limitations

# only deps we want to limit, should mostly be indirect/transitive
resolvelib < 1.1.0
sphinx == 5.3.0
rstcheck < 6  # rstcheck 6.x has problem with rstcheck.core triggered by including files w/ sphinx directives https://github.com/rstcheck/rstcheck-core/issues/3
antsibull-docs == 2.2.0  # currently approved version

Here's how you generate pins for the entire dep tree:

$ pip-compile --resolver=backtracking --allow-unsafe --output-file=requirements.txt --strip-extras requirements.in
#
# This file is autogenerated by pip-compile with Python 3.9
# by the following command:
#
#    pip-compile --allow-unsafe --output-file=requirements.txt --resolver=backtracking --strip-extras requirements.in
#
aiofiles==23.1.0
    # via antsibull-core
aiohttp==3.8.4
    # via
    #   antsibull-core
    #   antsibull-docs
aiosignal==1.3.1
    # via aiohttp
alabaster==0.7.13
    # via sphinx
ansible-pygments==0.1.1
    # via
    #   antsibull-docs
    #   sphinx-ansible-theme
antsibull-core==2.0.0
    # via antsibull-docs
antsibull-docs==2.2.0
    # via -r requirements.in
antsibull-docs-parser==1.0.0
    # via antsibull-docs
async-timeout==4.0.2
    # via aiohttp
asyncio-pool==0.6.0
    # via antsibull-docs
attrs==23.1.0
    # via aiohttp
babel==2.12.1
    # via sphinx
build==0.10.0
    # via antsibull-core
certifi==2023.5.7
    # via requests
charset-normalizer==3.1.0
    # via
    #   aiohttp
    #   requests
docutils==0.18.1
    # via
    #   antsibull-docs
    #   rstcheck
    #   sphinx
    #   sphinx-rtd-theme
frozenlist==1.3.3
    # via
    #   aiohttp
    #   aiosignal
idna==3.4
    # via
    #   requests
    #   yarl
imagesize==1.4.1
    # via sphinx
importlib-metadata==6.8.0
    # via sphinx
jinja2==3.1.2
    # via
    #   -r requirements.in
    #   antsibull-docs
    #   sphinx
markupsafe==2.1.3
    # via jinja2
multidict==6.0.4
    # via
    #   aiohttp
    #   yarl
packaging==23.1
    # via
    #   antsibull-core
    #   antsibull-docs
    #   build
    #   sphinx
perky==0.9.1
    # via antsibull-core
pydantic==1.10.11
    # via
    #   antsibull-core
    #   antsibull-docs
pygments==2.15.1
    # via
    #   ansible-pygments
    #   sphinx
pyproject-hooks==1.0.0
    # via build
pyyaml==6.0
    # via
    #   -r requirements.in
    #   antsibull-core
    #   antsibull-docs
requests==2.31.0
    # via sphinx
resolvelib==1.0.1
    # via -r requirements.in
rstcheck==5.0.0
    # via
    #   -r requirements.in
    #   antsibull-docs
semantic-version==2.10.0
    # via
    #   antsibull-core
    #   antsibull-docs
sh==1.14.3
    # via antsibull-core
six==1.16.0
    # via twiggy
snowballstemmer==2.2.0
    # via sphinx
sphinx==5.3.0
    # via
    #   -r requirements.in
    #   antsibull-docs
    #   sphinx-ansible-theme
    #   sphinx-notfound-page
    #   sphinx-rtd-theme
    #   sphinxcontrib-jquery
sphinx-ansible-theme==0.10.2
    # via -r requirements.in
sphinx-notfound-page==0.8.3
    # via -r requirements.in
sphinx-rtd-theme==1.2.2
    # via sphinx-ansible-theme
sphinxcontrib-applehelp==1.0.4
    # via sphinx
sphinxcontrib-devhelp==1.0.2
    # via sphinx
sphinxcontrib-htmlhelp==2.0.1
    # via sphinx
sphinxcontrib-jquery==4.1
    # via sphinx-rtd-theme
sphinxcontrib-jsmath==1.0.1
    # via sphinx
sphinxcontrib-qthelp==1.0.3
    # via sphinx
sphinxcontrib-serializinghtml==1.1.5
    # via sphinx
tomli==2.0.1
    # via
    #   build
    #   pyproject-hooks
twiggy==0.5.1
    # via
    #   antsibull-core
    #   antsibull-docs
types-docutils==0.18.3
    # via rstcheck
typing-extensions==4.7.1
    # via
    #   pydantic
    #   rstcheck
urllib3==2.0.3
    # via requests
yarl==1.9.2
    # via aiohttp
zipp==3.15.0
    # via importlib-metadata

The above command can be used to update the pins whenever you decide to bump something, even if just transitive deps. Alternatively, Dependabot supports .in+.txt file pairs and can send periodic PRs which would trigger the CI making it pretty visual if any dps bumps break things.

To use the constraints file, you'll need to install the deps as follows:

pip install -r tests/requirements.in -c tests/requirements.txt

Or:

PIP_CONSTRAINT=tests/requirements.txt pip install -r tests/requirements.in

All this can make the docs build env even more stable and updating the transitive deps in a controlled manner is usually beneficial for tracking what makes things break and what doesn't.

tests/requirements.txt Outdated Show resolved Hide resolved
@felixfontein
Copy link
Collaborator

I like the idea of having the dependency tree pinned and not just some 'selected' dependencies. This is also what ansible-test does: https://github.com/ansible/ansible/blob/devel/test/lib/ansible_test/_data/requirements/sanity.validate-modules.in describes some of the top-level requirements (the ones we want to update manually), and some tool expands this to https://github.com/ansible/ansible/blob/devel/test/lib/ansible_test/_data/requirements/sanity.validate-modules.txt with all transitively dependent packages included.

I am fine both with some custom tool like https://github.com/ansible/ansible/blob/devel/hacking/update-sanity-requirements.py or by pip-compile for that. I think pip-compile will be better since we don't have to maintain it :)

@oraNod
Copy link
Contributor Author

oraNod commented Jul 10, 2023

@webknjaz and @felixfontein Thanks for the suggestions. I've been getting into a tangle with dependencies a bit so it's really helpful to get some pointers. Doing it more or less manually, as I was doing previously kind of defeats the point of pinning reqs for stability (given that manual work increases the risk of human error). Seems like pip-compile is the best move.

@webknjaz With your comment about using the constraints file, that's a change we should reflect in docs for any user wanting to install the tested/pinned reqs, correct?

So instead of pip install -r tests/requirements.txt you need to do pip install -r tests/requirements.in -c tests/requirements.txt

@webknjaz
Copy link
Member

So instead of pip install -r tests/requirements.txt you need to do pip install -r tests/requirements.in -c tests/requirements.txt

Yes. Alternatively, I saw a proposal to use tox or nox — this could be embedded into that automation for simplicity and better contributor experience.

@oraNod
Copy link
Contributor Author

oraNod commented Jul 10, 2023

@webknjaz Another question that will expose some ignorance but in your example requirements.in has resolvelib as a direct dependency for ansible-core. Yet constraints.in pins it to resolvelib < 1.1.0.

If that is a direct dependency then should not put resolvelib < 1.1.0 within requirements.in? Thanks.

Edit: To be clear when I say "expose some ignorance" I mean my own...

@webknjaz
Copy link
Member

Constraints are not dependencies. They only have effect if they match something in the requested dependency tree.
Semantically, the direct dependencies typically declare minimum versions, usually meaning that said version has the required feature set. It is rare for the direct deps to declare upper limits. By using constraints, we can have a separate location for various kinds of things that break.

Also, I'd recommend to document what uses a direct dependency. If nothing does, that's probably a transitive dep that needs to be dropped or go to the incoming constraints file so it influences pip-compile.

@gotmax23
Copy link
Collaborator

I saw a proposal to use tox or nox

#54 FTR.

@oraNod oraNod force-pushed the issue-30/pin-test-requirements branch 2 times, most recently from d3d4903 to d5d0ca3 Compare July 11, 2023 13:35
@oraNod
Copy link
Contributor Author

oraNod commented Jul 11, 2023

Constraints are not dependencies. They only have effect if they match something in the requested dependency tree. Semantically, the direct dependencies typically declare minimum versions, usually meaning that said version has the required feature set. It is rare for the direct deps to declare upper limits. By using constraints, we can have a separate location for various kinds of things that break.

Thanks for the explanation. Using constraints is a nice approach.

Also, I'd recommend to document what uses a direct dependency. If nothing does, that's probably a transitive dep that needs to be dropped or go to the incoming constraints file so it influences pip-compile.

What is the best way to determine what uses direct dependency? I think this is what I was having doubts about in a previous comment.

@oraNod oraNod requested review from webknjaz and felixfontein July 11, 2023 14:06
@webknjaz
Copy link
Member

What is the best way to determine what uses direct dependency? I think this is what I was having doubts about in a previous comment.

Well, that's the thing. If you start adding these things in commits that start using them, it's easy to git blame the addition. Adding comments with them, make it easier to track where they are used.
I used a bit of guessing and looking into files. So you can try grepping for imports — if there are imports in the repo, they are used directly, if not — they aren't. But with the mounting of the ansible/ansible subtree, we should probably also consider things that repo uses directly as direct deps too…

Point is — whenever you add something new, add a comment. But for the initial integration, use your best guess.

rstcheck < 6 # rstcheck 6.x has problem with rstcheck.core triggered by include files w/ sphinx directives https://github.com/rstcheck/rstcheck-core/issues/3
antsibull-docs == 2.3.0 # currently approved version
#
# This file is autogenerated by pip-compile with Python 3.11
Copy link
Member

@webknjaz webknjaz Jul 11, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@oraNod FYI, the dependency tree may contain different dependencies on different OS/Python/arch combinations. Make sure to use the same Python version, at least, and document how and under what runtime it's generated/updated.
If you enable Dependabot for bumping this, it'll use an Ubuntu VM but probably extract the Python version from this header so it'll be mostly fine.

This is essentially why I have an entire matrix of constraints here https://github.com/cherrypy/cheroot/tree/main/requirements (the right file is picked up by a tox integration internally, in that case). You should probably decide on a single suported CPython version for the docs builds to limit the corner cases.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@felixfontein What are your thoughts on this one?

Looking at the support matrix, 3.10 appears to be a good option that spans all the stable versions and will also be around for core 2.16.

I'd like to enable Dependabot and it would be straightforward enough to get a dependency tree on Ubuntu. But could we run into any issues with the likes of antsibull docs? Could there be a better option? Thanks.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@webknjaz Cheers for the FYI on different OS combos. I think we should stick with Python 3.11 because all the build jobs for docs.ansible.com use that version and it works fine. I also generated the dependency tree on an Ubuntu VM and got the same result as Fedora.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we should have a script for updating this file in a reproducable way, like by using a Docker/Podman container with the python:3.11 image?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we should have a script for updating this file in a reproducable way, like by using a Docker/Podman container with the python:3.11 image?

I think, this should be in a follow-up and also integrated in tox/nox. Additionally, it would be really cool to have a matrix of constraints and have the command wrappers pick up the right thing for the current system but that's a non-trivial thing to achieve. In my above Cheroot experiment, I've gone for several components: a pip wrapper that is integrated into tox and auto-injects constraints for the correct env if they exist; and a GHA job generating those pins in separate envs.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should stick with Python 3.11 because all the build jobs for docs.ansible.com use that version and it works fine. I also generated the dependency tree on an Ubuntu VM and got the same result as Fedora.

Yeah, the distros of the same arch and Python runtime will usually give the same results, unless they have very different pip or something like that. This is because the conditional deps check for arch / OS type (linux vs darwin vs win32) / Python version. And most of those have wheels published for all/most of the combinations. But if you change just one of those factors, the chance of getting different output gets higher.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we need all of this complication. Having a single constraints file generated with python3.10 (i.e. the lowest version supported on ansible-core's devel branch) on Linux should accomplish what we need.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gotmax23 Let's go with the python3.10 constraints. I should've stuck with that from my previous comment.

For the rest I'll create a separate issue to capture some of the different strands in this PR. I really like the suggestion from @felixfontein to have a script or something and getting this complexity out of the way of contributors. And while I highly value all the insights from @webknjaz (integration with tox/nox is spot on), having a matrix for diff build environments seems like too much cowbell for a docs build. Although, yeah, it'd be cool to have.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've created #108 as a follow up

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed. I wasn't saying that it's a “must have”. I was just describing what's possible in extreme cases. I also think that for docs, a small amount of constraints is preferable and the contributor guidelines should stress what runtimes are expected. Plus, things like tox/nox could error out or print a warning if the runtime isn't matching the supported list.
Still, I foresee two main categories of contributors using macOS and Linux. So having decided on one CPython runtime (3.10) is helpful, but probably warrants two separate constraints files to account for the conditional transitive deps. Having one just with Linux in mind is a good start and the other one could be added once somebody hits a problem it would address.

@oraNod oraNod force-pushed the issue-30/pin-test-requirements branch from d5d0ca3 to 485dde0 Compare July 12, 2023 18:08
@oraNod oraNod requested a review from webknjaz July 12, 2023 18:18
# pip packages required to build the documentation.
# These requirements are as loosely defined as possible.
# For known good versions of doc build dependencies use the following command:
# pip install --user -r tests/requirements.in -c tests/requirements.txt

antsibull-docs >= 1.0.0, < 2.0.0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the function of this separate file? Why not use the one in tests everywhere, or vice versa?
Also, these restrictions are in conflict with the test ones.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I noticed the conflicting restrictions too but didn't want to start changing things here and changing the scope of this PR.

The function, as I understand it, is to have a requirements file that users can modify as they like and experiment with different requirements etc. Although I don't see why a separate file is necessary to do that. Perhaps someone like @felixfontein or @samccann could give a little more background because this has been around for a while and I'm not sure who in the community might be using it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's break this into a separate issue please: #107

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why don't we just replace this with the new requirements file? If users want the unconstrained deps, they can use requirements.in without the pins in requirements.txt. We can add a symlink from one location to another if that's needed for whatever reason. Having two sets of conflicting requirements files does not make sense to me.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It doesn't make sense to me either to have conflicting requirements files but I would prefer to handle it as a separate issue. Using requirements.in is a great suggestion but I wonder if that might change how @felixfontein and possibly others use docs/docsite/requirements.txt.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I only use the requirements files when I update them in PRs to bump the antsibull-docs/... version :) I never used them to install anything (except for looking inside to see what is installed, and installing the missing pieces manually).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why don't we just replace this with the new requirements file? If users want the unconstrained deps, they can use requirements.in without the pins in requirements.txt. We can add a symlink from one location to another if that's needed for whatever reason. Having two sets of conflicting requirements files does not make sense to me.

👍 for that. I would probably have the main requirement files in docs/, and symlinks from tests/requirements.txt to docs/requirements.txt and docs/docsite/requirements.txt to docs/requirements.in.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reason for two files:
/docs/requirements.txt was supposed to be as open as possible and used to publish to devel. The idea was to catch any problems with requirements long before we wanted to update the stable requirements (reflected in tests/requirements.txt).
test/requirements.txt - to be the sable, pinned set of requirements we use to publish to latest so we know the docs will build, and also be identical to the set of requirements used by CI.

Every so often, we would go into the stable one, update requirements to more recent versions, test, and pin at those versions ...and use that set for another few months.

These were not intended for users to experiment with, tho our docs local build instructions say they can use either set.

We could get rid of the open set of requirements I suppose, but that leaves us unable for example, to test antsibull-docs version 'live' on devel for a soak period before enabling that same version on latest.

My high-level objective is to always have stable requirements for our latest release. That means it has to be pinned.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could get rid of the open set of requirements I suppose, but that leaves us unable for example, to test antsibull-docs version 'live' on devel for a soak period before enabling that same version on latest.

@samccann that's not really true — it is possible to just not pass --constraint and use the unpinned stuff if needed. But I'd recommend getting rid of two versions of unpinned deps lists and have just one.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My high-level objective is to always have stable requirements for our latest release. That means it has to be pinned.

Yes, and when the pins are updated (by pip-tools in our case), this is performed through PRs and the build runs with those, so CI is able to catch any problems and what's more important — every dep bump is versioned in Git making it easier to track when things start to misbehave.

@oraNod oraNod force-pushed the issue-30/pin-test-requirements branch from 485dde0 to ab708a7 Compare July 13, 2023 12:24
@oraNod oraNod requested review from webknjaz and gotmax23 July 13, 2023 14:31
# pip packages required to build the documentation.
# These requirements are as loosely defined as possible.
# For known good versions of doc build dependencies use the following command:
# pip install --user -r tests/requirements.in -c tests/requirements.txt

antsibull-docs >= 1.0.0, < 2.0.0
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't particularly want to merge this with these conflicting requirements, but they were there before, so I guess...

docs/docsite/rst/community/documentation_contributions.rst Outdated Show resolved Hide resolved
@gotmax23
Copy link
Collaborator

@oraNod, @webknjaz what are the remaining blockers for this first PR/step? I feel there's been a lot of back and forth and am not sure what's still valid.

@webknjaz
Copy link
Member

This individual dependabot commits for each dependency are kind of annoying, and I like having more control over how pip-compile is run.

Agreed, I also have a workflow example for bumping the deps. It can optionally bump specific deps but updates everything by default.

@webknjaz
Copy link
Member

FWIW this should just be merged with follow-ups as needed. It's a real improvement already and doesn't have blocking effect on anything..

@oraNod oraNod requested a review from samccann July 31, 2023 11:37
@oraNod
Copy link
Contributor Author

oraNod commented Jul 31, 2023

@gotmax23 @webknjaz Thanks for the great improvements and the approvals.

I'm unsure about dependabot straight away, maybe give it a couple of cycles to keep the "noise levels" down in the PR queue. Manual control whether noxfile or gha seems like it fits the need. And if we plan to go with nox then I'd be in favour of adopting that approach over gha. I started poking around with nox last week and think it has a lot of potential as a replacement for the makefile.

@gotmax23
Copy link
Collaborator

gotmax23 commented Jul 31, 2023

And if we plan to go with nox then I'd be in favour of adopting that approach over gha.

They're not mutually exclusive. The Github Action could call a pip-compile session in the noxfile and automatically submit a PR if there are changes.

I started poking around with nox last week and think it has a lot of potential as a replacement for the makefile.

Yeah, I like it a lot. It's quite flexible and I like its philosophy.

@webknjaz
Copy link
Member

webknjaz commented Aug 1, 2023

@oraNod I agree with Maxwell. I'd rather you just merge this already as is and add all those improvements as follow-ups.
All of nox+pip-tools+dependabot are non-exclusive things that are additions to what's implemented here. I feel like we're pulling a log of premature feature creep into a simple PR that would be better merged while it's as small as possible.

@oraNod oraNod removed the needs_triage Needs a first human triage before being processed. label Aug 4, 2023
@oraNod
Copy link
Contributor Author

oraNod commented Aug 4, 2023

@webknjaz @gotmax23 I usually prefer to never merge my own PRs but I'll go ahead with this one. Thanks again.

@oraNod oraNod merged commit e8e6f0a into ansible:devel Aug 4, 2023
@gotmax23
Copy link
Collaborator

gotmax23 commented Aug 4, 2023

🎉. Thanks @oraNod. If you ever want me to merge something that I reviewed, feel free to ping on Matrix.

@webknjaz
Copy link
Member

webknjaz commented Aug 4, 2023

🎉 🎉 🎉

My reviews usually mean that I don't mind the thing being merged, hence it doesn't matter who does that as long as there's no other blockers. But I also didn't want to merge, despite having power to do so, since I'm trying to treat the docs repo as a place where I avoid any decision-making and don't want to step on somebody else's toes… So I'll try to only merge if asked explicitly.

@webknjaz
Copy link
Member

webknjaz commented Aug 4, 2023

@oraNod as a follow-up, consider adding .pip-tools.toml as it's now a thing.

@samccann samccann added backport-2.15 Automatically create a backport for the stable-2.15 branch backport-2.14 Automatically create a backport for the stable-2.14 branch backport-2.13 Automatically create a backport for the stable-2.13 branch labels Aug 4, 2023
@patchback
Copy link

patchback bot commented Aug 4, 2023

Backport to stable-2.14: 💔 cherry-picking failed — target branch does not exist

❌ Failed to fetch https://github.com/ansible/ansible-documentation.git

🤖 @patchback
I'm built with octomachinery and
my source is open — https://github.com/sanitizers/patchback-github-app.

@patchback
Copy link

patchback bot commented Aug 4, 2023

Backport to stable-2.15: 💚 backport PR created

✅ Backport PR branch: patchback/backports/stable-2.15/e8e6f0a3c4cf4eef8b1bd6db5bb7226999a88611/pr-60

Backported as #247

🤖 @patchback
I'm built with octomachinery and
my source is open — https://github.com/sanitizers/patchback-github-app.

@patchback
Copy link

patchback bot commented Aug 4, 2023

Backport to stable-2.13: 💔 cherry-picking failed — conflicts found

❌ Failed to cleanly apply e8e6f0a on top of patchback/backports/stable-2.13/e8e6f0a3c4cf4eef8b1bd6db5bb7226999a88611/pr-60

Backporting merged PR #60 into devel

  1. Ensure you have a local repo clone of your fork. Unless you cloned it
    from the upstream, this would be your origin remote.
  2. Make sure you have an upstream repo added as a remote too. In these
    instructions you'll refer to it by the name upstream. If you don't
    have it, here's how you can add it:
    $ git remote add upstream https://github.com/ansible/ansible-documentation.git
  3. Ensure you have the latest copy of upstream and prepare a branch
    that will hold the backported code:
    $ git fetch upstream
    $ git checkout -b patchback/backports/stable-2.13/e8e6f0a3c4cf4eef8b1bd6db5bb7226999a88611/pr-60 upstream/stable-2.13
  4. Now, cherry-pick PR Pin doc build requirements #60 contents into that branch:
    $ git cherry-pick -x e8e6f0a3c4cf4eef8b1bd6db5bb7226999a88611
    If it'll yell at you with something like fatal: Commit e8e6f0a3c4cf4eef8b1bd6db5bb7226999a88611 is a merge but no -m option was given., add -m 1 as follows instead:
    $ git cherry-pick -m1 -x e8e6f0a3c4cf4eef8b1bd6db5bb7226999a88611
  5. At this point, you'll probably encounter some merge conflicts. You must
    resolve them in to preserve the patch from PR Pin doc build requirements #60 as close to the
    original as possible.
  6. Push this branch to your fork on GitHub:
    $ git push origin patchback/backports/stable-2.13/e8e6f0a3c4cf4eef8b1bd6db5bb7226999a88611/pr-60
  7. Create a PR, ensure that the CI is green. If it's not — update it so that
    the tests and any other checks pass. This is it!
    Now relax and wait for the maintainers to process your pull request
    when they have some cycles to do reviews. Don't worry — they'll tell you if
    any improvements are necessary when the time comes!

🤖 @patchback
I'm built with octomachinery and
my source is open — https://github.com/sanitizers/patchback-github-app.

patchback bot pushed a commit that referenced this pull request Aug 4, 2023
* use pip compile and add constraints file

* don't suggest `pip install --user` in requirements.txt

Co-authored-by: Maxwell G <maxwell@gtmx.me>
Co-authored-by: Sviatoslav Sydorenko <578543+webknjaz@users.noreply.github.com>
(cherry picked from commit e8e6f0a)
@felixfontein
Copy link
Collaborator

If we want to backport this to stable-2.14 and stable-2.13, we'll need to do that manually, in particular since both branches have different pins for the requirements.

@samccann
Copy link
Contributor

samccann commented Aug 7, 2023

@oraNod - I 'think' the ability to publish docs other than dead is dead if we don't do this backport all the way back to 2.13? The jenkins build update (sorry folks, that's still an internal widget) now only works with this patch on devel. I guess the other option is to update the jenkins logic to ..erm.. I dunno? try to detect all possible variants on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backport-2.13 Automatically create a backport for the stable-2.13 branch backport-2.14 Automatically create a backport for the stable-2.14 branch backport-2.15 Automatically create a backport for the stable-2.15 branch
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Freeze requirements in tests/requirements.txt
6 participants