-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updating requirements for myst-parser #1197
Conversation
I think that the cherry-pick worked OK! Will start trying to figure out errors. BTW, @jpmckinney - I'm trying to locally run the command that the CI build is failing on, but it is giving me a different command locally. I wonder if there's an easy "gotcha" that I am not noticing:
I will dig into the docs a bit closer next time to see if there's more in there about building the translations. I don't have a ton of familiarity with translation workflows in Sphinx |
There are dependencies between commands (eg I think this error is because the There are many Make targets - if you just want to build one language for example. |
Gotcha, thanks. This is probably the most formidable Makefile I've ever used 😅 I am very impressed and a little fearful! |
I would suggest turning on fail on errors when building the documentation at e.g.: Line 99 in 323dee4
currently there are a bunch of warnings in the documentation build (even before this PR): https://github.com/open-contracting/standard/runs/1810749509 |
I'm making some updates to one of our Sphinx extensions to not warn about things that are actually fine. However, some of the warnings I don't know how to quiet. For example:
When we build the documentation, we write some JSON and CSV files to the build directory that Sphinx doesn't know about, so it complains about "target not found". I figure there's some way to tell Sphinx about those files. I've seen |
@jpmckinney would it be acceptable to use the
This is used to link directly to assets that are not "part of the documentation" but are meant to be downloaded |
Aha! That sounds like the solution. I didn't know about that role (for my later reading: https://www.sphinx-doc.org/en/master/usage/restructuredtext/roles.html#referencing-downloadable-files) |
@choldgraf Sphinx merged my PR (sphinx-doc/sphinx#8853) for heading level issue (executablebooks/MyST-Parser#302), so we can pin to a commit in Sphinx until the next 3.x release is made. |
Wow, nice! Will add that to the next commit. Right now I am trying to figure out this weird bug that we seem to be triggering in the markdown-it-py library when doing translations. |
looking at your PR there, I wonder if that actually fixes our bug? Will investigate |
The PR makes the heading level go from 0 to 1 instead of 0 to 2, so it should fix the immediate problem. |
I think the following error is fixed if you upgrade mdit-py-plugins to 0.2.5 (released for this fix executablebooks/mdit-py-plugins#11)
|
@jpmckinney ahh I just spent way too long trying to figure out what the heck was going on with that bug haha, I didn't realize that was the package to upgrade, I knew it had been addressed somewhere. FWIW this commit is what I finally used as a workaround, and I have no idea why it works 😅 OK I reverted that change and pushed another commit that updates mdit-py-plugins. |
dcd9b4b
to
2c6150d
Compare
@jpmckinney any thoughts on this one? Latest error seems to be some custom docs code in this repo? https://github.com/open-contracting/standard/pull/1197/checks?check_run_id=1884827925#step:6:64 |
Oh, yeah, I'll have to edit that script for the new theme (should be easy - just changing a selector). For now, you can comment out It's possible some of the tests in the I'll also need to customize the theme later, because we replace the search page's JavaScript to send a query to Elasticsearch. (The deployment process creates an Elasticsearch index.) |
@jpmckinney sounds good - I don't have any more cycles for this one tonight, so I'll just comment that line out and see what breaks next but then gonna log off :-) Are you using ElasticSearch just to speed things up? Nifty... |
The quality of Sphinx's search results are not great, so we use Elasticsearch, which supports quoted phrases, etc. and we're able to tune the results better. |
@jpmckinney you ever write up your workflow there in a blog post or something? That is pretty cool, a lot of folks ask for improved search in Jupyter Book and I've been trying to figure out ways that we could recommend something, or improve the tech itself Curious if you ever played around with using something like lunr |
Interesting - I'll have a look at Lunr. I haven't written it up, but https://ocds-index.readthedocs.io/en/latest/ is the Python package that does the work (at the Python-level, it's possible to configure it for different themes). ocds-index's CLI gets called by this script, which is run by GitHub Actions once the tests pass: https://github.com/open-contracting/deploy/blob/master/deploy-docs.sh#L8-L9 Then, the theme queries Elasticsearch: https://github.com/open-contracting/standard_theme/blob/open_contracting/standard_theme/static/js/search.js#L20-L53 We use https://readonlyrest.com to expose Elasticsearch to the web safely. (You'll notice the JavaScript in the theme uses Basic Authentication nonetheless, just so that random bots don't hit Elasticsearch.) |
Very cool, thanks for sharing! Note: I'm not sure what the latest failure is. Though tests seem to pass 🙂 https://github.com/open-contracting/standard/pull/1197/checks?check_run_id=1884900587#step:8:1 |
Oh, it's because this branch is from your fork, rather than in the repo itself, and an ssh-related GitHub Action isn't letting the build access the repo's secrets for that reason. You said you'd be logging off for the evening 😛 but if you change your origin and push a new branch to the repo directly, then it should build! |
@jpmckinney - hmmm - I don't believe that I have permission to push directly to the standard repo:
|
Oops, I thought I had already invited you. I've sent an invite now. |
This is a first-pass PR to get some low-hanging fruit out of the way. It does the following things:
eval_rst
blocks to instead use{eval-rst}
directives, which are the MyST-equivalent of the same thing[sometext](page#header)
). Will need to find these and update them.supercedes #1192