Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide hook to run doctests as part of normal test suite #198

Closed
davidanthoff opened this issue Aug 16, 2016 · 11 comments · Fixed by #774
Closed

Provide hook to run doctests as part of normal test suite #198

davidanthoff opened this issue Aug 16, 2016 · 11 comments · Fixed by #774

Comments

@davidanthoff
Copy link
Contributor

It would be great if there was a function that I could call in my normal runtests.jl script that runs all the doctests, for example like this:

Documenter.run_doctests(packagename)

This would have two benefits, as far as I can see:

  1. Errors in the doctests would actually show up as a broken build (right now an error in the doctest is actually not breaking a travis build)
  2. doctests could be run in all PRs, branches etc. My sense is that one would want to run doctests essentially whenever one wants to run normal tests, but of course doesn't want the docs to be build/deployed in all of these situations.
@MichaelHatherly
Copy link
Member

The problem I've run into with it is that to know what setup code (in @meta blocks) is needed for each doctest requires Documenter to parse all the markdown files in the docs/src folder.

I think it's probably simplest to just throw an error if any errors are reported during makedocs.

but of course doesn't want the docs to be build/deployed in all of these situations.

deploydocs should never actually run during a PR or on a non-master branch. It only deploys due to a push to master or when a new tag is pushed. So running docs/make.jl as part of your testsuite instead of as a separate step should work fine.

@davidanthoff
Copy link
Contributor Author

I think it's probably simplest to just throw an error if any errors are reported during makedocs.

Yes, that would definitely be good.

So running docs/make.jl as part of your testsuite instead of as a separate step should work fine.

That should work for now. But I think it would still be nice to have a specific test function that does less than makedocs. Yes, it still has to parse everything, but it wouldn't have to write e.g. any output, right?

@MichaelHatherly
Copy link
Member

Yes, it still has to parse everything, but it wouldn't have to write e.g. any output, right?

Yes, there's enough flexibility in the build "pipeline" to be able to reorder and/or disable certain steps, so we could probably add a testdocs / test function that just does the bare minimum I think.

@davidanthoff
Copy link
Contributor Author

Actually, if you would not just throw an exception, but instead use the @test macros from base, then one could include the make.jl file in a @testset and would get a nice summary output of how many tests passed etc.

@MichaelHatherly
Copy link
Member

Hooking into the testset infrastructure might work quite nicely. I'd want to leave something like that until we've dropped Julia 0.4 support though, since otherwise it might get kind of messy. Dropping 0.4 will likely happen for the Documenter 0.4 release, whenever that happens, there's no set date just yet.

@davidanthoff
Copy link
Contributor Author

Hm, if you just use @test for the comparison of the expected output and the output that you got? That should work on julia 0.4 and 0.5, and I think that would be enough to hook into testsets on 0.5? But certainly not a high priority.

@MichaelHatherly
Copy link
Member

The current output of @test for string comparisons can be difficult to read at times (not that Documenter's is much better) when there's two long strings with only very minor differences. What would be cool to have is something like Elixir's string diff (http://elixir-lang.org/blog/2016/06/21/elixir-v1-3-0-released/) for test output. That would make it really easy to see the differences.

@davidanthoff
Copy link
Contributor Author

Now that julia 0.5 support is dropped, maybe this could be implemented? I don't really care about the details, but it would be great if I could just add the one line from my original post here to my runtests.jl file and automatically have all the doc tests run as part of my normal tests, plus report errors in the doctests as part of the normal julia testing infrastructure.

@davidanthoff
Copy link
Contributor Author

Fantastic! Is there some documentation how to include doctests in the normal runtests.jl?

I just tried to look at the dev version of the Documenter.jl docs, but ironically those don't seem to be hosted/work?

@mortenpi
Copy link
Member

mortenpi commented Jul 1, 2019

Well, the PR was merged 6 minutes ago. The docs are still building 😂

I can offer this as a temporary substitute though 🙂

@mortenpi
Copy link
Member

mortenpi commented Jul 1, 2019

By the way, the exact API can probably be improved, so any suggestions and feedback would be much appreciated. Feel free to open an issue, or add to #1051.

Regarding the docs: actually, yes, #774 did break the dev/ docs, so they're gone at the moment. Will fix tomorrow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants