-
-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[cargo-nextest 0.9.72] When one test fails, every test is marked as failed in the Neotest summary #460
Comments
@wvffle could you please also paste your test module? |
Let me write a new one, the tested one was a private project |
I've updated my project to use nextest 0.9.72 and I'm now getting similar output. |
those are integration tests marked with |
Nope. Mine are marked with |
https://github.com/wvffle/rustaceanvim-460-repro using devenv.sh, so |
@wvffle I've implemented a hotfix in the Would you mind giving it a test drive in your private project? |
@mrcjkb strangely it does not work in my private project but it does work in the repro project. This is my output:
Losing output color in neotest preview is a pity but I guess I can't get both :/ |
I guess it's because of the space between the namespace and the test name. That wasn't present in the repro project's output. I'll have to revisit this another time.
ansi escape codes would complicate the parsing logic. |
@wvffle could it be that you don't have your tests in a tests module, such as: #[cfg(test)]
mod tests {
use super::*;
// test functions here
} If you have all your tests at the top-level, rustaceanvim will just run the tests for the whole module, because there's no such thing as a "test file" in rust. |
maybe not, but there for sure is
Maybe because the repro project wasn't using cargo workspaces. When doing so, the hotfix still works in that repo. I've pushed a And the output contains the space you are talking about, I believe
I guess I've found another bug:With two failing tests, the second failed one seems to be marked as skipped even though the output is saying that zero tests were skipped and 2 have failed. Output
|
Up until now, I've been running the tests from the neotest summary, for the This is strange. Output of auth.rs module
|
@mrcjkb I guess I've cracked it: I've commented out all failing tests and added a basic test that failed and it works well. In the real test, I use an assert method on a test client from the poem framework and it fails like so:
This, I think, also makes it so that the test failure reason is not present as diagnostics in the file. When I did a simple
I've replicated exact behavior in the repro project on the master branch: |
The thing is, you can't tell cargo to "run a test file", which is a thing neotest tries to do when you run from the summary or with the cursor outside of a test function or module. Afaik, neotest with neotest-rust works around this by parsing the file and running all tests in it individally.
Could this be caused by neotest trying to run a test file too?
Yep. That error message contains no information about the location of the failure, because the assertion is in a library. There's not much I can do there. Does this relate to this issue or is it a separate one? |
Thanks for clarification
I don't know :/
Well, it seems like it's the culprit why all tests fail in that specific module even after supporting nextest 0.9.72. |
I think the only way to solve this reliably is by using nexttest's JUnit output . I'll merge #461 and close this issue for now, and I'll open a separate issue for using JUnit output. |
Have you read the docs and searched existing issues?
Neovim version (nvim -v)
v0.10.0
Operating system/version
NixOS unstable
cargo 1.79.0 (ffa9cf99a 2024-06-03)
cargo-nextest 0.9.72
Output of :checkhealth rustaceanvim
How to reproduce the issue
Expected behaviour
Only failed tests should be marked as failed and passed tests should be marked as passed
Actual behaviour
All tests are marked as failed
The minimal config used to reproduce this issue.
The text was updated successfully, but these errors were encountered: