Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SugarVerbose: Print more verbose testcase reports for all o… #175

Closed
wants to merge 1 commit into from

Conversation

mitzkia
Copy link

@mitzkia mitzkia commented Jun 1, 2019

…utcome

Signed-off-by: Andras Mitzki mitzkia@gmail.com

I have found that pytest-sugar can display each failed testcases which is a useful feature.
For me it would be even better if it could display it for each pytest outcomes.
The original idea came from the error outcomes which can happen in setup or teardown phases also and I would like to know when it happened.
Maybe the option name (--sugar-verbose) is not the best, it is fine to change it.

Update:
The functionality from using "--sugar-verbose" arg has been changed to use "native" pytest arg: "-ra"

Example to run this feature use the following execution:

$ python3 -m pytest -ra t.py 

Before

Results (0.05s):
       5 passed
       3 failed
         - t.py:4 test_B
         - t.py:7 test_C
         - t.py:10 test_D
       1 error

After

Results (0.05s):
       5 passed
         - t.py:1 test_A (call)
         - t.py:18 test_E (call)
         - t.py:23 test_F[3+5-8] (call)
         - t.py:23 test_F[2+4-6] (call)
         - t.py:23 test_F[6*9-42] (call)
       3 failed
         - t.py:4 test_B (call)
         - t.py:7 test_C (call)
         - t.py:10 test_D (call)
       1 error
         - t.py:18 test_E (teardown)

@blueyed
Copy link
Collaborator

blueyed commented Jun 1, 2019

Just a quick idea: maybe this should be based on / coupled with pytest's -r option (reportopts), i.e. -ra would enable it here then also.

@codecov-io
Copy link

codecov-io commented Jun 1, 2019

Codecov Report

Merging #175 into master will decrease coverage by 3.47%.
The diff coverage is 52.94%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #175      +/-   ##
==========================================
- Coverage   85.32%   81.85%   -3.48%     
==========================================
  Files           2        2              
  Lines         477      496      +19     
  Branches       84       92       +8     
==========================================
- Hits          407      406       -1     
- Misses         44       57      +13     
- Partials       26       33       +7
Impacted Files Coverage Δ
pytest_sugar.py 76.32% <52.94%> (-4.56%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update a210316...066454a. Read the comment docs.

@mitzkia
Copy link
Author

mitzkia commented Jun 1, 2019

Thanks @blueyed I will check it.

@mitzkia
Copy link
Author

mitzkia commented Jun 1, 2019

I have checked it and found some issues.
1, It did not display where the testcase error occurred: in setup or in teardown
2, There is a very different output when pytest-sugar is installed or not (for the same testcase)

Report output when pytest-sugar is not installed:

$ python3 -m pytest -rA t.py
platform linux -- Python 3.6.7, pytest-4.5.0, py-1.6.0, pluggy-0.11.0
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/micek/na_akkor_ujra/backup/playground/python/play_with_pytest/.hypothesis/examples')
rootdir: /home/micek/na_akkor_ujra/backup/playground/python/play_with_pytest
plugins: xdist-1.28.0, repeat-0.8.0, icdiff-0.2, forked-1.0.2, hypothesis-4.23.4
collected 8 items  
...
============================================================ short test summary info =============================================================
ERROR t.py::test_E - AssertionError: assert 'aaa' == 'bbb'
FAILED t.py::test_B - AssertionError: assert equals failed
FAILED t.py::test_C - Exception: ('spam', 'eggs')
FAILED t.py::test_D - ZeroDivisionError: division by zero
PASSED t.py::test_A
PASSED t.py::test_E
PASSED t.py::test_F[3+5-8]
PASSED t.py::test_F[2+4-6]
PASSED t.py::test_F[6*9-42]

Report output when pytest-sugar is installed:

$ python3 -m pytest -rA t.py
Test session starts (platform: linux, Python 3.6.7, pytest 4.5.0, pytest-sugar 0.9.2)
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/micek/na_akkor_ujra/backup/playground/python/play_with_pytest/.hypothesis/examples')
rootdir: /home/micek/na_akkor_ujra/backup/playground/python/play_with_pytest
plugins: xdist-1.28.0, sugar-0.9.2, repeat-0.8.0, icdiff-0.2, forked-1.0.2, hypothesis-4.23.4
collecting ... 
...
============================================================ short test summary info =============================================================
FAILED t.py::test_B - AssertionError: assert equals failed
FAILED t.py::test_C - Exception: ('spam', 'eggs')
FAILED t.py::test_D - ZeroDivisionError: division by zero
FAILED t.py::test_E - AssertionError: assert 'aaa' == 'bbb'
PASSED t.py::test_A
PASSED t.py::test_A
PASSED t.py::test_A
PASSED t.py::test_B
PASSED t.py::test_B
PASSED t.py::test_C
PASSED t.py::test_C
PASSED t.py::test_D
PASSED t.py::test_D
PASSED t.py::test_E
PASSED t.py::test_E
PASSED t.py::test_F[3+5-8]
PASSED t.py::test_F[3+5-8]
PASSED t.py::test_F[3+5-8]
PASSED t.py::test_F[2+4-6]
PASSED t.py::test_F[2+4-6]
PASSED t.py::test_F[2+4-6]
PASSED t.py::test_F[6*9-42]
PASSED t.py::test_F[6*9-42]
PASSED t.py::test_F[6*9-42]

Results (0.05s):
       5 passed
       3 failed
         - t.py:4 test_B
         - t.py:7 test_C
         - t.py:10 test_D
       1 error

my example test file

def test_A():
    assert "aa" == "aa"

def test_B():
    assert "aa" == "bb"

def test_C():
    raise Exception('spam', 'eggs')

def test_D():
    a = 7
    b = 0
    a/b

def stop():
    assert "aaa" == "bbb"

def test_E(request):
    request.addfinalizer(stop)

import pytest

@pytest.mark.parametrize("test_input,expected", [("3+5", 8), ("2+4", 6), ("6*9", 42)])
def test_F(test_input, expected):
    assert True

@mitzkia
Copy link
Author

mitzkia commented Jun 1, 2019

I can understand why there are so many testcases in the report, it counts: call, setup and teardown for each testcase. But there is an outcome change from: ERROR t.py::test_E - AssertionError: assert 'aaa' == 'bbb' to FAILED t.py::test_E - AssertionError: assert 'aaa' == 'bbb' which is more problematic as I think.

@mitzkia
Copy link
Author

mitzkia commented Jun 1, 2019

Oh I understand you, maybe we did not need this new option: "--sugar-verbose", just enable this feature when -ra is defined in arguments. I will think about this.

@mitzkia mitzkia changed the title SugarVerbose: Add --sugar-verbose to print testcase reports for all o… SugarVerbose: Print more verbose testcase reports for all o… Jun 1, 2019
@mitzkia
Copy link
Author

mitzkia commented Jun 1, 2019

@blueyed Thanks for the note, I have removed the option and the feature can be enabled with native "-ra" arg

@mitzkia
Copy link
Author

mitzkia commented Jun 9, 2019

Correct me if I am wrong, without fixing CI failures my PR can not be merged. As I saw there are already some PR(s) to fix those issues. My question is: how can I help to move forward my PR?
Can I do some reviewing or testing (which PR(s) would fix CI issues?)?

@blueyed
Copy link
Collaborator

blueyed commented Jun 9, 2019

@mitzkia we have to wait for @Frozenball here I assume.

@mitzkia
Copy link
Author

mitzkia commented Jun 9, 2019

Ok, thank you for the answer.

@blueyed
Copy link
Collaborator

blueyed commented Jun 9, 2019

Actually I can merge things here, and CI should be fixed after merging #156 (which was approved / got no more feedback).

In general I am not using pytest-sugar myself by default.. (so do not expect too much help / reviewing from me here)

@blueyed
Copy link
Collaborator

blueyed commented Jun 9, 2019

... so apparently #156 was stalled for too long, and CI is still broken with it now: https://travis-ci.org/Frozenball/pytest-sugar/builds/543349387

Do you feel like fixing it? (from a quick look it appears there is an issue with xdist, and some other dep compatibility at least)
From my point of view we would not really have to support pytest30 for example (if that's an issue), but it would be great to get feedback / opinion from @Frozenball as the owner here.

@mitzkia
Copy link
Author

mitzkia commented Jun 9, 2019

Thanks for your answer. I would say lets wait for a while @Frozenball's answer. Soon I will check the broken CI and #156

@blueyed
Copy link
Collaborator

blueyed commented Jun 9, 2019

I am trying to fix it quickly in #177 already.

@blueyed
Copy link
Collaborator

blueyed commented Jun 9, 2019

Merged #177, so you are able to rebase your PR(s) at least for now.

@mitzkia
Copy link
Author

mitzkia commented Jun 9, 2019

Thanks :) , I will do it

@mitzkia
Copy link
Author

mitzkia commented Jun 9, 2019

Thanks again, I will fix the CI fail.

…utcome

Signed-off-by: Andras Mitzki <mitzkia@gmail.com>
@Teemu
Copy link
Owner

Teemu commented Nov 8, 2022

Hey 👋

Thank you for your PR! I am going through the old PRs in this repository and closing them since a long time has passed since the opening and they may not be that relevant anymore. If you still wish to continue with this pull request, let me know and we can re-open it.

Teemu

@Teemu Teemu closed this Nov 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants