Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

XFail and XPass output does not get printed to console #10618

Closed
jeffwright13 opened this issue Dec 29, 2022 · 4 comments
Closed

XFail and XPass output does not get printed to console #10618

jeffwright13 opened this issue Dec 29, 2022 · 4 comments
Labels
type: question general question, might be closed after 2 weeks of inactivity

Comments

@jeffwright13
Copy link

I don't see any output on the console when I run tests marked XFail or XPass. In fact, these tests only show up in the summaries, not in the PASSES nor the FAILURES section, where I would expect to see them. This would seem to be a fairly serious oversight, since testers certainly want to see the output of their tests that have been marked as such. Am I missing something?

Now, if I specify 'live logs' on the command line, I get my expected output, but only live; not captured. Given that a lot of pytest users probably don't know about this option, wouldn't it make sense to print the stdout/stderr/stdlog to console by default when processing XFail and XPass cases?

Thanks, Jeff

Example console outputs:

$ pytest -rAR -k test_single
========================================= test session starts =========================================
platform darwin -- Python 3.9.9, pytest-7.2.0, pluggy-1.0.0
rootdir: /Users/jwr003/coding/pytest-tui, configfile: pytest.ini
plugins: rerunfailures-10.3, metrics-0.4, assume-2.4.3, html-3.2.0, typhoon-xray-0.4.7,
json-report-1.5.0, Faker-13.15.0, adaptavist-5.8.0, session2file-0.1.11, allure-pytest-2.12.0,
jira-xray-0.8.2, reportportal-5.1.3, pylama-8.4.1, reportlog-0.1.2, metadata-2.0.2, tui-1.6.0

collected 78 items / 76 deselected / 2 selected

demo-tests/test_single_xpass_xfail.py xX                                                        [100%]

======================================= short test summary info =======================================
XFAIL demo-tests/test_single_xpass_xfail.py::test0_xfail
XPASS demo-tests/test_single_xpass_xfail.py::test0_xpass
============================ 76 deselected, 1 xfailed, 1 xpassed in 1.27s =============================
$ pytest --log-cli-level=DEBUG -rAR -k test_single
========================================= test session starts =========================================
platform darwin -- Python 3.9.9, pytest-7.2.0, pluggy-1.0.0
rootdir: /Users/jwr003/coding/pytest-tui, configfile: pytest.ini
plugins: rerunfailures-10.3, metrics-0.4, assume-2.4.3, html-3.2.0, typhoon-xray-0.4.7,
json-report-1.5.0, Faker-13.15.0, adaptavist-5.8.0, session2file-0.1.11, allure-pytest-2.12.0,
jira-xray-0.8.2, reportportal-5.1.3, pylama-8.4.1, reportlog-0.1.2, metadata-2.0.2, tui-1.6.0

collected 78 items / 76 deselected / 2 selected

demo-tests/test_single_xpass_xfail.py::test0_xfail
-------------------------------------------- live log call --------------------------------------------
CRITICAL root:test_single_xpass_xfail.py:10 CRITICAL
ERROR    root:test_single_xpass_xfail.py:11 ERROR
WARNING  root:test_single_xpass_xfail.py:12 WARNING
INFO     root:test_single_xpass_xfail.py:13 INFO
DEBUG    root:test_single_xpass_xfail.py:14 DEBUG
XFAIL                                                                                           [ 50%]
demo-tests/test_single_xpass_xfail.py::test0_xpass
-------------------------------------------- live log call --------------------------------------------
CRITICAL root:test_single_xpass_xfail.py:21 CRITICAL
ERROR    root:test_single_xpass_xfail.py:22 ERROR
WARNING  root:test_single_xpass_xfail.py:23 WARNING
INFO     root:test_single_xpass_xfail.py:24 INFO
DEBUG    root:test_single_xpass_xfail.py:25 DEBUG
XPASS                                                                                           [100%]

======================================= short test summary info =======================================
XFAIL demo-tests/test_single_xpass_xfail.py::test0_xfail
XPASS demo-tests/test_single_xpass_xfail.py::test0_xpass
============================ 76 deselected, 1 xfailed, 1 xpassed in 1.02s =============================

Test file:

import logging
import pytest

logger = logging.getLogger()


@pytest.mark.xfail()
def test0_xfail():
    print("Test 0 XFail")
    logger.critical("CRITICAL")
    logger.error("ERROR")
    logger.warning("WARNING")
    logger.info("INFO")
    logger.debug("DEBUG")
    assert False


@pytest.mark.xfail()
def test0_xpass():
    print("Test 0 XPass")
    logger.critical("CRITICAL")
    logger.error("ERROR")
    logger.warning("WARNING")
    logger.info("INFO")
    logger.debug("DEBUG")
    assert True
@RonnyPfannschmidt
Copy link
Member

That's intentional,

With strict, xpass gets treated as error and should have output

@jeffwright13
Copy link
Author

Interesting, I was not aware of that option. I put that in pytest.ini, and the XPass now shows up as a Failure, as you had mentioned. I still don't see anything related to XFail - which I know you didn't say would be there - but is there a way to have XFails print out their output? (Also curious why the capture stdout call section prints the log messages in triplicate.)

$ pytest -k single -rA
======================================= test session starts ========================================
platform darwin -- Python 3.9.9, pytest-7.2.0, pluggy-1.0.0
rootdir: /Users/jwr003/coding/pytest-tui, configfile: pytest.ini
plugins: rerunfailures-10.3, metrics-0.4, assume-2.4.3, html-3.2.0, typhoon-xray-0.4.7, json-report-1.5.0, Faker-13.15.0, adaptavist-5.8.0, session2file-0.1.11, allure-pytest-2.12.0, jira-xray-0.8.2, reportportal-5.1.3, pylama-8.4.1, reportlog-0.1.2, metadata-2.0.2, tui-1.6.0
collected 78 items / 76 deselected / 2 selected

demo-tests/test_single_xpass_xfail.py xF                                                     [100%]

============================================= FAILURES =============================================
___________________________________________ test0_xpass ____________________________________________
[XPASS(strict)]
--------------------------------------- Captured stdout call ---------------------------------------
Test 0 XPass
CRITICAL
CRITICAL
CRITICAL
ERROR
ERROR
ERROR
WARNING
WARNING
WARNING
INFO
INFO
INFO
DEBUG
DEBUG
DEBUG
---------------------------------------- Captured log call -----------------------------------------
CRITICAL root:test_single_xpass_xfail.py:23 CRITICAL
ERROR    root:test_single_xpass_xfail.py:24 ERROR
WARNING  root:test_single_xpass_xfail.py:25 WARNING
INFO     root:test_single_xpass_xfail.py:26 INFO
DEBUG    root:test_single_xpass_xfail.py:27 DEBUG
========================================= warnings summary =========================================
demo-tests/test_single_xpass_xfail.py::test0_xfail
  /Users/jwr003/coding/pytest-tui/demo-tests/test_single_xpass_xfail.py:16: Warning: You ave been warned!
    warnings.warn(Warning("You ave been warned!"))

demo-tests/test_single_xpass_xfail.py::test0_xpass
  /Users/jwr003/coding/pytest-tui/demo-tests/test_single_xpass_xfail.py:28: Warning: You ave been warned!
    warnings.warn(Warning("You ave been warned!"))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
===================================== short test summary info ======================================
XFAIL demo-tests/test_single_xpass_xfail.py::test0_xfail
FAILED demo-tests/test_single_xpass_xfail.py::test0_xpass
===================== 1 failed, 76 deselected, 1 xfailed, 2 warnings in 1.13s ======================

@Zac-HD Zac-HD added the type: question general question, might be closed after 2 weeks of inactivity label Jan 6, 2023
@Alxe
Copy link

Alxe commented Jan 21, 2023

While the behavior is intended, is there any way to log this information as part of the summary of the test? For example, by using report hooks.

One of my use cases is to run typing tests on a legacy code base (unstable typing) and to produce a test per error code so that I can catch regression errors once a given error code is stable.
In order to slowly stabilize the analysis of each error code, I want to see the output of XFail tests.

@Zac-HD
Copy link
Member

Zac-HD commented Feb 17, 2024

I believe this was fixed by #11574, in Pytest 8.0 🙂

@Zac-HD Zac-HD closed this as completed Feb 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: question general question, might be closed after 2 weeks of inactivity
Projects
None yet
Development

No branches or pull requests

4 participants