-
Notifications
You must be signed in to change notification settings - Fork 24
/
pytest-dev__pytest-7490.gpt-4-0125-preview.eval.log
1258 lines (1060 loc) · 72.9 KB
/
pytest-dev__pytest-7490.gpt-4-0125-preview.eval.log
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Task Metadata:
- Instance ID: pytest-dev__pytest-7490
- Testbed: pytest-dev__pytest__6.0
-
- Evaluation Model: gpt-4-0125-preview
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Command: git -c advice.detachedHead=false checkout 7f7a36478abe7dd1fa993b115d22606aa0e35e88
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Subprocess args: {"cwd": "/home/swe-bench/pytest-dev__pytest", "check": true, "shell": false, "universal_newlines": true, "stdout": -1, "stderr": -2}
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Std. Output:
Previous HEAD position was 634cde950 Merge pull request #7745 from asottile/exec_globals_type_problem
HEAD is now at 7f7a36478 Merge pull request #7482 from nicoddemus/changelog-adjustments
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Return Code: 0
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Command: git apply -v /home/swe-bench/temp_pytest-dev__pytest-7490_pred_try.patch
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Subprocess args: {"cwd": "/home/swe-bench/pytest-dev__pytest", "check": false, "shell": false, "universal_newlines": true, "stdout": -1, "stderr": -2}
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Std. Output:
Checking patch src/_pytest/nodes.py...
Applied patch src/_pytest/nodes.py cleanly.
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Return Code: 0
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Apply patch successful (pred_try)
>>>>> Applied Patch (pred_try)
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Command: git apply -v -R /home/swe-bench/temp_pytest-dev__pytest-7490_pred_try.patch
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Subprocess args: {"cwd": "/home/swe-bench/pytest-dev__pytest", "check": false, "shell": false, "universal_newlines": true, "stdout": -1, "stderr": -2}
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Std. Output:
Checking patch src/_pytest/nodes.py...
Applied patch src/_pytest/nodes.py cleanly.
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Return Code: 0
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Revert patch successful (pred_try)
>>>>> Applied Patch (pred_try)
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Command: git apply -v /home/swe-bench/temp_pytest-dev__pytest-7490_pred.patch
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Subprocess args: {"cwd": "/home/swe-bench/pytest-dev__pytest", "check": false, "shell": false, "universal_newlines": true, "stdout": -1, "stderr": -2}
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Std. Output:
Checking patch src/_pytest/nodes.py...
Applied patch src/_pytest/nodes.py cleanly.
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Return Code: 0
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Apply patch successful (pred)
>>>>> Applied Patch (pred)
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Command: git restore testing/test_skipping.py
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Subprocess args: {"cwd": "/home/swe-bench/pytest-dev__pytest", "check": true, "shell": false, "universal_newlines": true, "stdout": -1, "stderr": -2}
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Std. Output:
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Return Code: 0
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Command: git apply -v /home/swe-bench/temp_pytest-dev__pytest-7490_test.patch
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Subprocess args: {"cwd": "/home/swe-bench/pytest-dev__pytest", "check": false, "shell": false, "universal_newlines": true, "stdout": -1, "stderr": -2}
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Std. Output:
Checking patch testing/test_skipping.py...
Applied patch testing/test_skipping.py cleanly.
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Return Code: 0
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Apply patch successful (test)
>>>>> Applied Patch (test)
Test Script: conda run -n pytest-dev__pytest__6.0 pytest -rA testing/test_skipping.py;
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Command: conda run -n pytest-dev__pytest__6.0 pytest -rA testing/test_skipping.py
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Subprocess args: {"cwd": "/home/swe-bench/pytest-dev__pytest", "check": false, "shell": false, "universal_newlines": true, "stdout": -1, "stderr": -2, "timeout": 1800}
[pytest-dev__pytest__6.0] [pytest-dev__pytest-7490] Std. Output:
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /home/swe-bench/pytest-dev__pytest, configfile: pyproject.toml
collected 81 items
testing/test_skipping.py ............................FF................. [ 58%]
.................................. [100%]
=================================== FAILURES ===================================
____________ TestXFail.test_dynamic_xfail_set_during_runtest_failed ____________
self = <test_skipping.TestXFail object at 0x74ccea570100>
testdir = <Testdir local('/tmp/pytest-of-swe-bench/pytest-0/test_dynamic_xfail_set_during_runtest_failed0')>
def test_dynamic_xfail_set_during_runtest_failed(self, testdir: Testdir) -> None:
# Issue #7486.
p = testdir.makepyfile(
"""
import pytest
def test_this(request):
request.node.add_marker(pytest.mark.xfail(reason="xfail"))
assert 0
"""
)
result = testdir.runpytest(p)
> result.assert_outcomes(xfailed=1)
E AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E Omitting 4 identical items, use -vv to show
E Differing items:
E {'xfailed': 0} != {'xfailed': 1}
E {'failed': 1} != {'failed': 0}
E Use -v to get the full diff
/home/swe-bench/pytest-dev__pytest/testing/test_skipping.py:440: AssertionError
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_dynamic_xfail_set_during_runtest_failed0
collected 1 item
test_dynamic_xfail_set_during_runtest_failed.py F [100%]
=================================== FAILURES ===================================
__________________________________ test_this ___________________________________
request = <FixtureRequest for <Function test_this>>
def test_this(request):
request.node.add_marker(pytest.mark.xfail(reason="xfail"))
> assert 0
E assert 0
test_dynamic_xfail_set_during_runtest_failed.py:4: AssertionError
=========================== short test summary info ============================
FAILED test_dynamic_xfail_set_during_runtest_failed.py::test_this - assert 0
============================== 1 failed in 0.01s ===============================
________ TestXFail.test_dynamic_xfail_set_during_runtest_passed_strict _________
self = <test_skipping.TestXFail object at 0x74ccea4dfa30>
testdir = <Testdir local('/tmp/pytest-of-swe-bench/pytest-0/test_dynamic_xfail_set_during_runtest_passed_strict0')>
def test_dynamic_xfail_set_during_runtest_passed_strict(
self, testdir: Testdir
) -> None:
# Issue #7486.
p = testdir.makepyfile(
"""
import pytest
def test_this(request):
request.node.add_marker(pytest.mark.xfail(reason="xfail", strict=True))
"""
)
result = testdir.runpytest(p)
> result.assert_outcomes(failed=1)
E AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E Omitting 4 identical items, use -vv to show
E Differing items:
E {'passed': 1} != {'passed': 0}
E {'failed': 0} != {'failed': 1}
E Use -v to get the full diff
/home/swe-bench/pytest-dev__pytest/testing/test_skipping.py:454: AssertionError
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_dynamic_xfail_set_during_runtest_passed_strict0
collected 1 item
test_dynamic_xfail_set_during_runtest_passed_strict.py . [100%]
============================== 1 passed in 0.00s ===============================
==================================== PASSES ====================================
________________________ TestEvaluation.test_no_marker _________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_no_marker0
collected 0 items
============================ no tests ran in 0.00s =============================
___________________ TestEvaluation.test_marked_xfail_no_args ___________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_marked_xfail_no_args0
collected 0 items
============================ no tests ran in 0.00s =============================
__________________ TestEvaluation.test_marked_skipif_no_args ___________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_marked_skipif_no_args0
collected 0 items
============================ no tests ran in 0.00s =============================
______________________ TestEvaluation.test_marked_one_arg ______________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_marked_one_arg0
collected 0 items
============================ no tests ran in 0.00s =============================
________________ TestEvaluation.test_marked_one_arg_with_reason ________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_marked_one_arg_with_reason0
collected 0 items
============================ no tests ran in 0.00s =============================
___________________ TestEvaluation.test_marked_one_arg_twice ___________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_marked_one_arg_twice0
collected 0 items
============================ no tests ran in 0.00s =============================
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_marked_one_arg_twice0
collected 0 items
============================ no tests ran in 0.00s =============================
__________________ TestEvaluation.test_marked_one_arg_twice2 ___________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_marked_one_arg_twice20
collected 0 items
============================ no tests ran in 0.00s =============================
________ TestEvaluation.test_marked_skipif_with_boolean_without_reason _________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_marked_skipif_with_boolean_without_reason0
collected 0 items
============================ no tests ran in 0.00s =============================
____________ TestEvaluation.test_marked_skipif_with_invalid_boolean ____________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_marked_skipif_with_invalid_boolean0
collected 0 items
============================ no tests ran in 0.00s =============================
_______________________ TestEvaluation.test_skipif_class _______________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skipif_class0
collected 0 items
============================ no tests ran in 0.00s =============================
______________________ TestXFail.test_xfail_simple[True] _______________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_simple0
collected 0 items
============================ no tests ran in 0.00s =============================
______________________ TestXFail.test_xfail_simple[False] ______________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_simple1
collected 0 items
============================ no tests ran in 0.00s =============================
_________________________ TestXFail.test_xfail_xpassed _________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_xpassed0
collected 0 items
============================ no tests ran in 0.00s =============================
_____________________ TestXFail.test_xfail_using_platform ______________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_using_platform0
collected 0 items
============================ no tests ran in 0.00s =============================
_____________________ TestXFail.test_xfail_xpassed_strict ______________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_xpassed_strict0
collected 0 items
============================ no tests ran in 0.01s =============================
_______________________ TestXFail.test_xfail_run_anyway ________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_run_anyway0
collected 2 items
test_xfail_run_anyway.py F. [100%]
=================================== FAILURES ===================================
__________________________________ test_func ___________________________________
@pytest.mark.xfail
def test_func():
> assert 0
E assert 0
test_xfail_run_anyway.py:4: AssertionError
=========================== short test summary info ============================
FAILED test_xfail_run_anyway.py::test_func - assert 0
========================= 1 failed, 1 passed in 0.02s ==========================
________ TestXFail.test_xfail_run_with_skip_mark[test_input0-expected0] ________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_run_with_skip_mark0
collected 1 item
test_sample.py s [100%]
=========================== short test summary info ============================
SKIPPED [1] test_sample.py:2: unconditional skip
============================== 1 skipped in 0.01s ==============================
________ TestXFail.test_xfail_run_with_skip_mark[test_input1-expected1] ________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_run_with_skip_mark1
collected 1 item
test_sample.py s [100%]
=========================== short test summary info ============================
SKIPPED [1] test_sample.py:2: unconditional skip
============================== 1 skipped in 0.00s ==============================
___________________ TestXFail.test_xfail_evalfalse_but_fails ___________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_evalfalse_but_fails0
collected 0 items
============================ no tests ran in 0.00s =============================
___________________ TestXFail.test_xfail_not_report_default ____________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1 -- /home/swe-bench/miniconda3/envs/pytest-dev__pytest__6.0/bin/python
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_not_report_default0
collecting ... collected 1 item
test_one.py::test_this XFAIL [100%]
============================== 1 xfailed in 0.01s ==============================
_________________ TestXFail.test_xfail_not_run_xfail_reporting _________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_not_run_xfail_reporting0
collected 3 items
test_one.py xx. [100%]
=========================== short test summary info ============================
XFAIL test_one.py::test_this
reason: [NOTRUN] noway
XFAIL test_one.py::test_this_true
reason: [NOTRUN] condition: True
========================= 1 passed, 2 xfailed in 0.07s =========================
__________________ TestXFail.test_xfail_not_run_no_setup_run ___________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_not_run_no_setup_run0
collected 1 item
test_one.py x [100%]
=========================== short test summary info ============================
XFAIL test_one.py::test_this
reason: [NOTRUN] hello
============================== 1 xfailed in 0.04s ==============================
__________________________ TestXFail.test_xfail_xpass __________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_xpass0
collected 1 item
test_one.py X [100%]
=========================== short test summary info ============================
XPASS test_one.py::test_that
============================== 1 xpassed in 0.01s ==============================
_______________________ TestXFail.test_xfail_imperative ________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_imperative0
collected 1 item
test_xfail_imperative.py x [100%]
============================== 1 xfailed in 0.01s ==============================
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_imperative0
collected 1 item
test_xfail_imperative.py x [100%]
=========================== short test summary info ============================
XFAIL test_xfail_imperative.py::test_this
reason: hello
============================== 1 xfailed in 0.01s ==============================
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_imperative0
collected 1 item
test_xfail_imperative.py . [100%]
============================== 1 passed in 0.01s ===============================
______________ TestXFail.test_xfail_imperative_in_setup_function _______________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_imperative_in_setup_function0
collected 1 item
test_xfail_imperative_in_setup_function.py x [100%]
============================== 1 xfailed in 0.02s ==============================
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_imperative_in_setup_function0
collected 1 item
test_xfail_imperative_in_setup_function.py x [100%]
=========================== short test summary info ============================
XFAIL test_xfail_imperative_in_setup_function.py::test_this
reason: hello
============================== 1 xfailed in 0.01s ==============================
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_imperative_in_setup_function0
collected 1 item
test_xfail_imperative_in_setup_function.py F [100%]
=================================== FAILURES ===================================
__________________________________ test_this ___________________________________
def test_this():
> assert 0
E assert 0
test_xfail_imperative_in_setup_function.py:6: AssertionError
=========================== short test summary info ============================
FAILED test_xfail_imperative_in_setup_function.py::test_this - assert 0
============================== 1 failed in 0.01s ===============================
_____________________ TestXFail.test_dynamic_xfail_no_run ______________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_dynamic_xfail_no_run0
collected 1 item
test_dynamic_xfail_no_run.py x [100%]
=========================== short test summary info ============================
XFAIL test_dynamic_xfail_no_run.py::test_this
reason: [NOTRUN]
============================== 1 xfailed in 0.04s ==============================
____________ TestXFail.test_dynamic_xfail_set_during_funcarg_setup _____________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_dynamic_xfail_set_during_funcarg_setup0
collected 1 item
test_dynamic_xfail_set_during_funcarg_setup.py x [100%]
============================== 1 xfailed in 0.01s ==============================
_________ TestXFail.test_xfail_raises[TypeError-TypeError-*1 xfailed*] _________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_raises0
collected 1 item
test_xfail_raises.py x [100%]
============================== 1 xfailed in 0.01s ==============================
_ TestXFail.test_xfail_raises[(AttributeError, TypeError)-TypeError-*1 xfailed*] _
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_raises1
collected 1 item
test_xfail_raises.py x [100%]
============================== 1 xfailed in 0.01s ==============================
_________ TestXFail.test_xfail_raises[TypeError-IndexError-*1 failed*] _________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_raises2
collected 1 item
test_xfail_raises.py F [100%]
=================================== FAILURES ===================================
_________________________________ test_raises __________________________________
@pytest.mark.xfail(raises=TypeError)
def test_raises():
> raise IndexError()
E IndexError
test_xfail_raises.py:4: IndexError
=========================== short test summary info ============================
FAILED test_xfail_raises.py::test_raises - IndexError
============================== 1 failed in 0.01s ===============================
_ TestXFail.test_xfail_raises[(AttributeError, TypeError)-IndexError-*1 failed*] _
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_raises3
collected 1 item
test_xfail_raises.py F [100%]
=================================== FAILURES ===================================
_________________________________ test_raises __________________________________
@pytest.mark.xfail(raises=(AttributeError, TypeError))
def test_raises():
> raise IndexError()
E IndexError
test_xfail_raises.py:4: IndexError
=========================== short test summary info ============================
FAILED test_xfail_raises.py::test_raises - IndexError
============================== 1 failed in 0.01s ===============================
_________________________ TestXFail.test_strict_sanity _________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_strict_sanity0
collected 1 item
test_strict_sanity.py x [100%]
=========================== short test summary info ============================
XFAIL test_strict_sanity.py::test_foo
unsupported feature
============================== 1 xfailed in 0.01s ==============================
______________________ TestXFail.test_strict_xfail[True] _______________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_strict_xfail0
collected 1 item
test_strict_xfail.py F [100%]
=================================== FAILURES ===================================
___________________________________ test_foo ___________________________________
[XPASS(strict)] unsupported feature
============================== 1 failed in 0.01s ===============================
______________________ TestXFail.test_strict_xfail[False] ______________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_strict_xfail1
collected 1 item
test_strict_xfail.py X [100%]
=========================== short test summary info ============================
XPASS test_strict_xfail.py::test_foo unsupported feature
============================== 1 xpassed in 0.01s ==============================
_________________ TestXFail.test_strict_xfail_condition[True] __________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_strict_xfail_condition0
collected 1 item
test_strict_xfail_condition.py . [100%]
============================== 1 passed in 0.01s ===============================
_________________ TestXFail.test_strict_xfail_condition[False] _________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_strict_xfail_condition1
collected 1 item
test_strict_xfail_condition.py . [100%]
============================== 1 passed in 0.01s ===============================
_________________ TestXFail.test_xfail_condition_keyword[True] _________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_condition_keyword0
collected 1 item
test_xfail_condition_keyword.py . [100%]
============================== 1 passed in 0.01s ===============================
________________ TestXFail.test_xfail_condition_keyword[False] _________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_condition_keyword1
collected 1 item
test_xfail_condition_keyword.py . [100%]
============================== 1 passed in 0.01s ===============================
_____________ TestXFail.test_strict_xfail_default_from_file[true] ______________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_strict_xfail_default_from_file0, configfile: tox.ini
collected 1 item
test_strict_xfail_default_from_file.py F [100%]
=================================== FAILURES ===================================
___________________________________ test_foo ___________________________________
[XPASS(strict)] unsupported feature
============================== 1 failed in 0.01s ===============================
_____________ TestXFail.test_strict_xfail_default_from_file[false] _____________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_strict_xfail_default_from_file1, configfile: tox.ini
collected 1 item
test_strict_xfail_default_from_file.py X [100%]
=========================== short test summary info ============================
XPASS test_strict_xfail_default_from_file.py::test_foo unsupported feature
============================== 1 xpassed in 0.01s ==============================
_____________ TestXFailwithSetupTeardown.test_failing_setup_issue9 _____________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_failing_setup_issue90
collected 1 item
test_failing_setup_issue9.py x [100%]
============================== 1 xfailed in 0.03s ==============================
___________ TestXFailwithSetupTeardown.test_failing_teardown_issue9 ____________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_failing_teardown_issue90
collected 1 item
test_failing_teardown_issue9.py Xx [100%]
======================== 1 xfailed, 1 xpassed in 0.05s =========================
___________________________ TestSkip.test_skip_class ___________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skip_class0
collected 3 items
test_skip_class.py ss. [100%]
========================= 1 passed, 2 skipped in 0.01s =========================
_____________________ TestSkip.test_skips_on_false_string ______________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skips_on_false_string0
collected 1 item
test_skips_on_false_string.py s [100%]
============================== 1 skipped in 0.01s ==============================
_________________________ TestSkip.test_arg_as_reason __________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_arg_as_reason0
collected 1 item
test_arg_as_reason.py s [100%]
=========================== short test summary info ============================
SKIPPED [1] test_arg_as_reason.py:2: testing stuff
============================== 1 skipped in 0.01s ==============================
_________________________ TestSkip.test_skip_no_reason _________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skip_no_reason0
collected 1 item
test_skip_no_reason.py s [100%]
=========================== short test summary info ============================
SKIPPED [1] test_skip_no_reason.py:2: unconditional skip
============================== 1 skipped in 0.01s ==============================
________________________ TestSkip.test_skip_with_reason ________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skip_with_reason0
collected 1 item
test_skip_with_reason.py s [100%]
=========================== short test summary info ============================
SKIPPED [1] test_skip_with_reason.py:2: for lolz
============================== 1 skipped in 0.00s ==============================
_____________________ TestSkip.test_only_skips_marked_test _____________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_only_skips_marked_test0
collected 3 items
test_only_skips_marked_test.py ss. [100%]
=========================== short test summary info ============================
SKIPPED [1] test_only_skips_marked_test.py:2: unconditional skip
SKIPPED [1] test_only_skips_marked_test.py:5: nothing in particular
========================= 1 passed, 2 skipped in 0.01s =========================
________________________ TestSkip.test_strict_and_skip _________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_strict_and_skip0
collected 1 item
test_strict_and_skip.py s [100%]
=========================== short test summary info ============================
SKIPPED [1] test_strict_and_skip.py:2: unconditional skip
============================== 1 skipped in 0.01s ==============================
______________________ TestSkipif.test_skipif_conditional ______________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skipif_conditional0
collected 0 items
============================ no tests ran in 0.00s =============================
_________ TestSkipif.test_skipif_reporting["hasattr(sys, 'platform')"] _________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skipif_reporting0
collected 1 item
test_foo.py s
=========================== short test summary info ============================
SKIPPED [1] test_foo.py:2: condition: hasattr(sys, 'platform')
============================== 1 skipped in 0.00s ==============================
______ TestSkipif.test_skipif_reporting[True, reason="invalid platform"] _______
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skipif_reporting1
collected 1 item
test_foo.py s
=========================== short test summary info ============================
SKIPPED [1] test_foo.py:2: invalid platform
============================== 1 skipped in 0.01s ==============================
____________________ TestSkipif.test_skipif_using_platform _____________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skipif_using_platform0
collected 0 items
============================ no tests ran in 0.00s =============================
________ TestSkipif.test_skipif_reporting_multiple[skipif-SKIP-skipped] ________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skipif_reporting_multiple0
collected 1 item
test_foo.py s
=========================== short test summary info ============================
SKIPPED [1] test_foo.py:2: second_condition
============================== 1 skipped in 0.01s ==============================
________ TestSkipif.test_skipif_reporting_multiple[xfail-XPASS-xpassed] ________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skipif_reporting_multiple1
collected 1 item
test_foo.py X
=========================== short test summary info ============================
XPASS test_foo.py::test_foobar second_condition
============================== 1 xpassed in 0.01s ==============================
_________________________ test_skip_not_report_default _________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1 -- /home/swe-bench/miniconda3/envs/pytest-dev__pytest__6.0/bin/python
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skip_not_report_default0
collecting ... collected 1 item
test_one.py::test_this SKIPPED [100%]
============================== 1 skipped in 0.01s ==============================
______________________________ test_skipif_class _______________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skipif_class1
collected 2 items
test_skipif_class.py ss [100%]
============================== 2 skipped in 0.01s ==============================
_______________________ test_skipped_reasons_functional ________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skipped_reasons_functional0
collected 3 items
test_one.py sss [100%]
=========================== short test summary info ============================
SKIPPED [2] conftest.py:4: test
SKIPPED [1] test_one.py:14: via_decorator
============================== 3 skipped in 0.01s ==============================
_____________________________ test_skipped_folding _____________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_skipped_folding0
collected 2 items
test_one.py ss [100%]
=========================== short test summary info ============================
SKIPPED [2] test_one.py: Folding
============================== 2 skipped in 0.01s ==============================
_______________________________ test_reportchars _______________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_reportchars0
collected 4 items
test_reportchars.py FxXs [100%]
=================================== FAILURES ===================================
____________________________________ test_1 ____________________________________
def test_1():
> assert 0
E assert 0
test_reportchars.py:3: AssertionError
=========================== short test summary info ============================
FAILED test_reportchars.py::test_1 - assert 0
XFAIL test_reportchars.py::test_2
XPASS test_reportchars.py::test_3
SKIPPED [1] test_reportchars.py:11: four
============== 1 failed, 1 skipped, 1 xfailed, 1 xpassed in 0.01s ==============
____________________________ test_reportchars_error ____________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_reportchars_error0
collected 1 item
test_simple.py .E [100%]
==================================== ERRORS ====================================
________________________ ERROR at teardown of test_foo _________________________
def pytest_runtest_teardown():
> assert 0
E assert 0
conftest.py:2: AssertionError
=========================== short test summary info ============================
ERROR test_simple.py::test_foo - assert 0
========================== 1 passed, 1 error in 0.01s ==========================
_____________________________ test_reportchars_all _____________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_reportchars_all0
collected 5 items
test_reportchars_all.py FxXsE [100%]
==================================== ERRORS ====================================
___________________________ ERROR at setup of test_5 ___________________________
@pytest.fixture
def fail():
> assert 0
E assert 0
test_reportchars_all.py:14: AssertionError
=================================== FAILURES ===================================
____________________________________ test_1 ____________________________________
def test_1():
> assert 0
E assert 0
test_reportchars_all.py:3: AssertionError
=========================== short test summary info ============================
SKIPPED [1] test_reportchars_all.py:11: four
XFAIL test_reportchars_all.py::test_2
XPASS test_reportchars_all.py::test_3
ERROR test_reportchars_all.py::test_5 - assert 0
FAILED test_reportchars_all.py::test_1 - assert 0
========= 1 failed, 1 skipped, 1 xfailed, 1 xpassed, 1 error in 0.01s ==========
__________________________ test_reportchars_all_error __________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_reportchars_all_error0
collected 1 item
test_simple.py .E [100%]
==================================== ERRORS ====================================
________________________ ERROR at teardown of test_foo _________________________
def pytest_runtest_teardown():
> assert 0
E assert 0
conftest.py:2: AssertionError
=========================== short test summary info ============================
ERROR test_simple.py::test_foo - assert 0
========================== 1 passed, 1 error in 0.01s ==========================
____________________ test_errors_in_xfail_skip_expressions _____________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_errors_in_xfail_skip_expressions0
collected 3 items
test_errors_in_xfail_skip_expressions.py EE. [100%]
==================================== ERRORS ====================================
_______________________ ERROR at setup of test_nameerror _______________________
name 'asd' is not defined
During handling of the above exception, another exception occurred:
Error evaluating 'skipif' condition
asd
NameError: name 'asd' is not defined
________________________ ERROR at setup of test_syntax _________________________
unexpected EOF while parsing (<xfail condition>, line 1)
During handling of the above exception, another exception occurred:
Error evaluating 'xfail' condition
syntax error
^
SyntaxError: invalid syntax
=========================== short test summary info ============================
ERROR test_errors_in_xfail_skip_expressions.py::test_nameerror
ERROR test_errors_in_xfail_skip_expressions.py::test_syntax
========================= 1 passed, 2 errors in 0.01s ==========================
________________________ test_xfail_skipif_with_globals ________________________
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-6.0.1.dev180+g634cde950, py-1.11.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-swe-bench/pytest-0/test_xfail_skipif_with_globals0
collected 2 items
test_xfail_skipif_with_globals.py sx [100%]
=========================== short test summary info ============================
SKIPPED [1] test_xfail_skipif_with_globals.py:3: condition: x == 3
XFAIL test_xfail_skipif_with_globals.py::test_boolean
condition: x == 3
======================== 1 skipped, 1 xfailed in 0.01s =========================
_____________________________ test_default_markers _____________________________
----------------------------- Captured stdout call -----------------------------
@pytest.mark.filterwarnings(warning): add a warning filter to the given test. see https://docs.pytest.org/en/stable/warnings.html#pytest-mark-filterwarnings
@pytest.mark.skip(reason=None): skip the given test function with an optional reason. Example: skip(reason="no way of currently testing this") skips the test.
@pytest.mark.skipif(condition, ..., *, reason=...): skip the given test function if any of the conditions evaluate to True. Example: skipif(sys.platform == 'win32') skips the test if we are on the win32 platform. See https://docs.pytest.org/en/stable/reference.html#pytest-mark-skipif
@pytest.mark.xfail(condition, ..., *, reason=..., run=True, raises=None, strict=xfail_strict): mark the test function as an expected failure if any of the conditions evaluate to True. Optionally specify a reason for better reporting and run=False if you don't even want to execute the test function. If only specific exception(s) are expected, you can list them in raises, and if the test fails in other ways, it will be reported as a true failure. See https://docs.pytest.org/en/stable/reference.html#pytest-mark-xfail
@pytest.mark.parametrize(argnames, argvalues): call a test function multiple times passing in different arguments in turn. argvalues generally needs to be a list of values if argnames specifies only one name or a list of tuples of values if argnames specifies multiple names. Example: @parametrize('arg1', [1,2]) would lead to two calls of the decorated test function, one with arg1=1 and another with arg1=2.see https://docs.pytest.org/en/stable/parametrize.html for more info and examples.
@pytest.mark.usefixtures(fixturename1, fixturename2, ...): mark tests as needing all of the specified fixtures. see https://docs.pytest.org/en/stable/fixture.html#usefixtures
@pytest.mark.tryfirst: mark a hook implementation function such that the plugin machinery will try to call it first/as early as possible.