Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Flaky test: test_lstm_clip #14994

Closed
eric-haibin-lin opened this issue May 18, 2019 · 10 comments
Closed

Flaky test: test_lstm_clip #14994

eric-haibin-lin opened this issue May 18, 2019 · 10 comments

Comments

@eric-haibin-lin
Copy link
Member

test_gluon_gpu.test_lstm_clip

======================================================================

FAIL: test_gluon_gpu.test_lstm_clip

----------------------------------------------------------------------

Traceback (most recent call last):

  File "/usr/local/lib/python3.5/dist-packages/nose/case.py", line 198, in runTest

    self.test(*self.arg)

  File "/work/mxnet/tests/python/gpu/../unittest/common.py", line 177, in test_new

    orig_test(*args, **kwargs)

  File "/work/mxnet/tests/python/gpu/../unittest/common.py", line 110, in test_new

    orig_test(*args, **kwargs)

  File "/work/mxnet/tests/python/gpu/test_gluon_gpu.py", line 163, in test_lstm_clip

    assert (cell_states >= clip_min).all() and (cell_states <= clip_max).all()

AssertionError: 

-------------------- >> begin captured logging << --------------------

common: INFO: Setting test np/mx/python random seeds, use MXNET_TEST_SEED=551104712 to reproduce.

--------------------- >> end captured logging << ---------------------


http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/mxnet-validation%2Funix-gpu/detail/PR-14935/6/pipeline

https://github.com/apache/incubator-mxnet/pull/14935/files

@mxnet-label-bot
Copy link
Contributor

Hey, this is the MXNet Label Bot.
Thank you for submitting the issue! I will try and suggest some labels so that the appropriate MXNet community members can help resolve it.
Here are my recommended labels: Test, Flaky

@lanking520
Copy link
Member

lanking520 commented May 24, 2019

@mxnet-label-bot add [flaky, test]

@aaronmarkham
Copy link
Contributor

@leezu
Copy link
Contributor

leezu commented Nov 4, 2020

Flaky test is annotated in source via @pytest.mark.flaky now. It may be easier to track the flaky tests via source grep and we can close this issue

@leezu leezu closed this as completed Nov 4, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

9 participants