-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AssertMemberExposure causing flaky test failures #52368
Comments
It looks like the stack trace is incomplete, but I'm not able to find a complete stack trace in the build logs or artifacts. |
Note that this means we're seeing a Can we disable this test for now and try running it locally to see if we can get a repro? |
I think this won't help much. It's not the only test failing with this exception & stacktrace. See #52372 |
As Jared mentioned, the stack here is incomplete. Everything goes through |
The other failure is at least giving us a clue where the problem is
That failure happened on Windows .NET Framework where as this bug happened on Linux .NET Core. The reporting diagnostics are just capturing more data in the .NET Framework case. Looks like we need to dig into the |
@jaredpar I think this assert is different. There is enough "----- Inner Stack Trace -----" to see that they are different. |
@AlekseyTs my suspicion is that the difference in stack traces is due to the difference in how .NET Framework and .NET core handle trace listeners, particularly on Linux. I've seen similar stack differences before when debugging the same failure across the runtimes. I'm not 100% sure this is the case though. It's just suspicion at this point. In either case though, the stack from #52372 is actionable. Hence going to dig into that for a bit and see if I can get a bug fix out. If this failure stops reproing after that fix is in then I'll feel more confident in my suspicion here. |
Makes sense. |
Should have a PR out for #52372 shortly. |
* Fix race conditions in attribute asserts There are asserts in our `CustomAttributeBag` to verify we maintain the invariant that early decoding occurs before the full decoding. The asserts though done in the getters and take the following form: ```cs bool earlyComplete = IsPartComplete(CustomAttributeBagCompletionPart.EarlyDecodedWellKnownAttributeData); // If late attributes are complete, early attributes must also be complete Debug.Assert(!IsPartComplete(CustomAttributeBagCompletionPart.DecodedWellKnownAttributeData) || earlyComplete); return earlyComplete; ``` This pattern is subject to race conditions. Consider the case where the `bool earlyComplete` statement runs to completion and returns `false`. Then another thread swaps in and completes both early and full decoding. At that point the original thread resumes and the `Debug.Assert` will fail yet no invariant has been violated. Moved the `Debug.Assert` into the setters where we can reliably test the state invariants. closes #52372 related to #52368 * Fix * Add back an assert
@jaredpar Looks like it's not fixed 😕 Another test failed with a similar exception today in #50608.
|
I agree it's still happening. Unfortunately all the failures have the same lack of info here that makes it hard to debug this. Think we may need to run this in a loop locally to see where it's failing. |
This hit the failure today on Windows Desktop. The stack trace was a bit better this time. Pretty clear this is the assert in var declared = Volatile.Read(ref _lazyDeclaredMembersAndInitializers);
Debug.Assert(declared != DeclaredMembersAndInitializers.UninitializedSentinel); The test failing here is CS0267ERR_PartialMisplaced_Delegate1. The test is pretty straight forward. It's just asserting diagnostics on the following declaration: partial delegate E { } The original bug report occurred against LocalFunctionStatement_04 which also has a top level @AlekseyTs does this ring any bells? The |
@jaredpar Can we "instrument" the |
PR #52849 added |
Duplicate of #51195 |
I marked this one as a duplicate of #51195 since that issue is the one tracked in runfo |
@AlekseyTs have a heap dump of this failure now. When you click on that link it will download a file with a number extension. You need to rename it to have a Containing build: https://dev.azure.com/dnceng/public/_build/results?buildId=1113392 Note: this is a core dump from Linux but that shouldn't impact the VS experience (tested locally) |
https://dev.azure.com/dnceng/public/_build/results?buildId=1068163&view=ms.vss-test-web.build-test-results-tab&runId=32851380&resultId=122402&paneView=debug
The text was updated successfully, but these errors were encountered: