-
Notifications
You must be signed in to change notification settings - Fork 12.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
i8 and u8 not correctly shown in MSVC debugger #36646
Comments
Please forget what I first wrote here. I had an outdated PDB. Still, it led me on the right track:
When I manually patch the PDB to use the following instead, it works better (it shows the value, but it interprets it as an ASCII character, which it should not ... ideally it should probably show the real Rust typenames, right?):
The question remains where these typeindices are coming from ... |
Looks like it's coming from the translation of DWARF to CodeView DebugInfo in LLVM, so it's really an LLVM issue: https://github.com/llvm-mirror/llvm/blob/864e0ffb0eb47f848aca60d1f08fc2962fbb5009/lib/CodeGen/AsmPrinter/CodeViewDebug.cpp#L1223-L1240 (the |
@Boddlnagg That's some nice detective work there From what I see in |
OK, so MSVC seems to emit char types for those:
I'd still consider it a bug in the MSVC debugger rather than in LLVM. |
I agree that the MSVC debugger behavior is unexpected, but LLVM has already changed a usage of T_INT8 to T_QUAD (both represent 64-bit integers), probably because of a similar issue, so it looks like the T_INTx types are just not supported very well by the debugger. The debugger module for using GDB directly inside VS (https://github.com/Microsoft/MIEngine) does the same thing by the way (although coming from the opposite direction, since it uses the DWARF data directly), you can see this when using VisualRust, and compile with the GNU toolchain: |
I submitted a bug with LLVM: https://llvm.org/bugs/show_bug.cgi?id=30552 |
The VS debugger doesn't appear to understand the 0x68 or 0x69 type indices, which were probably intended for use on a platform where a C 'int' is 8 bits. So, use the character types instead. Clang was already using the character types because '[u]int8_t' is usually defined in terms of 'char'. See the Rust issue for screenshots of what VS does: rust-lang/rust#36646 Fixes PR30552 git-svn-id: https://llvm.org/svn/llvm-project/llvm/trunk@282739 91177308-0d34-0410-b5e6-96231b3b80d8
A fix has landed in LLVM :) |
I consider this fixed. |
Reopening since the LLVM fix has not propagated to our fork. |
Hi guys, Any indication whether a fix will make it into 1.15 or not? I'm not sure what the policy is on updating rust's mirror of LLVM. If someone can help me I can do the rebase or merge. Or do we wait for the next official release of LLVM to come out before we do the merge? |
As per information provided by @brson, upgrade to next version of LLVM is in progress. The status can be tracked here: #37609 |
#37609 has been closed. Can this issue be closed now too? |
Cool, thanks @Boddlnagg! |
Compiling the following program with the MSVC toolchain and the running it in the MSVC debugger shows that there is some problem with inspecting values of type
i8
andu8
(this was originally found while investigating #36503):The screenshot shows that the problem also appears for
&[u8]
and&[i8]
, where the debugger can't show the pointer (data_ptr
).The text was updated successfully, but these errors were encountered: