Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[wasm][debugger] Fix reusing buffer for debugger #59773

Merged
merged 2 commits into from
Sep 30, 2021

Conversation

thaystg
Copy link
Member

@thaystg thaystg commented Sep 29, 2021

When the wasm linear memory has to grow it invalidates existing views into the heap. To avoid this issue make sure to always create a new view into the heap for the debugger command immediately before use.

@ghost
Copy link

ghost commented Sep 29, 2021

Tagging subscribers to this area: @thaystg
See info in area-owners.md if you want to be subscribed.

Issue Details

Module.HEAPU8.buffer can be reallocated so we need to set _debugger_heap_bytes everytime that we will call set method.

Author: thaystg
Assignees: -
Labels:

arch-wasm, area-Debugger-mono

Milestone: -

@lewing
Copy link
Member

lewing commented Sep 29, 2021

This fix is appropriately minimal under the circumstances but it might make sense to inline the base64 decode here and avoid an intermediate copy.

@lewing
Copy link
Member

lewing commented Sep 29, 2021

@kg any thoughts on a way to catch these more easily in testing?

@radical
Copy link
Member

radical commented Sep 29, 2021

How was this caught, or how did it fail?

@lewing
Copy link
Member

lewing commented Sep 29, 2021

How was this caught, or how did it fail?

caught in CTI testing, eventually if the runtime allocates more memory than the current allocation debugging will stop working (in fairly non-obvious ways)

The underlying issue is emscripten-core/emscripten#12336

@lambdageek
Copy link
Member

The underlying issue is emscripten-core/emscripten#12336

There's a suggestion in that issue to monkeypatch wasmMemory.grow so we get our own notification about the heap. Any reason we don't want to do that, @lewing @kg ?

@lewing
Copy link
Member

lewing commented Sep 30, 2021

The underlying issue is emscripten-core/emscripten#12336

There's a suggestion in that issue to monkeypatch wasmMemory.grow so we get our own notification about the heap. Any reason we don't want to do that, @lewing @kg ?

it is more a question of what exactly to do with those notifications

@kg
Copy link
Member

kg commented Sep 30, 2021

The main problem is that in order to do anything about this we'd need to make people stop using raw arrayviews so we could wrap the views in a type that does something when the heap is invalidated. ArrayViews are designed to fail silently and weirdly in classic JS fashion, and because JS object indexing isn't customizable we can't even make a fake array-like object.

We could do something like the root API and store the buffers in a container and make people call .value or .buffer every time, I suppose. I'm not sure that would help. In this case it would have at least. We probably just need to audit all our code for it - I thought I did in the past but I definitely did not see this line.

@lewing lewing merged commit 8e9ab9f into dotnet:main Sep 30, 2021
thaystg added a commit to thaystg/runtime that referenced this pull request Oct 6, 2021
But I'm not using Uint8Array to set the content on memory allocated using malloc as @lewing suggested in the same PR.
lewing pushed a commit that referenced this pull request Oct 7, 2021
* Creating a test case for PR #59773
But I'm not using Uint8Array to set the content on memory allocated using malloc as @lewing suggested in the same PR.

* add lines in the eof

* Change the function that allocate memory.

* adding comment as suggested by @kg

* Accepting @lewing suggestion
@ghost ghost locked as resolved and limited conversation to collaborators Nov 3, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
arch-wasm WebAssembly architecture area-Debugger-mono
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants