You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 28, 2022. It is now read-only.
I am running the ReactiveServer sample in the following scenario: I have a test application with a loop and within this loop a ReactiveClient is created, then connects to the server, waits 200 ms and then disconnects. The client does not send any data to the server.
What I see in the memory profiler of Visual Studio is that the memory usage of the server grows linearly when I run my test application. When the test application stops, the memory usage of the server does not drop; it stays at whatever level it reached.
The memory leak does not seem to happen (or at least not so clearly, with linear grow of memory usage) when I run the ReactiveServer sample without the code that instantiates the StringChannel class and subscribes to the Receiver. For that reason, I was suspecting that the memory leak could be related to the fact that the subscription to the Receiver of the protocol is not properly disposed when the client disconnects. I have read that in normal circumstances (if the observable sequence finishes), then the subscription is automatically disposed, but if the sequence does not complete, then it is not disposed. I have tried to dispose the subscription in the Disconnected event handler, but I did not see any noticeable improvement.
Any ideas or suggestions are greatly appreciated.
Thanks.
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi,
I am running the ReactiveServer sample in the following scenario: I have a test application with a loop and within this loop a ReactiveClient is created, then connects to the server, waits 200 ms and then disconnects. The client does not send any data to the server.
What I see in the memory profiler of Visual Studio is that the memory usage of the server grows linearly when I run my test application. When the test application stops, the memory usage of the server does not drop; it stays at whatever level it reached.
The memory leak does not seem to happen (or at least not so clearly, with linear grow of memory usage) when I run the ReactiveServer sample without the code that instantiates the StringChannel class and subscribes to the Receiver. For that reason, I was suspecting that the memory leak could be related to the fact that the subscription to the Receiver of the protocol is not properly disposed when the client disconnects. I have read that in normal circumstances (if the observable sequence finishes), then the subscription is automatically disposed, but if the sequence does not complete, then it is not disposed. I have tried to dispose the subscription in the Disconnected event handler, but I did not see any noticeable improvement.
Any ideas or suggestions are greatly appreciated.
Thanks.
The text was updated successfully, but these errors were encountered: