Skip to content
This repository has been archived by the owner on Oct 28, 2022. It is now read-only.

Memory leak on repeated connection/disconnection #24

Open
crodriguezvega opened this issue Dec 16, 2015 · 0 comments
Open

Memory leak on repeated connection/disconnection #24

crodriguezvega opened this issue Dec 16, 2015 · 0 comments

Comments

@crodriguezvega
Copy link

Hi,

I am running the ReactiveServer sample in the following scenario: I have a test application with a loop and within this loop a ReactiveClient is created, then connects to the server, waits 200 ms and then disconnects. The client does not send any data to the server.

What I see in the memory profiler of Visual Studio is that the memory usage of the server grows linearly when I run my test application. When the test application stops, the memory usage of the server does not drop; it stays at whatever level it reached.

The memory leak does not seem to happen (or at least not so clearly, with linear grow of memory usage) when I run the ReactiveServer sample without the code that instantiates the StringChannel class and subscribes to the Receiver. For that reason, I was suspecting that the memory leak could be related to the fact that the subscription to the Receiver of the protocol is not properly disposed when the client disconnects. I have read that in normal circumstances (if the observable sequence finishes), then the subscription is automatically disposed, but if the sequence does not complete, then it is not disposed. I have tried to dispose the subscription in the Disconnected event handler, but I did not see any noticeable improvement.

Any ideas or suggestions are greatly appreciated.

Thanks.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant