Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow multiple simultaneous requests to be sent to OmniSharp server #902

Merged
merged 6 commits into from
Nov 11, 2016

Conversation

DustinCampbell
Copy link
Member

Communication with the OmniSharp server happens in a very serial fashion: make a request and wait for the response. This is done to ensure that critical communication, such as updating a buffer during typing, is processed quickly. However, that means that we'll wait on extremely long running requests that don't need to happen serially, such as waiting for all diagnostics to be returned for a solution.

This change splits requests into three queues based on the type of request. High priority requests, such as buffer changes, are still handled serially. Most other requests are placed on a queue that is allowed to make up to 8 simultaneous requests at a time. Other requests (especially long-running requests) are placed on a "deferred" queue that allows a much smaller number of simultaneous requests (maximum 2).

cc @david-driscoll

protocol.Requests.TypeLookup
];

const prioritySet = new Set<string>(priorityCommands);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This code looks awfully familiar. :D

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let startTime: number;
let request: Request;

let promise = new Promise<TResponse>((resolve, reject) => {
startTime = Date.now();

request = {
path,
command,
data,
onSuccess: value => resolve(value),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you not just pass in resolve and reject as reference values? Then you'd have one less closure.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It might require the dreaded <any> but the the extra closure (promises already give you one) seems unnecessary.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I recall, this caused a tslint error at one point, but I'l take a look.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, it does confuse tslint if I just pass the functions without creating lambdas -- "A Promise was found that appears to not have resolve or reject invoked on all code paths".

I suppose it's already been this way for several months already. I'll leave it as is for now.

if (this._getState() === ServerState.Started && !this._isProcessingQueue) {
this._processQueue();
if (this._getState() === ServerState.Started) {
this._requestQueue.drain();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

enqueue calls drain under the covers already, this seems like it might not be needed.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right -- thanks!

request.onSuccess(packet.Body);
}
else {
request.onError(packet.Message || packet.Body);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After the request is processed, you should drain the queue as well here.

The scenario I envision, if there are buffer updates in the queue, they will be processed and finish, but any items added to the other queues will not be processed. The node client has a complete callback (which is terribly named, it's really queueDrained() or something).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense -- thanks!

@DustinCampbell DustinCampbell merged commit d96fb0c into dotnet:master Nov 11, 2016
@DustinCampbell DustinCampbell deleted the concurrent-server branch January 31, 2017 20:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants