-
Notifications
You must be signed in to change notification settings - Fork 30k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance of 'http' + anonymous functions eventually slows down #3659
Comments
Might not be related, but can you run |
@alexlamsl but that would not explain why it performs ok without IIFE or promises. |
@SynerG Could you also log server memory usage? @SynerG Also,
Ok, I ran it with |
@YuriSolovyov, I think you could also reproduce the issue in your machine, but you will probably need to run the script for a longer period of time.
@alexlamsl, I have run @ChALkeR, regarding memory usage:
Also, the state of node process according to
So it looks like the implementation creates a memory leakage, although the VM still has free memory even after performance has degraded. |
I wonder if you can build node with this and try again, just in case. |
@YuriSolovyov, I am not familiar with the process to rebuild Nodejs with a different version of v8.
Is this the correct process? |
@SynerG It is not the easiest process. First you can try to directly apply the patch. Download it: https://gist.github.com/targos/b057bbaf1fd1589d4dba If you are lucky, it will apply cleanly |
Since the problem is 100% reproducible given the excellent test case @SynerG provided any contributer could test out any patch they wish. |
That patch does actually seem to prevent the large drop, but the average seems to be a bit lower right off the bat. on the master branch: Before
After
|
@evanlucas I was blinded by this too, try to run it at least 5-10 mins. |
@targos thanks for the tip. I was already rebuilding node the hard way when I read your comment. I have rebuild node with the latest version of v8 in the repo. It should already contain the desired commit as there is a path between the two according to git:
In my OSX development machine, the patch doesn't solve the problem:
@evanlucas could you keep running the test longer? It could eventually reach the point when it drops yet |
Another data point - I tried out the (no Promise) test case on Mac with node v5.0.0 and got the same throughput drop off numbers as @SynerG at top of ticket: 2400 requests/sec => 900 requests/sec after 5 minutes.
Then for the heck of it, I tried running the test case on jxcore, a node port using the mozilla JS engine - not v8:
After an hour I did not see any drop off:
which suggests it may be a v8 issue. |
@kzc good point using It may be a v8 issue, but I have been unable to reproduce the problem using only v8 features. For example, using the following script: // An attempt to replace http.createServer and trigger continuous calls
(function triggerCall(){
callbackValue(function(value){
statistics();
process.nextTick(triggerCall);
});
})();
// Same function than before
function callbackValue(cb) {
(function() {
var j = 0;
for (var i = 0, l = 300000; i < l; i++) {
j = j + 2;
}
cb("Hi " + j);
})();
}
// Print basic statistics periodically
var startTime = (new Date()).getTime();
var lastPrintTime = startTime;
var MIN_ELAPSED_TIME = 5000;
var currentTime, elapsedTime;
var callsCounter = 0;
function statistics(){
callsCounter++;
currentTime = (new Date()).getTime();
elapsedTime = currentTime-lastPrintTime;
if ( elapsedTime >= MIN_ELAPSED_TIME){
console.log( callsCounter/(MIN_ELAPSED_TIME/1000)+" calls/sec\t\t "+ (currentTime-startTime)/1000 +" seconds up");
callsCounter=0;
lastPrintTime=currentTime;
}
} The snippet doesn't use
This alternative to replace function mockServerF(fn){
fn();
}
var mockServer = mockServerF.bind(null, function(){
callbackValue(function(value){
statistics();
process.nextTick(mockServer);
});
});
mockServer();
If we were able to come up with a snippet that triggered the problem without Is somebody able to come up with a good script for this? |
@obastemur would be the one to answer that question. |
However I have tested the same scenario using |
Taking into account @kzc & @obastemur tests together, it looks like the problem is not caused by My guess is that This is only a theory, but it is based on the following observed facts:
|
@SynerG I wonder how number of iterations in for loop affects this, maybe at some point V8 decides that is growing too large, but that would be weird, since it should be destroyed anyway after the IIFE call |
Let's show the performance values using optimizable and non-optimizable versions of the function, as well as the performance values with the anonymous function during the 1st phase (high performance) and 2nd phase (low performance) for different values of This is the optimizable variant: function fNamedHighPerformance() {
var j = 0;
for (var i = 0, l = 30000; i < l; i++) { j = j + 2; }
return "Hi " + j;
}
function promiseValue() {
return Promise.resolve().then(fNamedHighPerformance);
} This is the non-optimizable variant: function fNamedLowPerformance() {
try { } catch (e) { }
var j = 0;
for (var i = 0, l = 30000; i < l; i++) { j = j + 2; }
return "Hi " + j;
}
function promiseValue() {
return Promise.resolve().then(fNamedLowPerformance);
} And the results:
Legend:
Almost the same performance numbers are obtained using the optimizable function than during the 1st phase, and using the non-optimizable function than during the 2nd phase for any choosen number of iterations. (Except for l = 3000 where the optimizable function provides substantially higher performance, but at that point it could be a matter of the performance of named vs inline-anonymous functions instead of optimized vs non-optimized) At some point, |
cc @trevnorris maybe? |
Ok, no more implying an optimized/non-optimized state, First, the implementation to check the optimization state: var http = require('http');
// Nothing has changed here
var srv = http.createServer(function (req, res) {
promiseValue().then(function(value){
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end(value);
});
});
srv.listen(3000, '127.0.0.1', function() {
"use strict";
console.log("Server listening on 127.0.0.1:3000");
});
// Improved: query optimization status of the internal function
function promiseValue() {
return Promise.resolve().then(function fNamed() {
var j = 0;
for (var i = 0, l = 300000; i < l; i++) { j = j + 2; }
printStatus(%GetOptimizationStatus(fNamed));
return "Hi " + j;
}
);
}
// Print the optimization status
var lastPrintTime = Date.now();
var MIN_ELAPSEDTIME = 2000;
function printStatus(s){
if ((Date.now()-lastPrintTime)>= MIN_ELAPSEDTIME){
console.log(verbalizeStatus(s));
lastPrintTime = Date.now();
}
}
function verbalizeStatus(s) {
var state = "UNKNOWN";
switch (s) {
case 1:
state = "OPTIMIZED";
break;
case 2:
state = "NOT optimized";
break;
case 3:
state = "ALWAYS optimized";
break;
case 4:
state = "NEVER optimized";
break;
case 6:
state = "MAYBE deoptimized";
break;
}
return "function is "+state;
} And the results. When this happens in the
This happens in the server window:
So At this point, some relevant questions:
I would like to also cc @petkaantonov, who has made an awesome work with Bluebird keeping it within |
The key factor may be the relative size and complexity of the closure. If the innermost anonymous function calls a regular function for the computation then the throughput does not degrade on node 5.0.0 after an hour. Throughput remains relatively constant at around 2,300 requests/second. Keep in mind that V8 may have optimized away the for loop altogether. var http = require('http');
var srv = http.createServer(function (req, res) {
promiseValue().then(function(value){
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end(value);
});
});
function promiseValue() {
return Promise.resolve().then(function() {
var j = compute(0);
return "Hi " + j;
});
}
function compute(j) {
for (var i = 0, l = 300000; i < l; i++) { j = j + 2; }
return j;
}
srv.listen(3000, '127.0.0.1', function() {
"use strict";
console.log("Server listening on 127.0.0.1:3000");
}); |
@kzc the reason could indeed be related to size and complexity. The function does not use a high number of variables (which could complicate the closure), neither has a high cyclomatic complexity, but we generate a huge number of calls to the function (although I don't know if Your proposal shows a pattern that could be used to minimize the impact of unexpected deoptimization. In practice, it refactorizes the instructions that require more computation to an external named function that, as tested previously, does not get unpredictably deoptimized. However, the internal anonymous function becomes so light that a sudden change in the optimization strategy from optimized to deoptimized would hardly be visible. Here there is a modified version to query function promiseValue() {
return Promise.resolve().then(function fNamed() {
var j = compute(0);
printStatus(%GetOptimizationStatus(fNamed));
return "Hi " + j;
});
} When invoked with
We get this since the beginning:
So, the function becomes so simple that Testing different values of In the cases where the function is never optimized or battles between optimized/non-optimized since the beginning, we would not find a sudden degradation of performance at an unexpected point of time in the future. But why an originally optimized (complex-enough) function becomes non-optimized some time later? (It could probably be related to complexity/closure factors) |
@SynerG Once again: could this be reproduced with |
@ChALkeR, yes it can be reproduced. Including function callbackValue(cb) {
(function internalFunction() {
var j = 0;
for (var i = 0, l = 300000; i < l; i++) { j = j + 2;}
printStatus(%GetOptimizationStatus(internalFunction));
cb("Hi " + j);
})();
} After 115 seconds running (in this case), I found this:
|
It is sad that V8 has |
Looks like it does not regress if you flatten the callbacks var http = require('http');
var callbackValue = function(cb, res) {
(function() {
var j = 0;
for (var i = 0, l = 300000; i < l; i++) {
j = j + 2;
}
cb("Hi " + j, res);
})();
};
var responseWriter = function(value, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end(value);
};
var reqHandler = function (req, res) {
callbackValue(responseWriter, res);
};
var srv = http.createServer(reqHandler);
srv.listen(3000, '127.0.0.1', function() {
"use strict";
console.log("Server listening on 127.0.0.1:3000");
}); |
Also, this is probably obvious, but the issue can be reproduced using 3rd party libraries supported by Express example: var express = require('express');
var app = express();
app.get('/', function(req, res){
callbackValue(function(value){
res.send(value);
});
}); After 35 seconds (in a first execution), and 140 seconds (in a second execution):
Loopback example: // In MyModel.js
MyModel.remoteMethod('getHello', {
http: {path: '/getHello', verb: 'get'},
returns: {root: true, type: 'object'}
});
MyModel.getHello= function (){
return promiseValue();
} After 25 seconds (1st execution) or 50 seconds (2nd execution):
Maybe with more complex libraries the issue shows up sooner? (This isn't clear) I think we should start logging as much information as possible on the server during execution (for example, using Input from @trevnorris, @petkaantonov or another connoisseur of |
After putting through irhydra, it looks like deopts occur in |
node/deps/v8/src/flag-definitions.h Lines 589 to 590 in f938ef7
You can adjust this value and other DEFINE_ options listed in flag-definitions.h on the node command line. But in the case of |
@evanlucas looking at Wouldn't v8 deopt if it optimized to one type and it received another? (Haven't explored as to whether this should be happening) |
v8 doesn't deoptimize from the types of arguments passed, but how the arguments are used in the function. |
Thanks for the clarification - sorry to add noise |
@omsmith No worries. I thought the same for a while until confirming with @mraleph. He told me that the function only checks on the number of arguments, but not the types. Though since the function operates on the arguments (why wouldn't it) there's where the deoptimization happens. On the side, a polymorphic function that has a lot going on can benefit from having the specialized cases split into multiple functions and have the entry point simply validate the arguments and call the corresponding function. This way if the entry point is deoptimized it causes much less of an impact. |
I've been digging into this for a couple days now. From what I can tell so far, the deopt in |
@evanlucas noted, let me know what you find. For reference, it attaches to the object "for efficiency", so that extra objects do not need to be created. Maybe just hard-core enrolling the object to have timers fields manually upon instantiation? |
This is pretty much what we should be doing. On every object instantiation we should be setting all possible future fields to at least |
Ping. any updates on this one? |
I ran the test several times with v5, v6 and v7 and can only reproduce the performance drop in v5. |
@targos did you measure v4? |
I didn't but it's probably still affected |
would we be able to backport the V8 changes? |
We'd have to identify them. I reopen with the |
Still the case on node 8.3+ ? |
Given that Node.js 4.x is now in maintenance mode (and has less than 5 months to go in its life) and there has been absolutely zero interest in anyone taking this on, I'm going to go ahead and close it but feel free to reopen if you believe I've made an error. |
When using anonymous functions (in IIFEs or Promises) to prepare the response of an HTTP request, the performance of the server suddenly slows down. Performance remains low until the server is restarted.
I initially found the problem using Promises. A script to replicate the problem:
But Promises are not required:
Using a HTTP benchmarking tool to test the server, e.g. wrk:
The server starts providing good performance, but an unpredictable point in time the performance drops:
Once it drops, it doesn't matter if you stop the load testing tool and let the server recover for a while, performance remains low until the server is restarted.
¿Is it a bug or am I doing something wrong?
If it is a bug, I don't know if it belongs to Nodejs or V8, although I have been unable to reproduce the problem only with Promises/IIFEs without the 'http' module.
Some additional details I have collected while hunting down the issue that may or may not be of interest to solve the problem:
.then(function(){...})
ofpromiseValue()
is the main cause of performance.try {} catch (e) {}
) is added to the anonymous function, performance matches the low level and doesn't vary over time.v8
, the functionis always not optimized(I could copy the snippet of how I tested this if required. Basically, it follows @petkaantonov's tip). (EDIT: The function starts being optimized and, at some point, becomes deoptimized also according tov8
, see comment).(I can also provide snippets for these points)
EDIT:
SpiderMonkey
(see @kzc comment)v8 3.14 / 3.28
(see @obastemur comment)The text was updated successfully, but these errors were encountered: