Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

used table functions of LuaJIT for better performance. #3673

Merged
merged 1 commit into from
Feb 1, 2019
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 10 additions & 5 deletions rootfs/etc/nginx/lua/monitor.lua
Original file line number Diff line number Diff line change
@@ -1,12 +1,16 @@
local socket = ngx.socket.tcp
local cjson = require("cjson.safe")
local assert = assert
local new_tab = require "table.new"
local clear_tab = require "table.clear"
local clone_tab = require "table.clone"

local metrics_batch = {}
-- if an Nginx worker processes more than (MAX_BATCH_SIZE/FLUSH_INTERVAL) RPS then it will start dropping metrics
local MAX_BATCH_SIZE = 10000
local FLUSH_INTERVAL = 1 -- second

local metrics_batch = new_tab(MAX_BATCH_SIZE, 0)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the second argument for? Why 0? A link to docs would be sufficient too.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

doc: https://github.com/openresty/luajit2/blob/v2.1-agentzh/doc/extensions.html
table.new(narray, nhash), the second argument is for hash.


local _M = {}

local function send(payload)
Expand Down Expand Up @@ -46,8 +50,8 @@ local function flush(premature)
return
end

local current_metrics_batch = metrics_batch
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are changing the behaviour here. The idea was to take current batch as soon as timer calls flush function and reset the main queue so that it can accept more metrics and continue with encoding and sending the current metrics batch while the other one is being filled.

With your change we will potentially lose more metrics. Imagine timer (flush function) kicks in right before the time when the current batch is full. It'll start encoding the metrics and during this time every new metric will be dropped. Given cjson is quite fast at encoding there'll probably not many metrics being dropped but if we can avoid that why not.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

got it. I will fix that.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

metrics_batch = {}
local current_metrics_batch = clone_tab(metrics_batch)
clear_tab(metrics_batch)

local payload, err = cjson.encode(current_metrics_batch)
if not payload then
Expand All @@ -66,12 +70,13 @@ function _M.init_worker()
end

function _M.call()
if #metrics_batch >= MAX_BATCH_SIZE then
local metrics_size = #metrics_batch
if metrics_size >= MAX_BATCH_SIZE then
ngx.log(ngx.WARN, "omitting metrics for the request, current batch is full")
return
end

table.insert(metrics_batch, metrics())
metrics_batch[metrics_size + 1] = metrics()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this custom table implementation not have insert?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it has the insert method, but the table.insert is an O(n) operation due to the lj_tab_len call inside LuaJIT.
We can avoid it in hot loops for better performance.

end

if _TEST then
Expand Down