-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Aeon difficulty v9 #194
Aeon difficulty v9 #194
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removing the sorting/cutting is likely to cause outlier timestamps to have an inordinate power to change the difficulty (timestamps in the future would increase diff a lot shortly after being mined).
I think that if you mine a block 2 hours into the future (this is the cutoff in monero, I'm assuming the same here), the amount of time between earliest and latest timestamps goes from ~2880 (720 * 4) minutes on average to 2880+120 minutes, +4%. A similar thing should happen when that block leaves the difficulty window, there'll be a quick diff drop (to 2880-120). I guess 4% is acceptable as an instant step.
Why not change the window size? |
@hsyia Take a closer look at the plot and see that the v9 curve starts to move earlier than v1 in order to respond to the hashrate change, but it takes more time to converge to the new difficulty level (in other words, to come back to the target block time). |
@stoffu I see how it responds earlier now. That seems like a good solution without changing a lot. But if the goal is to reduce very slow block times, wouldn't it be better to reduce the window size? Or is the goal only to have a quicker response? |
@hsyia I think we want quicker response, so that block times can at least start to come down faster when the hash rate drops, but not necessarily the ability to introduce larger (potentially wild) swings faster. It's not actually that important (and maybe not that desirable) that the block times converge quickly right to the target time, just that they are able to move down from extremes more quickly. |
Also, to clarify, my motivation for proposing to remove the sort and cut is not solely to decrease the delay in responding to hash rate changes but also to simplify the algorithm since it is not demonstrated in any way (and in fact somewhat demonstrated to the contrary) that the sorting and outliner removal is in fact beneficial. Outlier timestamps will indeed cause a small % change in the difficulty for one block, but that is very much within the noise of exponential distribution randomness on the solve time of the next block, and will be reversed in the very next block. Over a series of several blocks (and certainly any significant fraction of the adjustment window) the effect is negligible. @moneromooo-monero |
Perhaps we are seeing this lag because aeon block time is double monero's and so then the 720 block difficulty window is twice as big. One option may be to align closer with monero's ~1 day window by halving the difficulty window 360 blocks. Although I understand the concern for wild fast swings. |
@hsyia |
49cd89b
to
36a7c26
Compare
I've updated the test to simulate what you mentioned. At height 12000 the timestamp is set 2 hours into the future, creating a sharp drop of difficulty at 12009. At height 12719 the timestamp is set to the median of the last 60 blocks, creating a sharp spike of difficulty at 12728. In both cases, the effect of manipulation is pretty transient. |
da3df1f
to
089b900
Compare
@stoffu Maybe I'm misreading the chart, but how does a 2 hour increase in block time result in a drop in difficulty from about 120k to about 105k? That seems like an excessive change to me, given the window size of 720 blocks (48 hours). Two extra hours should only decrease difficulty by about 4%, no? |
089b900
to
672d306
Compare
@iamsmooth |
Ok, thanks for the clarification. |
@stoffu I see the new code to calculate the difficulty, and I see the verification of the results in the test_variant... but how did you compute the block time to begin with? Is there some pretend network hashrate data you use? Just curious to understand some of this better. |
672d306
to
a1370cb
Compare
@hsyia A bit tricky issue here is that block timestamps occasionally decrease (i.e. negative block times), which cannot be modeled by a Poisson process. To handle this issue, I collected positive and negative block times separately for the blocks in [1000k,1100k), and by observing that the negative samples (after negating their values) also seem to (quite) roughly follow another exponential distribution, fitted an exponential distribution to each of them separately (after removing outliers) using SciPy. The fitted result for the positive part produced the scale parameter close to the expected 240. The fitted scale parameter for the negative part was 36.7 as used in the code. |
Ok, yeah, I was able to reproduce this, very interesting calculations! I thought this chartwas a good visualization. It is a histogram of the log of the block times. This PR makes sense because the moment when the hashrate drops is the most critical time for a slow block. So responding faster makes for less extreme cases of slow blocks with the compromise of possibly a higher number of slower but less extreme blocks. For anyone who is not a c++ wizard, here is the spreadsheet I used. Google Sheets |
Thanks for the feedback and analysis @hsyia |
My pleasure! An honest and transparent relationship with the community is so important when it comes to cryptocurrency and having the devs communicate in a way that all aeon holders understand will go a long way. I think we definitely accomplished that here. I will see if I can provide anything useful for the Aeon project because I definitely agree with the ethos of lightweight and doing what is only necessary. That is something rare among cryptocurrencies but very important for longevity. |
Yes, agreed. Simple moving average is closest approximation to Poisson parameter lambda. No sorting/cutting. |
cns-010.txt |
The odd thing about the older algorithm is that it doesn't actually discard "outliers" in a meaningful sense, it discards low absolute time outliers toward the beginning of the time window and high absolute time outliers toward the end. A more interesting estimator might discard outliers by block time interval (including negative) relative to difficulty. Anyway, I think the current simple algorithm is good enough. |
A while ago @iamsmooth told me privately over IRC that Bittrex has expressed a concern about occasional very slow block times observed on the Aeon network, which are due to large hashrate swings. What he suggested as a remedy was to remove the difficulty cutoff/sorting and to reduce the lag, so that the difficulty calculation responds to sudden hashrate changes (especially sudden declines) more quickly.
This PR introduces a new difficulty adjustment algorithm (cutting/sorting removed & lag reduced from 15 to 8) with the next v9 hardfork. Testnet fork height is set to 131111 (currently 131019 since July).
To confirm the effect, I've added some code to
tests/difficulty/difficulty.cpp
which simulates how the difficulty and block times change over time given a fixed schedule of underlying real hashrate. Here's an interactive plot https://stoffu.github.io/diff-chart/diff-variant/diff-variant.html comparing the original scheme (v1) and the new one (v9). The new version succeeds in reducing the slow block times at sudden hashrate declines (see the 6h average block time).