Performance Optimization: Expanding binary search window #231
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Optimizes the binary search to not be too greedy when checking the size of sections.
Previously, I relied on the fact that we'd find a semantic level that didn't fit in the chunk size. However, it is quite likely there won't be one, and then the reliance on having this to scope the sections we are binary searching over means that in worst case, we are binary searching over the entire rest of the document, which might be large search area, and requires tokenizing huge chunks of text, which is the real bottleneck.
Now, we increase the search area by a factor of 2 each successful attempt, and also keep track of updating our starting
low
value for the binary search if we already know it is too low.