You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you have very large nodes (single-node size > 16 MB), you can increase the FileBlockSize to accommodate the larger nodes. This works fine when writing the data, however if you restart your process and it tries to initialize the BTree by reading the data from disk, it will fail in the TransactedCompoundFile.FileSection.Read method when it tries to validate the ActualBlocks count. This is because the Length written during the Write portion was greater than 16 MB, occupying more than 3 bytes, however the first byte was reserved for the header length. When trying to read the length back out, we only get the last 3 bytes of the length, which doesn't match the expected number of ActualBlocks, causing an error.
The text was updated successfully, but these errors were encountered:
If you have very large nodes (single-node size > 16 MB), you can increase the FileBlockSize to accommodate the larger nodes. This works fine when writing the data, however if you restart your process and it tries to initialize the BTree by reading the data from disk, it will fail in the TransactedCompoundFile.FileSection.Read method when it tries to validate the ActualBlocks count. This is because the Length written during the Write portion was greater than 16 MB, occupying more than 3 bytes, however the first byte was reserved for the header length. When trying to read the length back out, we only get the last 3 bytes of the length, which doesn't match the expected number of ActualBlocks, causing an error.
The text was updated successfully, but these errors were encountered: