-
-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move preview repair to query #23073
Move preview repair to query #23073
Conversation
The get directory contents can take ages on a large system. Better chunk it up. Signed-off-by: Roeland Jago Douma <roeland@famdouma.nl>
Do you have any data on how this affects the performance, while the "startup costs" should indeed be lower, I suspect that the extra overhead of getting each folder object by name will make things run slower in the end. |
Well we can also fetch a bit more. |
$cursor = $qb->select($qb->func()->count('*')) | ||
->from('filecache') | ||
->where($qb->expr()->eq('parent', $qb->createNamedParameter($currentPreviewFolder->getId()))) | ||
->orderBy('fileid', 'ASC') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if you just count, don't order
$hasFiles = false; | ||
foreach ($listing as $node) { | ||
if (!($node instanceof Folder)) { | ||
$hasFiles = true; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
$hasFiles = true; | |
$hasFiles = true; | |
continue; |
$section1->writeln(" Move preview/$name/$previewName to preview/$newFoldername", OutputInterface::VERBOSITY_VERBOSE); | ||
} | ||
|
||
$hasEntries = true; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This only triggers the while loop to continue if there are folders in the current set, that are processed. But what if there are only skipped folders in the current set? Then it would never reach the end. I can't come up with anything that would hold 1000 non-numeric folders, so this might be a theoretical issue.
do { | ||
$qb = $this->db->getQueryBuilder(); | ||
$cursor = $qb->select('name')->from('filecache') | ||
->where($qb->expr()->eq('parent', $qb->createNamedParameter($currentPreviewFolder->getId()))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wouldn't it make sense to remember the highest fileID from the previous loop run and put it in here as well?
->where($qb->expr()->eq('parent', $qb->createNamedParameter($currentPreviewFolder->getId()))) | |
->where($qb->expr()->eq('parent', $qb->createNamedParameter($currentPreviewFolder->getId()))) | |
->andWhere($qb->expr()->gt('fileid', $qb->createNamedParameter($previousHighestFileId))) |
core/Command/Preview/Repair.php
Outdated
$newFoldername = Root::getInternalFolder($name); | ||
$hasEntries = false; | ||
while ($row = $cursor->fetch()) { | ||
$oldPreviewFolder = $currentPreviewFolder->get($row['name']); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
$oldPreviewFolder = $currentPreviewFolder->get($row['name']); | |
$oldPreviewFolder = $currentPreviewFolder->get($row['name']); | |
$previousHighestFileId = $oldPreviewFolder->getId(); |
$progressBar->advance(); | ||
continue; | ||
} | ||
$hasEntries = true; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
$hasEntries = true; | |
$hasEntries = true; | |
$previousHighestFileId = 0; |
Signed-off-by: Morris Jobke <hey@morrisjobke.de>
🤖 beep boop beep 🤖 Here are the logs for the failed build: Status of 34040: failuremariadb10.1-php7.3
mysql8.0-php7.4
acceptance-app-files-sharing-link
Show full log
|
I'm going to close this due to lack of activity. |
The get directory contents can take ages on a large system. Better chunk
it up.
Signed-off-by: Roeland Jago Douma roeland@famdouma.nl