Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: OCP\Lock\LockedException during import #43

Closed
kusma opened this issue Dec 31, 2020 · 15 comments
Closed

Error: OCP\Lock\LockedException during import #43

kusma opened this issue Dec 31, 2020 · 15 comments

Comments

@kusma
Copy link

kusma commented Dec 31, 2020

I keep more or less randomly hitting a LockedException while importing my Google Photos albums. Here's an example from the log:

[core] Error: OCP\Lock\LockedException: "Google Photos/Siggraph 2008/siggraph_2008_060.jpg" is locked at <<closure>>

0. /var/www/html/lib/private/Files/View.php line 1156
   OC\Files\View->changeLock("/kusma/files/Go ... g", 2)
1. /var/www/html/lib/private/Files/View.php line 1001
   OC\Files\View->basicOperation("fopen", "/kusma/files/Go ... g", ["write"], "w")
2. /var/www/html/lib/private/Files/Node/File.php line 115
   OC\Files\View->fopen("/kusma/files/Go ... g", "w")
3. /var/www/html/custom_apps/integration_google/lib/Service/GooglePhotosAPIService.php line 364
   OC\Files\Node\File->fopen("w")
4. /var/www/html/custom_apps/integration_google/lib/Service/GooglePhotosAPIService.php line 277
   OCA\Google\Service\GooglePhotosAPIService->getPhoto("REDACTED", "kusma", {id: "AEWD2uhjex ... "}, OC\Files\Node\Folder {})
5. /var/www/html/custom_apps/integration_google/lib/Service/GooglePhotosAPIService.php line 155
   OCA\Google\Service\GooglePhotosAPIService->importPhotos("REDACTED", "kusma", "Google Photos", 500000000, 0)
6. /var/www/html/custom_apps/integration_google/lib/BackgroundJob/ImportPhotosJob.php line 36
   OCA\Google\Service\GooglePhotosAPIService->importPhotosJob("kusma")
7. /var/www/html/lib/public/BackgroundJob/Job.php line 80
   OCA\Google\BackgroundJob\ImportPhotosJob->run({user_id: "kusma"})
8. /var/www/html/lib/public/BackgroundJob/QueuedJob.php line 48
   OCP\BackgroundJob\Job->execute(OC\BackgroundJob\JobList {}, OC\Log {})
9. /var/www/html/cron.php line 127
   OCP\BackgroundJob\QueuedJob->execute(OC\BackgroundJob\JobList {}, OC\Log {})

at 2020-12-31T19:45:58+00:00

This is using Nextcloud 20.0.3, running in Docker, importing into a data directory located on an EXT4 filesystem on a sofware RAID5 array.

This seems to happen regardless of how the background tasks are run, I've tried both AJAX and cron, both with actual cron-jobs and with manually running the cron-job to avoid issues of concurrent runs. Re-running the cron.php script just fails right away once it's failed. If I cancel the import from the UI and retry, I get progress from start again, and problems occur at some random point later on.

@itstueben
Copy link

I have the same Problem. Also running 20.0.3 on docker. The error appears in Version V1.4 and v1.2.
@nextcloud-bot @aszlig Is the Job still running after errors and warnings? It seems the job is canceled. The Job stucks on 37% after 24 houres

julien-nc pushed a commit that referenced this issue Jan 5, 2021
Signed-off-by: Julien Veyssier <eneiluj@posteo.net>
julien-nc pushed a commit that referenced this issue Jan 5, 2021
Signed-off-by: Julien Veyssier <eneiluj@posteo.net>
@julien-nc
Copy link
Member

problems occur at some random point later on

@kusma Is it really random or does it happen always on the same file?

If I cancel the import from the UI and retry, I get progress from start again

@kusma The import process skips files/photos that already exist in your storage. Even if the initial progress is 0%, it does not download everything again.

The error appears in Version V1.4 and v1.2

@itstueben Do you mean it was working before or that you started using the app at version 0.1.2?

@itstueben Could you check if you get related errors in nextcloud.log?

@kusma @itstueben I have no idea why some files are locked when trying to write them. The files are written right after being created. I don't see why a freshly created file would be locked. v0.1.5-1-nightly might solve your problem, it's just trying to unlock the files before writing their content and skips (and delete) those that are still locked. Could you give it a try?

Thank you both for the feedback

@sarunaskas
Copy link

I also have the issue, but with Google Drive - it always gets stuck at the same file. It did not get resolved with v0.1.5-1-nightly

@julien-nc
Copy link
Member

@sarunaskas Thanks for the feedback. I need to see the errors to know what happens. Do you have access to nextcloud.log?

@sarunaskas
Copy link

Update: Cancelled the import, deleted the file from Trash in Google Drive and it seems to be working now. The log from previous error below

[core] Error: OCP\Lock\LockedException: "Storage Box/Drive/2020-01-07" is locked, existing lock on file: exclusive at <<closure>>

 0. /snap/nextcloud/25276/htdocs/lib/private/Files/View.php line 2074
    OC\Files\View->lockPath("/sarunas/files/ ... 7", 1, false)
 1. /snap/nextcloud/25276/htdocs/lib/private/Files/View.php line 1147
    OC\Files\View->lockFile("/sarunas/files/ ... 7", 1)
 2. /snap/nextcloud/25276/htdocs/lib/private/Files/View.php line 724
    OC\Files\View->basicOperation("unlink", "/sarunas/files/ ... 7", ["delete"])
 3. /snap/nextcloud/25276/htdocs/lib/private/Files/Node/File.php line 132
    OC\Files\View->unlink("/sarunas/files/ ... 7")
 4. /var/snap/nextcloud/25276/nextcloud/extra-apps/integration_google/lib/Service/GoogleDriveAPIService.php line 366
    OC\Files\Node\File->delete()
 5. /var/snap/nextcloud/25276/nextcloud/extra-apps/integration_google/lib/Service/GoogleDriveAPIService.php line 255
    OCA\Google\Service\GoogleDriveAPIService->getFile("ya29.a0AfH6SMBB ... N", "sarunas", {id: "1_VRSH8Jbk ... e}, {1CaXYGwdM83UxlK ... }}, OC\Files\Node\Folder {})
 6. /var/snap/nextcloud/25276/nextcloud/extra-apps/integration_google/lib/Service/GoogleDriveAPIService.php line 153
    OCA\Google\Service\GoogleDriveAPIService->importFiles("ya29.a0AfH6SMBB ... N", "sarunas", "/Storage Box/Drive", 500000000, 0)
 7. /var/snap/nextcloud/25276/nextcloud/extra-apps/integration_google/lib/BackgroundJob/ImportDriveJob.php line 36
    OCA\Google\Service\GoogleDriveAPIService->importDriveJob("sarunas")
 8. /snap/nextcloud/25276/htdocs/lib/public/BackgroundJob/Job.php line 80
    OCA\Google\BackgroundJob\ImportDriveJob->run({user_id: "sarunas"})
 9. /snap/nextcloud/25276/htdocs/lib/public/BackgroundJob/QueuedJob.php line 48
    OCP\BackgroundJob\Job->execute(OC\BackgroundJob\JobList {}, OC\Log {})
10. /snap/nextcloud/25276/htdocs/cron.php line 127
    OCP\BackgroundJob\QueuedJob->execute(OC\BackgroundJob\JobList {}, OC\Log {})

@sarunaskas
Copy link

Another update: Import was completed, but It only saved ~300 files (didn't note down the message) instead of the 825 it detected in the Drive.

@julien-nc
Copy link
Member

Thanks for the error log. I'll fix this. We still don't know why those files are locked. Except having concurrent run of the background job, I have no idea.

It only saved ~300 files

What happens if you launch the import again? Are there more downloaded files? I'm interested to know if it always happens to the same files or if it's random?

@sarunaskas
Copy link

It stopped at 308 again (I think that was the number in the previous run as well). The file mentioned in the log 2020-01-07 was causing the error each time I ran it using the previous versions (0.1.2 and 0.1.3 nightly)

julien-nc pushed a commit that referenced this issue Jan 6, 2021
Signed-off-by: Julien Veyssier <eneiluj@posteo.net>
julien-nc pushed a commit that referenced this issue Jan 6, 2021
Signed-off-by: Julien Veyssier <eneiluj@posteo.net>
@julien-nc
Copy link
Member

@sarunaskas Is the file really named "2020-01-07"? Which kind of file is it?
It would be nice if I could reproduce the bug locally. Is there a problematic file that you could provide?

Is it possible, considering your server network bandwidth, that downloading 500 MB takes more time than the delay between 2 cron.php?

v0.1.5-2-nightly is out with a few more fix/precautions if you have the time to try it.

Thanks all for your patience.

@kusma
Copy link
Author

kusma commented Jan 6, 2021

@eneiluj: Seemingly "random", yeah. I mean, I didn't purge everything and restart, so it might be reproducible, but I don't see any clear pattern. After a bunch of retries, the entire import got through, probably due to previously imported files remaining.

@julien-nc
Copy link
Member

@kusma Now that the locked files don't make the job crash anymore (latest nightly build), one single import process (triggering multiple background jobs, but anyway) should import everything (except the locked files if they are partial or empty...).

@sarunaskas
Copy link

@eneiluj

Is the file really named "2020-01-07"? Which kind of file is it?

I removed it from trash, but I believe it was it an google doc (I picked the OpenDocument export format). Now that the locked files are bypassed is there a way to check where the locks are encountered? I see entire directories that are missing files in Nextcloud.

Is it possible, considering your server network bandwidth, that downloading 500 MB takes more time than the delay between 2 cron.php?

Could you guide me on how to check it?

@julien-nc
Copy link
Member

@sarunaskas

Could you guide me on how to check it?

If you know the connection bandwidth on your server side, you can estimate how much data can be downloaded between 2 Nextcloud cron jobs. Let's say your bandwidth is 1 MB/s and you run Nextcloud's cron.php every 15 minutes. Then 1 * 60 * 15 = 900. You can download 900 MB in this time lapse.

I removed it from trash, but I believe it was it an google doc

Unfortunately I can't reproduce this bug. I tried to restore some files from the trash...it works fine.

is there a way to check where the locks are encountered

An error like this
Google Drive error downloading file FILE_NAME : Impossible to unlock file
should appear in nextcloud.log for each locked file.

@kusma

After a bunch of retries, the entire import got through, probably due to previously imported files remaining

Do you mean all files were downloaded successfully? No missing file in the end?

@kusma
Copy link
Author

kusma commented Jan 18, 2021

After a bunch of retries, the entire import got through, probably due to previously imported files remaining

Do you mean all files were downloaded successfully? No missing file in the end?

Yeah, after cancelling and retrying enough times, the import eventually passed.

@marcelklehr
Copy link
Member

Closing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants