-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error: OCP\Lock\LockedException during import #43
Comments
I have the same Problem. Also running 20.0.3 on docker. The error appears in Version V1.4 and v1.2. |
Signed-off-by: Julien Veyssier <eneiluj@posteo.net>
Signed-off-by: Julien Veyssier <eneiluj@posteo.net>
@kusma Is it really random or does it happen always on the same file?
@kusma The import process skips files/photos that already exist in your storage. Even if the initial progress is 0%, it does not download everything again.
@itstueben Do you mean it was working before or that you started using the app at version 0.1.2? @itstueben Could you check if you get related errors in nextcloud.log? @kusma @itstueben I have no idea why some files are locked when trying to write them. The files are written right after being created. I don't see why a freshly created file would be locked. v0.1.5-1-nightly might solve your problem, it's just trying to unlock the files before writing their content and skips (and delete) those that are still locked. Could you give it a try? Thank you both for the feedback |
I also have the issue, but with Google Drive - it always gets stuck at the same file. It did not get resolved with v0.1.5-1-nightly |
@sarunaskas Thanks for the feedback. I need to see the errors to know what happens. Do you have access to nextcloud.log? |
Update: Cancelled the import, deleted the file from Trash in Google Drive and it seems to be working now. The log from previous error below
|
Another update: Import was completed, but It only saved ~300 files (didn't note down the message) instead of the 825 it detected in the Drive. |
Thanks for the error log. I'll fix this. We still don't know why those files are locked. Except having concurrent run of the background job, I have no idea.
What happens if you launch the import again? Are there more downloaded files? I'm interested to know if it always happens to the same files or if it's random? |
It stopped at 308 again (I think that was the number in the previous run as well). The file mentioned in the log |
Signed-off-by: Julien Veyssier <eneiluj@posteo.net>
Signed-off-by: Julien Veyssier <eneiluj@posteo.net>
@sarunaskas Is the file really named "2020-01-07"? Which kind of file is it? Is it possible, considering your server network bandwidth, that downloading 500 MB takes more time than the delay between 2 cron.php? v0.1.5-2-nightly is out with a few more fix/precautions if you have the time to try it. Thanks all for your patience. |
@eneiluj: Seemingly "random", yeah. I mean, I didn't purge everything and restart, so it might be reproducible, but I don't see any clear pattern. After a bunch of retries, the entire import got through, probably due to previously imported files remaining. |
@kusma Now that the locked files don't make the job crash anymore (latest nightly build), one single import process (triggering multiple background jobs, but anyway) should import everything (except the locked files if they are partial or empty...). |
I removed it from trash, but I believe it was it an google doc (I picked the OpenDocument export format). Now that the locked files are bypassed is there a way to check where the locks are encountered? I see entire directories that are missing files in Nextcloud.
Could you guide me on how to check it? |
If you know the connection bandwidth on your server side, you can estimate how much data can be downloaded between 2 Nextcloud cron jobs. Let's say your bandwidth is 1 MB/s and you run Nextcloud's cron.php every 15 minutes. Then
Unfortunately I can't reproduce this bug. I tried to restore some files from the trash...it works fine.
An error like this
Do you mean all files were downloaded successfully? No missing file in the end? |
Yeah, after cancelling and retrying enough times, the import eventually passed. |
Closing. |
I keep more or less randomly hitting a
LockedException
while importing my Google Photos albums. Here's an example from the log:This is using Nextcloud 20.0.3, running in Docker, importing into a data directory located on an EXT4 filesystem on a sofware RAID5 array.
This seems to happen regardless of how the background tasks are run, I've tried both AJAX and cron, both with actual cron-jobs and with manually running the cron-job to avoid issues of concurrent runs. Re-running the cron.php script just fails right away once it's failed. If I cancel the import from the UI and retry, I get progress from start again, and problems occur at some random point later on.
The text was updated successfully, but these errors were encountered: