-
Notifications
You must be signed in to change notification settings - Fork 96
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sleap-label crashes with '[Errno 24] Too many open files' #1741
Comments
Hi @rupertoverall! Sorry you ran into this issue! Let us know if this works! |
Hi all, I had the same issue and my previously saved sessions (I had two open) were deleted. I still have the predictions and models folders for both but the .slp file containing my many labeled frames have been deleted. Are these files lost permanently and is there any way to build on the models from these projects? I tried to train one of these prior models in a new project and it looks like it's reverted back to the few labels in this new project, this will be quite a loss of effort for me, any recommendations? |
Hi @hlanovoi, In the model folders there are some SLP files which store a cached copy of the labels used to train those models. You can try to open those in the GUI and/or import and merge them into the new project. Let us know if you need some help with that. In general, we also strongly recommend saving out new versions of your labels as often as possible by going to File --> Save as... and saving a new version (which SLEAP automatically increments in the filename). |
Hello @talmo Thank you for your reply, so from what it looks like I by going to file-> merge into project I can maybe select one of the files you are talking derived from the model with the most labels. Within the folder for this model I see: Is this the right approach and which one of these is most appropriate to import? I'm noting this point about file hygiene, thanks so much! |
You'd want to import the Cheers, Talmo |
Bug description
Adding new files often throws an error 'Too many open files' which crashes the session.
Expected behaviour
Adding new videos should add new videos; and leave the session in a sane state (i.e. able to be saved). If there is insufficient memory or other resource, the action should fail with an error and remain in the previous state.
Actual behaviour
An error is thrown which leaves the session unuseable. Closing the session is not clean and actually deletes(!) the previously saved session file.
Your personal set up
SLEAP: 1.3.3
TensorFlow: 2.9.2
Numpy: 1.22.4
Python: 3.9.15
OS: macOS-13.6.6-arm64-arm-64bit
Run on an Apple (M1) laptop with limited RAM, so the issue could be triggered by resource limitations.
Environment packages
Logs
Screenshots
How to reproduce
I can create projects with e.g. 100 files and work with these - the error does not always occur with every session. But removing all of these videos and trying to add 30 different ones causes the crash, so it is also not just the number of videos in the project. I have not yet identified a hard-and-fast limit to the number of videos that are 'too many' (or, indeed, whether it is even the video files that are being referrred to in the error).
The text was updated successfully, but these errors were encountered: