-
-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory exceeds even if 2Gb for php background job is given #339
Comments
Hi @tobiasKaminsky The code that is failing you is a routine that at some point I would like to optimize.. and honestly you are the 2nd person to report this (But it had 1Gb ram, and used minimum confidence in 0.6). Well, a couple of things to consider:
Without avoiding my responsibility, probably the last point is your inconvenience. Finally, note that when we set 2GB memory requirements (Now just 1Gb), we believed that it was really the absolute minimum. That limitation is mainly imposed by the analysis task, and not by the clustering task, but in general, I guess you already discovered that it is a task that needs enough resources. |
55k :-)
0.99, seems it was the default?
I have 10Gb, and currently 8460Mb are free. |
Ohh.. Your problem is the number of photos. 🙈 Great collection of photos!. 😉
You're right, I was referring to sensitivity. 😅
Ohh.. Thinking again, also comment bad this.. The problem here is excess of the memory_limit imposed by PHP and not the ram memory. 😅 Last doubt. You analyzed most of the photos with an earlier version, and fails when upgrade to the latest?
EDIT I can not do it. Reviewing the code, the optimization I thought was really done!. 😞 |
That would be the goal, I think 👍 |
Sorry to hijack the conversation with a non-issue question, but reading this thread made me curious about somenthing. I installed face recognition on my server a couple of week ago and it found so far 210271 images, 95688 faces, 32389 persons in my collection. It's going well. The app hangs sometimes, but I made a script to restart it, and it's working just fine. Besides the fact that each restart it takes more and more time to jump from task 6/10 to 7/10, but I know it's because only uses one thread to do all this job. Said that, my question is: how many photos the app can handle? Is there any limitation or recomendation? |
Hi @ftrentini Can I propose a little experiment?. Let's see memory consumption with yours photos..
In my case with 12,000 photos and 9000 faces:
Just 144 megabytes against 3 Gb used for processing each photo it is more than acceptable.. 😅 But I would like to know how behave in your cases. 🤔 |
Oh, ok!! FYI, my server runs with 48GB of RAM, with 6GB setted to php-fpm (and no limit to php-cli). I am aware that your script uses up to 4GB, I read that somewhere, so, I think that makes my question answered after all. Just applied your patch and ran the app. The results: Found 132 faces without associated persons for user ftrentini and model 1 |
Wow.. According to this, you used about 20 GB of ram in the whole process.. 😮 @tobiasKaminsky probably needs less than 3GB to run it. I would really appreciate seeing the test. 😄 Well, I keep thinking how to optimize it. In principle, doing it in batches seems interesting, but it can bring more problems, since the direct consequence is that the number of persons would increase dramatically which would be disappointing. 🤔 On the other hand, I just discovery that we can change the memory limit only for the script.
Wow. I would have to adjust the documentation. and suggest that just in case it is necessary increase the limit only for the execution of this task.. and leave the server with more conservative values .. |
Feel free to use me as a tester!! |
Look. This is the log of the CPU and Memory (I took from webmin dashboard) of the last 12h, processing the final 20K images. It's taking about two and a half hours to cluster. And I took one screenshot during the processing and htop says that I'm using 13G of RAM. (And a little correction, this VM instance uses 32GB instead of 48GB I said on the last post). |
Hi, how is this issue going? My server does not have that much memory and it cannot continue to analyse faces any more. I have ~100k pictures and after processing ~70k pictures it runs out of memory trying to cluster faces. |
I have the same issue with my collection of ~180k photos |
Basically the descriptors (which are arrays of 128 floats), are not always necessary, and splitting the query to minimize its usage produces good memory savings. In my tests, it reduces between 33% and 39% of memory, and as an additional improvement, there was also a reduction in time of around 19%.
Hi everyone, I don't have as many photos as you to know how to scale, but I trust that it will improve your results. 🤔 |
I guess the problem remains as photo library grows, currently sitting 650k+ photos. |
Hi @rarealphacat But you can check the consumption with disabled memory limit?
Here: 232520kb / 1024 = 227,07 MB I plan to implement batching for these edge cases, but I'm interested in current consumption for reference. 🤔 |
Looks like it needs some time to process, I'll report back when finished. |
it gives 2884866 after 8/8 |
I reset everything and ran the cmd again for all users it gives 22529270 = 20GB+? Looks like it varies alot by many factors. |
Hi @rarealphacat Beyond that personally I am satisfied, I am aware that surely there are people with even more photos than you, and the current quasi-linear approach of memory consumption is unmanageable. 🤔 So, I reaffirm the need to make a change in the process to work in batches of images. I hope I can do it soon. |
I have same fatal . How to do?I found this solution: add para “-d memory_limit=8096M” Befor:
#Error After:
1/8 - Executing task CheckRequirementsTask (Check all requirements) |
I have got the solution to complete clustering. |
@xwyangjshb it works, I'm creating cron with your solution |
PHP Fatal error: Allowed memory size of 2147483648 bytes exhausted (tried to allocate 268435464 bytes) in /srv/nextcloud/apps/facerecognition/lib/BackgroundJob/Tasks/CreateClustersTask.php on line 293
I can increase memory as my server is powerful enough, but ideally it should not consume that much memory, or?
The text was updated successfully, but these errors were encountered: