-
Notifications
You must be signed in to change notification settings - Fork 96
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
batch size limit in GUI #1511
Comments
Hi @WeissShahaf, This parameter can be set through your training_config.json using the While there was no explicit upper limit on the batch size (there must have been some default range of 1-99 downstream somewhere), I've now added a range of 1-512 in #1513. Thanks, |
Great, thank you! @roomrys Now training at batchsize 512 with no issue. after scaling at 0.3, and unet with only 8 downsampling filters to keep the receptive field around the animal. Loss is at 0.000064 after 12196 batches. question: Is there a downside to reducing the number downsampling steps like that? besides losing information due to scaling Thanks, |
Hi @WeissShahaf, There are a few downsides if you decrease the number of down-sampling blocks by too much... ContextDeeper networks can capture more contextual information, which is helps with key points detection. Reducing down-sampling steps can limit the context the model can capture and make it more difficult to detect body parts. ... detour to Model Complexity (more to do with width of network)On a similar note, since the number of features increases with the number of layers used, the representational capacity of the model will be decreased. Unless this is accounted for in the number of "Filters" (or kernels) used for each layer, fine-grained features will go unnoticed.
Scalability/depth of detectionIf the size of your animal in the field of view varies significantly in scale, reducing down-sampling steps might make the model less robust to such variations. You may need to augment your data during preprocessing to handle scale differences, but this would depend on your data. Saves on memory/computationOne advantage of reducing down-sampling steps is that it can reduce memory and computational requirements which helps with large batch sizes like 512. ...but I would encourage your experimentation to find those limits which would vary on a dataset-to-dataset basis! Thanks, |
Bug description
batch size limited to 2 digits in GUI.
don't find a way to set in CLI
Expected behaviour
I have a GPU with 48 GB, and 720p movies, where the animal fits perfectly with no scaling. and can run at batchsize 90. using 33GB of VRAM. wanted to scale down and increase batchsize.
Actual behaviour
Your personal set up
WIN10, GPU: A6000
mamba install 1.3.3Environment packages
Logs
Screenshots
How to reproduce
try to input batch size over 99 in GUI
The text was updated successfully, but these errors were encountered: