Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

batch size limit in GUI #1511

Open
3 tasks
WeissShahaf opened this issue Sep 20, 2023 · 3 comments
Open
3 tasks

batch size limit in GUI #1511

WeissShahaf opened this issue Sep 20, 2023 · 3 comments
Labels
bug Something isn't working fixed in future release Fix or feature is merged into develop and will be available in future release.

Comments

@WeissShahaf
Copy link

Bug description

batch size limited to 2 digits in GUI.
don't find a way to set in CLI

Expected behaviour

I have a GPU with 48 GB, and 720p movies, where the animal fits perfectly with no scaling. and can run at batchsize 90. using 33GB of VRAM. wanted to scale down and increase batchsize.

Actual behaviour

Your personal set up

WIN10, GPU: A6000

mamba install 1.3.3
  • OS: WIN10
  • Version(s): 1.3.3 python 3.7
Environment packages
# paste output of `pip freeze` or `conda list` here
Logs
# paste relevant logs here, if any

Screenshots

How to reproduce

try to input batch size over 99 in GUI

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error
@WeissShahaf WeissShahaf added the bug Something isn't working label Sep 20, 2023
@roomrys
Copy link
Collaborator

roomrys commented Sep 21, 2023

Hi @WeissShahaf,

This parameter can be set through your training_config.json using the "batch_size" key. To train through the CLI, you would then reference this training_config.json where the batch size is set to whichever number.

While there was no explicit upper limit on the batch size (there must have been some default range of 1-99 downstream somewhere), I've now added a range of 1-512 in #1513.

Thanks,
Liezl

@WeissShahaf
Copy link
Author

Great, thank you! @roomrys

Now training at batchsize 512 with no issue. after scaling at 0.3, and unet with only 8 downsampling filters to keep the receptive field around the animal. Loss is at 0.000064 after 12196 batches.

question: Is there a downside to reducing the number downsampling steps like that? besides losing information due to scaling

Thanks,
Shahaf

@roomrys
Copy link
Collaborator

roomrys commented Sep 28, 2023

Hi @WeissShahaf,

There are a few downsides if you decrease the number of down-sampling blocks by too much...


Context

Deeper networks can capture more contextual information, which is helps with key points detection. Reducing down-sampling steps can limit the context the model can capture and make it more difficult to detect body parts.

... detour to Model Complexity (more to do with width of network)

On a similar note, since the number of features increases with the number of layers used, the representational capacity of the model will be decreased. Unless this is accounted for in the number of "Filters" (or kernels) used for each layer, fine-grained features will go unnoticed.

In the SLEAP GUI, we use "Filters" to refer to the number of kernels used in each layer to extract features. We use "Max Stride" to adjust the number of down-sampling blocks used. Similarly "Output Stride" adjusts the number of up-sampling blocks used.

Scalability/depth of detection

If the size of your animal in the field of view varies significantly in scale, reducing down-sampling steps might make the model less robust to such variations. You may need to augment your data during preprocessing to handle scale differences, but this would depend on your data.

Saves on memory/computation

One advantage of reducing down-sampling steps is that it can reduce memory and computational requirements which helps with large batch sizes like 512.


...but I would encourage your experimentation to find those limits which would vary on a dataset-to-dataset basis!

Thanks,
Liezl

@roomrys roomrys added the fixed in future release Fix or feature is merged into develop and will be available in future release. label Sep 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working fixed in future release Fix or feature is merged into develop and will be available in future release.
Projects
None yet
Development

No branches or pull requests

2 participants