You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Heya awesome work rebuilding this in PyTorch, I'm curious as to the method you used to adjust the iam handwriting dataset to work with the custom dataset class are did you work with it as it came,did you have a link to the weights and the decode map?
i attached the file you would have context as to what i mean lines.txt
The text was updated successfully, but these errors were encountered:
Hey, I didn't do much else other than convert the text file to CSV. The file needs to match the format here: dataset format
I did, however, deslant the images using the dataset class and resave the deslanted versions as a preprocessing step. I have added a jupyter notebook to show how I did that here. This might give a bit more insight into the dataset class.
I have made the weights available here. The word model requires input size of 64x256, and the line model 64x800, so set the output_size parameter of the Rescale transform to output_size=(64, 800) for the line model for instance.
You can train from 'scratch' as well to see if other regularisation levels etc make a difference. As there is transfer learning from the imagenet weights for the conv part it only took an hour or so to train. One thing to note, I have realised that I freeze the conv part initially but you probably don't want to freeze the batch norm layers when you start training - this could be adjusted.
I hope the code is reproducible, but I reckon it could be better so apologies about that. You will need pytorch 1.1 or higher, pandas, numpy, scikit-image and opencv. If you give it a go, and get stuck or whatever - let me know and I will see if I can help out.
Heya awesome work rebuilding this in PyTorch, I'm curious as to the method you used to adjust the iam handwriting dataset to work with the custom dataset class are did you work with it as it came,did you have a link to the weights and the decode map?
i attached the file you would have context as to what i mean
lines.txt
The text was updated successfully, but these errors were encountered: