53rd place(top2%) solution for Kaggle TGS Salt Identification Challenge
This is a not bad solution to get top2% place without post-processing.
Accodring to the forum, binary empty vs non-empty classifier by Heng and +0.01 LB with snapshot ensembling and cyclic lr by Peter and so on,There are many useful tricks.
I used padding image from 101x101 to 128*128, but I did not compared it with just resize. Some said the resize+flip is better than pad+aug. you can check the code heretransform.py
I used the resnet34 pretrained model and se-resnext50 pretraied model and se-resnext101 as the Unet encoder. From the results of experiments, the se-resnext50 pretrained model is the best Unet encoder, but some top kagglers said their best model is resnet34.
I used the scSE block and hypercolumn on decoder. It can raise the score a little bit.
binary empty vs non-empty classifier. Deep supervision can help the model converge quickly and increase the LB score.
From the results of experiments, train with only lovasz_loss and elu+1 is better than train model with bce in stage#1 and lovasz in stage#2.
SGDR with cycle learing rate.
1th place by b.e.s.
4th place by SeuTao
5th place by AlexenderLiao
8th place by Igor Krashenyi
9th place by tugstugi
11th place by alexisrozhkov
22nd place by Vishunu
27th place by Roman Vlasov
32nd place by Oleg Yaroshevskyy
43th place by n01z3