Skip to content

WeNet 3.0.0

Compare
Choose a tag to compare
@xingchensong xingchensong released this 25 Jan 11:59
· 147 commits to main since this release
baaa27a

❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤

New Features

What's Changed

  • Upgrade libtorch CPU runtime with IPEX version #1893
  • Refine ctc alignment #1966
  • Use torchrun for distributed training #2020, #2021
  • Refine traning code #2055, #2103, #2123, #2248, #2252, #2253, #2270, #2286, #2288, #2312 (!! big changes !!) 🚀
  • mv all ctc functions to ctc_utils.py #2057 (!! big changes !!) 🚀
  • move search methods to search.py #2056 (!! big changes !!) 🚀
  • move all k2 related functions to k2 #2058
  • refactor and simplify decoding methods #2061, #2062
  • unify decode results of all decoding methods #2063
  • refactor(dataset): return dict instead of tuple #2106, #2111
  • init_model API changed #2116, #2216 (!! big changes !!) 🚀
  • move yaml saving to save_model() #2156
  • refine tokenizer #2165, #2186 (!! big changes !!) 🚀
  • deprecate wenetruntime #2194 (!! big changes !!) 🚀
  • use pre-commit to auto check and lint #2195
  • refactor(yaml): Config ctc/cmvn/tokenizer in train.yaml #2205, #2229, #2230, #2227, #2232 (!! big changes !!) 🚀
  • train with dict input #2242, #2243 (!! big changes !!) 🚀
  • [dataset] keep pcm for other task #2268
  • Updgrad torch to 2.x #2301 (!! big changes !!) 🚀
  • log everything to tensorboard #2307

New Bug Fixes

  • Fix NST recipe #1863
  • Fix Librispeech fst dict #1929
  • Fix bug when make shard.list for *.flac #1933
  • Fix bug of transducer #1940
  • Avoid problem during model averaging when there is parameter-tying. #2113
  • [loss] set zero_infinity=True to ignore NaN or inf ctc_loss #2299
  • fix android #2303

❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤
Many thanks to all the contributors !!!!! I love u all.