Skip to content

The Emotion is Not One-hot Encoding: Learning with Grayscale Label for Emotion Recognition in Conversation (INTERSPEECH 2022)

Notifications You must be signed in to change notification settings

rungjoo/Emotion_not_One

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

The Emotion is Not One-hot Encoding: Learning with Grayscale Label for Emotion Recognition in Conversation (INTERSPEECH 2022)

  • The overall flow of our model.
  • This repository provides training/test codes for RoBERTa. Please use each open source for extensions to the comparison system.

Requirements

  1. Pytorch 1.8
  2. Python 3.6
  3. Transformer 4.4.0
  4. sklearn

Datasets

Each data is split into train/dev/test in the dataset folder.

  1. IEMOCAP
  2. DailyDialog
  3. MELD
  4. EmoryNLP

Train

Step1: Training teacher model

  • For self-method
cd self_teacher
python3 train.py {--argument}
  • For future-self-method
cd self_future_teacher
python3 train.py {--argument}

Step2: Training student model (final model)

cd gray
python3 train_{method}.py {--argument}
  • method names

    • C: category method
    • W: word-embedding method
    • F: future-self-method
    • S: self-methoid
    • SA: self-adjust method
  • Argument

    • gray: heuristic (C), word (W), teacher (S), teahcer_post (SA), teacher_future (FS), teacher_future_post (FSA)
    • pretrained: pretraeind model type (roberta-large)
    • cls: label class (emotion or sentiment) (default: emotion)
    • weight: weight value for loss (in paper, it is marked as $\alpha$.)

Citation

@inproceedings{lee22e_interspeech,
  author={Joosung Lee},
  title={{The Emotion is Not One-hot Encoding: Learning with Grayscale Label for Emotion Recognition in Conversation}},
  year=2022,
  booktitle={Proc. Interspeech 2022},
  pages={141--145},
  doi={10.21437/Interspeech.2022-551}
}

About

The Emotion is Not One-hot Encoding: Learning with Grayscale Label for Emotion Recognition in Conversation (INTERSPEECH 2022)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published