Skip to content

Aspect Term Extraction with History Attention and Selective Transformation (IJCAI 2018)

Notifications You must be signed in to change notification settings

lixin4ever/HAST

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HAST

Aspect Term Extraction with History Attention and Selective Transformation.

Requirements

  • Python 3.6
  • DyNet 2.0.2 (For building DyNet and enabling the python bindings, please follow the instructions in this link)
  • nltk 3.2.2
  • numpy 1.13.3

External Linguistic Resources

Preprocessing

  • Window-based input (window size is 3, as done in Pengfei's work).
  • Replacing the punctuations with the same token PUNCT.
  • Only the sentimental words with strong subjectivity are employed to provide distant supervision.

Running

python main.py -ds_name [YOUR_DATASET_NAME] -sgd_lr [YOUR_LEARNING_RATE_FOR_SGD] -win [YOUR_WINDOW_SIZE] -optimizer [YOUR_OPTIMIZER] -rnn_type [LSTM|GRU] -attention_type [bilinear|concat]

Environment

  • OS: REHL Server 6.4 (Santiago)
  • CPU: Intel Xeon CPU E5-2620 (Yes, we do not use GPU)

Citation

If the code is used in your research, please star this repo and cite our paper as follows:

@inproceedings{li2018aspect,
  title={Aspect Term Extraction with History Attention and Selective Transformation},
  author={Li, Xin and Bing, Lidong and Li, Piji and Lam, Wai and Yang, Zhimou},
  booktitle={IJCAI},
  pages={4194--4200}
  year={2018}
}

About

Aspect Term Extraction with History Attention and Selective Transformation (IJCAI 2018)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages