Aspect Term Extraction with History Attention and Selective Transformation.
- Python 3.6
- DyNet 2.0.2 (For building DyNet and enabling the python bindings, please follow the instructions in this link)
- nltk 3.2.2
- numpy 1.13.3
- Glove Word Embeddings (840B, 2.2M vocab).
- MPQA Subjectivity Lexicon
- Window-based input (window size is 3, as done in Pengfei's work).
- Replacing the punctuations with the same token
PUNCT
. - Only the sentimental words with strong subjectivity are employed to provide distant supervision.
python main.py -ds_name [YOUR_DATASET_NAME] -sgd_lr [YOUR_LEARNING_RATE_FOR_SGD] -win [YOUR_WINDOW_SIZE] -optimizer [YOUR_OPTIMIZER] -rnn_type [LSTM|GRU] -attention_type [bilinear|concat]
- OS: REHL Server 6.4 (Santiago)
- CPU: Intel Xeon CPU E5-2620 (Yes, we do not use GPU)
If the code is used in your research, please star this repo and cite our paper as follows:
@inproceedings{li2018aspect,
title={Aspect Term Extraction with History Attention and Selective Transformation},
author={Li, Xin and Bing, Lidong and Li, Piji and Lam, Wai and Yang, Zhimou},
booktitle={IJCAI},
pages={4194--4200}
year={2018}
}