此页面上的内容需要较新版本的 Adobe Flash Player。

获取 Adobe Flash Player

Interacted grid tagging for sentiment word extraction

WANG Wei, LI Ting,  GE Hongwei

(School of Artificial Intelligence and Computer, Jiangnan University, Wuxi 214122, China)


Abstract: With the improvement of people’s quality of life, the requirements for service quality have become higher. The judgment of merchants on users’ reviews has been improved from good or bad to specific aspects, while traditional sentiment classification is difficult to complete this task, where sentiment word extraction of text is the key. Most of the existing models use segmented subtasks for training. In order to solve the problem of model error propagation in multiple subtasks and improve the deep learning of data, an aspect sentiment analysis model based on interactive grid tagging scheme of bidirectional encoder representation from transformers (BERT) (IGTS-BERT) was proposed. Firstly, the rotation position encoding was used to enhance the model’s sensitivity to position, and the word pairs were labeled with a low-dimensional method to improve the learning efficiency. Then, the two grid tagging networks based on the rotational position were made to interact, giving the model a better generalization ability. Finally, in order to further exert the effect of interacting, two data augmentation methods were proposed: splicing method and a method about random matching of aspect words and sentiment words, which further improves the performance of the model. The Triplets extraction and Pairs extraction tasks were tested on 4 standard datasets, with an average improvement of more than 4% in F1 value and a maximum improvement of 7%. Experimental results show that the IGTS-BERT model exhibits superior performance on sentiment word extraction.


Key words: sentiment word extraction; grid tagging scheme (GTS); rotary position embedding (RoPE); interactive learning; data augmentation

References

[1] ZHOU J, HUANG J X, CHEN Q, et al. Deep learning for aspect-level sentiment classification: survey, vision, and challenges. IEEE Access, 2019, 7: 78454-78483.
[2] DO H H, PRASAD P W C, MAAG A, et al. Deep learning for aspect-based sentiment analysis: a comparative review. Expert Systems with Applications, 2019, 118: 272-299.
[3] WANG Y, HUANG M, ZHU X, et al. Attention-based LSTM for aspect-level sentiment classification. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, 2016: 606-615.
[4] MAO Y, SHEN Y, YU C, et al. A joint training dual-mrc framework for aspect-based sentiment analysis. (2021-04-07). https:doi.org/10.48550/arXiv.2101.B0816.
[5] CHEN X, SUN C, WANG J, et al. Aspect sentiment classification with document-level sentiment preference modeling//58th Annual Meeting of the Association for Computational Linguistics, Jul. 5-10, 2020, Seattle, USA. Stroudsburg: ACL, 2020: 3667-3677.
[6] YADAV R K, JIAO L, GRANMO O C, et al. Human-level inter-pretable learning for aspect-based sentiment analysis//AAAI Conference on Artificial Intelligence, Feb. 2-9, 2021. USA: AAAI, 2021, 35(16): 14203-14212.
[7] CHEN P, SUN Z, BING L, et al. Recurrent attention network on memory for aspect sentiment analysis//2017 Conference on Empirical Methods in Natural Language Processing, Sept. 7-11, 2017, Copenhage, Denmark. Stroudburg: ACL, 2017: 452-461.
[8] LI R, CHEN H, FENG F, et al. Dual graph convolutional net-works for aspect-based sentiment analysis//59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Aug. 1-6, 2021. Stroudburg: ACL,  2021: 6319-6329.
[9] PENG H, XU L, BING L, et al. Knowing what, how and why: a near complete solution for aspect-based sentiment analysis//AAAI Conference on Artificial Intelligence, Feb. 7-12, 2020, Hilton New York Midtown, USA. USA: AAAI, 2020, 34(5): 8600-8607.
[10] YU BAI JIAN S, NAYAK T, MAJUMDER N, et al. Aspect sentiment triplet extraction using reinforcement learning. (2021-08-13). https://doi.org/10.48550/arXiv.2108.08107.
[11] XU L, LI H, LU W, et al. Position-aware tagging for aspect sentiment triplet extraction//2020 Conference on Empirical Methods in Natural Language Processing, Nov. 16-20, 2020. Stroudburg: ACL, 2020: 2339-2349.
[12] HE R, LEE W S, NG H T, et al. An interactive multi-task learning network for end-to-end aspect-based sentiment analysis//57th Annual Meeting of the Association for Computational Linguistics, Jul. 28-Aug.2, 2019. Florence, Italy. Stroudburg: ACL, 2019: 504-515.
[13] WU Z, YING C, ZHAO F, et al. Grid tagging scheme for aspect-oriented fine-grained opinion extraction//Findings of the Association for Computational Linguistics: EMNLP 2020, Nov. 16-20, 2020. Stroudburg: ACL,  2020: 2576-2585.
[14] KENTON J D M W C, TOUTANOVA L K. BERT: pre-training of deep bidirectional transformers for language understanding//Annual Conference of the North American Chapter of the Association for Computation Linguistics, Jun. 2-7, 2019, Minneopolis, USA. Stroudburg: ACL, 2019: 4171-4186.
[15] KIVONO S, KOBAYASHI S, SUZUKI J, et al. SHAPE: shifted absolute position embedding for transformers//2021 Conference on Empirical Methods in Natural Language Processing, Nov. 7-11, 2021, Dunta Cana, Dominican Republic. Stroudburg: ACL, 2021: 3309-3321.
[16] SU J, LU Y, PAN S, et al. Roformer: enhanced transformer with rotary position embedding. (2022-08-09). https:/doi.org/10.48550/arXiv.2104.09864.
[17] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: a simple way to prevent neural networks from overfitting. The Journal of Machine Learning Research, 2014, 15(1): 1929-1958.
[18] WU L, LI J, WANG Y, et al. R-drop: regularized dropout for neural networks. Advances in Neural Information Processing Systems, 2021, 34: 10890-10905.
[19] WANG W R, PAN S J, DAHLMEIER D, et al. Coupled multi-layer attentions for co-extraction of aspect and opinion terms//31st AAAI Conference on Artificial Intelligence. Feb. 4-9, 2017, San Trancise, USA. USA: AAAI, 2017, 31(1) :3316-3322.
[20] DAI H, SONG Y. Neural aspect and opinion term extraction with mined rules as weak supervision//57th Annual Meeting of the Association for Computational Linguistics, Jul.28-Aug.2, 2019. Florence, Italy. Stroudburg: ACL, 2019: 5268-5277.
[21] LI X, BING L, LI P, et al. A unified model for opinion target extraction and target sentiment prediction//33nd AAAI Conference on Artificial Intelligence, Jan.27-Feb.1, 2019, Honolulu, Hawaii, USA: AAAI, 2019, 33(1): 6714-6721.
[22] YAN H, DAI J, JI T, et al. A unified generative framework for aspect-based sentiment analysis//59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Aug. 1-6, 2021, Online. Stroudburg: ACL, 2021: 2416-2429.


基于互学习网格标签的方面情感词提取

王伟, 李婷, 葛洪伟

(江南大学 人工智能与计算机学院, 江苏 无锡 214122)



摘要:随着人们生活质量的提高, 对服务质量的要求日益提高, 用户对商家评论从简单的好坏判断提升到了具体方面, 而传统的情感分类方法无法解决这个问题, 其中, 文本的情感方面提取是关键。 现有模型大多使用分段式子任务来进行训练, 为了解决模型在多个子任务中错误传播的问题, 并提高对数据的深层学习, 提出一种基于BERT的互学习网格标签的方面情感分析模型。  首先, 利用旋转位置编码来加强模型对位置的敏感度, 同时采用降维方式对词对进行标注, 以提升学习效率。 然后, 将两个基于位置编码的网格标签进行互学习, 使模型具有更好的泛化能力。 最后, 为了进一步发挥互学习的效果, 提出了两种数据扩充方法: 拼接法和移花接木法, 使模型的性能得到进一步提升。 在4个标准数据集上测试了三元组提取和二元组提取任务, 在F1值上平均提升了4%以上, 最高提升了7%。 实验结果表明, IGTS-BERT模型在情感词提取上表现出优越的性能。

关键词:方面情感词提取; 网格标签; 旋转位置编码; 互学习; 数据扩充

引用格式:WANG Wei, LI Ting,  GE Hongwei. Interacted grid tagging for sentiment word extraction. Journal of Measurement Science and Instrumentation, 2023, 14(3): 369-378. DOI: 10.3969/j.issn.1674-8042.2023.03.014


[full text view]