Knowledge-Fusion-Based Iterative Graph Structure Learning Framework for Implicit Sentiment Identification
Abstract
:1. Introduction
- Defects are caused by a single-view graph structure. Implicit sentiment mining is a non-trivial task, and a suitable text graph structure can accurately represent the structural information among tokens. However, the current graph structure of implicit sentiment text is single and carries a lack of structural information to accurately capture obscure sentiment expressions.
- Deficiencies arise from predefined fixed graph structures. The graph structures of GNNs used for text analysis are expected to be plausible enough, but current graph structures are usually extracted from human a priori knowledge, such as syntactic dependency trees, co-occurrence information, etc., which inevitably contain uncertain, redundant, incorrect, and missing edges [13,14]. Since implicit sentiment has no explicit sentiment words, the mining of its sentiment information requires a more accurate topology to represent the semantic information, and this noisy information can seriously impair the performance of implicit sentiment analysis.
- We propose a new implicit sentiment analysis framework (KIG) for the joint iterative learning of graph structures and multi-view knowledge fusion. KIG improves implicit sentiment analysis by fusing multi-view graph structures to obtain an integrated understanding of the knowledge provided by different graph structures.
- Higher-quality graph structures and node representations are obtained through iterative learning, and when the learned graph becomes close to the optimized graph (for implicit sentiment analysis), KIG dynamically terminates. To the best of our knowledge, KIG is the first attempt to apply an iterative method to implicit sentiment analysis.
- Our extensive experiments on the benchmark dataset of implicit sentiment and extensive experimental results validate the superiority of our framework.
2. Related Work
2.1. Implicit Sentiment Analysis
2.2. Graph Neural Network Applications
3. Materials and Methods
3.1. Problem Statement
3.2. Methodology
3.2.1. Text Encoder and Graph Construction
- a.
- Co-occurrence statistics
- b.
- Cosine similarity
- c.
- Syntactic structures
3.2.2. Graph Learning
3.2.3. Graph Fusion
3.2.4. Graph Regularization
4. Experiments and Analysis of Results
4.1. Experimental Setup
4.1.1. Datasets
4.1.2. Evaluation Index
4.2. Baseline and Parameter Settings
4.3. Main Results
4.4. Ablation Experiment
4.5. Parameter Analysis
4.6. Number of Iterations
4.7. Case Study
4.8. Error Analysis
5. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Hatzivassiloglou, V.; McKeown, K. Predicting the semantic orientation of adjectives. In Proceedings of the 35th Annual Meeting of the Association for Computational Linguistics and 8th Conference of the European Chapter of the Association for Computational Linguistics, Madrid, Spain, 7–12 July 1997; Association for Computational Linguistics: Stroudsburg, PA, USA, 1997; pp. 174–181. [Google Scholar]
- Liu, B. Sentiment analysis and opinion mining. Synth. Lect. Hum. Lang. Technol. 2012, 5, 1–167. [Google Scholar]
- Liao, J.; Wang, S.; Li, D. Identification of fact-implied implicit sentiment based on multi-level semantic fused representation. Knowl.-Based Syst. 2019, 165, 197–207. [Google Scholar] [CrossRef]
- Ma, Y.; Peng, H.; Khan, T.; Cambria, E.; Hussain, A. Sentic LSTM: A hybrid network for targeted aspect-based sentiment analysis. Cogn. Comput. 2018, 10, 639–650. [Google Scholar] [CrossRef]
- Xiao, L.; Hu, X.; Chen, Y.; Xue, Y.; Chen, B.; Gu, D.; Tang, B. Multi-head self-attention based gated graph convolutional networks for aspect-based sentiment classification. Multimed. Tools Appl. 2022, 81, 19051–19070. [Google Scholar] [CrossRef]
- Yuan, P.; Jiang, L.; Liu, J.; Zhou, D.; Li, P.; Gao, Y. Dual-Level Attention Based on a Heterogeneous Graph Convolution Network for Aspect-Based Sentiment Classification. Wirel. Commun. Mob. Comput. 2021, 2021, 6625899. [Google Scholar] [CrossRef]
- Li, J.; Liu, M.; Zheng, Z.; Zhang, H.; Qin, B.; Kan, M.Y.; Liu, T. Dadgraph: A discourse-aware dialogue graph neural network for multiparty dialogue machine reading comprehension. In Proceedings of the 2021 International Joint Conference on Neural Networks (IJCNN), Shenzhen, China, 18–22 July 2021; pp. 1–8. [Google Scholar]
- Song, L.; Wang, Z.; Yu, M.; Zhang, Y.; Florian, R.; Gildea, D. Exploring graph-structured passage representation for multi-hop reading comprehension with graph neural networks. arXiv 2018, arXiv:1809.02040. [Google Scholar]
- Marcheggiani, D.; Bastings, J.; Titov, I. Exploiting semantics in neural machine translation with graph convolutional networks. arXiv 2018, arXiv:1804.08313. [Google Scholar]
- Bastings, J.; Titov, I.; Aziz, W.; Marcheggiani, D.; Sima’an, K. Graph convolutional encoders for syntax-aware neural machine translation. arXiv 2017, arXiv:1704.04675. [Google Scholar]
- Yao, L.; Mao, C.; Luo, Y. Graph convolutional networks for text classification. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 29–31 January 2019; Volume 33, pp. 7370–7377. [Google Scholar]
- Huang, L.; Ma, D.; Li, S.; Zhang, X.; Wang, H. Text level graph neural network for text classification. arXiv 2019, arXiv:1910.02356. [Google Scholar]
- Moschitti, A. Efficient convolution kernels for dependency and constituent syntactic trees. In Proceedings of the Machine Learning: ECML 2006: 17th European Conference on Machine Learning, Berlin, Germany, 18–22 September 2006; Proceedings 17. Springer: Berlin/Heidelberg, Germany, 2006; pp. 318–329. [Google Scholar]
- Dagan, I.; Lee, L.; Pereira, F.C. Similarity-based models of word cooccurrence probabilities. Mach. Learn. 1999, 34, 43–69. [Google Scholar] [CrossRef] [Green Version]
- Dettmers, T.; Minervini, P.; Stenetorp, P.; Riedel, S. Convolutional 2d knowledge graph embeddings. In Proceedings of the AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018; Volume 32. [Google Scholar]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Chung, J.; Gulcehre, C.; Cho, K.; Bengio, Y. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv 2014, arXiv:1412.3555. [Google Scholar]
- Chen, Y. Convolutional Neural Network for Sentence Classification. Master’s Thesis, University of Waterloo, Waterloo, TN, Canada, 2015. [Google Scholar]
- Yang, Z.; Yang, D.; Dyer, C.; He, X.; Smola, A.; Hovy, E. Hierarchical attention networks for document classification. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, CA, USA, 12–17 June 2016; pp. 1480–1489. [Google Scholar]
- Ma, D.; Li, S.; Zhang, X.; Wang, H. Interactive attention networks for aspect-level sentiment classification. arXiv 2017, arXiv:1709.00893. [Google Scholar]
- Wei, J.; Liao, J.; Yang, Z.; Wang, S.; Zhao, Q. BiLSTM with multi-polarity orthogonal attention for implicit sentiment analysis. Neurocomputing 2020, 383, 165–173. [Google Scholar] [CrossRef]
- Jaderberg, M.; Simonyan, K.; Vedaldi, A.; Zisserman, A. Reading text in the wild with convolutional neural networks. Int. J. Comput. Vis. 2016, 116, 1–20. [Google Scholar] [CrossRef] [Green Version]
- Conneau, A.; Kiela, D. Senteval: An evaluation toolkit for universal sentence representations. arXiv 2018, arXiv:1803.05449. [Google Scholar]
- Teng, Z.; Zhang, Y. Head-lexicalized bidirectional tree lstms. Trans. Assoc. Comput. Linguist. 2017, 5, 163–177. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Y.; Qi, P.; Manning, C.D. Graph convolution over pruned dependency trees improves relation extraction. arXiv 2018, arXiv:1809.10185. [Google Scholar]
- Sun, C.; Huang, L.; Qiu, X. Utilizing BERT for aspect-based sentiment analysis via constructing auxiliary sentence. arXiv 2019, arXiv:1903.09588. [Google Scholar]
- Wu, Z.; Ong, D.C. Context-guided bert for targeted aspect-based sentiment analysis. In Proceedings of the AAAI Conference on Artificial Intelligence, Washington, DC, USA, 7–14 February 2021; Volume 35, pp. 14094–14102. [Google Scholar]
- Mou, L.; Li, G.; Zhang, L.; Wang, T.; Jin, Z. Convolutional neural networks over tree structures for programming language processing. In Proceedings of the AAAI Conference on Artificial Intelligence, Kumamoto, Japan, 10–14 July 2016; Volume 30. [Google Scholar]
- Zuo, E.; Zhao, H.; Chen, B.; Chen, Q. Context-specific heterogeneous graph convolutional network for implicit sentiment analysis. IEEE Access 2020, 8, 37967–37975. [Google Scholar] [CrossRef]
- Zuo, E.; Aysa, A.; Muhammat, M.; Zhao, Y.; Ubul, K. Context aware semantic adaptation network for cross domain implicit sentiment classification. Sci. Rep. 2021, 11, 22038. [Google Scholar] [CrossRef] [PubMed]
- Scarselli, F.; Gori, M.; Tsoi, A.C.; Hagenbuchner, M.; Monfardini, G. The graph neural network model. IEEE Trans. Neural Netw. 2008, 20, 61–80. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kipf, T.N.; Welling, M. Semi-supervised classification with graph convolutional networks. arXiv 2016, arXiv:1609.02907. [Google Scholar]
- Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Philip, S.Y. A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 2020, 32, 4–24. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bruna, J.; Zaremba, W.; Szlam, A.; LeCun, Y. Spectral networks and locally connected networks on graphs. arXiv 2013, arXiv:1312.6203. [Google Scholar]
- Defferrard, M.; Bresson, X.; Vandergheynst, P. Convolutional neural networks on graphs with fast localized spectral filtering. Adv. Neural Inf. Process. Syst. 2016, 29, 3844–3852. [Google Scholar]
- Hamilton, W.; Ying, Z.; Leskovec, J. Inductive representation learning on large graphs. Adv. Neural Inf. Process. Syst. 2017, 30, 1024–1034. [Google Scholar]
- Velickovic, P.; Cucurull, G.; Casanova, A.; Romero, A.; Lio, P.; Bengio, Y. Graph attention networks. Stat 2017, 1050, 10–48550. [Google Scholar]
- Xu, K.; Hu, W.; Leskovec, J.; Jegelka, S. How powerful are graph neural networks? arXiv 2018, arXiv:1810.00826. [Google Scholar]
- Wu, F.; Souza, A.; Zhang, T.; Fifty, C.; Yu, T.; Weinberger, K. Simplifying graph convolutional networks. In Proceedings of the International Conference on Machine Learning, PMLR, Long Beach, CA, USA, 9–15 June 2019; pp. 6861–6871. [Google Scholar]
- Zhang, C.; Li, Q.; Song, D. Aspect-based sentiment classification with aspect-specific graph convolutional networks. arXiv 2019, arXiv:1909.03477. [Google Scholar]
- Li, M.; Chen, X.; Li, X.; Ma, B.; Vitányi, P.M. The similarity metric. IEEE Trans. Inf. Theory 2004, 50, 3250–3264. [Google Scholar] [CrossRef]
- Church, K.; Hanks, P. Word association norms, mutual information, and lexicography. Comput. Linguist. 1990, 16, 22–29. [Google Scholar]
- Pennington, J.; Socher, R.; Manning, C.D. Glove: Global vectors for word representation. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, 25–29 October 2014; pp. 1532–1543. [Google Scholar]
- Mikolov, T.; Chen, K.; Corrado, G.; Dean, J. Efficient estimation of word representations in vector space. arXiv 2013, arXiv:1301.3781. [Google Scholar]
- Yang, D.; Lavie, A.; Dyer, C.; Hovy, E. Humor recognition and humor anchor extraction. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portuga, 17–21 September 2015; pp. 2367–2376. [Google Scholar]
- Weller, O.; Seppi, K. Humor detection: A transformer gets the last laugh. arXiv 2019, arXiv:1909.00252. [Google Scholar]
- Zhao, Z.; Cattle, A.; Papalexakis, E.; Ma, X. Embedding lexical features via tensor decomposition for small sample humor recognition. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Hong Kong, China, 3–7 November 2019; Association for Computational Linguistics: Stroudsburg, PA, USA, 2019; pp. 6377–6382. [Google Scholar]
- Chen, L.; Lee, C.M. Convolutional neural network for humor recognition. arXiv 2017, arXiv:1702.02584. [Google Scholar]
- Bertero, D.; Fung, P. A long short-term memory framework for predicting humor in dialogues. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, CA, USA, 12–17 June 2016; pp. 130–135. [Google Scholar]
- Fan, X.; Lin, H.; Yang, L.; Diao, Y.; Shen, C.; Chu, Y.; Zhang, T. Phonetics and ambiguity comprehension gated attention network for humor recognition. Complexity 2020, 2020, 2509018. [Google Scholar] [CrossRef]
- Chen, Y.; Wu, L.; Zaki, M. Iterative deep graph learning for graph neural networks: Better and robust node embeddings. Adv. Neural Inf. Process. Syst. 2020, 33, 19314–19326. [Google Scholar]
- Kim, Y. Convolutional neural networks for sentence classification. arXiv 2014, arXiv:1408.5882. [Google Scholar]
- Lai, S.; Xu, L.; Liu, K.; Zhao, J. Recurrent convolutional neural networks for text classification. In Proceedings of the AAAI Conference on Artificial Intelligence, Austin, TX, USA, 25–30 January 2015; Volume 29. [Google Scholar]
- Johnson, R.; Zhang, T. Deep pyramid convolutional neural networks for text categorization. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Vancouver, BC, Canada, 30 July–4 August 2017; pp. 562–570. [Google Scholar]
Dataset | Implicit Sentiment | Non-Implicit Sentiment | Average Length | Train | Dev | Test | Total |
---|---|---|---|---|---|---|---|
Puns | 2424 | 2403 | 13.5 | 2897 | 965 | 965 | 4827 |
2025 | 13,884 | 17 | 14,693 | 608 | 608 | 15,909 |
Actual/Predicted | Positive | Negative |
---|---|---|
Positive | True Positive (TP) | False Negative (FN) |
Negative | False Positive (FP) | True Negative (TN) |
Model | Epoch | Batch_Size | Max_Length | Learning_Rate | Dropout_Rate |
---|---|---|---|---|---|
DPCNN | 5 | 128 | 200 | 0.001 | 0.2 |
IDGL | 30 | 16 | 50 | 0.001 | 0.5 |
HAN | 5 | 128 | 200 | 0.001 | 0.2 |
TEXTCNN | 10 | 128 | 200 | 0.001 | 0.2 |
RCNN | 10 | 128 | 200 | 0.001 | 0.2 |
KIG (ours) | 50 | 16 | 50 | 0.001 | 0.5 |
Methods | Accuracy (%) | Recall (%) | F1 (%) |
---|---|---|---|
TM | 74.50 | 72.30 | 73.70 |
SVM | 83.85 | 82.52 | 84.18 |
HCFW2V | 85.40 | 88.80 | 85.90 |
Bi-LSTM+CNN | 85.38 | 91.97 | 86.37 |
CNN | 86.10 | 86.40 | 85.70 |
Bi-GRU | 87.72 | 92.46 | 88.15 |
PACGA | 88.69 | 92.76 | 90.81 |
KIG (ours) | 89.21 | 93.72 | 91.15 |
Methods | Body (%) | Punchline (%) | Full (%) |
---|---|---|---|
Human (General) | 49.30 | 59.20 | 66.30 |
DPCNN | 56.74 | 55.42 | 57.73 |
IDGL | 60.36 | 57.56 | 61.67 |
HAN | 61.18 | 63.48 | 64.14 |
TEXTCNN | 61.84 | 66.11 | 67.10 |
RCNN | 63.98 | 63.81 | 64.63 |
KIG (ours) | 64.37 | 64.25 | 67.84 |
Dataset | Puns | Reddit-Full |
---|---|---|
Evaluation index | Acc(%) | Acc(%) |
Single graph | 86.23 | 60.19 |
w/o IL | 87.78 | 62.66 |
w/o graph reg. | 86.12 | 59.70 |
Full-model KIG | 89.21 | 67.84 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhao, Y.; Mamat, M.; Aysa, A.; Ubul, K. Knowledge-Fusion-Based Iterative Graph Structure Learning Framework for Implicit Sentiment Identification. Sensors 2023, 23, 6257. https://doi.org/10.3390/s23146257
Zhao Y, Mamat M, Aysa A, Ubul K. Knowledge-Fusion-Based Iterative Graph Structure Learning Framework for Implicit Sentiment Identification. Sensors. 2023; 23(14):6257. https://doi.org/10.3390/s23146257
Chicago/Turabian StyleZhao, Yuxia, Mahpirat Mamat, Alimjan Aysa, and Kurban Ubul. 2023. "Knowledge-Fusion-Based Iterative Graph Structure Learning Framework for Implicit Sentiment Identification" Sensors 23, no. 14: 6257. https://doi.org/10.3390/s23146257
APA StyleZhao, Y., Mamat, M., Aysa, A., & Ubul, K. (2023). Knowledge-Fusion-Based Iterative Graph Structure Learning Framework for Implicit Sentiment Identification. Sensors, 23(14), 6257. https://doi.org/10.3390/s23146257