Determining community happiness index with transformers and attention-based deep learning

Hilman Singgih Wicaksana, Retno Kusumaningrum, Rahmat Gernowo

Abstract


In the current digital era, evaluating the quality of people's lives and their happiness index is closely related to their expressions and opinions on Twitter social media. Measuring population welfare goes beyond monetary aspects, focusing more on subjective well-being, and sentiment analysis helps evaluate people's perceptions of happiness aspects. Aspect-based sentiment analysis (ABSA) effectively identifies sentiments on predetermined aspects. The previous study has used Word-to-Vector (Word2Vec) and long short-term memory (LSTM) methods with or without attention mechanism (AM) to solve ABSA cases. However, the problem with the previous study is that Word2Vec has the disadvantage of being unable to handle the context of words in a sentence. Therefore, this study will address the problem with bidirectional encoder representations from transformers (BERT), which has the advantage of performing bidirectional training. Bayesian optimization as a hyperparameter tuning technique is used to find the best combination of parameters during the training process. Here we show that BERT-LSTM-AM outperforms the Word2Vec-LSTM-AM model in predicting aspect and sentiment. Furthermore, we found that BERT is the best state-of-the-art embedding technique for representing words in a sentence. Our results demonstrate how BERT as an embedding technique can significantly improve the model performance over Word2Vec.

Keywords


ABSA; BERT; LSTM; Happiness Index; Twitter

Full Text:

PDF

References


A. Carnett, L. Neely, M.-T. Chen, K. Cantrell, E. Santos, and S. Ala’i-Rosales, “How might indices of happiness inform early intervention research and decision making?,” Advances in Neurodevelopmental Disorders, vol. 6, no. 4, pp. 567–576, Dec. 2022, doi: 10.1007/s41252-022-00288-0.

U. Suchaini, W. P. S. Nugraha, I. K. D. Dwipayana, and S. A. Lestari, “The happiness index in indonesian: indeks kebahagiaan,”

Badan Pusat Statistik RI, pp. 1–185, 2021.

A. Iqbal, R. Amin, J. Iqbal, R. Alroobaea, A. Binmahfoudh, and M. Hussain, “Sentiment analysis of consumer reviews using deep learning,” Sustainability, vol. 14, no. 17, p. 10844, Aug. 2022, doi: 10.3390/su141710844.

B. Liu, Sentiment analysis and opinion mining. Cham: Springer International Publishing, 2012. doi: 10.1007/978-3-031-02145-9.

P. N. Andono, Sunardi, R. A. Nugroho, and B. Harjo, “Aspect-Based Sentiment Analysis for Hotel Review Using LDA, Semantic Similarity, and BERT,” International Journal of Intelligent Engineering and Systems, vol. 15, no. 5, pp. 232–243, Oct. 2022, doi: 10.22266/ijies2022.1031.21.

R. Jayanto, R. Kusumaningrum, and A. Wibowo, “Aspect-based sentiment analysis for hotel reviews using an improved model of long short-term memory,” International Journal of Advances in Intelligent Informatics, vol. 8, no. 3, p. 391, Nov. 2022, doi: 10.26555/ijain.v8i3.691.

L. M. Cendani, R. Kusumaningrum, and S. N. Endah, “Aspect-based sentiment analysis of Indonesian-Language Hotel Reviews using long short-term memory with an attention mechanism,” 2023, pp. 106–122. doi: 10.1007/978-3-031-15191-0_11.

D. A. Ingkafi, “Aspect-based sentiment analysis in measuring the community happiness index of Semarang City on Twitter Social media using bidirectional encoder representations from transformers (BERT) in Indonesian: aspect-based sentiment analysis dalam pengukuran indeks,” Universitas Diponegoro Semarang, pp. 1–60, 2022.

A. Vohra and R. Garg, “Deep learning based sentiment analysis of public perception of working from home through tweets,” Journal of Intelligent Information Systems, vol. 60, no. 1, pp. 255–274, Feb. 2023, doi: 10.1007/s10844-022-00736-2.

N. A. M. Roslan, N. M. Diah, Z. Ibrahim, Y. Munarko, and A. E. Minarno, “Automatic plant recognition using convolutional neural network on malaysian medicinal herbs: the value of data augmentation,” International Journal of Advances in Intelligent Informatics, vol. 9, no. 1, p. 136, Mar. 2023, doi: 10.26555/ijain.v9i1.1076.

A. H. Oliaee, S. Das, J. Liu, and M. A. Rahman, “Using bidirectional encoder representations from transformers (BERT) to classify traffic crash severity types,” Natural Language Processing Journal, vol. 3, p. 100007, Jun. 2023, doi: 10.1016/j.nlp.2023.100007.

S. Selva Birunda and R. Kanniga Devi, “A review on word embedding techniques for text classification,” Lecture Notes on Data Engineering and Communications Technologies, vol. 59, pp. 267–281, 2021, doi: 10.1007/978-981-15-9651-3_23.

M. Liu, Z. Wen, R. Zhou, and H. Su, “Bayesian optimization and ensemble learning algorithm combined method for deformation prediction of concrete dam,” Structures, vol. 54, pp. 981–993, Aug. 2023, doi: 10.1016/j.istruc.2023.05.136.

J. Devlin, M. W. Chang, K. Lee, and K. Toutanova, “BERT: Pre-training of deep bidirectional transformers for language understanding,” NAACL HLT 2019 - 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings of the Conference, vol. 1, pp. 4171–4186, 2019.

T. Mikolov, K. Chen, G. Corrado, and J. Dean, “Efficient estimation of word representations in vector space,” 1st International Conference on Learning Representations, ICLR 2013 - Workshop Track Proceedings, 2013.

K. Kusum and S. P. Panda, “Sentiment analysis using global vector and long short-term memory,” Indonesian Journal of Electrical Engineering and Computer Science, vol. 26, no. 1, p. 414, Apr. 2022, doi: 10.11591/ijeecs.v26.i1.pp414-422.

F. Kurniawan, Y. Romadhoni, L. Zahrona, and J. Hammad, “Comparing LSTM and CNN Methods in Case Study on Public Discussion about Covid-19 in Twitter,” International Journal of Advanced Computer Science and Applications, vol. 13, no. 10, pp. 402–409, 2022, doi: 10.14569/IJACSA.2022.0131048.

A. Vaswani et al., “Attention is all you need,” Jun. 2017, [Online]. Available: http://arxiv.org/abs/1706.03762

D. Bahdanau, K. Cho, and Y. Bengio, “Neural machine translation by jointly learning to align and translate,” Sep. 2014, [Online].

Available: http://arxiv.org/abs/1409.0473

M.-T. Luong, H. Pham, and C. D. Manning, “Effective approaches to attention-based neural machine translation,” Aug. 2015, [Online]. Available: http://arxiv.org/abs/1508.04025

C. Raffel and D. P. W. Ellis, “Feed-forward networks with attention can solve some long-term memory problems,” Dec. 2015, [Online]. Available: http://arxiv.org/abs/1512.08756

SrivastavaNitish, HintonGeoffrey, KrizhevskyAlex, SutskeverIlya, and SalakhutdinovRuslan, “Dropout: a simple way to prevent neural networks from overfitting,” Journal of Machine Learning Research, vol. 15, no. 1, pp. 1929–1958, 2014.

W. Ma and J. Lu, “An equivalence of fully connected layer and convolutional layer,” Dec. 2017, [Online]. Available: http://arxiv.org/abs/1712.01252

N. Singh and H. Sabrol, “Convolutional neural networks-an extensive arena of deep learning. a comprehensive study,” Archives of Computational Methods in Engineering, vol. 28, no. 7, pp. 4755–4780, Dec. 2021, doi: 10.1007/s11831-021-09551-4.

C. Nwankpa, W. Ijomah, A. Gachagan, and S. Marshall, “Activation functions: comparison of trends in practice and research for deep learning,” Nov. 2018, [Online]. Available: http://arxiv.org/abs/1811.03378

I. Goodfellow, Y. B. And, and A. Courville, “Deep learning (adaptive computation and machine learning),” Massachusetts Institute of Technology, vol. 8, no. 9, pp. 1–58, 2016.

Usha Ruby Dr.A, “Binary cross entropy with deep learning technique for Image classification,” International Journal of Advanced Trends in Computer Science and Engineering, vol. 9, no. 4, pp. 5393–5397, Aug. 2020, doi: 10.30534/ijatcse/2020/175942020.

Y. H. Park, “Gradients in a deep neural network and their Python implementations,” Korean Journal of Mathematics, vol. 30, no. 1, pp. 131–146, 2022, doi: 10.11568/kjm.2022.30.1.131.

K. N. Alam et al., “Deep learning-based sentiment analysis of COVID-19 vaccination responses from Twitter data,” Computational and Mathematical Methods in Medicine, vol. 2021, pp. 1–15, Dec. 2021, doi: 10.1155/2021/4321131.




DOI: http://doi.org/10.11591/ijai.v13.i2.pp1753-1761

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938 
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).

View IJAI Stats