METHOD OF SENTIMENT ANALYSIS OF TEXTS USING THE T5 MODEL
Main Article Content
Abstract
Sentiment analysis, a fundamental task in the field of natural language processing (NLP), focuses on identifying and interpreting the emotional tone conveyed by textual content. This task plays a pivotal role in applications such as opinion mining, customer feedback analysis, and social media monitoring, where understanding sentiments helps organizations and individuals make informed decisions. In this study, we employ the T5 (Text-To-Text Transfer Transformer) model, a state-of-the-art transformer-based architecture, to address the challenges associated with sentiment analysis. Unlike traditional models, which are often limited to specific tasks, T5’s text-to-text framework allows for a unified approach to various NLP problems, including sentiment classification. We fine-tune the T5 model on a curated sentiment-labeled dataset to evaluate its effectiveness in distinguishing between different emotional polarities, such as positive, negative, and neutral sentiments. Our experimental methodology involves rigorous evaluation using widely accepted metrics, including precision, recall, and F1 score, to provide a comprehensive assessment of the model’s performance. Additionally, we propose mathematical formulations to quantify and analyze the model's accuracy and reliability across diverse scenarios. To further validate the robustness of our approach, we compare the T5 model’s performance against traditional machine learning and deep learning models commonly used in sentiment analysis. The results, presented through graphical visualizations, highlight the superior classification capabilities of the T5 model. These findings underscore the potential of T5 in advancing sentiment analysis and its applicability in real-world scenarios, where understanding human emotions is critical.
Article Details
References
C. Raffel, Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer, Journal of Machine Learning Research, Vol. 21, No. 140, 2020, pp. 1–67.
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, I. Polosukhin, Attention is All You Need, Advances in Neural Information Processing Systems (NeurIPS), Vol. 30, 2017, pp. 5998–6008.
R. Socher, A. Perelygin, J. Wu, J. Chuang, C. D. Manning, A. Y. Ng, C. Potts, Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank, Conference on Empirical Methods in Natural Language Processing (EMNLP), 2013, pp. 1631–1642.