Autores
Ojo Olumide Ebenezer
Ta Hoang Thang
Gelbukh Alexander
Calvo Castro Francisco Hiram
Adebanji Olaronke Oluwayemisi
Sidorov Grigori
Título Transformer-Based Approaches to Sentiment Detection
Tipo Libro
Sub-tipo Indefinido
Descripción Recent Developments and the New Directions of Research, Foundations, and Applications 
Resumen The use of transfer learning methods is largely responsible for the present breakthrough in Natural Learning Processing (NLP) tasks across multiple domains. In order to solve the problem of sentiment detection, we examined the performance of four different types of well-known state-of-the-art transformer models for text classification. Models such as Bidirectional Encoder Representations from Transformers (BERT), Robustly Optimized BERT Pre-training Approach (RoBERTa), a distilled version of BERT (DistilBERT), and a large bidirectional neural network architecture (XLNet) were proposed. The performance of the four models that were used to detect disaster in the text was compared. All the models performed well enough, indicating that transformer-based models are suitable for the detection of disaster in text. The RoBERTa transformer model performs best on the test dataset with a score of 82.6% and is highly recommended for quality predictions. Furthermore, we discovered that the learning algorithms’ performance was influenced by pre-processing techniques, the nature of words in the vocabulary, unbalanced labeling, and the model parameters. © 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
Observaciones DOI 10.1007/978-3-031-23476-7_10 Studies in Fuzziness and Soft Computing, v. 423
Lugar Cham
País Suiza
No. de páginas 101-110
Vol. / Cap. STUDFUZZ,v. 423
Inicio 2023-06-27
Fin
ISBN/ISSN 9783031234750