Autores
Sidorov Grigori
Balouchzahi Fazlourrahman
Butt Sabur
Gelbukh Alexander
Título Regret and Hope on Transformers: An Analysis of Transformers on Regret and Hope Speech Detection Datasets
Tipo Revista
Sub-tipo JCR
Descripción Applied Sciences (Switzerland)
Resumen In this paper, we analyzed the performance of different transformer models for regret and hope speech detection on two novel datasets. For the regret detection task, we compared the averaged macro-scores of the transformer models to the previous state-of-the-art results. We found that the transformer models outperformed the previous approaches. Specifically, the roberta-based model achieved the highest averaged macro F1-score of 0.83, beating the previous state-of-the-art score of 0.76. For the hope speech detection task, the bert-based, uncased model achieved the highest averaged-macro F1-score of 0.72 among the transformer models. However, the specific performance of each model varied slightly depending on the task and dataset. Our findings highlight the effectiveness of transformer models for hope speech and regret detection tasks, and the importance of considering the effects of context, specific transformer architectures, and pre-training on their performance. © 2023 by the authors.
Observaciones DOI 10.3390/app13063983
Lugar Basel
País Suiza
No. de páginas Article number 3983
Vol. / Cap. v. 13 no. 6
Inicio 2023-03-01
Fin
ISBN/ISSN