Skip to main content

Large Language Models in Targeted Sentiment Analysis for Russian.

Rusnachenko, N., Golubev, A. and Loukachevitch, N., 2024. Large Language Models in Targeted Sentiment Analysis for Russian. Lobachevskii Journal of Mathematics, 45 (7), 3148-3158.

Full text available as:

[thumbnail of 2404.12342v1.pdf] PDF
2404.12342v1.pdf - Accepted Version
Restricted to Repository staff only until 18 October 2025.
Available under License Creative Commons Attribution Non-commercial.

588kB

DOI: 10.1134/S1995080224603758

Abstract

Abstract: In this paper, we investigate the use of decoder-based generative transformers for extracting sentiment towards the named entities in Russian news articles. We study sentiment analysis capabilities of instruction-tuned large language models (LLMs). We consider the dataset of RuSentNE-2023 in our study. The first group of experiments was aimed at the evaluation of zero-shot capabilities of LLMs with closed and open transparencies. The second covers the fine-tuning of Flan-T5 using the ‘‘chain-of-thought’’ (CoT) three-hop reasoning framework (THoR). We found that the results of the zero-shot approaches are similar to the results achieved by baseline fine-tuned encoder-based transformers (BERT). Reasoning capabilities of the fine-tuned Flan-T5 models with THoR achieve at least increment with the base-size model compared to the results of the zero-shot experiment. The best results of sentiment analysis on RuSentNE-2023 were achieved by fine-tuned Flan-T5, which surpassed the results of previous state-of-the-art transformer-based classifiers. Our CoT application framework is publicly available: https://github.com/nicolay-r/Reasoning-for-Sentiment-Analysis-Framework

Item Type:Article
ISSN:1995-0802
Uncontrolled Keywords:sentiment analysis; large language models
Group:Faculty of Science & Technology
ID Code:41161
Deposited By: Symplectic RT2
Deposited On:09 Jul 2025 13:34
Last Modified:09 Jul 2025 13:34

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -