Show simple item record

contributor authorHan, Yi
contributor authorMoghaddam, Mohsen
date accessioned2025-04-21T10:37:43Z
date available2025-04-21T10:37:43Z
date copyright12/13/2024 12:00:00 AM
date issued2024
identifier issn1530-9827
identifier otherjcise_25_2_021007.pdf
identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4306580
description abstractAspect-based sentiment analysis (ABSA) enables a systematic identification of user opinions on particular aspects, thus improving the idea creation process in the initial stages of a product/service design. Large language models (LLMs) such as T5 and GPT have proven powerful in ABSA tasks due to their inherent attention mechanism. However, some key limitations remain. First, existing research mainly focuses on relatively simpler ABSA tasks such as aspect-based sentiment analysis, while the task of extracting aspects, opinions, and sentiment in a unified model remains largely unaddressed. Second, current ABSA tasks overlook implicit opinions and sentiments. Third, most attention-based LLMs use position encoding in a linear projected manner or through split-position relations in word distance schemes, which could lead to relation biases during the training process. This paper incorporates domain knowledge into LLMs by introducing a new position encoding strategy for the transformer model. This paper addresses these gaps by (1) introducing the ACOSI (aspect, category, opinion, sentiment, implicit indicator) analysis task, developing a unified model capable of extracting all five types of labels in the ACOSI analysis task simultaneously in a generative manner; (2) designing a new position encoding method in the attention-based model; and (3) introducing a new benchmark based on ROUGE score that incorporates design domain knowledge inside. The numerical experiments on manually labeled data from three major e-Commerce retail stores for apparel and footwear products showcase the domain knowledge inserted transformer method’s performance, scalability, and potential.
publisherThe American Society of Mechanical Engineers (ASME)
titleDesign Knowledge as Attention Emphasizer in Large Language Model-Based Sentiment Analysis
typeJournal Paper
journal volume25
journal issue2
journal titleJournal of Computing and Information Science in Engineering
identifier doi10.1115/1.4067212
journal fristpage21007-1
journal lastpage21007-12
page12
treeJournal of Computing and Information Science in Engineering:;2024:;volume( 025 ):;issue: 002
contenttypeFulltext


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record