融合用户兴趣与疲劳提取的序列推荐模型
DOI:
CSTR:
作者:
作者单位:

上海工程技术大学 电子电气工程学院

作者简介:

通讯作者:

中图分类号:

TP391

基金项目:


Sequential Recommendation Model Integrating User Interest and Fatigue Extraction
Author:
Affiliation:

School of Electronic and Electrical Engineering,Shanghai University of Engineering Science,Shanghai,201620

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对用户反复接收到相似内容从而引起的反感或疲劳这一问题,提出了一种基于Transformer和疲劳提取的序列推荐模型(TransFESRec)。TransFESRec首先通过Transformer从用户的行为序列中提取动态的兴趣表示。然后,使用快速傅里叶变换将用户的行为序列从时域转换到频域,从而提取用户的疲劳表示。接着,将疲劳表示与用户兴趣表示结合并输入到多层感知机中,以学习二者之间的非线性关系,进而形成一个全面的用户状态表示。最终,将这个综合表示与每个候选物品的嵌入向量进行点积运算,从而得到用户对每个候选物品的偏好程度。通过实验证实,TransFESRec能够有效减少导致用户疲劳的内容类型推荐,并在多个评价指标上优于其他主流方法。

    Abstract:

    In response to the issue of user dissatisfaction and fatigue caused by repeatedly receiving similar content, this paper proposes a sequence recommendation model based on Transformer and fatigue extraction, named TransFESRec. Initially, TransFESRec extracts dynamic interest representations from users" behavior sequences using a Transformer. Then, it employs the Fast Fourier Transform (FFT) to convert the behavior sequence from the time domain to the frequency domain, thereby extracting the user"s fatigue representation. Subsequently, the model combines the fatigue representation with the interest representation and inputs them into a multilayer perceptron (MLP) to learn the nonlinear relationships between them, forming a comprehensive representation of the user"s state. Finally, this integrated representation is dot-multiplied with the embedding vectors of each candidate item to determine the user"s preference level for each item. Experimental validation shows that TransFESRec effectively reduces the recommendation of content types that lead to user fatigue and outperforms other mainstream methods across multiple evaluation metrics.

    参考文献
    相似文献
    引证文献
引用本文
分享
相关视频

文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2024-05-04
  • 最后修改日期:2026-01-19
  • 录用日期:2026-01-29
  • 在线发布日期:
  • 出版日期:
文章二维码