*Result*: Dual-Adaptive Update Strategies-Enhanced Meta-Optimization for User Cold-Start Recommendation.

Title:
Dual-Adaptive Update Strategies-Enhanced Meta-Optimization for User Cold-Start Recommendation.
Authors:
Zhao, Xuhao1 (AUTHOR) zhaoxuhao@sjtu.edu.cn, Zhu, Yanmin1 (AUTHOR) yzhu@sjtu.edu.cn, Wang, Chunyang2 (AUTHOR) cywang@dase.ecnu.edu.cn, Jing, Mengyuan1 (AUTHOR) jingmy@sjtu.edu.cn, Ma, Wenze1 (AUTHOR) mawenze991226@sjtu.edu.cn, Yu, Jiadi1 (AUTHOR) jiadiyu@sjtu.edu.cn, Tang, Feilong1 (AUTHOR) tang-fl@cs.sjtu.edu.cn
Source:
ACM Transactions on Information Systems. Nov2025, Vol. 43 Issue 6, p1-36. 36p.
Database:
Academic Search Index

*Further Information*

*User cold-start recommendation presents a significant challenge for recommender systems, affecting their overall effectiveness. Meta-learning-based methods have been introduced to address this issue. These methods treat the user cold-start recommendation problem as a few-shot learning task, where each user represents a unique task. The objective is to acquire shared initialization parameters that can be effectively applied across all cold-start users. Subsequently, these shared parameters are fine-tuned into personalized parameters using individual interaction data. Recent studies argue that shared parameters are unsuitable for all users with an implicit grouping distribution of user preference. Therefore, they propose adaptive-initialization-based methods, which first differentiate tasks based on user preferences and then generate task-adaptive initialization parameters using task representations. However, both the meta-learning and adaptive-initialization-based manners ignore discovering the adaptive capability of update strategies in the process of transferring initialization parameters to personalized parameters. Instead, they rely on task-shared optimization strategies, leading the model to fall into an overfitting or underfitting situation. In response to this, we propose a dual-adaptive update strategies-enhanced meta-optimization framework (DAUS) for user cold-start recommendation. First, we integrate dual-adaptive update strategies to enhance the adaptive capability of transferring initialization parameters. This involves incorporating both task-adaptive optimization hyperparameters and objectives. Second, we design a multifaceted task encoder, which can provide diverse task information to differentiate between tasks, including explicit task features (task relevance, training signals) and other implicit task information. Extensive experiments based on three real-world datasets demonstrate that our DAUS outperforms the state-of-the-art methods. The source code is available at https://github.com/XuHao-bit/DAUS. [ABSTRACT FROM AUTHOR]*