Incremental specialized and specialized-generalized matrix factorization models based on adaptive learning rate optimizers
•Incremental models are most suitable for cold-start scenarios.•Adaptive learning rate methods are effective in data stream environments.•Learning user-specialized parameters increase the recommender model accuracy.•The proposed adaptive models have success in large-scale recommendation datasets. Re...
Saved in:
Published in: | Neurocomputing (Amsterdam) Vol. 552; p. 126515 |
---|---|
Main Authors: | , , , |
Format: | Journal Article |
Language: | English |
Published: |
Elsevier B.V
01-10-2023
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •Incremental models are most suitable for cold-start scenarios.•Adaptive learning rate methods are effective in data stream environments.•Learning user-specialized parameters increase the recommender model accuracy.•The proposed adaptive models have success in large-scale recommendation datasets.
Recommender systems suggest items that are likely to be preferred by a particular user based on historical behavior, actions, and feedback. In real-world applications, data on users and items are continuously generated at a fast pace, such as in e-commerce, social media, digital marketing, and content consumption applications. Since interactions occur over time, these scenarios can be formulated as a data stream where users’ interests are potentially dynamic, i.e., they change over time. Given that changes are expected to occur, one of the current research challenges in streaming recommender systems is that models must adapt their parameters when changes occur to maintain performance. As such changes do not occur for all users and items in the stream at the same time, we consider adapting learning schemes to account for user or item identifiers and model individual parameters. Therefore, we used specialized parameters to adjust the step size for each dataset user or item. More specifically, this study proposes four specialized and specialized-generalized variants of four well-known adaptive learning rate optimizers and shows how they are combined with incremental matrix factorization methods. We tested our proposed optimization strategies on different datasets and showed that one of the proposed specialized variants, that is, InAMSGradUser, improves the RECALL and NDCG rates by up to 11.1 and 7.5 percentage points, respectively, compared to the traditional stochastic gradient descent (SGD) optimizer. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2023.126515 |