Attentional matrix factorization with context and co-invocation for service recommendation

Easy accessibility of data and functions are the main advantages to develop mashups from abundant sources of Web APIs. However, it simultaneously brings difficulties to choose suitable APIs for a mashup. Existing probabilistic matrix factorization (PMF) recommender systems can effectively exploit th...

Full description

Saved in:
Bibliographic Details
Published in:Expert systems with applications Vol. 186; p. 115698
Main Authors: Nguyen, Mo, Yu, Jian, Nguyen, Tung, Han, Yanbo
Format: Journal Article
Language:English
Published: New York Elsevier Ltd 30-12-2021
Elsevier BV
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Easy accessibility of data and functions are the main advantages to develop mashups from abundant sources of Web APIs. However, it simultaneously brings difficulties to choose suitable APIs for a mashup. Existing probabilistic matrix factorization (PMF) recommender systems can effectively exploit the latent features of the invocations with the same weight but not all features are equally significant and predictive, and the useless features may bring noises to the model. In this work, we propose the Attentional PMF Model (AMF), which leverages a neural attentional network to learn the significance of latent features. We then inject the attentional scores and the mashup-API context similarity into the matrix factorization structure for training. Furthermore, our model exploits the relationship between APIs from both their context and co-invocation history as regularization terms to improve its prediction performance. Our experiments are evaluated on ProgrammableWeb. The results show that our model outperforms some state-of-art recommender systems in mashup service applications. •Attentional mechanism increases the precision of mashup-API recommendation.•Document context and co-invocations of Web-APIs influence recommendations.•Combining attention with context and co-invocation yields better performance.
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2021.115698