factorization machine
Recently Published Documents


TOTAL DOCUMENTS

119
(FIVE YEARS 94)

H-INDEX

8
(FIVE YEARS 5)

Author(s):  
Qingren Wang ◽  
Min Zhang ◽  
Yiwen Zhang ◽  
Jinqin Zhong ◽  
Victor S. Sheng

Author(s):  
Bohui Xia ◽  
Xueting Wang ◽  
Toshihiko Yamasaki

Given the promising results obtained by deep-learning techniques in multimedia analysis, the explainability of predictions made by networks has become important in practical applications. We present a method to generate semantic and quantitative explanations that are easily interpretable by humans. The previous work to obtain such explanations has focused on the contributions of each feature, taking their sum to be the prediction result for a target variable; the lack of discriminative power due to this simple additive formulation led to low explanatory performance. Our method considers not only individual features but also their interactions, for a more detailed interpretation of the decisions made by networks. The algorithm is based on the factorization machine, a prediction method that calculates factor vectors for each feature. We conducted experiments on multiple datasets with different models to validate our method, achieving higher performance than the previous work. We show that including interactions not only generates explanations but also makes them richer and is able to convey more information. We show examples of produced explanations in a simple visual format and verify that they are easily interpretable and plausible.


2021 ◽  
Vol 11 (20) ◽  
pp. 9546
Author(s):  
Huaidong Yu ◽  
Jian Yin ◽  
Yan Li

Nowadays, to deal with the increasing data of users and items and better mine the potential relationship between the data, the model used by the recommendation system has become more and more complex. In this case, how to ensure the prediction accuracy and operation speed of the recommendation system has become an urgent problem. Deep neural network is a good solution to the problem of accuracy, we can use more network layers, more advanced feature cross way to improve the utilization of data. However, when the accuracy is guaranteed, little attention is paid to the speed problem. We can only pursue better machine efficiency, and we do not pay enough attention to the speed efficiency of the model itself. Some models with advantages in speed, such as PNN, are slightly inferior in accuracy. In this paper, the Gate Attention Factorization Machine (GAFM) model based on the double factors of accuracy and speed is proposed, and the structure of gate is used to control the speed and accuracy. Extensive experiments have been conducted on data sets in various application scenarios, and the results show that the GAFM model is better than the existing factorization machines in both speed and accuracy.


Author(s):  
Jun Zhou ◽  
Longfei Li ◽  
Ziqi Liu ◽  
Chaochao Chen

Recently, Factorization Machine (FM) has become more and more popular for recommendation systems due to its effectiveness in finding informative interactions between features. Usually, the weights for the interactions are learned as a low rank weight matrix, which is formulated as an inner product of two low rank matrices. This low rank matrix can help improve the generalization ability of Factorization Machine. However, to choose the rank properly, it usually needs to run the algorithm for many times using different ranks, which clearly is inefficient for some large-scale datasets. To alleviate this issue, we propose an Adaptive Boosting framework of Factorization Machine (AdaFM), which can adaptively search for proper ranks for different datasets without re-training. Instead of using a fixed rank for FM, the proposed algorithm will gradually increase its rank according to its performance until the performance does not grow. Extensive experiments are conducted to validate the proposed method on multiple large-scale datasets. The experimental results demonstrate that the proposed method can be more effective than the state-of-the-art Factorization Machines.


Sign in / Sign up

Export Citation Format

Share Document