Deep Neural Network for Multi-Pitch Estimation Using Weighted Cross Entropy Loss

Author(s):  
Samuel Stone ◽  
Evan Spector
IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 146331-146341 ◽  
Author(s):  
Yangfan Zhou ◽  
Xin Wang ◽  
Mingchuan Zhang ◽  
Junlong Zhu ◽  
Ruijuan Zheng ◽  
...  

Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1500
Author(s):  
Xiangde Zhang ◽  
Yuan Zhou ◽  
Jianping Wang ◽  
Xiaojun Lu

Session-based recommendations aim to predict a user’s next click based on the user’s current and historical sessions, which can be applied to shopping websites and APPs. Existing session-based recommendation methods cannot accurately capture the complex transitions between items. In addition, some approaches compress sessions into a fixed representation vector without taking into account the user’s interest preferences at the current moment, thus limiting the accuracy of recommendations. Considering the diversity of items and users’ interests, a personalized interest attention graph neural network (PIA-GNN) is proposed for session-based recommendation. This approach utilizes personalized graph convolutional networks (PGNN) to capture complex transitions between items, invoking an interest-aware mechanism to activate users’ interest in different items adaptively. In addition, a self-attention layer is used to capture long-term dependencies between items when capturing users’ long-term preferences. In this paper, the cross-entropy loss is used as the objective function to train our model. We conduct rich experiments on two real datasets, and the results show that PIA-GNN outperforms existing personalized session-aware recommendation methods.


2018 ◽  
Vol 7 (3.12) ◽  
pp. 1213
Author(s):  
Ram Sethuraman ◽  
Akshay Havalgi

The concept of deep learning is used in the various fields like text, speech and vision. The proposed work deep neural network is used for recommender system. In this work pair wise objective function is used for emphasis of non-linearity and latent features. The GMF (Gaussian matrix factorization) and MLP techniques are used in this work. The proposed framework is named as NCF which is basically neural network based collaborative filtering. The NCF gives the latent features by reducing the non-linearity and generalizing the matrix. In the proposed work combination of pair-wise and point wise objective function is used and tune by using the concept of cross entropy with Adam optimization. This optimization approach optimizes the gradient descent function. The work is done on 1K and 1M movies lens dataset and it is compared with deep matrix factorization (DMF).  


2020 ◽  
Vol 2020 ◽  
pp. 1-20
Author(s):  
Rong Fei ◽  
Quanzhu Yao ◽  
Yuanbo Zhu ◽  
Qingzheng Xu ◽  
Aimin Li ◽  
...  

Within the sentiment classification field, the convolutional neural network (CNN) and long short-term memory (LSTM) are praised for their classification and prediction performance, but their accuracy, loss rate, and time are not ideal. To this purpose, a deep learning structure combining the improved cross entropy and weight for word is proposed for solving cross-domain sentiment classification, which focuses on achieving better text sentiment classification by optimizing and improving recurrent neural network (RNN) and CNN. Firstly, we use the idea of hinge loss function (hinge loss) and the triplet loss function (triplet loss) to improve the cross entropy loss. The improved cross entropy loss function is combined with the CNN model and LSTM network which are tested in the two classification problems. Then, the LSTM binary-optimize (LSTM-BO) model and CNN binary-optimize (CNN-BO) model are proposed, which are more effective in fitting the predicted errors and preventing overfitting. Finally, considering the characteristics of the processing text of the recurrent neural network, the influence of input words for the final classification is analysed, which can obtain the importance of each word to the classification results. The experiment results show that within the same time, the proposed weight-recurrent neural network (W-RNN) model gives higher weight to words with stronger emotional tendency to reduce the loss of emotional information, which improves the accuracy of classification.


Author(s):  
David T. Wang ◽  
Brady Williamson ◽  
Thomas Eluvathingal ◽  
Bruce Mahoney ◽  
Jennifer Scheler

Sign in / Sign up

Export Citation Format

Share Document