Machine learning loss given default for corporate debt

2021 ◽  
Vol 64 ◽  
pp. 144-159
Author(s):  
Luke M. Olson ◽  
Min Qi ◽  
Xiaofei Zhang ◽  
Xinlei Zhao
Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Weiwei Hao ◽  
Hongyan Gao ◽  
Zongqing Liu

This paper proposes a nonlinear autoregressive neural network (NARNET) method for the investment performance evaluation of state-owned enterprises (SOE). It is different from the traditional method based on machine learning, such as linear regression, structural equation, clustering, and principal component analysis; this paper uses a regression prediction method to analyze investment efficiency. In this paper, we firstly analyze the relationship between diversified ownership reform, corporate debt leverage, and the investment efficiency of state-owned enterprises (SOE). Secondly, a set of investment efficiency evaluation index system for SOE was constructed, and a nonlinear autoregressive neural network approach was used for verification. The data of A-share state-owned listed companies in Shanghai and Shenzhen stock exchanges from 2009 to 2018 are taken as a sample. The experimental results show that the output value from the NARNET is highly fitted to the actual data. Based on the neural network model regression analysis, this paper conducts a descriptive statistical analysis of the main variables and control variables of the evaluation indicators. It verifies the direct impact of diversified ownership reform on the investment efficiency of SOE and the indirect impact on the investment efficiency of SOE through corporate debt leverage.


CFA Digest ◽  
2011 ◽  
Vol 41 (4) ◽  
pp. 70-71
Author(s):  
Luis Garcia-Feijoo

2011 ◽  
pp. 110601013524086 ◽  
Author(s):  
Michael Jacobs ◽  
Ahmet K Karagozoglu

2011 ◽  
Vol 21 (1) ◽  
pp. 6-20 ◽  
Author(s):  
Michael Jacobs ◽  
Ahmet K. Karagozoglu

2020 ◽  
Vol 43 ◽  
Author(s):  
Myrthe Faber

Abstract Gilead et al. state that abstraction supports mental travel, and that mental travel critically relies on abstraction. I propose an important addition to this theoretical framework, namely that mental travel might also support abstraction. Specifically, I argue that spontaneous mental travel (mind wandering), much like data augmentation in machine learning, provides variability in mental content and context necessary for abstraction.


2020 ◽  
Author(s):  
Mohammed J. Zaki ◽  
Wagner Meira, Jr
Keyword(s):  

2020 ◽  
Author(s):  
Marc Peter Deisenroth ◽  
A. Aldo Faisal ◽  
Cheng Soon Ong
Keyword(s):  

Author(s):  
Lorenza Saitta ◽  
Attilio Giordana ◽  
Antoine Cornuejols

Sign in / Sign up

Export Citation Format

Share Document