scholarly journals Modeling Credit Risk: An Application of the Rough Set Methodology

Author(s):  
Reyes Samaniego Medina ◽  
Maria Jose Vazquez Cueto

The Basel Accords encourages credit entities to implement their own models for measuring financial risk. In this paper, we focus on the use of internal ratings-based (IRB) models for the assessment of credit risk and, specifically, on one component that models the probability of default (PD). The traditional methods used for modelling credit risk, such as discriminant analysis and logit and probit models, start with several statistical restrictions. The rough set methodology avoids these limitations and as such is an alternative to the classic statistical methods. We apply the rough set methodology to a database of 106 companies that are applicants for credit. We obtain ratios that can best discriminate between financially sound and bankrupt companies, along with a series of decision rules that will help detect operations that are potentially in default. Finally, we compare the results obtained using the rough set methodology to those obtained using classic discriminant analysis and logit models. We conclude that the rough set methodology presents better risk classification results.  

2020 ◽  
Vol 30 (1) ◽  
pp. 49-58
Author(s):  
Rute Q. de Faria ◽  
Amanda R. P. dos Santos ◽  
Deoclecio J. Amorim ◽  
Renato F. Cantão ◽  
Edvaldo A. A. da Silva ◽  
...  

AbstractThe prediction of seed longevity (P50) is traditionally performed by the use of the Probit model. However, due to the fact that the survival data are of binary origin (0,1), the fit of the model can be compromised by the non-normality of the residues. Consequently, this leads to prediction losses, despite the data being partially smoothed by Probit and Logit models. A possibility to reduce the effect of non-normality of the data would be to apply the principles of the central limit theorem, which states that non-normal residues tend to be normal as the n sample is increased. The Logit and Probit models differ in their normal and logistic distribution. Therefore, we developed a new estimation procedure by using a small increase of the n sample and tested it in the Probit and Logit functions to improve the prediction of P50. The results showed that the calculation of P50 by increasing the n samples from 4 to 6 replicates improved the index of correctness of the prediction. The Logit model presented better performance when compared with the Probit model, indicating that the estimation of P50 is more adequate when the adjustment of the data is performed by the Logit function.


Author(s):  
Bello Malam Sa’idu

The objective of this paper is to investigate the linkage between poverty, inequality and Millennium Development Goals’ (MDGs) expenditure. To achieve the set objective, probit and logit models were empirically employed using a panel data series. The results revealed that a unit increase in expenditure on MDGs would lead to increase in poverty by a single digit and income inequality by double digits. This is not to blame the MDG funding or discourage it. Plausibly the expenditure on MDGs has been constrained due to technical, managerial, institutional, macro-economic imbalances, and policy bottlenecks. Therefore, government and agencies should ameliorate these constraints. Consequently, this work has originated applied logit and probit models to explore poverty-inequality-MDGs’ expenditure nexus.


Author(s):  
Novan Wijaya

Credit risk evaluation is an importanttopic in financial risk management and become a major focus in the banking sector. This research discusses a credit risk evaluation system using an artificial neural network model based on backpropagation algorithm. This system is to train and test the neural network to determine the predictive value of credit risk, whether high riskorlow risk. This neural network uses 14 input layers, nine hidden layers and an output layer, and the data used comes from the bank that has branches in EastJakarta. The results showed that neural network can be used effectively in the evaluation of credit risk with accuracy of 88% from 100 test data


2017 ◽  
Vol 21 (5) ◽  
pp. 997-1018 ◽  
Author(s):  
Arunabha Mukhopadhyay ◽  
Samir Chatterjee ◽  
Kallol K. Bagchi ◽  
Peteer J. Kirs ◽  
Girja K. Shukla

2012 ◽  
Vol 02 (09) ◽  
pp. 38-46
Author(s):  
Khalili Araghi Maryam ◽  
Makvandi Sara

Simultaneous with extensive environmental changes and the rapid development of technology which has increasingly accelerated economy, competitiveness economical enterprises have restricted earning profit and make probable closing of bankrupt firms. Thus it seems necessary to find a model that can predict financial crisis and bankruptcy of companies. Nowadays occurrence of significant progress in other sciences, such as computer and math attract the attention of the financial scholars toward designing and using more exact patterns like Data Envelopment Analysis (DEA). For this purpose, this study uses DEA technique to predict the bankruptcy likelihood of manufacturing firms and also compare its predictability with2 methods : Logit and Probit models. Study sample includes all manufacturing firms listed in Stock Exchange of Tehran from 2000-2010. The results showed that the accuracy of the designed model under DEA technique is %72 and the predictability of Logit and Probit models has been81, and %80 respectively. The results also showed DEA was proved to be an effective tool for predicting bankruptcy likelihood of manufacturing firms; but,it acted less efficient than Logit and Probit models.


Author(s):  
Normaizatul Akma Saidi Et.al

Banks play a significant role in financing the economy and take on risky financial activities based on information and trust as they specialized companies with their own specificities. This study was propelled to unravel the determinants that affect financial risk (liquidity risk and credit risk) for conventional and Islamic banks. The bank-level data of conventional and Islamic banks in the regions of Middle East, Southeast Asia, and South Asia between 2006 and 2014 were collected from the Bankscope, which is a commercial database produced by the Bureau van Dijk. Thus, for conventional banks the obtained results exhibited significantly positive relationship between regulatory quality towards liquidity risk. Then, the relationship between regulatory quality towards credit risk was negatively significant for conventional banks. Meanwhile, as for Islamic banks, the relationship between government effectiveness and regulatory quality towards financial risk was insignificant. Hence, the regulators or policymakers are able to identify specific mechanism to improve the risk management of these banks as well through this study.


2013 ◽  
pp. 1225-1251
Author(s):  
Chun-Che Huang ◽  
Tzu-Liang (Bill) Tseng ◽  
Hao-Syuan Lin

Patent infringement risk is a significant issue for corporations due to the increased appreciation of intellectual property rights. If a corporation gives insufficient protection to its patents, it may loss both profits from product, and industry competitiveness. Many studies on patent infringement have focused on measuring the patent trend indicators and the patent monetary value. However, very few studies have attempted to develop a categorization mechanism for measuring and evaluating the patent infringement risk, for example, the categorization of the patent infringement cases, then to determine the significant attributes and introduce the infringement decision rules. This study applies Rough Set Theory (RST), which is suitable for processing qualitative information to induce rules to derive significant attributes for categorization of the patent infringement risk. Moreover, through the use of the concept hierarchy and the credibility index, it can be integrated with RST and then enhance application of the finalized decision rules.


Sign in / Sign up

Export Citation Format

Share Document