default data
Recently Published Documents


TOTAL DOCUMENTS

41
(FIVE YEARS 16)

H-INDEX

7
(FIVE YEARS 2)

2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Hong Liu ◽  
Mingkang Yuan ◽  
Meiling Zhou

In P2P loans with information asymmetry, the text information described by the borrower plays an important role in alleviating the information asymmetry between borrowers and lenders. To explore the borrowing described in text information and its relationship with default behavior, this article selects credits from April 2014 to October 2016 as the repayment period and studies default data. This is performed based on the length of the excavated text, purpose of the loan, repayment ability, willingness to reimburse, five text variables, and degree of loan urgency. The empirical results show that text length has a significant negative correlation with the default probability of borrowers. Different loan purposes have different default risks. Interestingly, the more urgent a loan is, the more likely the borrower is to default. However, repayment ability information and repayment willingness information have no significant effect on default behavior. In addition, the Nagelkerke R2 improved by nearly 3% in the logistic regression model with the addition of text variables. In short, fully excavating loan description information is helpful in reducing the risk of loan default.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Ying Chen ◽  
Ruirui Zhang

Aiming at the problem that the credit card default data of a financial institution is unbalanced, which leads to unsatisfactory prediction results, this paper proposes a prediction model based on k-means SMOTE and BP neural network. In this model, k-means SMOTE algorithm is used to change the data distribution, and then the importance of data features is calculated by using random forest, and then it is substituted into the initial weights of BP neural network for prediction. The model effectively solves the problem of sample data imbalance. At the same time, this paper constructs five common machine learning models, KNN, logistics, SVM, random forest, and tree, and compares the classification performance of these six prediction models. The experimental results show that the proposed algorithm can greatly improve the prediction performance of the model, making its AUC value from 0.765 to 0.929. Moreover, when the importance of features is taken as the initial weight of BP neural network, the accuracy of model prediction is also slightly improved. In addition, compared with the other five prediction models, the comprehensive prediction effect of BP neural network is better.


2021 ◽  
Author(s):  
Annamária Laborczi ◽  
Gábor Szatmári ◽  
János Mészáros ◽  
Sándor Koós ◽  
Béla Pirkó ◽  
...  

<p>‘Strategic objective 1’ of the United Nations Convention to Combat Desertification (UNCCD) aims to improve conditions of affected ecosystems, combat desertification/land degradation, promote sustainable land management, and contribute to land degradation neutrality. The indicator ‘Proportion of land that is degraded over total land area’ (SO1) is compiled from three sub-indicators: ‘Trends in land cover’ (SO1-1), ‘Trends in land productivity or functioning of the land’ (SO1-2), ‘Trends in carbon stocks above and below ground’ (SO1-3).</p><p>Soil organic carbon (SOC) stock can be adopted as the metric of SO1-3, until globally accepted methods for estimating the total terrestrial system carbon stocks will be elaborated. SOC can be considered as one of the most important properties of soil, which shows not just spatial but temporal variability. According to our previous results in the topic, UNCCD default data of SOC stock for Hungary is strongly recommended to be replaced with country specific estimation of SOC stock.</p><p>SOC stock maps were compiled in the framework of DOSoReMI.hu (Digital, Optimized, Soil Related Maps and Information in Hungary) initiative, predicted by proper digital soil mapping (DSM) method. Reference soil data were derived from a countrywide monitoring system. The selection of environmental covariates was based on the SCORPAN model. The elaborated SOC stock mapping methodology have two components: (1) point support modelling, where SOC stock is computed at the level of soil profile, and (2) spatial modelling (quantile regression forest), where spatial prediction and uncertainty quantification are carried out using the computed SOC stock values.</p><p>We analyzed how SOC stock changed between 1998 and 2016.  Nationwide SOC stock predictions were compiled for the years 1998, 2010, 2013, and 2016. For the intermediate years, we do not recommend to calculate SOC stock values, because we have no information on the dynamics of change in the intervening years. Based on the 1998 SOC stock prediction, we compiled a SOC stock map for 2018, using only land use conversion factors, according to the default data conversion values.</p><p>According to the elaborated scheme during the respective period, significant changes cannot be detected, only tendentious SOC stock changes appear. Based on our results, we recommend to use spatially predicted layers for all years when data are available, rather than calculating SOC stock change based on land use conversion factors.</p><p><strong>Acknowledgment:</strong> Our research was supported by the Hungarian National Research, Development and Innovation Office (NKFIH; K-131820) and by the Premium Postdoctoral Scholarship of the Hungarian Academy of Sciences (PREMIUM-2019-390) (Gábor Szatmári).</p>


2020 ◽  
Vol 62 (5) ◽  
pp. 441-447
Author(s):  
Michał Stopel ◽  
Piotr Aleksandrowicz ◽  
Dariusz Skibicki

Abstract The study presents an analysis of the frontal impact of a Dodge Grand Caravan car colliding with different obstacles - a rigid barrier, for full and 50 % overlaps, and a pole with 0.5 m. diameter. Simulations were performed using a V-Sim 4 program at three data sets: default data offered by the program, data established by experts on the basis of their experience and identified on the basis of impact simulation via a LS-Dyna program. The results obtained showed that the attainment of a precise simulation is possible only when the data of V-Sim program is formulated in accordance with an advanced MES analysis. This usually pertains to cases when there is no full overlap of the vehicle body in connection with the obstacle. The results of this work may offer some directions for the development of this type of simulation program. They also show users how important it is to provide accurate input data to achieve reliable calculation results.


2020 ◽  
Vol 10 (18) ◽  
pp. 6212
Author(s):  
Piotr Aleksandrowicz

The analyses performed by the experts are crucial for the settlement of court disputes, and they have legal consequences for the parties to legal proceedings. The reliability of the simulation result is crucial. First, in article, an impact simulation was performed with the use of the program default data. Next, the impact parameters were identified from a crash test, and a simulation was presented. Due to the difficulties in obtaining the data identified, the experts usually take advantage of simplifications using only default data provided by the simulation program. This article includes the original conclusions on specific reasons of simplified collision modeling in Multi Body Systems (MBS) programs and provides specific directions of development of the V-SIM4 program used in the study to enhance the models applied. This manuscript indicates a direction for crash model development in MBS programs to consider a varied 3D body space zones stiffness related to the structure of the car body and the internal car elements instead of modeling the car body as a solid with an average stiffness. Such an approach would provide an alternative to Finite Element Method (FEM) convention modeling.


2020 ◽  
Vol 8 (1) ◽  
pp. 1
Author(s):  
Colin Ellis

Corporate bond defaults in different sectors often increase suddenly at roughly similar times, although some sectors see default rates jump earlier than others. This could reflect contagion among sectors—specifically, defaults in one sector leading to credit stresses in other sectors of the economy that would not otherwise have seen stresses. To complicate matters, simple correlation-based tests for contagion are often biased, reflecting increased volatility in periods of stress. This paper uses sectoral default data from over 30 sectors to test for signs of contagion over the past 30 years. While jumps in sectoral default rates do often coincide, there is no consistent evidence of contagion across different periods of stress from unbiased test results. Instead, coincident jumps in sectoral default rates are likely to reflect common macroeconomic shocks.


Author(s):  
David Erdos

Drawing on the results of an extensive questionnaire of European Data Protection Authorities (DPAs), this chapter explores these regulators’ substantive orientation and detailed approach to standard-setting in the area of professional journalism under the Data Protection Directive. As regards news production, a large majority of DPAs accepted that the special expressive purposes derogation was engaged. Notwithstanding a greater emphasis on an internal balancing of rights within default data protection norms, this also remained the plurality view also as regards news archives. Detailed standard-setting was explored through hypothetical scenarios relating to undercover investigative journalism and data subject access demands made of journalists. It was found that, notwithstanding conflicts in many cases with statutory transparency and sensitive data provisions, all DPAs accepted the essential legitimacy of undercover journalism and over one-third only required that such activity conform to a permissive public interest test that didn’t explicitly incorporate a necessity threshold. In contrast, a much stricter approach was taken to the articulation of standards relating to subject access, with over one-third arguing that, aside from protecting information relating to sources, journalists would be obliged to comply with the default rules here in full. This difference may be linked to the divergent treatment of these issues within self-regulatory media codes: whilst almost all set down general ‘ethical’ norms applicable to undercover journalism, almost none did so as regards subject access. Despite the general tendency to ‘read down’ statutory provisions relating to undercover journalism, the severity of a DPAs’ approach to each scenario remained strongly correlated with the stringency of local law applicable to journalism.


Author(s):  
David Erdos

This chapter explores the significance of the book’s empirical and normative study of the interface between European data protection regulation and professional journalists, artists, and both academic and non-academic writers within the contemporary online media. The study has elucidated practical attempts at regulating professional journalism through a contextual rights-balancing paradigm, argued that this should be generalized to other traditional publishers, and proposed that it be systematically developed through co-regulation and strategic enforcement. It is contended that, notwithstanding the rise of new online media, an examination of the regulation of traditional publishers still has strong significance in and of itself. These actors continue to possess disproportionate information power and perform a vital role in distilling, explaining and putting new information and ideas into the public realm. The themes of the book may also contribute to thinking on new online media regulation. Whilst such media often does not orientate itself towards a public discourse, some kind of contextual balancing (even if often internal to default data protection norms) remains necessary. Co-regulation, encompassing not just platforms but also users, could also play some role in specifying that (albeit stricter) balance. Finally, not least given the severe resource constraints of Data Protection Authorities (DPAs), strategic enforcement is likely to be necessary in this context also. Through engagement with both traditional and new media, data protection is becoming a holistic regulator of the information ecosystem, thereby highlighting its importance within contemporary society.


Sign in / Sign up

Export Citation Format

Share Document