Preparing Simulations in Large Value Payment Systems using Historical Data

Author(s):  
Ronald Heijmans ◽  
Richard Heuver

Simulations in large value payment systems have become a common tool for stress scenario analyses, often using historical data. The reason for simulating is that disruptions in payment systems are not very common. Simulation of realistic scenarios requires adequate preparation. As part of the preparation, it is essential 1) to have a thorough understanding of the structure of the investigated market, 2) to potentially remove certain types of transactions, such as funding-related transactions (interbank loans), and 3) to understand how banks react to a shock. The financial crisis starting in the summer of 2007 caused several stressful events worldwide and provided insight into how banks behaved during these events.

2019 ◽  
Vol 122 (1) ◽  
pp. 681-699 ◽  
Author(s):  
E. Tattershall ◽  
G. Nenadic ◽  
R. D. Stevens

AbstractResearch topics rise and fall in popularity over time, some more swiftly than others. The fastest rising topics are typically called bursts; for example “deep learning”, “internet of things” and “big data”. Being able to automatically detect and track bursty terms in the literature could give insight into how scientific thought evolves over time. In this paper, we take a trend detection algorithm from stock market analysis and apply it to over 30 years of computer science research abstracts, treating the prevalence of each term in the dataset like the price of a stock. Unlike previous work in this domain, we use the free text of abstracts and titles, resulting in a finer-grained analysis. We report a list of bursty terms, and then use historical data to build a classifier to predict whether they will rise or fall in popularity in the future, obtaining accuracy in the region of 80%. The proposed methodology can be applied to any time-ordered collection of text to yield past and present bursty terms and predict their probable fate.


2013 ◽  
Vol 29 (02) ◽  
pp. 84-91
Author(s):  
Stefanos Koullias ◽  
Santiago Balestrini Robinson ◽  
Dimitri N. Mavris

The purpose of this study is to obtain insight into surface effect ship (SES) endurance without reliance on historical data as a function of geometry, displacement, and technology level. First-principle models of the resistance, structures, and propulsion system are developed and integrated to predict large SES endurance and to suggest the directions that future large SESs will take. It is found that large SESs are dominated by structural weight, which indicates the need for advanced materials and complex structures, and that advanced propulsion cycles can increase endurance by up to 33%. SES endurance is shown to be a nonlinear discontinuous function of geometry, displacement, and technology level that cannot be predicted by simplified models or assumptions.


2020 ◽  
Vol 13 (9) ◽  
pp. 122
Author(s):  
Eddison T. Walters

The researcher called for economic research to consider the potential effect of advancement in technology on analysis of economic data in Eddison Walters Modern Economic Analysis Theory in the future represented a paradigm shift in economic analysis that will significantly reduce the potential for error due to data distortion in the future. The foundation of the world's economy is based on the sharing of information, yet very little attention has been given to the effect of technology advancement in the analysis of data. The researcher of the current study highlighted the critical nature of sharing information to the development of the world’s economy in the past, as well as the critical nature of sharing information to the world’s economy today. Advancement in technology has drastically improved the sharing of information and has led to the globalized economy. The lack of evidence supporting the widely accepted theory of the Global Financial Crisis of 2007 and 2008 prompted the investigation by the current researcher aimed at gaining insight into economic factors that were responsible for conditions contributing to the Global Financial Crisis of 2007 and 2008. Walters (2018) presented evidence suggesting no financial bubble existed before the Global Financial Crisis of 2007 and 2008. The study resulted in the development of “Eddison Walters Risk Expectation Theory of The Global Financial Crisis of 2007 and 2008”. The theory presented an alternative explanation for the financial crisis. The researcher called for additional investigation to gain insight into the nature of the cause of the Global Financial Crisis of 2007 and 2008. Further investigation in Walters (2019) provided evidence supporting the idea, technological advancement led to the rapid growth in home prices before the Global Financial Crisis of 2007 and 2008. The result from the analysis of data in Walters (2019) revealed the following, 0.989 Adjusted R-square, 194.041 Mean Dependent Variable, 5.908 Square Error of Regression, and 488.726 Sum-of-Square Residual, from nonlinear regression analysis. The dependent variable in the study was, “home purchase price” and the independent variable was, “advancement in technology”. The current study continued the investigation into factors that were described in the literature which set the conditions leading to the Global Financial Crisis of 2007 and 2008. Gaining insight into the effect of technological advancement on the significant increase in consumer debt prior to the Global Financial Crisis will significantly contribute to the understanding of the economic environment before the Global Financial Crisis of 2007 and 2008. Insight into the effect of advancement in technology on the increase in consumer lending prior to the Global Financial Crisis of 2007 and 2008, will significantly contribute to the understanding of the Global Financial Crisis of 2007 and 2008.


Author(s):  
Derek Peterson ◽  
Caroline Howard

As e-commerce is increasingly critical to organizational survival in the 21st century global marketplace, business organizations are challenged with selecting the best payment alternatives to meet both their requirements and the needs of their customers. This paper develops and validates a performance-based tool, the Electronic Payment Efficacy Quotient (EPEQ), designed to assist merchants in selecting the appropriate EPS and measuring effectiveness. The research aims at addressing the need for EPS research to aid merchant selection and use of EPS. The paper presents the case study of a single source Internet Service Provider (ISP), which was analyzed to determine merchant’s needs regarding EPS and develop measures. Historical data was then used to determine and test the validity of the most effective alternative measures. The paper concludes with recommendations for future research to assist in optimizing merchant use of EPS.


2017 ◽  
Vol 85 (2) ◽  
pp. 228-246
Author(s):  
Jan Boon ◽  
Koen Verhoest ◽  
Bruno De Borger

This study contributes to our understanding of the characteristics of public organizations that are more likely to outsource administrative overhead. Despite the climate of ongoing crisis that urges public organizations to focus their resources on core tasks, little is known about the characteristics of organizations that hive off the delivery of non-essential administrative overhead processes to the private sector. This study runs a panel data Tobit model to test whether different effect sizes of structural, institutional and political characteristics are found regarding the probability of outsourcing and the degree of outsourcing of administrative overhead. We find that organizational size, formal autonomy, inertia and time matter for understanding the outsourcing of public organizations. Points for practitioners Across the globe, governments have turned to a rationalization of administrative overhead in response to austerity demands posed by the global financial crisis. The present study shows that large differences exist between organizations in terms of their propensity to turn to the private sector – one of the classic recipes for achieving efficiency gains – for the delivery of administrative overhead, and helps practitioners gain insight into the determinants of administrative overhead outsourcing.


Focaal ◽  
2017 ◽  
Vol 2017 (78) ◽  
pp. 1-8 ◽  
Author(s):  
Marguerite van den Berg ◽  
Bruce O’Neill

Nearly a decade after the global financial crisis of 2008, this thematic section investigates one way in which marginalization and precarization appears: boredom. An increasingly competitive global economy has fundamentally changed the coordinates of work and class in ways that have led to a changing engagement with boredom. Long thought of as an affliction of prosperity, boredom has recently emerged as an ethnographically observed plight of the most economically vulnerable. Drawing on fieldwork from postsocialist Europe and postcolonial Africa, this thematic section explores the intersection of boredom and precarity in order to gain new insight into the workings of advanced capitalism. It experiments with ways of theorizing the changing relationship between status, production, consumption, and the experience of excess free time. These efforts are rooted in a desire to make sense of the precarious forms of living that proliferated in the aftermath of the global financial crisis and that continue to endure a decade later.


2010 ◽  
Vol 51 (2) ◽  
pp. 171-194 ◽  
Author(s):  
Oane Visser ◽  
Don Kalb

AbstractLooking for new ways to interpret the failings of the neo-liberal economy, this article argues that financialised capitalism at the eve of the 2008 financial crisis showed striking analogies with the characteristic combination of oligopoly and informality of the Soviet economy at the eve of its collapse. State capture by oligopolists, a large "virtual economy", the inability of agencies to obtain insight into economic and financial operations, the short term orientations of managers not coinciding with enterprise viability, and a "mystification of risk" by high science are some of the analogies to be discussed. It is argued that not only the origins but also the aftermath of the crisis may show significant analogies.


2000 ◽  
Vol 03 (03) ◽  
pp. 443-450 ◽  
Author(s):  
NEIL F. JOHNSON ◽  
MICHAEL HART ◽  
PAK MING HUI ◽  
DAFANG ZHENG

We explore various extensions of Challet and Zhang's Minority Game in an attempt to gain insight into the dynamics underlying financial markets. First we consider a heterogeneous population where individual traders employ differing "time horizons" when making predictions based on historical data. The resulting average winnings per trader is a highly non-linear function of the population's composition. Second, we introduce a threshold confidence level among traders below which they will not trade. This can give rise to large fluctuations in the "volume" of market participants and the resulting market "price".


Author(s):  
Edgars Pudzis ◽  
Sanda Geipele ◽  
Ineta Geipele

Abstract The research provides an insight into village development planning, as well as considers village planning from the perspective of the national planning framework. Local settings of village development have also been taken into account. The research provides information about possible approaches for local community involvement in development decision-making. The article aims at considering the current situation of the involvement of local communities in the advancement of local territories and at presenting the proposals for public involvement models. Analysis, logical and historical data access methods, induction and deduction have been used in the present research.


Author(s):  
Xiyue Li ◽  
Gary Yohe

This chapter offers results from an artificial simulation exercise that was designed to answer three fundamental questions that lie at the heart of anticipatory adaptation. First, how can confidence in projected vulnerabilities and impacts be greater than the confidence in attributing what has heretofore been observed? Second, are there characteristics of recent historical data series that do or do not portend our achieving high confidence in attribution to climate change in support of framing adaptation decisions in an uncertain future? And finally, what can analysis of confidence in attribution tell us about ranges of “not-implausible” extreme futures vis-à-vis projections based at least implicitly on an assumption that the climate system is static? An extension of the IPCC method of assessing our confidence in attribution to anthropogenic sources of detected warming presents an answer to the first question. It is also possible to identify characteristics that support an affirmative answer to the second. Finally, this chapter offer some insight into the significance of our attribution methodology in informing attempts to frame considerations of potential extremes and how to respond.


Sign in / Sign up

Export Citation Format

Share Document