scholarly journals Long-Run Growth of Financial Data Technology

2020 ◽  
Vol 110 (8) ◽  
pp. 2485-2523
Author(s):  
Maryam Farboodi ◽  
Laura Veldkamp

“Big data” financial technology raises concerns about market inefficiency. A common concern is that the technology might induce traders to extract others’ information, rather than to produce information themselves. We allow agents to choose how much they learn about future asset values or about others’ demands, and we explore how improvements in data processing shape these information choices, trading strategies and market outcomes. Our main insight is that unbiased technological change can explain a market-wide shift in data collection and trading strategies. However, in the long run, as data processing technology becomes increasingly advanced, both types of data continue to be processed. Two competing forces keep the data economy in balance: data resolve investment risk, but future data create risk. The efficiency results that follow from these competing forces upend two pieces of common wisdom: our results offer a new take on what makes prices informative and whether trades typically deemed liquidity-providing actually make markets more resilient. (JEL C55, D83, G12, G14, O33)

Author(s):  
Yacine Aït-Sahalia ◽  
Jean Jacod

High-frequency trading is an algorithm-based computerized trading practice that allows firms to trade stocks in milliseconds. Over the last fifteen years, the use of statistical and econometric methods for analyzing high-frequency financial data has grown exponentially. This growth has been driven by the increasing availability of such data, the technological advancements that make high-frequency trading strategies possible, and the need of practitioners to analyze these data. This comprehensive book introduces readers to these emerging methods and tools of analysis. The book covers the mathematical foundations of stochastic processes, describes the primary characteristics of high-frequency financial data, and presents the asymptotic concepts that their analysis relies on. It also deals with estimation of the volatility portion of the model, including methods that are robust to market microstructure noise, and address estimation and testing questions involving the jump part of the model. As the book demonstrates, the practical importance and relevance of jumps in financial data are universally recognized, but only recently have econometric methods become available to rigorously analyze jump processes. The book approaches high-frequency econometrics with a distinct focus on the financial side of matters while maintaining technical rigor, which makes this book invaluable to researchers and practitioners alike.


2021 ◽  
Vol 14 (7) ◽  
pp. 1167-1174
Author(s):  
Zsolt István ◽  
Soujanya Ponnapalli ◽  
Vijay Chidambaram

Most modern data processing pipelines run on top of a distributed storage layer, and securing the whole system, and the storage layer in particular, against accidental or malicious misuse is crucial to ensuring compliance to rules and regulations. Enforcing data protection and privacy rules, however, stands at odds with the requirement to achieve higher and higher access bandwidths and processing rates in large data processing pipelines. In this work we describe our proposal for the path forward that reconciles the two goals. We call our approach "Software-Defined Data Protection" (SDP). Its premise is simple, yet powerful: decoupling often changing policies from request-level enforcement allows distributed smart storage nodes to implement the latter at line-rate. Existing and future data protection frameworks can be translated to the same hardware interface which allows storage nodes to offload enforcement efficiently both for company-specific rules and regulations, such as GDPR or CCPA. While SDP is a promising approach, there are several remaining challenges to making this vision reality. As we explain in the paper, overcoming these will require collaboration across several domains, including security, databases and specialized hardware design.


Demography ◽  
2017 ◽  
Vol 54 (5) ◽  
pp. 1773-1793 ◽  
Author(s):  
Sara Cools ◽  
Simen Markussen ◽  
Marte Strøm

2002 ◽  
Vol 1 (2) ◽  
Author(s):  
Timothy J. Tardiff

This paper addresses the fundamental question of what costs and prices would look like under competitive conditions and how close the FCC's total element long-run incremental cost (TELRIC) pricing rules allow one to approximate such competitive outcomes. We consider: what types of firms would enter in competitive network industries, what effect would new entry have on the asset values and prices of incumbent firms, and what impact would competition have on (1) the types and vintages of capital equipment, (2) prices for that equipment, and (3) conditions in the operating environment? The paper concludes by highlighting alternative pricing proposals offered by contending parties and identifying the major drivers that explain what have proven to be large differences among competing proposals.


2021 ◽  
Vol 3 (2) ◽  
pp. 183-198
Author(s):  
Henrik Kleven ◽  
Camille Landais ◽  
Jakob Egholt Søgaard

This paper investigates whether the impact of children on the labor market outcomes of women relative to men—child penalties—can be explained by the biological links between mother and child. We estimate child penalties in biological and adoptive families using event studies around the arrival of children and almost 40 years of adoption data from Denmark. Short-run child penalties are slightly larger for biological mothers than for adoptive mothers, but their long-run child penalties are virtually identical and precisely estimated. This suggests that biology is not a key driver of child-related gender gaps. (JEL J12, J13, J16)


2020 ◽  
Vol 240 (2-3) ◽  
pp. 351-386 ◽  
Author(s):  
Helge Braun ◽  
Roland Döhrn ◽  
Michael Krause ◽  
Martin Micheli ◽  
Torsten Schmidt

AbstractThis paper analyzes the introduction of the German minimum wage in 2015 in a structural model geared to quantitatively assess its long-run economic effects. We first employ a simple neoclassic model where wages equal their marginal product, then extend this model to two sector economy, and finally introduce search and matching frictions. Even though all model variants remain highly stylized, they yield quantitative insights on the importance of different mechanisms and channels through which minimum wages affect outcomes in the long run. In this framework, the minimum wage has a strong negative effect on employment. When sectors are differently affected by the minimum wage, sectoral relative price changes play an important quantitative role. Other labor market policies and institutions are important for the transmission of minimum wage policy on labor market market outcomes.


1989 ◽  
Vol 21 (2) ◽  
pp. 139-153 ◽  
Author(s):  
Charles B. Moss ◽  
J.S. Shonkwiler ◽  
John E. Reynolds

AbstractThis study determines the effect of government payments on real agricultural asset values using Bayesian vector autoregression. In developing the empirical model, special attention is focused on the informational content of government payments. The results indicate that government payments to farmers have little effect on real asset values in the long run. In the short run, an increase in government payments to farmers may be associated with decline in asset values.


2020 ◽  
Vol 34 (02) ◽  
pp. 2128-2135
Author(s):  
Yang Liu ◽  
Qi Liu ◽  
Hongke Zhao ◽  
Zhen Pan ◽  
Chuanren Liu

In recent years, considerable efforts have been devoted to developing AI techniques for finance research and applications. For instance, AI techniques (e.g., machine learning) can help traders in quantitative trading (QT) by automating two tasks: market condition recognition and trading strategies execution. However, existing methods in QT face challenges such as representing noisy high-frequent financial data and finding the balance between exploration and exploitation of the trading agent with AI techniques. To address the challenges, we propose an adaptive trading model, namely iRDPG, to automatically develop QT strategies by an intelligent trading agent. Our model is enhanced by deep reinforcement learning (DRL) and imitation learning techniques. Specifically, considering the noisy financial data, we formulate the QT process as a Partially Observable Markov Decision Process (POMDP). Also, we introduce imitation learning to leverage classical trading strategies useful to balance between exploration and exploitation. For better simulation, we train our trading agent in the real financial market using minute-frequent data. Experimental results demonstrate that our model can extract robust market features and be adaptive in different markets.


2016 ◽  
Vol 6 (2) ◽  
pp. 103
Author(s):  
Susan Carol Christoffersen ◽  
Elizabeth Harman Granitz

<p>Firms have a responsibility to their shareholders to maximize their financial performance however they are increasingly scrutinized for environmental practices as well. These two objectives are often thought to be in conflict; it can be costly to be a good steward of the environment however it may be more costly in the long run to ignore societal pressures and environmental impacts. While various studies provide ambiguous and sometimes contradictory results, we conduct a rigorous analysis of the health care sector using Trucost’s Environmental Impact Score and financial data. The study uses regression analysis to identify the extent to which the benefit to the firm of reducing its environmental impact is financially beneficial. In the health care sector, an increase in the environmental impact score of 1 unit is correlated with an increase of 4% of their earnings per share. Improving the environmental bottom line improves the financial bottom line.</p>


Sign in / Sign up

Export Citation Format

Share Document