ARE TECHNOLOGY SHOCKS NONLINEAR?

1999 ◽  
Vol 3 (4) ◽  
pp. 506-533 ◽  
Author(s):  
Sumru Altuğ ◽  
Richard A. Ashley ◽  
Douglas M. Patterson

The behavior of postwar real U.S. GNP, the inputs to an aggregate production function, and several formulations of the associated Solow residuals for the presence of nonlinearities in their generating mechanisms are examined. Three different statistical tests for nonlinearity are implemented: the McLeod-Li test, the BDS test, and the Hinich bicovariance test. We find substantial evidence for nonlinearity in the generating mechanism of real GNP growth but no evidence for nonlinearity in the Solow residuals. We further find that the generating mechanism of the labor input series is nonlinear, whereas that of the capital services input appears to be linear. We therefore conclude that the observed nonlinearity in real output arises from nonlinearities in the labor markets, not from nonlinearities in the technical shocks driving the system. Finally, we investigate the source of the nonlinearities in the labor markets by examining simulated data from a model of the Dutch economy with asymmetric adjustment costs.

2009 ◽  
Vol 99 (5) ◽  
pp. 2258-2266 ◽  
Author(s):  
Christian Bayer

This comment addresses a point raised in Russell Cooper and Jonathan Willis (2003, 2004), which discusses whether the “gap approach” is appropriate to describe the adjustment of production factors. They show that this approach to labor adjustment as applied in Ricardo J. Caballero, Eduardo Engel, and John C. Haltiwanger (1997) and Caballero and Engel (1993) can falsely generate evidence in favor of nonconvex adjustment costs, even if costs are quadratic. Simulating a dynamic model of firm-level employment decisions with quadratic adjustment costs and estimating a gap model from the simulated data, they identify two factors producing this spurious evidence: approximating dynamic adjustment targets by static ones, and estimating the static targets themselves. This comment reassesses whether the first factor indeed leads to spurious evidence in favor of fixed adjustment costs. We show that the numerical approximation of the productivity process is pivotal for Cooper and Willis's finding. With more precise approximations of the productivity process, it becomes rare to falsely reject the quadratic adjustment cost model due to the approximation of dynamic targets by static ones. (JEL E24, J3)


2020 ◽  
Vol 14 (2) ◽  
pp. 129-163
Author(s):  
Sevgi Coskun

We test a standard DSGE (Dynamic Stochastic General Equilibrium) model on impulse responses of hours worked and real GDP after technology and non-technology shocks in emerging market economies (EMEs). Most dynamic macroeconomic models assume that hours worked are stationary. However, in the data, we observe apparent changes in hours worked from 1970 to 2013 in these economies. Motivated by this fact, we first estimate a structural vector autoregression (SVAR) model with a specification of hours in difference (DSVAR) and then set up a DSGE model by incorporating permanent labour supply (LS) shocks that can generate a unit root in hours worked, while preserving the property of a balanced growth path. These LS shocks could be associated with very dramatic changes in LS which look permanent in these economies. Hence, the identification restriction in our models comes from the fact that both technology and LS shocks have a permanent effect on GDP yet only the latter shocks have a long-run impact on hours worked. For inference purposes, we compare empirical impulse responses based on the EMEs data to impulse responses from DSVARs run on the simulated data from the model. The results show that a DSGE model with permanent LS shocks that can generate a unit root in hours worked is required to properly evaluate the DSVAR in EMEs as this model is able to replicate indirectly impulse responses obtained from a DSVAR on the actual data. JEL Classification: C32, E32


2021 ◽  
Vol 71 ◽  
pp. 77-96
Author(s):  
Huanhuan Li ◽  
Ying Ji ◽  
Zaiwu Gong ◽  
Shaojian Qu

2021 ◽  
Author(s):  
Yu Zhao ◽  
Yurui Gao ◽  
Muwei Li ◽  
Adam W. Anderson ◽  
Zhaohua Ding ◽  
...  

<p>The analysis of connectivity between parcellated regions of cortex provides insights into the functional architecture of the brain at a systems level. However, there has been less progress in the derivation of functional structures from voxel-wise analyses at finer scales. We propose a novel method, called localized topo-connectivity mapping with singular-value-decomposition-informed filtering (or filtered LTM), to identify and characterize voxel-wise functional structures in the human brain using resting-state fMRI data. Here we describe its mathematical background and provide a proof-of-concept using simulated data that allow an intuitive interpretation of the results of filtered LTM. The algorithm has also been applied to 7T fMRI data as part of the Human Connectome Project to generate group-average LTM images. Functional structures revealed by this approach agree moderately well with anatomical structures identified by T<sub>1</sub>-weighted images and fractional anisotropy maps derived from diffusion MRI. Moreover, the LTM images also reveal subtle functional variations that are not apparent in the anatomical structures. To assess the performance of LTM images, the subcortical region and occipital white matter were separately parcellated. Statistical tests were performed to demonstrate that the synchronies of fMRI signals in LTM-informed parcellations are significantly larger than those of random parcellations. Overall, the filtered LTM approach can serve as a tool to investigate the functional organization of the brain at the scale of individual voxels as measured in fMRI.</p>


Sign in / Sign up

Export Citation Format

Share Document