scholarly journals A job ladder model with stochastic employment opportunities

2021 ◽  
Vol 12 (4) ◽  
pp. 1399-1430 ◽  
Author(s):  
Jake Bradley ◽  
Axel Gottfries

We set up a model with on‐the‐job search in which firms infrequently post vacancies for which workers occasionally apply. The model nests the standard job ladder and stock‐flow models as special cases, while remaining analytically tractable and easy to estimate from standard panel data sets. The parameters from a structurally estimated model on US data are significantly different from either the restrictions imposed by a stock‐flow or job ladder model. Imposing these restrictions significantly understates the search option associated with employment and are, unlike our model, inconsistent with recent survey evidence and declining job finding rates and starting wage with duration of unemployment, both of which are present in the data.

1986 ◽  
Vol 51 (11) ◽  
pp. 2489-2501
Author(s):  
Benitto Mayrhofer ◽  
Jana Mayrhoferová ◽  
Lubomír Neužil ◽  
Jaroslav Nývlt

A model is derived for a multi-stage crystallization with cross-current flows of the solution and the crystals being purified. The purity of the product is compared with that achieved in the countercurrent arrangement. A suitable function has been set up which allows the cross-current and countercurrent flow models to be compared and reduces substantially the labour of computation for the countercurrent arrangement. Using the recrystallization of KAl(SO4)2.12 H2O as an example, it is shown that, when the cross-current and countercurrent processes are operated at the same output, the countercurrent arrangement is more advantageous because its solvent consumption is lower.


2003 ◽  
Vol 15 (9) ◽  
pp. 2227-2254 ◽  
Author(s):  
Wei Chu ◽  
S. Sathiya Keerthi ◽  
Chong Jin Ong

This letter describes Bayesian techniques for support vector classification. In particular, we propose a novel differentiable loss function, called the trigonometric loss function, which has the desirable characteristic of natural normalization in the likelihood function, and then follow standard gaussian processes techniques to set up a Bayesian framework. In this framework, Bayesian inference is used to implement model adaptation, while keeping the merits of support vector classifier, such as sparseness and convex programming. This differs from standard gaussian processes for classification. Moreover, we put forward class probability in making predictions. Experimental results on benchmark data sets indicate the usefulness of this approach.


Author(s):  
Bart Jacobs ◽  
Aleks Kissinger ◽  
Fabio Zanasi

Abstract Extracting causal relationships from observed correlations is a growing area in probabilistic reasoning, originating with the seminal work of Pearl and others from the early 1990s. This paper develops a new, categorically oriented view based on a clear distinction between syntax (string diagrams) and semantics (stochastic matrices), connected via interpretations as structure-preserving functors. A key notion in the identification of causal effects is that of an intervention, whereby a variable is forcefully set to a particular value independent of any prior propensities. We represent the effect of such an intervention as an endo-functor which performs ‘string diagram surgery’ within the syntactic category of string diagrams. This diagram surgery in turn yields a new, interventional distribution via the interpretation functor. While in general there is no way to compute interventional distributions purely from observed data, we show that this is possible in certain special cases using a calculational tool called comb disintegration. We demonstrate the use of this technique on two well-known toy examples: one where we predict the causal effect of smoking on cancer in the presence of a confounding common cause and where we show that this technique provides simple sufficient conditions for computing interventions which apply to a wide variety of situations considered in the causal inference literature; the other one is an illustration of counterfactual reasoning where the same interventional techniques are used, but now in a ‘twinned’ set-up, with two version of the world – one factual and one counterfactual – joined together via exogenous variables that capture the uncertainties at hand.


Author(s):  
Hilal Bahlawan ◽  
Mirko Morini ◽  
Michele Pinelli ◽  
Pier Ruggero Spina ◽  
Mauro Venturini

This paper documents the set-up and validation of nonlinear autoregressive exogenous (NARX) models of a heavy-duty single-shaft gas turbine. The considered gas turbine is a General Electric PG 9351FA located in Italy. The data used for model training are time series data sets of several different maneuvers taken experimentally during the start-up procedure and refer to cold, warm and hot start-up. The trained NARX models are used to predict other experimental data sets and comparisons are made among the outputs of the models and the corresponding measured data. Therefore, this paper addresses the challenge of setting up robust and reliable NARX models, by means of a sound selection of training data sets and a sensitivity analysis on the number of neurons. Moreover, a new performance function for the training process is defined to weigh more the most rapid transients. The final aim of this paper is the set-up of a powerful, easy-to-build and very accurate simulation tool which can be used for both control logic tuning and gas turbine diagnostics, characterized by good generalization capability.


Author(s):  
J.-F. Hullo

We propose a complete methodology for the fine registration and referencing of kilo-station networks of terrestrial laser scanner data currently used for many valuable purposes such as 3D as-built reconstruction of Building Information Models (BIM) or industrial asbuilt mock-ups. This comprehensive target-based process aims to achieve the global tolerance below a few centimetres across a 3D network including more than 1,000 laser stations spread over 10 floors. This procedure is particularly valuable for 3D networks of indoor congested environments. In situ, the use of terrestrial laser scanners, the layout of the targets and the set-up of a topographic control network should comply with the expert methods specific to surveyors. Using parametric and reduced Gauss-Helmert models, the network is expressed as a set of functional constraints with a related stochastic model. During the post-processing phase inspired by geodesy methods, a robust cost function is minimised. At the scale of such a data set, the complexity of the 3D network is beyond comprehension. The surveyor, even an expert, must be supported, in his analysis, by digital and visual indicators. In addition to the standard indicators used for the adjustment methods, including Baarda’s reliability, we introduce spectral analysis tools of graph theory for identifying different types of errors or a lack of robustness of the system as well as <i>in fine</i> documenting the quality of the registration.


2020 ◽  
Author(s):  
Gijs de Boer ◽  
Sean Waugh ◽  
Alexander Erwin ◽  
Steven Borenstein ◽  
Cory Dixon ◽  
...  

Abstract. Between 14 and 20 July 2018, small unmanned aircraft systems (sUAS) were deployed to the San Luis Valley of Colorado (USA) alongside surface-based remote, in-situ sensors, and radiosonde systems as part of the Lower Atmospheric Profiling Studies at Elevation – a Remotely-piloted Aircraft Team Experiment (LAPSE-RATE). The measurements collected as part of LAPSE-RATE targeted quantities related to enhancing our understanding of boundary layer structure, cloud and aerosol properties and surface-atmosphere exchange, and provide detailed information to support model evaluation and improvement work. Additionally, intensive intercomparison between the different unmanned aircraft platforms was completed. The current manuscript describes the observations obtained using three different types of surface-based mobile observing vehicles. These included the University of Colorado Mobile UAS Research Collaboratory (MURC), the National Oceanic and Atmospheric Administration National Severe Storms Laboratory Mobile Mesonet, and two University of Nebraska Combined Mesonet and Tracker (CoMeT) vehicles. Over the one-week campaign, a total of 143 hours of data were collected using this combination of vehicles. The data from these coordinated activities provide detailed perspectives on the spatial variability of atmospheric state parameters (air temperature, humidity, pressure, and wind) throughout the northern half of the San Luis Valley. These data sets have been checked for quality and published to the Zenodo data archive under a specific community set up for LAPSE-RATE (https://zenodo.org/communities/lapse-rate/) and are accessible at no cost by all registered users. The primary dataset DOIs are https://doi.org/10.5281/zenodo.3814765 (CU MURC measurements; de Boer et al., 2020d), https://doi.org/10.5281/zenodo.3738175 (NSSL MM measurements; Waugh, 2020) and https://doi.org/10.5281/zenodo.3838724 (UNL CoMeT measurements; Houston and Erwin., 2020).


1999 ◽  
Vol 61 (2) ◽  
pp. 237-254 ◽  
Author(s):  
Tom Engsted ◽  
Niels Haldrup
Keyword(s):  

Do special considerations apply to valuation in the case of large global chemical distributors? This study seeks to identify whether Income-based Discounted Cash Flow method based on projected future income would be suitable to value international chemical distributors. Two- and Three-stage Discounted Cash Flow models will be used. The expected companies’ enterprise and equity value are compared with the existing companies’ valuations. A base, bear and bull case scenario will be set up to establish the range of the company’s value for comparison with the existing valuation. This study adopts a single multiple-case study approach where actual financial data from three of the world’s largest chemical distributors were used to establish the existing companies’ valuation to demonstrate the validity and applicability of the Discounted Cash Flow method for sensitivity analysis.


Sign in / Sign up

Export Citation Format

Share Document