lévy process
Recently Published Documents


TOTAL DOCUMENTS

484
(FIVE YEARS 82)

H-INDEX

27
(FIVE YEARS 3)

Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-18
Author(s):  
Wissem Jedidi

We consider statistical experiments associated with a Lévy process X = X t t ≥ 0 observed along a deterministic scheme i u n ,   1 ≤ i ≤ n . We assume that under a probability ℙ θ , the r.v. X t ,   t > 0 , has a probability density function > o , which is regular enough relative to a parameter θ ∈ 0 , ∞ . We prove that the sequence of the associated statistical models has the LAN property at each θ , and we investigate the case when X is the product of an unknown parameter θ by another Lévy process Y with known characteristics. We illustrate the last results by the case where Y is attracted by a stable process.


2021 ◽  
Vol 32 (1) ◽  
Author(s):  
Simon Godsill ◽  
Yaman Kındap

AbstractIn this paper novel simulation methods are provided for the generalised inverse Gaussian (GIG) Lévy process. Such processes are intractable for simulation except in certain special edge cases, since the Lévy density associated with the GIG process is expressed as an integral involving certain Bessel functions, known as the Jaeger integral in diffusive transport applications. We here show for the first time how to solve the problem indirectly, using generalised shot-noise methods to simulate the underlying point processes and constructing an auxiliary variables approach that avoids any direct calculation of the integrals involved. The resulting augmented bivariate process is still intractable and so we propose a novel thinning method based on upper bounds on the intractable integrand. Moreover, our approach leads to lower and upper bounds on the Jaeger integral itself, which may be compared with other approximation methods. The shot noise method involves a truncated infinite series of decreasing random variables, and as such is approximate, although the series are found to be rapidly convergent in most cases. We note that the GIG process is the required Brownian motion subordinator for the generalised hyperbolic (GH) Lévy process and so our simulation approach will straightforwardly extend also to the simulation of these intractable processes. Our new methods will find application in forward simulation of processes of GIG and GH type, in financial and engineering data, for example, as well as inference for states and parameters of stochastic processes driven by GIG and GH Lévy processes.


2021 ◽  
Vol 105 (0) ◽  
pp. 79-91
Author(s):  
F. Kühn ◽  
R. Schilling

Let X = ( X t ) t ≥ 0 X=(X_t)_{t\geq 0} be a one-dimensional Lévy process such that each X t X_t has a C b 1 C^1_b -density w. r. t. Lebesgue measure and certain polynomial or exponential moments. We characterize all polynomially bounded functions f : R → R f\colon \mathbb {R}\to \mathbb {R} , and exponentially bounded functions g : R → ( 0 , ∞ ) g\colon \mathbb {R}\to (0,\infty ) , such that f ( X t ) − E f ( X t ) f(X_t)-\mathbb {E} f(X_t) , resp. g ( X t ) / E g ( X t ) g(X_t)/\mathbb {E} g(X_t) , are martingales.


2021 ◽  
Vol 58 (4) ◽  
pp. 868-879
Author(s):  
Boris Buchmann ◽  
Kevin W. Lu

AbstractConsider the strong subordination of a multivariate Lévy process with a multivariate subordinator. If the subordinate is a stack of independent Lévy processes and the components of the subordinator are indistinguishable within each stack, then strong subordination produces a Lévy process; otherwise it may not. Weak subordination was introduced to extend strong subordination, always producing a Lévy process even when strong subordination does not. Here we prove that strong and weak subordination are equal in law under the aforementioned condition. In addition, we prove that if strong subordination is a Lévy process then it is necessarily equal in law to weak subordination in two cases: firstly when the subordinator is deterministic, and secondly when it is pure-jump with finite activity.


Bernoulli ◽  
2021 ◽  
Vol 27 (4) ◽  
Author(s):  
Jorge González Cázares ◽  
Jevgenijs Ivanovs

Author(s):  
Valentin Courgeau ◽  
Almut E. D. Veraart

AbstractWe consider the problem of modelling restricted interactions between continuously-observed time series as given by a known static graph (or network) structure. For this purpose, we define a parametric multivariate Graph Ornstein-Uhlenbeck (GrOU) process driven by a general Lévy process to study the momentum and network effects amongst nodes, effects that quantify the impact of a node on itself and that of its neighbours, respectively. We derive the maximum likelihood estimators (MLEs) and their usual properties (existence, uniqueness and efficiency) along with their asymptotic normality and consistency. Additionally, an Adaptive Lasso approach, or a penalised likelihood scheme, infers both the graph structure along with the GrOU parameters concurrently and is shown to satisfy similar properties. Finally, we show that the asymptotic theory extends to the case when stochastic volatility modulation of the driving Lévy process is considered.


Author(s):  
Vladimir Fomichov ◽  
Jorge González Cázares ◽  
Jevgenijs Ivanovs

Author(s):  
Lukas Gonon ◽  
Christoph Schwab

AbstractWe study the expression rates of deep neural networks (DNNs for short) for option prices written on baskets of $d$ d risky assets whose log-returns are modelled by a multivariate Lévy process with general correlation structure of jumps. We establish sufficient conditions on the characteristic triplet of the Lévy process $X$ X that ensure $\varepsilon $ ε error of DNN expressed option prices with DNNs of size that grows polynomially with respect to ${\mathcal{O}}(\varepsilon ^{-1})$ O ( ε − 1 ) , and with constants implied in ${\mathcal{O}}(\, \cdot \, )$ O ( ⋅ ) which grow polynomially in $d$ d , thereby overcoming the curse of dimensionality (CoD) and justifying the use of DNNs in financial modelling of large baskets in markets with jumps.In addition, we exploit parabolic smoothing of Kolmogorov partial integro-differential equations for certain multivariate Lévy processes to present alternative architectures of ReLU (“rectified linear unit”) DNNs that provide $\varepsilon $ ε expression error in DNN size ${\mathcal{O}}(|\log (\varepsilon )|^{a})$ O ( | log ( ε ) | a ) with exponent $a$ a proportional to $d$ d , but with constants implied in ${\mathcal{O}}(\, \cdot \, )$ O ( ⋅ ) growing exponentially with respect to $d$ d . Under stronger, dimension-uniform non-degeneracy conditions on the Lévy symbol, we obtain algebraic expression rates of option prices in exponential Lévy models which are free from the curse of dimensionality. In this case, the ReLU DNN expression rates of prices depend on certain sparsity conditions on the characteristic Lévy triplet. We indicate several consequences and possible extensions of the presented results.


Sign in / Sign up

Export Citation Format

Share Document