ALGORITHMIC DIFFERENTIATION FOR DISCONTINUOUS PAYOFFS

2018 ◽  
Vol 21 (04) ◽  
pp. 1850019 ◽  
Author(s):  
ROBERTO DALUISO ◽  
GIORGIO FACCHINETTI

We present a general technique to compute the sensitivities of the Monte Carlo prices of discontinuous financial products. It is a natural extension of the pathwise adjoints method, which would require an almost-surely differentiable payoff; the efficiency of the latter method when many sensitivities must be calculated is preserved. We show empirically that the new algorithm is competitive in terms of accuracy and execution time when compared to benchmarks obtained by smoothing of the payoff, which benchmarks are biased and require a nonobvious tuning of their parameters.

Algorithms ◽  
2021 ◽  
Vol 14 (10) ◽  
pp. 296
Author(s):  
Lucy Blondell ◽  
Mark Z. Kos ◽  
John Blangero ◽  
Harald H. H. Göring

Statistical analysis of multinomial data in complex datasets often requires estimation of the multivariate normal (mvn) distribution for models in which the dimensionality can easily reach 10–1000 and higher. Few algorithms for estimating the mvn distribution can offer robust and efficient performance over such a range of dimensions. We report a simulation-based comparison of two algorithms for the mvn that are widely used in statistical genetic applications. The venerable Mendell-Elston approximation is fast but execution time increases rapidly with the number of dimensions, estimates are generally biased, and an error bound is lacking. The correlation between variables significantly affects absolute error but not overall execution time. The Monte Carlo-based approach described by Genz returns unbiased and error-bounded estimates, but execution time is more sensitive to the correlation between variables. For ultra-high-dimensional problems, however, the Genz algorithm exhibits better scale characteristics and greater time-weighted efficiency of estimation.


2015 ◽  
Vol 2015 ◽  
pp. 1-21 ◽  
Author(s):  
Daniel Bonetti ◽  
Dorival Leão ◽  
Alberto Ohashi ◽  
Vinícius Siqueira

We propose a feasible and constructive methodology which allows us to compute pure hedging strategies with respect to arbitrary square-integrable claims in incomplete markets. In contrast to previous works based on PDE and BSDE methods, the main merit of our approach is the flexibility of quadratic hedging in full generality without a priori smoothness assumptions on the payoff. In particular, the methodology can be applied to multidimensional quadratic hedging-type strategies for fully path-dependent options with stochastic volatility and discontinuous payoffs. In order to demonstrate that our methodology is indeed applicable, we provide a Monte Carlo study on generalized Föllmer-Schweizer decompositions, locally risk minimizing, and mean variance hedging strategies for vanilla and path-dependent options written on local volatility and stochastic volatility models.


Author(s):  
EMMANUEL M. TADJOUDDINE

We consider sequential auctions wherein seller and bidder agents need to price goods on sale at the 'right' market price. We propose algorithms based on a binomial model for both the seller and buyer. Then, we consider the problem of calibrating pricing models to market data. To this end, we studied a stochastic volatility model used for option pricing, derived, and analyzed Monte Carlo estimators for computing the gradient of a certain payoff function using Finite Differencing and Algorithmic Differentiation. We then assessed the accuracy and efficiency of both methods as well as their impacts into the optimization algorithm. Numerical results are presented and discussed. This work can benefit those engaged in electronic trading or investors in financial products with the need for fast and more precise predictions of future market data.


2018 ◽  
Vol 7 (2.4) ◽  
pp. 66
Author(s):  
N Rajkumar ◽  
K Kishore Kumar ◽  
J Vivek

Replica identification is the path toward perceiving various depictions of same true matters. Today, duplication location methodologies needed process ever greater datasets in ever shorter time: keeping this idea on dataset ends up being dynamically troublesome. To present, dynamic duplicate and distinguishing by proof figuring happened to be using Progressive Sorted Neighbourhood Method and Progressive Blocking augmentations more profitability occurred in order to find duplicates. If the execution time is restricted then the grow of general technique considers the time accessible and generates reports that considerably produces results faster than ordinary systems. Broad examinations exhibit that our dynamic counts can twofold the capability after some season of standard copy recognition and basically improve related work.  


Sign in / Sign up

Export Citation Format

Share Document