scholarly journals Numerical Valuation of American Basket Options via Partial Differential Complementarity Problems

Mathematics ◽  
2021 ◽  
Vol 9 (13) ◽  
pp. 1498
Author(s):  
Karel J. in’t Hout ◽  
Jacob Snoeijer

We study the principal component analysis based approach introduced by Reisinger and Wittum (2007) and the comonotonic approach considered by Hanbali and Linders (2019) for the approximation of American basket option values via multidimensional partial differential complementarity problems (PDCPs). Both approximation approaches require the solution of just a limited number of low-dimensional PDCPs. It is demonstrated by ample numerical experiments that they define approximations that lie close to each other. Next, an efficient discretisation of the pertinent PDCPs is presented that leads to a favourable convergence behaviour.

2018 ◽  
Vol 37 (10) ◽  
pp. 1233-1252 ◽  
Author(s):  
Jonathan Hoff ◽  
Alireza Ramezani ◽  
Soon-Jo Chung ◽  
Seth Hutchinson

In this article, we present methods to optimize the design and flight characteristics of a biologically inspired bat-like robot. In previous, work we have designed the topological structure for the wing kinematics of this robot; here we present methods to optimize the geometry of this structure, and to compute actuator trajectories such that its wingbeat pattern closely matches biological counterparts. Our approach is motivated by recent studies on biological bat flight that have shown that the salient aspects of wing motion can be accurately represented in a low-dimensional space. Although bats have over 40 degrees of freedom (DoFs), our robot possesses several biologically meaningful morphing specializations. We use principal component analysis (PCA) to characterize the two most dominant modes of biological bat flight kinematics, and we optimize our robot’s parametric kinematics to mimic these. The method yields a robot that is reduced from five degrees of actuation (DoAs) to just three, and that actively folds its wings within a wingbeat period. As a result of mimicking synergies, the robot produces an average net lift improvesment of 89% over the same robot when its wings cannot fold.


2011 ◽  
Vol 341-342 ◽  
pp. 790-797 ◽  
Author(s):  
Zhi Yan Xiang ◽  
Tie Yong Cao ◽  
Peng Zhang ◽  
Tao Zhu ◽  
Jing Feng Pan

In this paper, an object tracking approach is introduced for color video sequences. The approach presents the integration of color distributions and probabilistic principal component analysis (PPCA) into particle filtering framework. Color distributions are robust to partial occlusion, are rotation and scale invariant and are calculated efficiently. Principal Component Analysis (PCA) is used to update the eigenbasis and the mean, which can reflect the appearance changes of the tracked object. And a low dimensional subspace representation of PPCA efficiently adapts to these changes of appearance of the target object. At the same time, a forgetting factor is incorporated into the updating process, which can be used to economize on processing time and enhance the efficiency of object tracking. Computer simulation experiments demonstrate the effectiveness and the robustness of the proposed tracking algorithm when the target object undergoes pose and scale changes, defilade and complex background.


2014 ◽  
Vol 571-572 ◽  
pp. 753-756
Author(s):  
Wei Li Li ◽  
Xiao Qing Yin ◽  
Bin Wang ◽  
Mao Jun Zhang ◽  
Ke Tan

Denoising is an important issue for laser active image. This paper attempted to process laser active image in the low-dimensional sub-space. We adopted the principal component analysis with local pixel grouping (LPG-PCA) denoising method proposed by Zhang [1], and compared it with the conventional denoising method for laser active image, such as wavelet filtering, wavelet soft threshold filtering and median filtering. Experimental results show that the image denoised by LPG-PCA has higher BIQI value than other images, most of the speckle noise can be reduced and the detail structure information is well preserved. The low-dimensional sub-space idea is a new direction for laser active image denoising.


Author(s):  
Stephanie Hare ◽  
Lars Bratholm ◽  
David Glowacki ◽  
Barry Carpenter

Low dimensional representations along reaction pathways were produced using newly created Python software that utilises Principal Component Analysis (PCA) to do dimensionality reduction. Plots of these pathways in reduced dimensional space, as well as the physical meaning of the reduced dimensional axes, are discussed.


2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Pei Heng Li ◽  
Taeho Lee ◽  
Hee Yong Youn

Various dimensionality reduction (DR) schemes have been developed for projecting high-dimensional data into low-dimensional representation. The existing schemes usually preserve either only the global structure or local structure of the original data, but not both. To resolve this issue, a scheme called sparse locality for principal component analysis (SLPCA) is proposed. In order to effectively consider the trade-off between the complexity and efficiency, a robust L2,p-norm-based principal component analysis (R2P-PCA) is introduced for global DR, while sparse representation-based locality preserving projection (SR-LPP) is used for local DR. Sparse representation is also employed to construct the weighted matrix of the samples. Being parameter-free, this allows the construction of an intrinsic graph more robust against the noise. In addition, simultaneous learning of projection matrix and sparse similarity matrix is possible. Experimental results demonstrate that the proposed scheme consistently outperforms the existing schemes in terms of clustering accuracy and data reconstruction error.


Sensors ◽  
2018 ◽  
Vol 18 (12) ◽  
pp. 4391 ◽  
Author(s):  
Aimin Miao ◽  
Jiajun Zhuang ◽  
Yu Tang ◽  
Yong He ◽  
Xuan Chu ◽  
...  

Variety classification is an important step in seed quality testing. This study introduces t-distributed stochastic neighbourhood embedding (t-SNE), a manifold learning algorithm, into the field of hyperspectral imaging (HSI) and proposes a method for classifying seed varieties. Images of 800 maize kernels of eight varieties (100 kernels per variety, 50 kernels for each side of the seed) were imaged in the visible- near infrared (386.7–1016.7 nm) wavelength range. The images were pre-processed by Procrustes analysis (PA) to improve the classification accuracy, and then these data were reduced to low-dimensional space using t-SNE. Finally, Fisher’s discriminant analysis (FDA) was used for classification of the low-dimensional data. To compare the effect of t-SNE, principal component analysis (PCA), kernel principal component analysis (KPCA) and locally linear embedding (LLE) were used as comparative methods in this study, and the results demonstrated that the t-SNE model with PA pre-processing has obtained better classification results. The highest classification accuracy of the t-SNE model was up to 97.5%, which was much more satisfactory than the results of the other models (up to 75% for PCA, 85% for KPCA, 76.25% for LLE). The overall results indicated that the t-SNE model with PA pre-processing can be used for variety classification of waxy maize seeds and be considered as a new method for hyperspectral image analysis.


Computation ◽  
2021 ◽  
Vol 9 (7) ◽  
pp. 78
Author(s):  
Shengkun Xie

Feature extraction plays an important role in machine learning for signal processing, particularly for low-dimensional data visualization and predictive analytics. Data from real-world complex systems are often high-dimensional, multi-scale, and non-stationary. Extracting key features of this type of data is challenging. This work proposes a novel approach to analyze Epileptic EEG signals using both wavelet power spectra and functional principal component analysis. We focus on how the feature extraction method can help improve the separation of signals in a low-dimensional feature subspace. By transforming EEG signals into wavelet power spectra, the functionality of signals is significantly enhanced. Furthermore, the power spectra transformation makes functional principal component analysis suitable for extracting key signal features. Therefore, we refer to this approach as a double feature extraction method since both wavelet transform and functional PCA are feature extractors. To demonstrate the applicability of the proposed method, we have tested it using a set of publicly available epileptic EEGs and patient-specific, multi-channel EEG signals, for both ictal signals and pre-ictal signals. The obtained results demonstrate that combining wavelet power spectra and functional principal component analysis is promising for feature extraction of epileptic EEGs. Therefore, they can be useful in computer-based medical systems for epilepsy diagnosis and epileptic seizure detection problems.


2019 ◽  
Vol 8 (S3) ◽  
pp. 66-71
Author(s):  
T. Sudha ◽  
P. Nagendra Kumar

Data mining is one of the major areas of research. Clustering is one of the main functionalities of datamining. High dimensionality is one of the main issues of clustering and Dimensionality reduction can be used as a solution to this problem. The present work makes a comparative study of dimensionality reduction techniques such as t-distributed stochastic neighbour embedding and probabilistic principal component analysis in the context of clustering. High dimensional data have been reduced to low dimensional data using dimensionality reduction techniques such as t-distributed stochastic neighbour embedding and probabilistic principal component analysis. Cluster analysis has been performed on the high dimensional data as well as the low dimensional data sets obtained through t-distributed stochastic neighbour embedding and Probabilistic principal component analysis with varying number of clusters. Mean squared error; time and space have been considered as parameters for comparison. The results obtained show that time taken to convert the high dimensional data into low dimensional data using probabilistic principal component analysis is higher than the time taken to convert the high dimensional data into low dimensional data using t-distributed stochastic neighbour embedding.The space required by the data set reduced through Probabilistic principal component analysis is less than the storage space required by the data set reduced through t-distributed stochastic neighbour embedding.


Author(s):  
Katsuhiro Honda ◽  
◽  
Yoshihito Nakamura ◽  
Hidetomo Ichihashi

This paper proposes the simultaneous application of homogeneity analysis and fuzzy clustering with incomplete data. Taking into account the similarity between the loss function for homogeneity analysis and the least squares criterion for principal component analysis, we define the new objective function in a formulation similar to linear fuzzy clustering with missing values. Numerical experiments demonstrate the feasibility of the proposed method.


Sign in / Sign up

Export Citation Format

Share Document