Simplified plasma models based on reduced kinetics

2017 ◽  
Author(s):  
◽  
Aurélie Bellemans

Performing high-fidelity plasma simulations remains computationally expensive because of their large dimension and complex chemistry. Atmospheric re-entry plasmas for instance, involve hundreds of species in thousands of re- actions used in detailed physical models. These models are very complex as they describe the non-equilibrium phenomena due to finite-rate processes in the flow. Chemical non-equilibrium arises because of the many dissociation, ionization and excitation reaction at various time-scales. Vibrational, rotational, electronic and translational temperatures characterize the flow and exchange energy between species, which leads to thermal non-equilibrium. With the current computational resources, detailed three-dimensional simulations are still out of reach. Detailed calculations using the full dynamics are often restricted to a zero- or one-dimensional description. A trade-off has to be made between the level of accuracy of the model and its computational cost. This thesis presents various methods to develop accurate reduced kinetic models for plasma flows. Starting from detailed chemistry, high-fidelity reductions are achieved through the application of either physics-based techniques, such as presented by the binning methods and time-scale based reductions, either empirical techniques given by principal component analysis. As an original contribution to the existing methods, the physics-based techniques are combined with principal component analysis uniting both com- munities. The different techniques are trained on a 34 species collisional- radiative model for argon plasma by comparing shock relaxation simulations. The best performing method is applied on the large N-N2 mechanism containing 9391 species and 23 million reactions calculated by the NASA Ames Research Center. As a preliminary step, the system dynamics is analyzed to improve our understanding of the various processes occurring in plasma flows. The re- actions are analyzed and classified according to their importance. A deep investigation of the kinetics enables finding the main variables and parameters characterizing the plasma, which can thereafter be used to develop or improve existing reductions. As a result, a novel coarse grain model has been developed for argon by binning the electronic excited levels and the ionized species into 2 Boltzmann averaged energy bins. The ground state is solved individually together with the free electrons, reducing the species mass conservation equations from 34 to 4. Principal component analysis has been transferred from the combustion community to plasma flows by investigating the Manifold-Generated and Score-PCA techniques. PCA identifies low-dimensional manifolds empirically, projecting the full kinetics to its base of principal components. A novel approach combines the binning techniques with PCA, finding an optimized model for reducing the N3 rovibrational collisional model.

2018 ◽  
Vol 8 (8) ◽  
pp. 1321 ◽  
Author(s):  
Minseo Kim ◽  
Soohwan Yu ◽  
Seonhee Park ◽  
Sangkeun Lee ◽  
Joonki Paik

This paper presents a computationally efficient haze removal and image enhancement methods. The major contribution of the proposed research is two-fold: (i) an accurate atmospheric light estimation using principal component analysis, and (ii) learning-based transmission estimation. To reduce the computational cost, we impose a constraint on the candidate pixels to estimate the haze components in the sub-image. In addition, the proposed method extracts modified haze-relevant features to estimate an accurate transmission using random forest. Experimental results show that the proposed method can provide high-quality results with a significantly reduced computational load compared with existing methods. In addition, we demonstrate that the proposed method can significantly enhance the contrast of low-light images according to the assumption on the visual similarity between the inverted low-light and haze images.


2020 ◽  
Vol 82 (12) ◽  
pp. 2711-2724 ◽  
Author(s):  
Pezhman Kazemi ◽  
Jaume Giralt ◽  
Christophe Bengoa ◽  
Armin Masoumian ◽  
Jean-Philippe Steyer

Abstract Because of the static nature of conventional principal component analysis (PCA), natural process variations may be interpreted as faults when it is applied to processes with time-varying behavior. In this paper, therefore, we propose a complete adaptive process monitoring framework based on incremental principal component analysis (IPCA). This framework updates the eigenspace by incrementing new data to the PCA at a low computational cost. Moreover, the contribution of variables is recursively provided using complete decomposition contribution (CDC). To impute missing values, the empirical best linear unbiased prediction (EBLUP) method is incorporated into this framework. The effectiveness of this framework is evaluated using benchmark simulation model No. 2 (BSM2). Our simulation results show the ability of the proposed approach to distinguish between time-varying behavior and faulty events while correctly isolating the sensor faults even when these faults are relatively small.


2021 ◽  
Vol 2015 (1) ◽  
pp. 012047
Author(s):  
Giorgio Gnecco ◽  
Andrea Bacigalupo ◽  
Francesca Fantoni ◽  
Daniela Selvi

Abstract A promising technique for the spectral design of acoustic metamaterials is based on the formulation of suitable constrained nonlinear optimization problems. Unfortunately, the straightforward application of classical gradient-based iterative optimization algorithms to the numerical solution of such problems is typically highly demanding, due to the complexity of the underlying physical models. Nevertheless, supervised machine learning techniques can reduce such a computational effort, e.g., by replacing the original objective functions of such optimization problems with more-easily computable approximations. In this framework, the present article describes the application of a related unsupervised machine learning technique, namely, principal component analysis, to approximate the gradient of the objective function of a band gap optimization problem for an acoustic metamaterial, with the aim of making the successive application of a gradient-based iterative optimization algorithm faster. Numerical results show the effectiveness of the proposed method.


2017 ◽  
Vol 40 (7) ◽  
pp. 2387-2395 ◽  
Author(s):  
Yi Ji ◽  
Hong-Bo Xie

Time-frequency representiation has been intensively employed for the analysis of biomedical signals. In order to extract discriminative information, time-frequency matrix is often transformed into a 1D vector followed by principal component analysis (PCA). This study contributes a two-directional two-dimensional principal component analysis (2D2PCA)-based technique for time-frequency feature extraction. The S transform, integrating the strengths of short time Fourier transform and wavelet transform, is applied to perform the time-frequency decomposition. Then, 2D2PCA is directly conducted on the time-frequency matrix rather than 1D vectors for feature extraction. The proposed method can significantly reduce the computational cost while capture the directions of maximal time-frequency matrix variance. The efficiency and effectiveness of the proposed method is demonstrated by classifying eight hand motions using 4-channel myoelectric signals recorded in health subjects and amputees.


2015 ◽  
Vol 22 (6) ◽  
pp. 062108 ◽  
Author(s):  
A. Bellemans ◽  
A. Munafò ◽  
T. E. Magin ◽  
G. Degrez ◽  
A. Parente

2020 ◽  
pp. 147592172097739
Author(s):  
Luis Eduardo Mujica ◽  
Magda Ruiz ◽  
Rodolfo Villamizar

The hydrocarbon industry in Colombia is one of the principal pillars for the Colombian economy, representing around 5% of its gross domestic product. Since petroleum reserves have decreased, gas becomes one main alternative for economical growth. However, current gas pipelines have been in service for over 30 years and some of them are buried and phenomena, such as metal losses, corrosion, mechanical stress, strikes by excavation machinery, and another type of damages, are presented. The maintenance program of these structures is typically corrective type and is very expensive. To overcome this situation, the native research institute “Research Institute of Corrosion—Corporación para la Investigación de la Corrosión” recently developed an in-line inspection tool to be operated in Colombian gas pipelines to get valuable information of their current state along thousands of kilometers. A huge quantity of data is recorded (including tool movement, magnet, magnetic flow leakage, and caliper signals), which demand a high-computational cost and an adequate tool analysis to establish the current pipeline structural health condition. In this sense, authors have shown in several works that principal component analysis is an effective tool to detect and locate abnormal operational structural conditions from multidimensional data. In a previous analysis, multidimensional data were used to locate possible damages along the pipeline. However, most of the activated points belonged to weld points. Then, in this article, it is proposed to use the root mean square value of magnetic flux leakage signals to separate these points and to obtain sets of signals by sections removing the welds, and then multiway principal component analysis is applied for each set of signals of each gas pipeline section. The maximum values of damage indices ( Q and [Formula: see text]-statistics) of each section are conserved to activate the sections of the gas pipeline with more probability of damages and then, they must be evaluated by experts.


2022 ◽  
Vol 355 ◽  
pp. 02024
Author(s):  
Haojing Wang ◽  
Yingjie Tian ◽  
An Li ◽  
Jihai Wu ◽  
Gaiping Sun

In view of the limitation of “hard assignment” of clusters in traditional clustering methods and the difficulty of meeting the requirements of clustering efficiency and clustering accuracy simultaneously in regard to massive data sets, a load classification method based on a Gaussian mixture model combining clustering and principal component analysis is proposed. The load data are fed into a Gaussian mixture model clustering algorithm after principal component analysis and dimensionality reduction to achieve classification of large-scale load datasets. The method in this paper is used to classify loads in the Canadian AMPds2 public dataset and is compared with K-Means, Gaussian mixed model clustering and other methods. The results show that the proposed method can not only achieve load classification more effectively and finely, but also save computational cost and improve computational efficiency.


2015 ◽  
Vol 24 (2) ◽  
pp. 025004 ◽  
Author(s):  
Kim Peerenboom ◽  
Alessandro Parente ◽  
Tomáš Kozák ◽  
Annemie Bogaerts ◽  
Gérard Degrez

VASA ◽  
2012 ◽  
Vol 41 (5) ◽  
pp. 333-342 ◽  
Author(s):  
Kirchberger ◽  
Finger ◽  
Müller-Bühl

Background: The Intermittent Claudication Questionnaire (ICQ) is a short questionnaire for the assessment of health-related quality of life (HRQOL) in patients with intermittent claudication (IC). The objective of this study was to translate the ICQ into German and to investigate the psychometric properties of the German ICQ version in patients with IC. Patients and methods: The original English version was translated using a forward-backward method. The resulting German version was reviewed by the author of the original version and an experienced clinician. Finally, it was tested for clarity with 5 German patients with IC. A sample of 81 patients were administered the German ICQ. The sample consisted of 58.0 % male patients with a median age of 71 years and a median IC duration of 36 months. Test of feasibility included completeness of questionnaires, completion time, and ratings of clarity, length and relevance. Reliability was assessed through a retest in 13 patients at 14 days, and analysis of Cronbach’s alpha for internal consistency. Construct validity was investigated using principal component analysis. Concurrent validity was assessed by correlating the ICQ scores with the Short Form 36 Health Survey (SF-36) as well as clinical measures. Results: The ICQ was completely filled in by 73 subjects (90.1 %) with an average completion time of 6.3 minutes. Cronbach’s alpha coefficient reached 0.75. Intra-class correlation for test-retest reliability was r = 0.88. Principal component analysis resulted in a 3 factor solution. The first factor explained 51.5 of the total variation and all items had loadings of at least 0.65 on it. The ICQ was significantly associated with the SF-36 and treadmill-walking distances whereas no association was found for resting ABPI. Conclusions: The German version of the ICQ demonstrated good feasibility, satisfactory reliability and good validity. Responsiveness should be investigated in further validation studies.


Sign in / Sign up

Export Citation Format

Share Document