Model Order Identification of Combustion Instability Using Lipschitz Indices

Author(s):  
Salil Harris ◽  
Aniruddha Sinha ◽  
Sudarshan Kumar

Abstract Gas turbine combustors employing lean premixed combustion are prone to combustion instability. Combustion instability, if unchecked, will have deleterious effects to the combustor and hence needs to be controlled. Active control methods are preferred to obtain better off-design performance. The effectiveness of active control methods is dependent on the quality of controller which in-turn depends on the quality of model. In the present work, an input-output model structure, where the output of the system at the current instant is modelled as a nonlinear function of delayed inputs and outputs is chosen. As there are infinite possibilities for representation of nonlinear functions, all parameters in the model structure like time delay between input and output, number of delayed input and output terms and the appropriate form of nonlinear function can be obtained only iteratively. However, prior knowledge of delay and number of delayed inputs and outputs reduces the computational intensity. To this end, the present work utilizes the method of Lipschitz indices to obtain the number of delayed inputs and outputs.

Entropy ◽  
2020 ◽  
Vol 22 (10) ◽  
pp. 1079
Author(s):  
Vladimir Kazakov ◽  
Mauro A. Enciso ◽  
Francisco Mendoza

Based on the application of the conditional mean rule, a sampling-recovery algorithm is studied for a Gaussian two-dimensional process. The components of such a process are the input and output processes of an arbitrary linear system, which are characterized by their statistical relationships. Realizations are sampled in both processes, and the number and location of samples in the general case are arbitrary for each component. As a result, general expressions are found that determine the optimal structure of the recovery devices, as well as evaluate the quality of recovery of each component of the two-dimensional process. The main feature of the obtained algorithm is that the realizations of both components or one of them is recovered based on two sets of samples related to the input and output processes. This means that the recovery involves not only its own samples of the restored realization, but also the samples of the realization of another component, statistically related to the first one. This type of general algorithm is characterized by a significantly improved recovery quality, as evidenced by the results of six non-trivial examples with different versions of the algorithms. The research method used and the proposed general algorithm for the reconstruction of multidimensional Gaussian processes have not been discussed in the literature.


2021 ◽  
Vol 2021 (8) ◽  
Author(s):  
Alex May

Abstract Quantum tasks are quantum computations with inputs and outputs occurring at specified spacetime locations. Considering such tasks in the context of AdS/CFT has led to novel constraints relating bulk geometry and boundary entanglement. In this article we consider tasks where inputs and outputs are encoded into extended spacetime regions, rather than the points previously considered. We show that this leads to stronger constraints than have been derived in the point based setting. In particular we improve the connected wedge theorem, appearing earlier in arXiv:1912.05649, by finding a larger bulk region whose existence implies large boundary correlation. As well, we show how considering extended input and output regions leads to non-trivial statements in Poincaré-AdS2+1, a setting where the point-based connected wedge theorem is always trivial.


2014 ◽  
Vol 608-609 ◽  
pp. 19-22
Author(s):  
Ping Xu ◽  
Jian Gang Yi

Hydraulic descaling system is the key device to ensure the surface quality of billet. However, traditional control methods lead to the stability problem in hydraulic descaling system. To solve the problem, the construction of the hydraulic descaling computer control system is studied, the working principle of the system is analyzed, and the high pressure water bench of hydraulic descaling is designed. Based on it, the corresponding computer control software is developed. The application shows that the designed system is stable in practice, which is helpful for enterprise production.


2018 ◽  
Vol 1037 ◽  
pp. 032042 ◽  
Author(s):  
Dimitris I. Manolas ◽  
Giannis P. Serafeim ◽  
Panagiotis K. Chaviaropoulos ◽  
Vasilis A. Riziotis ◽  
Spyros G. Voutsinas

2020 ◽  
Vol 3 (348) ◽  
pp. 7-24
Author(s):  
Michał Pietrzak

The aim of this article is to analyse the possibility of applying selected perturbative masking methods of Statistical Disclosure Control to microdata, i.e. unit‑level data from the Labour Force Survey. In the first step, the author assessed to what extent the confidentiality of information was protected in the original dataset. In the second step, after applying selected methods implemented in the sdcMicro package in the R programme, the impact of those methods on the disclosure risk, the loss of information and the quality of estimation of population quantities was assessed. The conclusion highlights some problematic aspects of the use of Statistical Disclosure Control methods which were observed during the conducted analysis.


2007 ◽  
Vol 56 (6) ◽  
pp. 95-103 ◽  
Author(s):  
I. Nopens ◽  
N. Nere ◽  
P.A. Vanrolleghem ◽  
D. Ramkrishna

Many systems contain populations of individuals. Often, they are regarded as a lumped phase, which might, for some applications, lead to inadequate model predictive power. An alternative framework, Population Balance Models, has been used here to describe such a system, activated sludge flocculation in which particle size is the property one wants to model. An important problem to solve in population balance modelling is to determine the model structure that adequately describes experimentally obtained data on for instance, the time evolution of the floc size distribution. In this contribution, an alternative method based on solving the inverse problem is used to recover the model structure from the data. In this respect, the presence of similarity in the data simplifies the problem significantly. Similarity was found and the inverse problem could be solved. A forward simulation then confirmed the quality of the model structure to describe the experimental data.


Author(s):  
Satya Swesty Widiyana ◽  
Rus Indiyanto

ABSTRACTThis study was taken from the problems in Heaven Store ranging from turnover does not reach the target, the different display products for each branch, and a just few reference customer visiting from problems in customer satisfaction. because the values of input and output obtained from each branch has a different values so demanding customers Heaven Store to correct weaknesses in the efficiency of customer service and satisfaction, then we tried to respond to the challenges of these improvements to the study "Analysis of Measurement Efficiency Services Methods Data envelopment analysis (DEA) In Heaven Store in West Surabaya "So in this study, researchers will assist the managementHeaven Store for measuring the level of efficiency that Heaven store along 5th branches can improve the quality of service by using data envelopment analysis (DEA), which is a methods that determine the level of efficiency similar organization where efficiency is not determined by the organization concerned. It is hoped this analysis will help the management to withdraw the customer so that the customer can buy the products that are sold in Heaven Store. After calculation of the mathematical model by referring to the calculation of the mathematical model DEA CRS, obtained the efficiency 0.8479688 on the fifth branch Heaven Store, then after an improvement in input and output according to the reference fixes the target model of DEA CRS, then the value of the relative efficiency DMU 5 can be increased from 0.8479688 (inefficient) to 1.000000 (efficient). Keywords: Data Envelopment Analysis, customer satisfaction, efficiency


Sign in / Sign up

Export Citation Format

Share Document