A SYSTEMATIC APPROACH TO SMB PROCESS MODEL IDENTIFICATION FROM SMB PROCESS DATA

2005 ◽  
Vol 38 (1) ◽  
pp. 189-194
Author(s):  
V. Grosfils ◽  
C. Levrie ◽  
M. Kinnaert ◽  
A. Vande Wouwer
10.14311/816 ◽  
2006 ◽  
Vol 46 (2) ◽  
Author(s):  
P. Pecherková ◽  
I. Nagy

Success/failure of adaptive control algorithms – especially those designed using the Linear Quadratic Gaussian criterion – depends on the quality of the process data used for model identification. One of the most harmful types of process data corruptions are outliers, i.e. ‘wrong data’ lying far away from the range of real data. The presence of outliers in the data negatively affects an estimation of the dynamics of the system. This effect is magnified when the outliers are grouped into blocks. In this paper, we propose an algorithm for outlier detection and removal. It is based on modelling the corrupted data by a two-component probabilistic mixture. The first component of the mixture models uncorrupted process data, while the second models outliers. When the outlier component is detected to be active, a prediction from the uncorrupted data component is computed and used as a reconstruction of the observed data. The resulting reconstruction filter is compared to standard methods on simulated and real data. The filter exhibits excellent properties, especially in the case of blocks of outliers. 


2007 ◽  
Vol 62 (15) ◽  
pp. 3894-3908 ◽  
Author(s):  
V. Grosfils ◽  
C. Levrie ◽  
M. Kinnaert ◽  
A. Vande Wouwer

2012 ◽  
Vol 2012 ◽  
pp. 1-21 ◽  
Author(s):  
Shen Yin ◽  
Xuebo Yang ◽  
Hamid Reza Karimi

This paper presents an approach for data-driven design of fault diagnosis system. The proposed fault diagnosis scheme consists of an adaptive residual generator and a bank of isolation observers, whose parameters are directly identified from the process data without identification of complete process model. To deal with normal variations in the process, the parameters of residual generator are online updated by standard adaptive technique to achieve reliable fault detection performance. After a fault is successfully detected, the isolation scheme will be activated, in which each isolation observer serves as an indicator corresponding to occurrence of a particular type of fault in the process. The thresholds can be determined analytically or through estimating the probability density function of related variables. To illustrate the performance of proposed fault diagnosis approach, a laboratory-scale three-tank system is finally utilized. It shows that the proposed data-driven scheme is efficient to deal with applications, whose analytical process models are unavailable. Especially, for the large-scale plants, whose physical models are generally difficult to be established, the proposed approach may offer an effective alternative solution for process monitoring.


2004 ◽  
Vol 37 (9) ◽  
pp. 23-28
Author(s):  
V.C. Machado ◽  
J.O. Trierweiler ◽  
A.R. Secchi

Author(s):  
YI DENG ◽  
JIACUN WANG ◽  
XUDONG HE ◽  
JEFFREY J. P. TSAI

System assembly is one of the major issues in engineering complex component-based systems. This is especially true when heterogeneous, COTS and GOTS distributed systems, typical in industrial applications, are involved. The goal of system assembly is not only to make constituent components work together, but also to ensure that the components as a whole behave consistently and guarantee certain end-to-end properties. Despite recent advances, there is a lack of understanding about software composability, as well as theory and techniques for checking and verifying component-based systems. A theory of software system constraints about components, their environment and about system as a whole is the necessary foundation toward solid understanding of the composability of component-based systems. In this paper, we present a systematic approach for constraint specification and constraint propagation in concert with design refinement with a novel technique to ensure consistency between system-wide and component constraints in a design composition process of component-based systems. The consistent constraint propagation is used in our approach to drive progressive verification of the design. It allows us to verify overall design composition without interference of internal details of component designs. Verification is done separately at architectural and component levels without having to compose results of component analyses. A component can be safely replaced with alternative design without re-verifying the overall system composition so long as the replacement conforms to the corresponding interface and component constraint(s).


Author(s):  
Jaime Garci´a ◽  
Jose´ Posada ◽  
Pedro Villalba ◽  
Marco Sanjuan

Biofuels production is facing new challenges every day, related to better process control and quality monitoring. It is very important for the sustainability of these processes to implement strategies and alternatives in order to achieve a continuous production process and to control significant variables involved in the reaction. One of the most difficult variables to measure is the actual Biodiesel concentration inside the reactor. Neural networks have become a useful strategy to give solutions to complex problems; its application is growing faster at industries due to the inherent nonlinear behavior of the processes, modeled easily by this computational tool. The capacity of mapping a complex behavior trough input and output process data, without a complicated and hardly to obtain mathematical model, makes neural networks an attractive strategy to be implemented in most industries, in a soft sensor or a process model scheme. This investigation addresses the need to predict the concentrations of esters (biodiesel) when different triglycerides are reacting with alcohol. Concentration was estimated using an approach that uses a soft sensor that captures the dynamics of these variables through off line laboratory experiments. The soft sensor is actually a Random Activation Weight Neural Net (RAWN), which is a back propagation neural network with a fast training algorithm that does not need any iteration. Also, to reduce the complexity of the soft sensor an optimization procedure was carried out to determine the optimum number of neurons in the hidden layer. In this research Biodiesel was produced by transesterification of palm oil with ethanol and KOH as catalyst. During transesterification reaction the estimation of concentrations is determined by laboratory analysis at off line stages, these variables are very important to control the continuous process of a biodiesel plant.


Sign in / Sign up

Export Citation Format

Share Document