scholarly journals A two-dimensional backward heat problem with statistical discrete data

2018 ◽  
Vol 26 (1) ◽  
pp. 13-31 ◽  
Author(s):  
Nguyen Dang Minh ◽  
Khanh To Duc ◽  
Nguyen Huy Tuan ◽  
Dang Duc Trong

AbstractWe focus on the nonhomogeneous backward heat problem of finding the initial temperature {\theta=\theta(x,y)=u(x,y,0)} such that\left\{\begin{aligned} \displaystyle u_{t}-a(t)(u_{xx}+u_{yy})&\displaystyle=f% (x,y,t),&\hskip 10.0pt(x,y,t)&\displaystyle\in\Omega\times(0,T),\\ \displaystyle u(x,y,t)&\displaystyle=0,&\hskip 10.0pt(x,y)&\displaystyle\in% \partial\Omega\times(0,T),\\ \displaystyle u(x,y,T)&\displaystyle=h(x,y),&\hskip 10.0pt(x,y)&\displaystyle% \in\overline{\Omega},\end{aligned}\right.\vspace*{-0.5mm}where {\Omega=(0,\pi)\times(0,\pi)}. In the problem, the source {f=f(x,y,t)} and the final data {h=h(x,y)} are determined through random noise data {g_{ij}(t)} and {d_{ij}} satisfying the regression models\displaystyle g_{ij}(t)=f(X_{i},Y_{j},t)+\vartheta\xi_{ij}(t),\displaystyle d_{ij}=h(X_{i},Y_{j})+\sigma_{ij}\varepsilon_{ij},where {(X_{i},Y_{j})} are grid points of Ω. The problem is severely ill-posed. To regularize the instable solution of the problem, we use the trigonometric least squares method in nonparametric regression associated with the projection method. In addition, convergence rate is also investigated numerically.

Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-15
Author(s):  
Vu Ho ◽  
Donal O’Regan ◽  
Hoa Ngo Van

In this paper, we consider the nonlinear inverse-time heat problem with a conformable derivative concerning the time variable. This problem is severely ill posed. A new method on the modified integral equation based on two regularization parameters is proposed to regularize this problem. Numerical results are presented to illustrate the efficiency of the proposed method.


2019 ◽  
Vol 27 (1) ◽  
pp. 103-115
Author(s):  
Triet Minh Le ◽  
Quan Hoang Pham ◽  
Phong Hong Luu

Abstract In this article, we investigate the backward heat problem (BHP) which is a classical ill-posed problem. Although there are many papers relating to the BHP in many domains, considering this problem in polar coordinates is still scarce. Therefore, we wish to deal with this problem associated with a space and time-dependent heat source in polar coordinates. By modifying the quasi-boundary value method, we propose the stable solution for the problem. Furthermore, under some initial assumptions, we get the Hölder type of error estimates between the exact solution and the approximated solution. Eventually, a numerical experiment is provided to prove the effectiveness and feasibility of our method.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Tao Min ◽  
Weimin Fu ◽  
Qiang Huang

We investigate the inverse problem in the nonhomogeneous heat equation involving the recovery of the initial temperature from measurements of the final temperature. This problem is known as the backward heat problem and is severely ill-posed. We show that this problem can be converted into the first Fredholm integral equation, and an algorithm of inversion is given using Tikhonov's regularization method. The genetic algorithm for obtaining the regularization parameter is presented. We also present numerical computations that verify the accuracy of our approximation.


2020 ◽  
Vol 42 (2) ◽  
Author(s):  
Édipo Menezes da Silva ◽  
Maraísa Hellen Tadeu ◽  
Victor Ferreira da Silva ◽  
Rafael Pio ◽  
Tales Jesus Fernandes ◽  
...  

Abstract Blackberry is a small fruit with several properties beneficial to human health and its cultivation is an alternative for small producers due to its fast and high financial return. Studying the growth of fruits over time is extremely important to understand their development, helping in the most appropriate crop management, avoiding post-harvest losses, which is one of the aggravating factors of blackberry cultivation, being a short shelf life fruit. Thus, growth curves are highlighted in this type of study and modeling through statistical models helps understanding how such growth occurs. Data from this study were obtained from an experiment conducted at the Federal University of Lavras in 2015. The aim of this study was to adjust nonlinear, double Logistic and double Gompertz models to describe the diameter growth of four blackberry cultivars (‘Brazos’, ‘Choctaw’, ‘Guarani’ and ‘Tupy’). Estimations of parameters were obtained using the least squares method and the Gauss-Newton algorithm, with the “nls” and “glns” functions of the R statistical software. The comparison of adjustments was made by the Akaike information criterion (AICc), residual standard deviation (RSD) and adjusted determination coefficient (R2 aj). The models satisfactorily described data, choosing the Logistic double model for ‘Brazos’ and ‘Guarani’ cultivars and the double Gompertz model for ‘Tupy’ and ‘Choctaw’ cultivars.


Mathematics ◽  
2019 ◽  
Vol 7 (5) ◽  
pp. 422
Author(s):  
Nguyen Anh Triet ◽  
Nguyen Duc Phuong ◽  
Van Thinh Nguyen ◽  
Can Nguyen-Huu

In this work, we focus on the Cauchy problem for the Poisson equation in the two dimensional domain, where the initial data is disturbed by random noise. In general, the problem is severely ill-posed in the sense of Hadamard, i.e., the solution does not depend continuously on the data. To regularize the instable solution of the problem, we have applied a nonparametric regression associated with the truncation method. Eventually, a numerical example has been carried out, the result shows that our regularization method is converged; and the error has been enhanced once the number of observation points is increased.


2019 ◽  
Vol 11 (9) ◽  
pp. 168781401987323 ◽  
Author(s):  
Marwa Chaabane ◽  
Majdi Mansouri ◽  
Kamaleldin Abodayeh ◽  
Ahmed Ben Hamida ◽  
Hazem Nounou ◽  
...  

A new fault detection technique is considered in this article. It is based on kernel partial least squares, exponentially weighted moving average, and generalized likelihood ratio test. The developed approach aims to improve monitoring the structural systems. It consists of computing an optimal statistic that merges the current information and the previous one and gives more weight to the most recent information. To improve the performances of the developed kernel partial least squares model even further, multiscale representation of data will be used to develop a multiscale extension of this method. Multiscale representation is a powerful data analysis way that presents efficient separation of deterministic characteristics from random noise. Thus, multiscale kernel partial least squares method that combines the advantages of the kernel partial least squares method with those of multiscale representation will be developed to enhance the structural modeling performance. The effectiveness of the proposed approach is assessed using two examples: synthetic data and benchmark structure. The simulation study proves the efficiency of the developed technique over the classical detection approaches in terms of false alarm rate, missed detection rate, and detection speed.


2000 ◽  
Vol 33 (2) ◽  
pp. 259-266 ◽  
Author(s):  
F. Sánchez-Bajo ◽  
F. L. Cumbrera

The deconvolution of X-ray diffraction profiles is a basic step in order to obtain reliable results on the microstructure of crystalline powder (crystallite size, lattice microstrain,etc.). A procedure for unfolding the linear integral equationh=g finvolved in the kinematical theory of X-ray diffraction is proposed. This technique is based on the series expansion of the `pure' profile,f. The method has been tested with a simulated instrument-broadened profile overlaid with random noise by using Hermite polynomials and Fourier series, and applied to the deconvolution of the (111) peak of a sample of 9-YSZ. In both cases, the effects of the `ill-posed' nature of this deconvolution problem were minimized, especially when using the zero-order regularization combined with the series expansion.


Geophysics ◽  
1984 ◽  
Vol 49 (5) ◽  
pp. 521-524 ◽  
Author(s):  
John Halpenny

Data from automatic recording systems often require editing and filtering before they are suitable for computer analysis. The procedure described in this paper produces edited values at regular intervals from input data containing random noise, data gaps, and sudden steps or resets. It uses a Kalman filter with a fixed delay time to estimate the most probable data value at any time, based on information both before and after the time point. Isolated portions of a bad record can be recognized and removed, and steps or offsets are identified and measured. An example is shown of clean output produced from input which suffers from a variety of instrumental problems.


Sign in / Sign up

Export Citation Format

Share Document