scholarly journals Comparison of Spectrum Estimation Methods for the Accurate Evaluation of Sea State Parameters

Sensors ◽  
2020 ◽  
Vol 20 (5) ◽  
pp. 1416
Author(s):  
Giovanni Battista Rossi ◽  
Francesco Crenna ◽  
Vincenzo Piscopo ◽  
Antonio Scamardella

The monitoring of sea state conditions, either for weather forecasting or ship seakeeping analysis, requires the reliable assessment of the sea spectra encountered by the ship, either as a final result or intermediate step for the measurement of the relevant wave-motion parameters. In current analyses, different spectrum estimation methods, namely the Welch, Thomson and ARMA models, have been applied and compared based on a set of random wave signals, with different durations, representative of several sea state conditions. Subsequently, two sea spectrum reconstruction techniques were described and applied in order to detect the main sea state parameters, namely the significant wave height, the mean wave period and the spectrum peak enhancement factor. The performances of both spectral analysis and sea state reconstruction methods are discussed in order to provide some preliminary guidelines for practical application purposes. In this respect, based on current results, the Welch and Thomson methods seem to be the most promising techniques, combined with the nonlinear least-square reconstruction technique.

Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 2995
Author(s):  
Giovanni Battista Rossi ◽  
Francesco Crenna ◽  
Marta Berardengo ◽  
Vincenzo Piscopo ◽  
Antonio Scamardella

The reliable monitoring of sea state parameters is a key factor for weather forecasting, as well as for ensuring the safety and navigation of ships. In the current analysis, two spectrum estimation techniques, based on the Welch and Thomson methods, were applied to a set of random wave signals generated from a theoretical wave spectrum obtained by combining wind sea and swell components with the same prevailing direction but different combinations of significant wave heights, peak periods, and peak enhancement factors. A wide benchmark study was performed to systematically apply and compare the two spectrum estimation methods. In this respect, different combinations of wind sea spectra, corresponding to four grades of the Douglas Scale, were combined with three swell spectra corresponding to different swell categories. The main aim of the benchmark study was to systematically investigate the effectiveness of the Welch and Thomson methods in terms of spectrum restitution and the assessment of sea state parameters. The spectrum estimation methods were applied to random wave signals with different durations, namely 600 s (short) and 3600 s (long), to investigate how the record length affected the assembled sea state parameters, which, in turn, were assessed by the nonlinear least square method. Finally, based on the main outcomes of the benchmark study, some suggestions are provided to select the most suitable spectrum reconstruction method and increase the effectiveness of the assembled sea state parameters.


2012 ◽  
Vol 529 ◽  
pp. 139-143
Author(s):  
Zong Feng Ma

A low-cost, novel and robust heterodyne laser Doppler radar based on Er-doped fiber laser is presented in this paper. Reliable optical fiber components and instruments for the optical communication were used to build this system. All devices of optical circuit are connected by single-mode fibers making the system reliable and setup arrangement flexible. Spectrum estimation methods based on an efficient digital signal processing technique, fast Fourier transformation (FFT), was utilized to determine the location of the peak. Experiments were performed on a moving hard target with this developed prototype. The results are presented: the minimum velocity can be measured is below 0.5mm/s, and the resultant nonlinear of the measured curve calculated by least square method is below 100 ppm.


2021 ◽  
Vol 10 (3) ◽  
pp. 157
Author(s):  
Paul-Mark DiFrancesco ◽  
David A. Bonneau ◽  
D. Jean Hutchinson

Key to the quantification of rockfall hazard is an understanding of its magnitude-frequency behaviour. Remote sensing has allowed for the accurate observation of rockfall activity, with methods being developed for digitally assembling the monitored occurrences into a rockfall database. A prevalent challenge is the quantification of rockfall volume, whilst fully considering the 3D information stored in each of the extracted rockfall point clouds. Surface reconstruction is utilized to construct a 3D digital surface representation, allowing for an estimation of the volume of space that a point cloud occupies. Given various point cloud imperfections, it is difficult for methods to generate digital surface representations of rockfall with detailed geometry and correct topology. In this study, we tested four different computational geometry-based surface reconstruction methods on a database comprised of 3668 rockfalls. The database was derived from a 5-year LiDAR monitoring campaign of an active rock slope in interior British Columbia, Canada. Each method resulted in a different magnitude-frequency distribution of rockfall. The implications of 3D volume estimation were demonstrated utilizing surface mesh visualization, cumulative magnitude-frequency plots, power-law fitting, and projected annual frequencies of rockfall occurrence. The 3D volume estimation methods caused a notable shift in the magnitude-frequency relations, while the power-law scaling parameters remained relatively similar. We determined that the optimal 3D volume calculation approach is a hybrid methodology comprised of the Power Crust reconstruction and the Alpha Solid reconstruction. The Alpha Solid approach is to be used on small-scale point clouds, characterized with high curvatures relative to their sampling density, which challenge the Power Crust sampling assumptions.


2013 ◽  
Vol 278-280 ◽  
pp. 1323-1326
Author(s):  
Yan Hua Yu ◽  
Li Xia Song ◽  
Kun Lun Zhang

Fuzzy linear regression has been extensively studied since its inception symbolized by the work of Tanaka et al. in 1982. As one of the main estimation methods, fuzzy least squares approach is appealing because it corresponds, to some extent, to the well known statistical regression analysis. In this article, a restricted least squares method is proposed to fit fuzzy linear models with crisp inputs and symmetric fuzzy output. The paper puts forward a kind of fuzzy linear regression model based on structured element, This model has precise input data and fuzzy output data, Gives the regression coefficient and the fuzzy degree function determination method by using the least square method, studies the imitation degree question between the observed value and the forecast value.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Mozamel Musa Saeed ◽  
Mohammed Alsharidah

AbstractBoth software-defined networking and big data have gained approval and preferences from both industry and academia. These two important realms have conventionally been addressed independently in wireless cellular networks. The discussion taken into consideration in this study was to analyze the wireless cellular technologies with the contrast of efficient and enhanced spectral densities at a reduced cost. To accomplish the goal of this study, Welch's method has been used as the core subject. With the aid of previous research and classical techniques, this study has identified that the spectral densities can be enhanced at reduced costs with the help of the power spectral estimation methods. The Welch method gives the result on power spectrum estimation. By reducing the effect of noise, the Welch method is used to calculate the power spectral density of a signal. When data length is increased, Welch's method is considered the best as a conclusion to this paper because excellent results are yielded by it in the area of power spectral density estimation.


2018 ◽  
Vol 8 (1) ◽  
pp. 44
Author(s):  
Lutfiah Ismail Al turk

In this paper, a Nonhomogeneous Poisson Process (NHPP) reliability model based on the two-parameter Log-Logistic (LL) distribution is considered. The essential model’s characteristics are derived and represented graphically. The parameters of the model are estimated by the Maximum Likelihood (ML) and Non-linear Least Square (NLS) estimation methods for the case of time domain data. An application to show the flexibility of the considered model are conducted based on five real data sets and using three evaluation criteria. We hope this model will help as an alternative model to other useful reliability models for describing real data in reliability engineering area.


In this paper, we have defined a new two-parameter new Lindley half Cauchy (NLHC) distribution using Lindley-G family of distribution which accommodates increasing, decreasing and a variety of monotone failure rates. The statistical properties of the proposed distribution such as probability density function, cumulative distribution function, quantile, the measure of skewness and kurtosis are presented. We have briefly described the three well-known estimation methods namely maximum likelihood estimators (MLE), least-square (LSE) and Cramer-Von-Mises (CVM) methods. All the computations are performed in R software. By using the maximum likelihood method, we have constructed the asymptotic confidence interval for the model parameters. We verify empirically the potentiality of the new distribution in modeling a real data set.


2019 ◽  
Vol 5 (3) ◽  
pp. 6 ◽  
Author(s):  
Neha Dubey ◽  
Ankit Pandit

In wireless communication, orthogonal frequency division multiplexing (OFDM) plays a major role because of its high transmission rate. Channel estimation and tracking have many different techniques available in OFDM systems. Among them, the most important techniques are least square (LS) and minimum mean square error (MMSE). In least square channel estimation method, the process is simple but the major drawback is it has very high mean square error. Whereas, the performance of MMSE is superior to LS in low SNR, its main problem is it has high computational complexity. If the error is reduced to a very low value, then an exact signal will be received. In this paper an extensive review on different channel estimation methods used in MIMO-OFDM like pilot based, least square (LS) and minimum mean square error method (MMSE) and least minimum mean square error (LMMSE) methods and also other channel estimation methods used in MIMO-OFDM are discussed.


Author(s):  
Huanan Zhang ◽  
Stefanus Jasin

Problem definition: We consider the problem of joint learning and optimization of cyclic pricing policies in the presence of patient customers. In our problem, some customers are patient, and they are willing to wait in the system for several periods to make a purchase until the price is lower than their valuation. The seller does not know the joint distribution of customers’ valuation and patience level a priori and can only learn this from the realized total sales in every period. Academic/practical relevance: The revenue management problem with patient customers has been studied in the literature as an optimization problem, and cyclic policy has been shown to be optimal in some cases. We contribute to the literature by studying this problem from the joint learning and optimization perspective. Indeed, to the best of our knowledge, our paper is the first work that studies online learning and optimization for multiperiod pricing with patient customers. Methodology: We introduce new dynamic programming formulations for this problem, and we develop two nontrivial upper confidence bound–based learning algorithms. Results: We analyze both decreasing cyclic policies and so-called threshold-regulated policies, which contain both the decreasing cyclic policies and the nested decreasing cyclic policies. We show that our learning algorithms for these policies converge to the optimal clairvoyant decreasing cyclic policy and threshold-regulated policy at a near-optimal rate. Managerial implications: Our proposed algorithms perform significantly better than benchmark algorithms that either ignore the patient customer characteristic or simply use the standard estimate-then-optimize framework, which does not encourage enough exploration; this highlights the importance of “smart learning” in the context of data-driven decision making. In addition, our numerical results also show that combining our algorithms with smart estimation methods, such as linear interpolation or least square estimation, can significantly improve their empirical performance; this highlights the benefit of combining smart learning with smart estimation, which further increases the practical viability of the algorithms.


Sign in / Sign up

Export Citation Format

Share Document