smoothing parameters
Recently Published Documents


TOTAL DOCUMENTS

107
(FIVE YEARS 26)

H-INDEX

16
(FIVE YEARS 2)

2021 ◽  
Vol 15 (1) ◽  
pp. 280-288
Author(s):  
Mahdi Rezapour ◽  
Khaled Ksaibati

Background: Kernel-based methods have gained popularity as employed model residual’s distribution might not be defined by any classical parametric distribution. Kernel-based method has been extended to estimate conditional densities instead of conditional distributions when data incorporate both discrete and continuous attributes. The method often has been based on smoothing parameters to use optimal values for various attributes. Thus, in case of an explanatory variable being independent of the dependent variable, that attribute would be dropped in the nonparametric method by assigning a large smoothing parameter, giving them uniform distributions so their variances to the model’s variance would be minimal. Objectives: The objective of this study was to identify factors to the severity of pedestrian crashes based on an unbiased method. Especially, this study was conducted to evaluate the applicability of kernel-based techniques of semi- and nonparametric methods on the crash dataset by means of confusion techniques. Methods: In this study, two non- and semi-parametric kernel-based methods were implemented to model the severity of pedestrian crashes. The estimation of the semi-parametric densities is based on the adoptive local smoothing and maximization of the quasi-likelihood function, which is similar somehow to the likelihood of the binary logit model. On the other hand, the nonparametric method is based on the selection of optimal smoothing parameters in estimation of the conditional probability density function to minimize mean integrated squared error (MISE). The performances of those models are evaluated by their prediction power. To have a benchmark for comparison, the standard logistic regression was also employed. Although those methods have been employed in other fields, this is one of the earliest studies that employed those techniques in the context of traffic safety. Results: The results highlighted that the nonparametric kernel-based method outperforms the semi-parametric (single-index model) and the standard logit model based on the confusion matrices. To have a vision about the bandwidth selection method for removal of the irrelevant attributes in nonparametric approach, we added some noisy predictors to the models and a comparison was made. Extensive discussion has been made in the content of this study regarding the methodological approach of the models. Conclusion: To summarize, alcohol and drug involvement, driving on non-level grade, and bad lighting conditions are some of the factors that increase the likelihood of pedestrian crash severity. This is one of the earliest studies that implemented the methods in the context of transportation problems. The nonparametric method is especially recommended to be used in the field of traffic safety when there are uncertainties regarding the importance of predictors as the technique would automatically drop unimportant predictors.


Author(s):  
Евгений Николаевич Коровин ◽  
Виктория Николаевна Белоусова

В статье приведены анализ и прогнозирование основных статистических показателей, характеризующих распространенность различных нозологических форм заболеваний среди детского населения Каменского района. Для определения качества медицинской помощи, предоставляемой в детской поликлинике, среди жителей района был проведен опрос, в ходе которого была выявлена частота посещения данного амбулаторно-поликлинического учреждения по поводу заболевания и с целью профилактики, оценен уровень оказываемой помощи по различным критериям, определены как положительные, так и отрицательные аспекты деятельности, а также предложены методы повышения эффективности работы поликлиники. С целью предвидения основных показателей заболеваемости был построен прогноз. В качестве данных для прогнозирования были использованы показатели заболеваемости детского населения прошлых лет. Прогнозирование осуществляется с помощью метода экспоненциального сглаживания с использованием линейного тренда и выбором оптимальных параметров сглаживания. Экспоненциальное сглаживание является интуитивным методом, который взвешивает наблюдаемые временные ряды неравномерно. Последние наблюдения взвешиваются более интенсивно, чем отдаленные наблюдения. Основной целью анализа и прогнозирования является выявление основных тенденций в изменении структуры заболеваемости, а также определение влияния качества и доступности оказываемых медицинских услуг в поликлинике на здоровье детского населения Каменского района The article presents the analysis and prediction of the main statistical indicators characterizing the prevalence of various nosological forms of diseases among the children of the Kamensky district. To determine the quality of medical care provided in the children's polyclinic, a survey was conducted among the residents of the district, during which the frequency of visits to this outpatient clinic for the disease and for the purpose of prevention was revealed, the level of care provided was assessed according to various criteria, both positive and negative aspects of activity were identified, and methods of improving the efficiency of the polyclinic were proposed. In order to anticipate the main indicators of morbidity, a forecast was built. The indicators of morbidity of the child population of previous years were used as data for forecasting. Forecasting is carried out using the exponential smoothing method using a linear trend and the choice of optimal smoothing parameters. Exponential smoothing is an intuitive method that weighs the observed time series unevenly. Recent observations are weighed more intensively than distant observations. The main purpose of the analysis and forecasting is to identify the main trends in the change in the structure of morbidity, as well as to determine the impact of the quality and availability of medical services provided in the polyclinic on the health of the children's population of the Kamensky district


Forecasting ◽  
2021 ◽  
Vol 3 (4) ◽  
pp. 839-850
Author(s):  
Eren Bas ◽  
Erol Egrioglu ◽  
Ufuk Yolcu

Exponential smoothing methods are one of the classical time series forecasting methods. It is well known that exponential smoothing methods are powerful forecasting methods. In these methods, exponential smoothing parameters are fixed on time, and they should be estimated with efficient optimization algorithms. According to the time series component, a suitable exponential smoothing method should be preferred. The Holt method can produce successful forecasting results for time series that have a trend. In this study, the Holt method is modified by using time-varying smoothing parameters instead of fixed on time. Smoothing parameters are obtained for each observation from first-order autoregressive models. The parameters of the autoregressive models are estimated by using a harmony search algorithm, and the forecasts are obtained with a subsampling bootstrap approach. The main contribution of the paper is to consider the time-varying smoothing parameters with autoregressive equations and use the bootstrap method in an exponential smoothing method. The real-world time series are used to show the forecasting performance of the proposed method.


2021 ◽  
Vol 9 (4) ◽  
pp. 325-337
Author(s):  
Robert Z. Selden ◽  
Lauren N. Butaric ◽  
Kersten Bergstrom ◽  
Dennis Van Gerven

ABSTRACTThe production of three-dimensional (3D) digital meshes of surface and computed tomographic (CT) data has become widespread in morphometric analyses of anthropological and archaeological data. Given that processing methods are not standardized, this leaves questions regarding the comparability of processed and digitally curated 3D datasets. The goal of this study was to identify those processing parameters that result in the most consistent fit between CT-derived meshes and a 3D surface model of the same human mandible. Eight meshes, each using unique thresholding and smoothing parameters, were compared to assess whole-object deviations, deviations along curves, and deviations between specific anatomical features on the surface model when compared with the CT scans using a suite of comparison points. Based on calculated gap distances, the mesh that thresholded at “0” with an applied smoothing technique was found to deviate least from the surface model, although it is not the most biologically accurate. Results have implications for aggregated studies that employ multimodal 3D datasets, and caution is recommended for studies that enlist 3D data from websites and digital repositories, particularly if processing parameters are unknown or derived for studies with different research foci.


2021 ◽  
Vol 2123 (1) ◽  
pp. 012035
Author(s):  
Andi Tenri Ampa ◽  
I Nyoman Budiantara ◽  
Ismaini Zain

Abstract In this article, we propose a new method of selecting smoothing parameters in semiparametric regression. This method is used in semiparametric regression estimation where the nonparametric component is partially approximated by multivariable Fourier Series and partly approached by multivariable Kernel. Selection of smoothing parameters using the method with Generalized Cross-Validation (GCV). To see the performance of this method, it is then applied to the data drinking water quality sourced from Regional Drinking Water Company (PDAM) Surabaya by using Fourier Series with trend and Gaussian Kernel. The results showed that this method contributed a good performance in selecting the optimal smoothing parameters.


Author(s):  
Hairi Septiyanor ◽  
Syaripuddin Syaripuddin ◽  
Rito Goejantoro

Exponential smoothing is forecasting method used to predict the future. Lazarus is an open source software based on free pascal compiler. at this research, program Lazarus be design used exponential smoothing method to predict electricity consumption data in Samarinda City from September to November 2018. Purposed of this researched is to determine the procedure of building an exponential smoothing forecasting application and obtained forecasting result using the built application. Procedure of built the application are designed interface, designed properties and filled coding. The optimum smoothing parameters were obtained used the golden section method. Based on the analysis, electricity consumption data in Samarinda City shows a trend pattern, then the forecasting was used double exponential smoohting (DES) method are DES Brown and DES Holt. The best forecasting method for at this researched is DES Holt, because DES Holt method produced MAPE 0,0659% less than DES Brown method produced MAPE 0,0843%.


2021 ◽  
Author(s):  
Robert Z. Selden ◽  
lauren butaric ◽  
Kersten Bergstrom ◽  
Dennis Van Gerven

The production of three-dimensional (3-D) digital meshes of surface and computed tomographic (CT) data has become widespread in morphometric analyses of anthropological and archaeological data. Given that processing methods are not standardised, this leaves questions regarding the comparability of processed and digitally curated 3-D datasets. The goal of this study was to identify those processing parameters that result in the most consistent fit between CT-derived meshes and a 3-D surface model of the same human mandible. Eight meshes, each using unique thresholding and smoothing parameters, were compared to assess whole-object deviations, deviations along curves, and deviations between specific anatomical features on the surface model when compared with the CT scans using a suite of \textit{comparison points}. Based on calculated gap distances, the mesh thresholded at "0" with an applied smoothing technique was found to deviate least from the surface model; although, it is not the most biologically accurate. Results have implications for aggregated studies that employ multi-modal 3-D datasets, and caution is recommended for studies that enlist 3-D data from websites and digital repositories, particularly if processing parameters are unknown or derived for studies with different research foci.


2021 ◽  
pp. 1-35
Author(s):  
Hiroshi Yamada

The Hodrick–Prescott (HP) filter has been a popular method of trend extraction from economic time series. However, it is impractical without modification if some observations are not available. This paper improves the HP filter so that it can be applied in such situations. More precisely, this paper introduces two alternative generalized HP filters that are applicable for this purpose. We provide their properties and a way of specifying those smoothing parameters that are required for their application. In addition, we numerically examine their performance. Finally, based on our analysis, we recommend one of them for applied studies.


2021 ◽  
Author(s):  
Donato Talone ◽  
Rita de Nardis ◽  
Giusy Lavecchia ◽  
Luca De Siena

<p>Seismic tomography can be applied to different scales. Over the last two decades, monitoring systems, technical innovations and methodologies have substantially improved, resulting in accurate tomographic images at the global and local scales. Nowadays it is easy to perform travel-time tomography with local seismicity thanks to the increasing density of seismic stations. Nevertheless, it is unlikely to have earthquakes that properly cover the whole studied area at the requested depths. For this reason, many tomographic images are obtained with teleseisms and both far and local earthquakes.</p><p>Here, we realized a Local Earthquake Tomography (LET) in an area of high seismic hazard in central-southern Italy, extending from L’Aquila to Benevento, to benchmark the iterative non-linear Fast-Marching code FMTOMO (Rawlinson and Sambridge, 2004) at intracontinental scale. The primary aim is to analyse and discuss the influence of both the inversion parameters and the grid sizes on the inversion results. Special attention was devoted to setting damping factors and smoothing parameters and to study how they can affect the tomographic images and their reliability.</p><p>We used 5712 local events (0.2<ML<5.1) recorded by 38 stations of the Italian Seismic network; we jointly inverted 71221 P and S arrival times to obtain Vp and Vs model. We selected earthquakes having: (1) a root-mean-square (RMS) residual less than 0.5 s, (2) more than 10 phases (P and S), (3) azimuth gap less than 180, (4) residual of each phase less than 0.5 s, (5) a depth between 0.5 and 30 km.<span> </span><span>We used a single layer of 35 km in depth and a grid area extending 162 km in latitude and 245 km in longitude with a node spacing of about 5 km in each direction. As a starting velocity model, we </span><span>chose</span><span> a mono-dimensional one of Trionfera et al. (2020).</span></p><p>Using these well-localized earthquakes, we observed low residuals variability despite a full investigation of damping and smoothing parameters. Furthermore, the regularization parameters we obtained are one or two order of magnitude lower than those estimated at the wider scales.</p><p>Because of the uncertainties in the depth of events, the fast-marching code needs several nodes above and below the grid set for earthquakes to move sources during each hypocentral inversion. As a consequence, when inverting for both velocity and hypocentral location, FMTOMO performs the calculation even for a wide boundary area without earthquakes, which causes a loss of computational speed.</p><p>After properly tuning the inversion parameters, FMTOMO gives reliable and high-resolution tomographic images. We found a good agreement with surface geology and regional tectonic structures, demonstrating that the code works well in areas with such complex geology.</p>


Author(s):  
Axel Böhm ◽  
Stephen J. Wright

AbstractWe study minimization of a structured objective function, being the sum of a smooth function and a composition of a weakly convex function with a linear operator. Applications include image reconstruction problems with regularizers that introduce less bias than the standard convex regularizers. We develop a variable smoothing algorithm, based on the Moreau envelope with a decreasing sequence of smoothing parameters, and prove a complexity of $${\mathcal {O}}(\epsilon ^{-3})$$ O ( ϵ - 3 ) to achieve an $$\epsilon $$ ϵ -approximate solution. This bound interpolates between the $${\mathcal {O}}(\epsilon ^{-2})$$ O ( ϵ - 2 ) bound for the smooth case and the $${\mathcal {O}}(\epsilon ^{-4})$$ O ( ϵ - 4 ) bound for the subgradient method. Our complexity bound is in line with other works that deal with structured nonsmoothness of weakly convex functions.


Sign in / Sign up

Export Citation Format

Share Document