smoothing kernel
Recently Published Documents


TOTAL DOCUMENTS

40
(FIVE YEARS 20)

H-INDEX

8
(FIVE YEARS 1)

2021 ◽  
pp. 096228022110616
Author(s):  
İsmail Yenilmez ◽  
Ersin Yılmaz ◽  
Yeliz Mert Kantar ◽  
Dursun Aydın

In this study, parametric and semi-parametric regression models are examined for random right censorship. The components of the aforementioned regression models are estimated with weights based on Cox and Kaplan–Meier estimates, which are semi-parametric and nonparametric methods used in survival analysis, respectively. The Tobit based on weights obtained from a Cox regression is handled as a parametric model instead of other parametric models requiring distribution assumptions such as exponential, Weibull, and gamma distributions. Also, the semi-parametric smoothing spline and the semi-parametric smoothing kernel estimators based on Kaplan–Meier weights are used. Therefore, estimates are obtained from two models with flexible approaches. To show the flexible shape of the models depending on the weights, Monte Carlo simulations are conducted, and all results are presented and discussed. Two empirical datasets are used to show the performance of the aforementioned estimators. Although three approaches gave similar results to each other, the semi-parametric approach was slightly superior to the parametric approach. The parametric approach method, on the other hand, yields good results in medium and large sample sizes and at a high censorship level. All other findings have been shared and interpreted.


PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0260245
Author(s):  
Douglas D. Burman

Studies of the hippocampus use smaller voxel sizes and smoothing kernels than cortical activation studies, typically using a multivoxel seed with specified radius for connectivity analysis. This study identified optimal processing parameters for evaluating hippocampal connectivity with sensorimotor cortex (SMC), comparing effectiveness by varying parameters during both activation and connectivity analysis. Using both 3mm and 4mm isovoxels, smoothing kernels of 0-10mm were evaluated on the amplitude and extent of motor activation and hippocampal connectivity with SMC. Psychophysiological interactions (PPI) identified hippocampal connectivity with SMC during volitional movements, and connectivity effects from multivoxel seeds were compared with alternate methods; a structural seed represented the mean connectivity map from all voxels within a region, whereas a functional seed represented the regional voxel with maximal SMC connectivity. With few exceptions, the same parameters were optimal for activation and connectivity. Larger isovoxels showed larger activation volumes in both SMC and the hippocampus; connectivity volumes from structural seeds were also larger, except from the posterior hippocampus. Regardless of voxel size, the 10mm smoothing kernel generated larger activation and connectivity volumes from structural seeds, as well as larger beta estimates at connectivity maxima; structural seeds also produced larger connectivity volumes than multivoxel seeds. Functional seeds showed lesser effects from voxel size and smoothing kernels. Optimal parameters revealed topography in structural seed connectivity along both the longitudinal axis and mediolateral axis of the hippocampus. These results indicate larger voxels and smoothing kernels can improve sensitivity for detecting both cortical activation and hippocampal connectivity.


2021 ◽  
Author(s):  
Douglas D. Burman

Protocol for task-specific, effective connectivity analysis of hippocampus using SPM12 batch files and PPI analysis. This protocol requires input from an SPM analysis on smoothed, normalized, slice-time corrected data (files with swa prefix) using 4mm isovoxels, with the first contrast specifying effects of interest (an F-contrast, excluding any movement covariates but specifying every experimental and control condition). The batch files in this protocol generate connectivity maps from each voxel in both the left and right hippocampus, then generates averaged connectivity maps from specified regions of the hippocampus, eliminating the necessity to identify seed regions a priori from methods (such as activation) not directly related to connectivity analysis. This procedure can be repeated for as many as 30 subjects. The final step creates a random effects group analysis for each of 9 bilateral structural seeds, as described elsewhere (e.g., Burman, 2021 "Topography of hippocampal connectivity with sensorimotor cortex revealed by optimizing smoothing kernel and voxel size", doi: https://doi.org/10.1101/2020.05.14.096339); however, connectivity maps created by the protocol allow an investigator to run group analysis on any region of interest.


2021 ◽  
Author(s):  
Sina Mansour L. ◽  
Caio Seguin ◽  
Robert E Smith ◽  
Andrew Zalesky

Structural connectomes are increasingly mapped at high spatial resolutions comprising many hundreds—if not thousands—of network nodes. However, high-resolution connectomes are particularly susceptible to image registration misalignment, tractography artifacts, and noise, all of which can lead to reductions in connectome accuracy and test-retest reliability. We investigate a network analogue of image smoothing to address these key challenges. Connectome-Based Smoothing (CBS) involves jointly applying a carefully chosen smoothing kernel to the two endpoints of each tractography streamline, yielding a spatially smoothed connectivity matrix. We develop computationally efficient methods to perform CBS using a matrix congruence transformation and evaluate a range of different smoothing kernel choices on CBS performance. We find that smoothing substantially improves the identifiability, sensitivity, and test-retest reliability of high-resolution connectivity maps, though at a cost of increasing storage burden. For atlas-based connectomes (i.e. low-resolution connectivity maps), we show that CBS marginally improves the statistical power to detect associations between connectivity and cognitive performance, particularly for connectomes mapped using probabilistic tractography. CBS was also found to enable more reliable statistical inference compared to connectomes without any smoothing. We provide recommendations on optimal smoothing kernel parameters for connectomes mapped using both deterministic and probabilistic tractography. We conclude that spatial smoothing is particularly important for the reliability of high-resolution connectomes, but can also provide benefits at lower parcellation resolutions. We hope that our work enables computationally efficient integration of spatial smoothing into established structural connectome mapping pipelines.


Geophysics ◽  
2021 ◽  
pp. 1-56
Author(s):  
Aaron Davis

Airborne geophysical surveys routinely collect data along traverse lines at sample spacing distances that are two or more orders of magnitude less than between line separations. Grids and maps interpolated from such surveys can produce aliasing; features that cross flight lines can exhibit boudinage or string-of-beads artefacts. Boudinage effects can be addressed by novel gridding methods. Following developments in geostatistics, a non-stationary nested anisotropic gridding scheme is proposed that accommodates local anisotropy in survey data. Computation is reduced by including anchor points throughout the interpolation region that contain localised anisotropy information which is propagated throughout the survey area with a smoothing kernel. Additional anisotropy can be required at certain locations in the region to be gridded. A model selection scheme is proposed that employs Laplace approximations for determining whether increased model complexity is supported by the surrounding data. The efficacy of the method is shown using a synthetic data set obtained from satellite imagery. A pseudo geophysical survey is created from the image and reconstructed with the method above. Two case histories are selected for further elucidation from airborne geophysical surveys conducted in Western Australia. The first example illustrates improvement in gridding the depth of palaeochannels interpreted from along-line conductivity-depth models of a regional airborne electromagnetic survey in the Mid-West. The second example shows how improvements can be made in producing grids of aeromagnetic data and inverted electrical conductivity from an airborne electromagnetic survey conducted in the Pilbara. In both case histories, nested anisotropic kriging reduces the expression of boudinage patterns and sharpens cross-line features in the final gridded products permitting increased confidence in interpretations based on such products.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Henrik Schlüter ◽  
Florian Gayk ◽  
Heinz-Jürgen Schmidt ◽  
Andreas Honecker ◽  
Jürgen Schnack

Abstract Trace estimators allow us to approximate thermodynamic equilibrium observables with astonishing accuracy. A prominent representative is the finite-temperature Lanczos method (FTLM) which relies on a Krylov space expansion of the exponential describing the Boltzmann weights. Here we report investigations of an alternative approach which employs Chebyshev polynomials. This method turns out to be also very accurate in general, but shows systematic inaccuracies at low temperatures that can be traced back to an improper behavior of the approximated density of states with and without smoothing kernel. Applications to archetypical quantum spin systems are discussed as examples.


Author(s):  
Игорь Иванович Потапов ◽  
Ольга Владимировна Решетникова

В работе для моделирования движения сыпучей среды используется метод сглаженных частиц. Для аппроксимации искомых функций предложено новое составное ядро малой связности. Основой для разработки ядра послужило требование к условию о сохранении плотности единичной SPH-частицы. Выполнение данного условия позволяет правильно моделировать поле плотности на границах расчетной области, а также в случаях структурных изменений каркаса гранулированных частиц сыпучей среды. Из анализа решения задачи гидростатики методом SPH получена оценка значения масштаба сглаживающей длины ядра для двумерного случая. Выполнен расчет процесса обрушения гранулированного “столба” и проведено сравнение полученных численных результатов моделирования с экспериментальными данными. The purpose of the study is to improve the practice of the SPH methodology which is applied for modelling of movement in the various media. The basis of the SPH-approximation of the function fields is formed by the forms of the smoothing kernel and its derivatives. Popular forms of smoothing kernels are characterized by the presence of significant fatal approximation errors when modelling granular media. Methodology. The state of granular medium is described by the classical motion and mass conservation equations. Each granule of the medium corresponds to a separate SPH particle. To approximate the density and pressure fields in the SPH particle, a new combination of the smoothing core and its first derivative forms is proposed. Results. The proposed new composite core fulfills the conditions of mass conservation and density recovery in the particle during SPH modeling. It is shown that the new composite core is characterized by a minimum error of pressure gradient approximation - about 2%. A new estimate for the velocity of propagation of an elastic wave in a medium, sufficient to obtain a correct numerical solution, is proposed. A comparative analysis of the obtained solutions with experimental data is made. Findings. The proposed composite shape of the smoothing kernel allows correct simulation of the motion of a granular medium by the SPH method. Its compactness (unit smoothing radius and unit smoothing length) makes it possible to correctly reconstruct the density field at the boundaries of the computational domain and in cases of structural changes in the framework of the granular medium. The numerical solution of the problem of the collapse of a column of granules obtained using the proposed composite core shows good agreement with experimental data.


Author(s):  
Santosh Kumar ◽  
Nitendra Kumar ◽  
Khursheed Alam

Background: In the image processing area, deblurring and denoising are the most challenging hurdles. The deblurring image by a spatially invariant kernel is a frequent problem in the field of image processing. Methods: For deblurring and denoising, the total variation (TV norm) and nonlinear anisotropic diffusion models are powerful tools. In this paper, nonlinear anisotropic diffusion models for image denoising and deblurring are proposed. The models are developed in the following manner: first multiplying the magnitude of the gradient in the anisotropic diffusion model, and then apply priori smoothness on the solution image by Gaussian smoothing kernel. Results: The finite difference method is used to discretize anisotropic diffusion models with forward-backward diffusivities. Conclusion: The results of the proposed model are given in terms of the improvement.


Sign in / Sign up

Export Citation Format

Share Document