dense features
Recently Published Documents


TOTAL DOCUMENTS

27
(FIVE YEARS 12)

H-INDEX

4
(FIVE YEARS 2)

2021 ◽  
Author(s):  
Yu Zhang ◽  
Lihuo He ◽  
Wen Lu ◽  
Jie Li ◽  
Xinbo Gao

Entropy ◽  
2020 ◽  
Vol 22 (11) ◽  
pp. 1299
Author(s):  
Simona Moldovanu ◽  
Lenuta Pană Toporaș ◽  
Anjan Biswas ◽  
Luminita Moraru

A new solution to overcome the constraints of multimodality medical intra-subject image registration is proposed, using the mutual information (MI) of image histogram-oriented gradients as a new matching criterion. We present a rigid, multi-modal image registration algorithm based on linear transformation and oriented gradients for the alignment of T2-weighted (T2w) images (as a fixed reference) and diffusion tensor imaging (DTI) (b-values of 500 and 1250 s/mm2) as floating images of three patients to compensate for the motion during the acquisition process. Diffusion MRI is very sensitive to motion, especially when the intensity and duration of the gradient pulses (characterized by the b-value) increases. The proposed method relies on the whole brain surface and addresses the variability of anatomical features into an image stack. The sparse features refer to corners detected using the Harris corner detector operator, while dense features use all image pixels through the image histogram of oriented gradients (HOG) as a measure of the degree of statistical dependence between a pair of registered images. HOG as a dense feature is focused on the structure and extracts the oriented gradient image in the x and y directions. MI is used as an objective function for the optimization process. The entropy functions and joint entropy function are determined using the HOGs data. To determine the best image transformation, the fiducial registration error (FRE) measure is used. We compare the results against the MI-based intensities results computed using a statistical intensity relationship between corresponding pixels in source and target images. Our approach, which is devoted to the whole brain, shows improved registration accuracy, robustness, and computational cost compared with the registration algorithms, which use anatomical features or regions of interest areas with specific neuroanatomy. Despite the supplementary HOG computation task, the computation time is comparable for MI-based intensities and MI-based HOG methods.


2020 ◽  
Vol 495 (2) ◽  
pp. 1672-1691
Author(s):  
Thomas J R Bending ◽  
Clare L Dobbs ◽  
Matthew R Bate

ABSTRACT We present simulations of a 500 pc2 region, containing gas of mass 4 × 106 M⊙, extracted from an entire spiral galaxy simulation, scaled up in resolution, including photoionizing feedback from stars of mass >18 M⊙. Our region is evolved for 10 Myr and shows clustered star formation along the arm generating ≈ 5000 cluster sink particles ≈ 5 per cent of which contain at least one of the ≈ 4000 stars of mass >18 M⊙. Photoionization has a noticeable effect on the gas in the region, producing ionized cavities and leading to dense features at the edge of the H ii regions. Compared to the no-feedback case, photoionization produces a larger total mass of clouds and clumps, with around twice as many such objects, which are individually smaller and more broken up. After this we see a rapid decrease in the total mass in clouds and the number of clouds. Unlike studies of isolated clouds, our simulations follow the long-range effects of ionization, with some already dense gas, becoming compressed from multiple sides by neighbouring H ii regions. This causes star formation that is both accelerated and partially displaced throughout the spiral arm with up to 30 per cent of our cluster sink particle mass forming at distances >5 pc from sites of sink formation in the absence of feedback. At later times, the star formation rate decreases to below that of the no-feedback case.


Author(s):  
Han Zhang ◽  
Lin Lei ◽  
Weiping Ni ◽  
Tao Tang ◽  
Junzheng Wu ◽  
...  

Author(s):  
Xiaowang Zhang ◽  
Qiang Gao ◽  
Zhiyong Feng

In this paper, we present a neural network (InteractionNN) for sparse predictive analysis where hidden features of sparse data can be learned by multilevel feature interaction. To characterize multilevel interaction of features, InteractionNN consists of three modules, namely, nonlinear interaction pooling, layer-lossing, and embedding. Nonlinear interaction pooling (NI pooling) is a hierarchical structure and, by shortcut connection, constructs low-level feature interactions from basic dense features to elementary features. Layer-lossing is a feed-forward neural network where high-level feature interactions can be learned from low-level feature interactions via correlation of all layers with target. Moreover, embedding is to extract basic dense features from sparse features of data which can help in reducing our proposed model computational complex. Finally, our experiment evaluates on the two benchmark datasets and the experimental results show that InteractionNN performs better than most of state-of-the-art models in sparse regression.


Sign in / Sign up

Export Citation Format

Share Document