scholarly journals Bayesian Gene Selection Based on Pathway Information and Network-Constrained Regularization

2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Ming Cao ◽  
Yue Fan ◽  
Qinke Peng

High-throughput data make it possible to study expression levels of thousands of genes simultaneously under a particular condition. However, only few of the genes are discriminatively expressed. How to identify these biomarkers precisely is significant for disease diagnosis, prognosis, and therapy. Many studies utilized pathway information to identify the biomarkers. However, most of these studies only incorporate the group information while the pathway structural information is ignored. In this paper, we proposed a Bayesian gene selection with a network-constrained regularization method, which can incorporate the pathway structural information as priors to perform gene selection. All the priors are conjugated; thus, the parameters can be estimated effectively through Gibbs sampling. We present the application of our method on 6 microarray datasets, comparing with Bayesian Lasso, Bayesian Elastic Net, and Bayesian Fused Lasso. The results show that our method performs better than other Bayesian methods and pathway structural information can improve the result.

2014 ◽  
Author(s):  
Hong-Dong Li ◽  
Qing-Song Xu ◽  
Yi-Zeng Liang

Identifying a small subset of discriminate genes is important for predicting clinical outcomes and facilitating disease diagnosis. Based on the model population analysis framework, we present a method, called PHADIA, which is able to output a phase diagram displaying the predictive ability of each variable, which provides an intuitive way for selecting informative variables. Using two publicly available microarray datasets, it’s demonstrated that our method can selects a few informative genes and achieves significantly better or comparable classification accuracy compared to the reported results in the literature. The source codes are freely available at: www.libpls.net.


2021 ◽  
Vol 49 (1) ◽  
pp. 030006052098284
Author(s):  
Tingting Qiao ◽  
Simin Liu ◽  
Zhijun Cui ◽  
Xiaqing Yu ◽  
Haidong Cai ◽  
...  

Objective To construct deep learning (DL) models to improve the accuracy and efficiency of thyroid disease diagnosis by thyroid scintigraphy. Methods We constructed DL models with AlexNet, VGGNet, and ResNet. The models were trained separately with transfer learning. We measured each model’s performance with six indicators: recall, precision, negative predictive value (NPV), specificity, accuracy, and F1-score. We also compared the diagnostic performances of first- and third-year nuclear medicine (NM) residents with assistance from the best-performing DL-based model. The Kappa coefficient and average classification time of each model were compared with those of two NM residents. Results The recall, precision, NPV, specificity, accuracy, and F1-score of the three models ranged from 73.33% to 97.00%. The Kappa coefficient of all three models was >0.710. All models performed better than the first-year NM resident but not as well as the third-year NM resident in terms of diagnostic ability. However, the ResNet model provided “diagnostic assistance” to the NM residents. The models provided results at speeds 400 to 600 times faster than the NM residents. Conclusion DL-based models perform well in diagnostic assessment by thyroid scintigraphy. These models may serve as tools for NM residents in the diagnosis of Graves’ disease and subacute thyroiditis.


2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Dandan Yang

This paper investigates the three-way clustering involving fuzzy covering, thresholds acquisition, and boundary region processing. First of all, a valid fuzzy covering of the universe is constructed on the basis of an appropriate fuzzy similarity relation, which helps capture the structural information and the internal connections of the dataset from the global perspective. Due to the advantages of valid fuzzy covering, we explore the valid fuzzy covering instead of the raw dataset for RFCM algorithm-based three-way clustering. Subsequently, from the perspective of semantic interpretation of balancing the uncertainty changes in fuzzy sets, a method of partition thresholds acquisition combining linear and nonlinear fuzzy entropy theory is proposed. Furthermore, boundary regions in three-way clustering correspond to the abstaining decisions and generate uncertain rules. In order to improve the classification accuracy, the k-nearest neighbor (kNN) algorithm is utilized to reduce the objects in the boundary regions. The experimental results show that the performance of the proposed three-way clustering based on fuzzy covering and kNN-FRFCM algorithm is better than the compared algorithms in most cases.


2018 ◽  
Vol 14 (6) ◽  
pp. 868-880 ◽  
Author(s):  
Shilan S. Hameed ◽  
Fahmi F. Muhammad ◽  
Rohayanti Hassan ◽  
Faisal Saeed

2018 ◽  
Vol 8 (9) ◽  
pp. 1569 ◽  
Author(s):  
Shengbing Wu ◽  
Hongkun Jiang ◽  
Haiwei Shen ◽  
Ziyi Yang

In recent years, gene selection for cancer classification based on the expression of a small number of gene biomarkers has been the subject of much research in genetics and molecular biology. The successful identification of gene biomarkers will help in the classification of different types of cancer and improve the prediction accuracy. Recently, regularized logistic regression using the L 1 regularization has been successfully applied in high-dimensional cancer classification to tackle both the estimation of gene coefficients and the simultaneous performance of gene selection. However, the L 1 has a biased gene selection and dose not have the oracle property. To address these problems, we investigate L 1 / 2 regularized logistic regression for gene selection in cancer classification. Experimental results on three DNA microarray datasets demonstrate that our proposed method outperforms other commonly used sparse methods ( L 1 and L E N ) in terms of classification performance.


2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Zhuxiang Shen ◽  
Wei Li ◽  
Hui Han

To explore the utilization of the convolutional neural network (CNN) and wavelet transform in ultrasonic image denoising and the influence of the optimized wavelet threshold function (WTF) algorithm on image denoising, in this exploration, first, the imaging principle of ultrasound images is studied. Due to the limitation of the principle of ultrasound imaging, the inherent speckle noise will seriously affect the quality of ultrasound images. The denoising principle of the WTF based on the wavelet transform is analyzed. Based on the traditional threshold function algorithm, the optimized WTF algorithm is proposed and applied to the simulation experiment of ultrasound images. By comparing quantitatively and qualitatively with the traditional threshold function algorithm, the advantages of the optimized WTF algorithm are analyzed. The results suggest that the image is denoised by the optimized WTF. The mean square error (MSE), peak signal-to-noise ratio (PSNR), and structural similarity index measurement (SSIM) of the images are 20.796 dB, 34.294 dB, and 0.672 dB, respectively. The denoising effect is better than the traditional threshold function. It can denoise the image to the maximum extent without losing the image information. In addition, in this exploration, the optimized function is applied to the actual medical image processing, and the ultrasound images of arteries and kidneys are denoised separately. It is found that the quality of the denoised image is better than that of the original image, and the extraction of effective information is more accurate. In summary, the optimized WTF algorithm can not only remove a lot of noise but also obtain better visual effect. It has important value in assisting doctors in disease diagnosis, so it can be widely applied in clinics.


2020 ◽  
Vol 34 (04) ◽  
pp. 6704-6711
Author(s):  
Zheng Yu ◽  
Xuhui Fan ◽  
Marcin Pietrasik ◽  
Marek Z. Reformat

The Mixed-Membership Stochastic Blockmodel (MMSB) is proposed as one of the state-of-the-art Bayesian relational methods suitable for learning the complex hidden structure underlying the network data. However, the current formulation of MMSB suffers from the following two issues: (1), the prior information (e.g. entities' community structural information) can not be well embedded in the modelling; (2), community evolution can not be well described in the literature. Therefore, we propose a non-parametric fragmentation coagulation based Mixed Membership Stochastic Blockmodel (fcMMSB). Our model performs entity-based clustering to capture the community information for entities and linkage-based clustering to derive the group information for links simultaneously. Besides, the proposed model infers the network structure and models community evolution, manifested by appearances and disappearances of communities, using the discrete fragmentation coagulation process (DFCP). By integrating the community structure with the group compatibility matrix we derive a generalized version of MMSB. An efficient Gibbs sampling scheme with Polya Gamma (PG) approach is implemented for posterior inference. We validate our model on synthetic and real world data.


2000 ◽  
Vol 6 (S2) ◽  
pp. 1192-1193 ◽  
Author(s):  
Michael A. O'Keefe

Transmission electron microscopy to a resolution of 0.89Å has been achieved at the National Center for Electron Microscopy and is available to electron microscopists who have a requirement for this level of resolution. Development of this capability commenced in 1993, when the National Center for Electron Microscopy agreed to fund a proposal for a unique facility, a one- Ångstrom microscope (OÅM).2 The OÅM project provides materials scientists with transmission electron microscopy at a resolution better than one Angstrom by exploiting the significantly higher information limit of a FEG-TEM over its Scherzer resolution limit. To turn the misphased information beyond the Scherzer limit into useful resolution, the OÅM requires extensive image reconstruction. One method chosen was reconstruction from off-axis holograms; another was reconstruction from focal series of underfocused images. The OÅM is then properly a combination of a FEG-TEM (a CM300FEG-UT) together with computer software able to generate sub-Ångstrom images from experimental images obtained on the FEG-TEM.Before the advent of the OÅM, NCEM microscopists relied on image simulation to obtain structural information beyond the TEM resolution limit.


Sign in / Sign up

Export Citation Format

Share Document