Computational Methods for High-Dimensional Rotations in Data Visualization

Author(s):  
Andreas Buja ◽  
Dianne Cook ◽  
Daniel Asimov ◽  
Catherine Hurley
1996 ◽  
Vol 5 (1) ◽  
pp. 78 ◽  
Author(s):  
Andreas Buja ◽  
Dianne Cook ◽  
Deborah F. Swayne

2015 ◽  
Vol 150 ◽  
pp. 570-582 ◽  
Author(s):  
Hannah Kim ◽  
Jaegul Choo ◽  
Chandan K. Reddy ◽  
Haesun Park

2021 ◽  
Vol 12 ◽  
Author(s):  
Jianping Zhao ◽  
Na Wang ◽  
Haiyun Wang ◽  
Chunhou Zheng ◽  
Yansen Su

Dimensionality reduction of high-dimensional data is crucial for single-cell RNA sequencing (scRNA-seq) visualization and clustering. One prominent challenge in scRNA-seq studies comes from the dropout events, which lead to zero-inflated data. To address this issue, in this paper, we propose a scRNA-seq data dimensionality reduction algorithm based on a hierarchical autoencoder, termed SCDRHA. The proposed SCDRHA consists of two core modules, where the first module is a deep count autoencoder (DCA) that is used to denoise data, and the second module is a graph autoencoder that projects the data into a low-dimensional space. Experimental results demonstrate that SCDRHA has better performance than existing state-of-the-art algorithms on dimension reduction and noise reduction in five real scRNA-seq datasets. Besides, SCDRHA can also dramatically improve the performance of data visualization and cell clustering.


2020 ◽  
Vol 3 (1) ◽  
pp. 339-364 ◽  
Author(s):  
Brian Hie ◽  
Joshua Peters ◽  
Sarah K. Nyquist ◽  
Alex K. Shalek ◽  
Bonnie Berger ◽  
...  

Single-cell RNA sequencing (scRNA-seq) has provided a high-dimensional catalog of millions of cells across species and diseases. These data have spurred the development of hundreds of computational tools to derive novel biological insights. Here, we outline the components of scRNA-seq analytical pipelines and the computational methods that underlie these steps. We describe available methods, highlight well-executed benchmarking studies, and identify opportunities for additional benchmarking studies and computational methods. As the biochemical approaches for single-cell omics advance, we propose coupled development of robust analytical pipelines suited for the challenges that new data present and principled selection of analytical methods that are suited for the biological questions to be addressed.


Algorithms ◽  
2020 ◽  
Vol 13 (5) ◽  
pp. 109 ◽  
Author(s):  
Marian B. Gorzałczany ◽  
Filip Rudziński

In this paper, we briefly present several modifications and generalizations of the concept of self-organizing neural networks—usually referred to as self-organizing maps (SOMs)—to illustrate their advantages in applications that range from high-dimensional data visualization to complex data clustering. Starting from conventional SOMs, Growing SOMs (GSOMs), Growing Grid Networks (GGNs), Incremental Grid Growing (IGG) approach, Growing Neural Gas (GNG) method as well as our two original solutions, i.e., Generalized SOMs with 1-Dimensional Neighborhood (GeSOMs with 1DN also referred to as Dynamic SOMs (DSOMs)) and Generalized SOMs with Tree-Like Structures (GeSOMs with T-LSs) are discussed. They are characterized in terms of (i) the modification mechanisms used, (ii) the range of network modifications introduced, (iii) the structure regularity, and (iv) the data-visualization/data-clustering effectiveness. The performance of particular solutions is illustrated and compared by means of selected data sets. We also show that the proposed original solutions, i.e., GeSOMs with 1DN (DSOMs) and GeSOMS with T-LSs outperform alternative approaches in various complex clustering tasks by providing up to 20 % increase in the clustering accuracy. The contribution of this work is threefold. First, algorithm-oriented original computer-implementations of particular SOM’s generalizations are developed. Second, their detailed simulation results are presented and discussed. Third, the advantages of our earlier-mentioned original solutions are demonstrated.


Author(s):  
Qiaolian Liu ◽  
Jianfei Zhao ◽  
Naiwang Guo ◽  
Ding Xiao ◽  
Chuan Shi

Sign in / Sign up

Export Citation Format

Share Document