Bi-stochastic Matrix Approximation Framework for Data Co-clustering

Author(s):  
Lazhar Labiod ◽  
Mohamed Nadif
2019 ◽  
Vol 37 (4) ◽  
pp. 1-34 ◽  
Author(s):  
Huafeng Liu ◽  
Liping Jing ◽  
Yuhua Qian ◽  
Jian Yu

2020 ◽  
Vol 18 (1) ◽  
pp. 653-661 ◽  
Author(s):  
Hongxing Wang ◽  
Xiaoyan Zhang

Abstract In this article, we study the constrained matrix approximation problem in the Frobenius norm by using the core inverse: ||Mx-b|{|}_{F}=\hspace{.25em}\min \hspace{1em}\text{subject}\hspace{.25em}\text{to}\hspace{1em}x\in {\mathcal R} (M), where M\in {{\mathbb{C}}}_{n}^{\text{CM}} . We get the unique solution to the problem, provide two Cramer’s rules for the unique solution and establish two new expressions for the core inverse.


1991 ◽  
Vol 28 (1) ◽  
pp. 96-103 ◽  
Author(s):  
Daniel P. Heyman

We are given a Markov chain with states 0, 1, 2, ···. We want to get a numerical approximation of the steady-state balance equations. To do this, we truncate the chain, keeping the first n states, make the resulting matrix stochastic in some convenient way, and solve the finite system. The purpose of this paper is to provide some sufficient conditions that imply that as n tends to infinity, the stationary distributions of the truncated chains converge to the stationary distribution of the given chain. Our approach is completely probabilistic, and our conditions are given in probabilistic terms. We illustrate how to verify these conditions with five examples.


1979 ◽  
Vol 62 (3) ◽  
pp. 595-607 ◽  
Author(s):  
H. Fukuyama ◽  
A. Sakurai

2018 ◽  
Vol 14 (1) ◽  
pp. 7540-7559
Author(s):  
MI lOS lAWA SOKO

Virtually every biological model utilising a random number generator is a Markov stochastic process. Numerical simulations of such processes are performed using stochastic or intensity matrices or kernels. Biologists, however, define stochastic processes in a slightly different way to how mathematicians typically do. A discrete-time discrete-value stochastic process may be defined by a function p : X0 × X → {f : Î¥ → [0, 1]}, where X is a set of states, X0 is a bounded subset of X, Î¥ is a subset of integers (here associated with discrete time), where the function p satisfies 0 < p(x, y)(t) < 1 and  EY p(x, y)(t) = 1. This definition generalizes a stochastic matrix. Although X0 is bounded, X may include every possible state and is often infinite. By interrupting the process whenever the state transitions into the X −X0 set, Markov stochastic processes defined this way may have non-quadratic stochastic matrices. Similar principle applies to intensity matrices, stochastic and intensity kernels resulting from considering many biological models as Markov stochastic processes. Class of such processes has important properties when considered from a point of view of theoretical mathematics. In particular, every process from this class may be simulated (hence they all exist in a physical sense) and has a well-defined probabilistic space associated with it.


Sign in / Sign up

Export Citation Format

Share Document