gaussian vector
Recently Published Documents


TOTAL DOCUMENTS

140
(FIVE YEARS 8)

H-INDEX

17
(FIVE YEARS 0)

Author(s):  
Oren Yakir

Abstract Given a $d$-dimensional Euclidean lattice we consider the random set obtained by adding an independent Gaussian vector to each of the lattice points. In this note we provide a simple procedure that recovers the lattice from a single realization of the random set.


Entropy ◽  
2020 ◽  
Vol 22 (12) ◽  
pp. 1378
Author(s):  
Jesús Gutiérrez-Gutiérrez ◽  
Marta Zárraga-Rodríguez ◽  
Xabier Insausti

In this paper, we study the asymptotic optimality of a low-complexity coding strategy for Gaussian vector sources. Specifically, we study the convergence speed of the rate of such a coding strategy when it is used to encode the most relevant vector sources, namely wide sense stationary (WSS), moving average (MA), and autoregressive (AR) vector sources. We also study how the coding strategy considered performs when it is used to encode perturbed versions of those relevant sources. More precisely, we give a sufficient condition for such perturbed versions so that the convergence speed of the rate remains unaltered.


Entropy ◽  
2019 ◽  
Vol 21 (10) ◽  
pp. 965
Author(s):  
Marta Zárraga-Rodríguez ◽  
Jesús Gutiérrez-Gutiérrez ◽  
Xabier Insausti

In this paper, we present a low-complexity coding strategy to encode (compress) finite-length data blocks of Gaussian vector sources. We show that for large enough data blocks of a Gaussian asymptotically wide sense stationary (AWSS) vector source, the rate of the coding strategy tends to the lowest possible rate. Besides being a low-complexity strategy it does not require the knowledge of the correlation matrix of such data blocks. We also show that this coding strategy is appropriate to encode the most relevant Gaussian vector sources, namely, wide sense stationary (WSS), moving average (MA), autoregressive (AR), and ARMA vector sources.


Sign in / Sign up

Export Citation Format

Share Document