multipoint statistics
Recently Published Documents


TOTAL DOCUMENTS

23
(FIVE YEARS 7)

H-INDEX

5
(FIVE YEARS 1)

eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Riccardo Caramellino ◽  
Eugenio Piasini ◽  
Andrea Buccellato ◽  
Anna Carboncino ◽  
Vijay Balasubramanian ◽  
...  

Efficient processing of sensory data requires adapting the neuronal encoding strategy to the statistics of natural stimuli. Previously, in Hermundstad et al., 2014, we showed that local multipoint correlation patterns that are most variable in natural images are also the most perceptually salient for human observers, in a way that is compatible with the efficient coding principle. Understanding the neuronal mechanisms underlying such adaptation to image statistics will require performing invasive experiments that are impossible in humans. Therefore, it is important to understand whether a similar phenomenon can be detected in animal species that allow for powerful experimental manipulations, such as rodents. Here we selected four image statistics (from single- to four-point correlations) and trained four groups of rats to discriminate between white noise patterns and binary textures containing variable intensity levels of one of such statistics. We interpreted the resulting psychometric data with an ideal observer model, finding a sharp decrease in sensitivity from two- to four-point correlations and a further decrease from four- to three-point. This ranking fully reproduces the trend we previously observed in humans, thus extending a direct demonstration of efficient coding to a species where neuronal and developmental processes can be interrogated and causally manipulated.


2021 ◽  
Author(s):  
Riccardo Caramellino ◽  
Eugenio Piasini ◽  
Andrea Buccellato ◽  
Anna Carboncino ◽  
Vijay Balasubramanian ◽  
...  

Efficient processing of sensory data requires adapting the neuronal encoding strategy to the statistics of natural stimuli. Humans, for instance, are most sensitive to multipoint correlations that vary the most across natural images. Here we show that rats possess the same sensitivity ranking to multipoint statistics as humans, thus extending a classic demonstration of efficient coding to a species where neuronal and developmental processes can be interrogated and causally manipulated.


2021 ◽  
Vol 14 (1) ◽  
pp. 377-389
Author(s):  
Rostislav Kouznetsov

Abstract. Lossy compression of scientific data arrays is a powerful tool to save network bandwidth and storage space. Properly applied lossy compression can reduce the size of a dataset by orders of magnitude while keeping all essential information, whereas a wrong choice of lossy compression parameters leads to the loss of valuable data. An important class of lossy compression methods is so-called precision-preserving compression, which guarantees that a certain precision of each number will be kept. The paper considers statistical properties of several precision-preserving compression methods implemented in NetCDF Operators (NCO), a popular tool for handling and transformation of numerical data in NetCDF format. We compare artifacts resulting from the use of precision-preserving compression of floating-point data arrays. In particular, we show that a popular Bit Grooming algorithm (default in NCO until recently) has suboptimal accuracy and produces substantial artifacts in multipoint statistics. We suggest a simple implementation of two algorithms that are free from these artifacts and have double the precision. One of them can be used to rectify the data already processed with Bit Grooming. We compare precision trimming for relative and absolute precision to a popular linear packing (LP) method and find out that LP has no advantage over precision trimming at a given maximum absolute error. We give examples when LP leads to an unconstrained error in the integral characteristic of a field or leads to unphysical values. We analyze compression efficiency as a function of target precision for two synthetic datasets and discuss precision needed in several atmospheric fields. Mantissa rounding has been contributed to NCO mainstream as a replacement for Bit Grooming. The Appendix contains code samples implementing precision trimming in Python3 and Fortran 95.


2020 ◽  
Author(s):  
Rostislav Kouznetsov

Abstract. Lossy compression of scientific data arrays is a powerful tool to save network bandwidth and storage space. Properly applied lossy compression can reduce the size of a dataset by orders of magnitude keeping all essential information, whereas a wrong choice of lossy compression parameters leads to the loss of valuable data. The paper considers statistical properties of several lossy compression methods implemented in "NetCDF operators" (NCO), a popular tool for handling and transformation of numerical data in NetCDF format. We compare the effects of imprecisions and artifacts resulting from use of a lossy compression of floating-point data arrays. In particular, we show that a popular Bit Grooming algorithm (default in NCO) has sub-optimal accuracy and produces substantial artifacts in multipoint statistics. We suggest a simple implementation of two algorithms that are free from these artifacts and have twice higher precision. Besides that, we suggest a way to rectify the data already processed with Bit Grooming. The algorithm has been contributed to NCO mainstream. The supplementary material contains the implementation of the algorithm in Python 3.


2019 ◽  
Vol 10 (1) ◽  
pp. 107-132 ◽  
Author(s):  
J. Peinke ◽  
M.R.R. Tabar ◽  
M. Wächter

When the complete understanding of a complex system is not available, as, e.g., for systems considered in the real world, we need a top-down approach to complexity. In this approach, one may desire to understand general multipoint statistics. Here, such a general approach is presented and discussed based on examples from turbulence and sea waves. Our main idea is based on the cascade picture of turbulence, entangling fluctuations from large to small scales. Inspired by this cascade picture, we express the general multipoint statistics by the statistics of scale-dependent fluctuations of variables and relate it to a scale-dependent process, which finally is a stochastic cascade process. We show how to extract from empirical data a Fokker–Planck equation for this cascade process, which allows the generation of surrogate data to forecast extreme events as well as to develop a nonequilibrium thermodynamics for the complex systems. For each cascade event, an entropy production can be determined. These entropies accurately fulfill a rigorous law, namely the integral fluctuations theorem.


Sign in / Sign up

Export Citation Format

Share Document