STIPA Method in Public Adress Sound Systems and Voice Alarm Systems Part 1: The Theoretical Basis and the Reference Speaker

2014 ◽  
Vol 1082 ◽  
pp. 570-573
Author(s):  
René Drtina ◽  
Jaroslav Lokvenc ◽  
Josef Šedivý

The transmission channel mediates the flow of information (information transfer) between the source and the information received. In terms of examining the technical characteristics of the transmission channel is probably the most widely used model Shannon-Weaver model of communication, in-processes to technical blocks. From the perspective of media communications and generally assess the overall effect of the transmission of information via Lasswell communication model.

2019 ◽  
Vol 24 (1) ◽  
pp. 54-70
Author(s):  
Alexander Sigman ◽  
Nicolas Misdariis

An ongoing international arts-research-industry collaborative project focusing on the design and implementation of innovative car alarm systems, alarm/will/sound has a firm theoretical basis in theories of sound perception and classification of Pierre Schaeffer and the acousmatic tradition. In turn, the timbre perception, modelling and design components of this project have had a significant influence on a range of fixed media, electroacoustic and media installation works realised in parallel to the experimental research. An examination of the multiple points of contact and cross-influence between auditory warning research and artistic practice forms the backbone of this article, with an eye towards continued development in both the research and the artistic domains of the project.


1990 ◽  
Vol 01 (04) ◽  
pp. 355-422 ◽  
Author(s):  
JACOB D. BEKENSTEIN ◽  
MARCELO SCHIFFER

Information must take up space, must weigh, and its flux must be limited. Quantum limits on communication and information storage leading to these conclusions are described here. Quantum channel capacity theory is reviewed for both steady state and burst communication. An analytic approximation is given for the maximum signal information possible with occupation number signal states as a function of mean signal energy. A theorem guaranteeing that these states are optimal for communication is proved. A heuristic "proof" of the linear bound on communication is given, followed by rigorous proofs for signals with specified mean energy, and for signals with given energy budget. And systems of many parallel quantum channels are shown to obey the linear bound for a natural channel architecture. The time-energy uncertainty principle is reformulated in information language by means of the linear bound. The quantum bound on information storage capacity of quantum mechanical and quantum field devices is reviewed. A simplified version of the analytic proof for the bound is given for the latter case. Solitons as information caches are discussed, as is information storage in one-dimensional systems. The influence of signal self-gravitation on communication is considered. Finally, it is shown that acceleration of a receiver acts to block information transfer.


2021 ◽  
Vol 13 (1) ◽  
pp. 38-48
Author(s):  
Olga V. Galtseva

Introduction. In the context of cultural memory the article analyzes the rooting mechanism and subsequent functioning of local religious holidays in the cultural space of Russian rural communities of the XIX–XX centuries. Materials and Methods. The article contains studied and generalized materials found in research that give an idea of the problem under consideration. It also employs the author’s field materials collected during ethnographic trips to the Nizhny Novgorod region. The theoretical basis of the presented research is the problem-chronological and comparative-historical methods. Such methods of field Ethnography as direct observation and interviewing were used during the collection of field materials. Results and Discussion. In modern humanities, the representative properties of the holidays are interpreted in the context of cultural memory, where the holiday acts as its element or primary form. In this approach, the holiday can be considered as “General text”, which, according to Yuri Lotman can be stored and updated in the shared memory of the community, and as the mechanism of this update process, which acts from generation to generation and allows members of this community to exercise their cultural identity. The author considers local religious holidays of the Russian and Finno-Ugric population (Mordovians and Mari people) in Nizhny Novgorod region as one of the traditional forms of preservation, actualization and intergenerational transmission of information, important for the formation of cultural identity. Conclusion. Local religious holidays were the mechanism used for the collective memory of individual rural communities to be comprehended, preserved and transferred through the unified traditional forms of all-Russian spiritual culture. In the cultural memory of the Finno-Ugric peoples, the local religious holidays of the Russian neighbors became the key to the perception of Christian religion and played an important role in the processes of acculturation.


2020 ◽  
Vol 23 (05) ◽  
pp. 2050014
Author(s):  
JINGLAN ZHENG ◽  
CHUN-XIAO NIE

This study examines the information flow between prices and transaction volumes in the cryptocurrency market, where transfer entropy is used for measurement. We selected four cryptocurrencies (Bitcoin, Ethereum, Litecoin and XRP) with large market values, and Bitcoin and BCH (Bitcoin Cash) for hard fork analysis; a hard fork is when a single cryptocurrency splits in two. By examining the real price data, we show that the long-term time series includes too much noise obscuring the local information flow; thus, a dynamic calculation is needed. The long-term and short-term sliding transfer entropy (TE) values and the corresponding [Formula: see text]-values, based on daily data, indicate that there is a dynamic information flow. The dominant direction of which is [Formula: see text]. In addition, the example based on minute Bitcoin data also shows a dynamic flow of information between price and transaction volume. The price–volume dynamics of multiple time scales helps to analyze the price mechanism in the cryptocurrency market.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Madhavun Candadai ◽  
Eduardo J. Izquierdo

Abstract Behavior involves the ongoing interaction between an organism and its environment. One of the prevailing theories of adaptive behavior is that organisms are constantly making predictions about their future environmental stimuli. However, how they acquire that predictive information is still poorly understood. Two complementary mechanisms have been proposed: predictions are generated from an agent’s internal model of the world or predictions are extracted directly from the environmental stimulus. In this work, we demonstrate that predictive information, measured using bivariate mutual information, cannot distinguish between these two kinds of systems. Furthermore, we show that predictive information cannot distinguish between organisms that are adapted to their environments and random dynamical systems exposed to the same environment. To understand the role of predictive information in adaptive behavior, we need to be able to identify where it is generated. To do this, we decompose information transfer across the different components of the organism-environment system and track the flow of information in the system over time. To validate the proposed framework, we examined it on a set of computational models of idealized agent-environment systems. Analysis of the systems revealed three key insights. First, predictive information, when sourced from the environment, can be reflected in any agent irrespective of its ability to perform a task. Second, predictive information, when sourced from the nervous system, requires special dynamics acquired during the process of adapting to the environment. Third, the magnitude of predictive information in a system can be different for the same task if the environmental structure changes.


2014 ◽  
Vol 1082 ◽  
pp. 574-580
Author(s):  
René Drtina ◽  
Jaroslav Lokvenc ◽  
Josef Šedivý ◽  
Lukáš Čákora ◽  
Jan Konvalina ◽  
...  

Classrooms, lecture, lecture, conference and convention halls can be considered as spaces in which there is sometimes one way, but more often two-way communication through the transmission channel. Equally important is the transfer of key information in voice alarm system.


2021 ◽  
Vol 9 ◽  
Author(s):  
Winnie Poel ◽  
Claudia Winklmayr ◽  
Pawel Romanczuk

In human and animal groups, social interactions often rely on the transmission of information via visual observation of the behavior of others. These visual interactions are governed by the laws of physics and sensory limits. Individuals appear smaller when far away and thus become harder to detect visually, while close by neighbors tend to occlude large areas of the visual field and block out interactions with individuals behind them. Here, we systematically study the effect of a group’s spatial structure, its density as well as polarization and aspect ratio of the physical bodies, on the properties of static visual interaction networks. In such a network individuals are connected if they can see each other as opposed to other interaction models such as metric or topological networks that omit these limitations due to the individual’s physical bodies. We find that structural parameters of the visual networks and especially their dependence on spatial group density are fundamentally different from the two other types. This results in characteristic deviations in information spreading which we study via the dynamics of two generic SIR-type models of social contagion on static visual and metric networks. We expect our work to have implications for the study of animal groups, where it could inform the study of functional benefits of different macroscopic states. It may also be applicable to the construction of robotic swarms communicating via vision or for understanding the spread of panics in human crowds.


2021 ◽  
Author(s):  
Yuri Еvgenievich Polak

The century before last saw revolutionary changes in the transmission of information. For the functioning of the optical telegraph, which appeared at the end of the 18th century, cumbersome towers were necessary for the line of sight of the semaphore signals. One hundred years later, telegraph lines were hundreds of thousands of kilometers long; at the turn of the century, the first experiments with the use of a wireless telegraph began. This is reflected in numerous brochures, books, periodicals of that time. A hundred years later, many of these materials became publicly available thanks to the development of the Internet and electronic libraries, which made the appearance of this work possible. Its goal is to trace the evolution of technologies and processes of information transfer in the 19th century using a wide variety of electronic libraries - from the grandiose projects of the Library of Congress and Google Books with their millions of digitized books to modest private collections dedicated to local topics. Used materials from 20+ electronic libraries.


2019 ◽  
Author(s):  
Madhavun Candadai ◽  
Eduardo J. Izquierdo

Behavior involves the ongoing interaction between an organism and its environment. One of the prevailing theories of adaptive behavior is that organisms are constantly making predictions about their future environmental stimuli. However, how they acquire that predictive information is still poorly understood. Two complementary mechanisms have been proposed: predictions are generated from an agent’s internal model of the world or predictions are extracted directly from the environmental stimulus. In this work, we demonstrate that predictive information, measured using mutual information, cannot distinguish between these two kinds of systems. Furthermore, we show that predictive information cannot distinguish between organisms that are adapted to their environments and random dynamical systems exposed to the same environment. To understand the role of predictive information in adaptive behavior, we need to be able to identify where it is generated. To do this, we decompose information transfer across the different components of the organism-environment system and track the flow of information in the system over time. To validate the proposed framework, we examined it on a set of computational models of idealized agent-environment systems. Analysis of the systems revealed three key insights. First, predictive information, when sourced from the environment, can be reflected in any agent irrespective of its ability to perform a task. Second, predictive information, when sourced from the nervous system, requires special dynamics acquired during the process of adapting to the environment. Third, the magnitude of predictive information in a system can be different for the same task if the environmental structure changes.Significance StatementAn organism’s ability to predict the consequences of its actions on future stimuli is considered a strong indicator of its environmental adaptation. However, in highly structured natural environments, to what extent does an agent have to develop specialized mechanisms to generate predictions? To study this, we present an information theoretic framework to infer the source of predictive information in an organism: extrinsically from the environment or intrinsically from the agent. We find that predictive information extracted from the environment can be reflected in any agent and is therefore not a good indicator of behavioral performance. Studying the flow of predictive information over time across the organism-environment system enables us to better understand its role in behavior.


2019 ◽  
Author(s):  
Jan Bím ◽  
Vito De Feo ◽  
Daniel Chicharro ◽  
Malte Bieler ◽  
Ileana L. Hanganu-Opatz ◽  
...  

AbstractQuantifying both the amount and content of the information transferred between neuronal populations is crucial to understand brain functions. Traditional data-driven methods based on Wiener-Granger causality quantify information transferred between neuronal signals, but do not reveal whether transmission of information refers to one specific feature of external stimuli or another. Here, we developed a new measure called Feature-specific Information Transfer (FIT), that quantifies the amount of information transferred between neuronal signals about specific stimulus features. The FIT quantifies the feature-related information carried by a receiver that was previously carried by a sender, but that was never carried by the receiver earlier. We tested the FIT on simulated data in various scenarios. We found that, unlike previous measures, FIT successfully disambiguated genuine feature-specific communication from non-feature specific communication, from external confounding inputs and synergistic interactions. Moreover, the FIT had enhanced temporal sensitivity that facilitates the estimation of the directionality of transfer and the communication delay between neuronal signals. We validated the FIT’s ability to track feature-specific information flow using neurophysiological data. In human electroencephalographic data acquired during a face detection task, the FIT demonstrated that information about the eye in face pictures flowed from the hemisphere contralateral to the eye to the ipsilateral one. In multi-unit activity recorded from thalamic nuclei and primary sensory cortices of rats during multimodal stimulation, FIT, unlike Wiener-Granger methods, credibly detected both the direction of information flow and the sensory features about which information was transmitted. In human cortical high-gamma activity recorded with magnetoencephalography during visuomotor mapping, FIT showed that visuomotor-related information flowed from superior parietal to premotor areas. Our work suggests that the FIT measure has the potential to uncover previously hidden feature-specific information transfer in neuronal recordings and to provide a better understanding of brain communication.Author summaryThe emergence of coherent percepts and behavior relies on the processing and flow of information about sensory features, such as the color or shape of an object, across different areas of the brain. To understand how computations within the brain lead to the emergence of these functions, we need to map the flow of information about each specific feature. Traditional methods, such as those based on Wiener-Granger causality, quantify whether information is transmitted from one brain area to another, but do not reveal if the information being transmitted is about a certain feature or another feature. Here, we develop a new mathematical technique for the analysis of brain activity recordings, called Feature-specific Information Transfer (FIT), that can reveal not only if any information is being transmitted across areas, but whether or not such transmitted information is about a certain sensory feature. We validate the method with both simulated and real neuronal data, showing its power in detecting the presence of feature-specific information transmission, as well as the timing and directionality of this transfer. This work provides a tool of high potential significance to map sensory information processing in the brain.


Sign in / Sign up

Export Citation Format

Share Document