scholarly journals Noise resistance in communication: Quantifying uniformity and optimality

2020 ◽  
Author(s):  
Christine Cuskley ◽  
Joel Wallenberg

Over the past decade and a half, several lines of research have investigated aspects of the smooth signalling redundancy hypothesis. This hypothesis proposes that speakers distribute the information in linguistic utterances as evenly as possible, in order to make the utterance more robust against noise for the hearer. Several studies have shown evidence for this hypothesis in limited linguistic domains, showing that speakers manipulate acoustic and syntactic features to avoid drastic spikes or troughs in information content. In theory, the mechanism behind this is that these spikes would make utterances more vulnerable to noise events, and thus, communicative failure. However, this previous work doesn't consider information density across entire utterances, and only rarely has this mechanism been directly explored. Here, we introduce a new descriptive statistic that quantifies the uniformity of information across an entire utterance, alongside an algorithm that can measure the uniformity of actual utterances against an optimized distribution. Using a simple simulation, we show that utterances optimized for more uniform distributions of information are, in fact, more robust against noise.

2004 ◽  
Vol 18 (1) ◽  
pp. 1-12 ◽  
Author(s):  
Feng Gu ◽  
Baruch Lev

The rise of intangible assets in size and contribution to corporate growth over the past quarter century was accompanied by a steep increase in the rate and scope of patenting. Consequently, many patent-rich companies, particularly in the science-based and high-tech industries, are extensively engaged in the licensing and sale of patents. We examine various valuation and disclosure aspects of the outcome of patent licensing—royalty income. Our findings indicate the following: (1) royalty income is highly relevant to securities valuation, (2) the intensity of royalty income provides investors with an important signal about the quality and prospects of firms' R&D expenditures, and (3) a substantial number of companies engaged in patent licensing do not disclose royalty income in financial reports.


Author(s):  
Sadegh Samadi ◽  
Mohammad Reza Khosravi ◽  
Jafar A. Alzubi ◽  
Omar A. Alzubi ◽  
Varun G. Menon

In this paper, we determine an optimal range for angle tracking radars (ATRs) based on evaluating the standard deviation of all kinds of errors in a tracking system. In the past, this optimal range has often been computed by the simulation of the total error components; however, we are going to introduce a closed form for this computation which allows us to obtain the optimal range directly. Thus, for this purpose, we firstly solve an optimization problem to achieve the closed form of the optimal range (Ropt.) and then, we compute it by doing a simple simulation. The results show that both theoretical and simulation-based computations are similar to each other.


1989 ◽  
Vol 2 (1) ◽  
pp. 14-16 ◽  
Author(s):  
H. Richard Friman

Decision making under conditions of crisis is an integral part of international relations. Yet in most introductory IR texts, crisis decision making consists of Graham Allison's models, the Cuban missile crisis, and updated examples, discussed in five pages or less. In supplementing such texts, instructors of international politics at the introductory level may find themselves skirting Scylla and Charybdis. In the cliffs lies the extensive simulation exercise requiring additional readings and valuable class time to establish the game. Ahead, lies the whirlpool of detailed historical case studies all vying for attention as the necessary cases for relating to the student experience.Over the past few semesters, I have sought to resolve this dilemma with a simple simulation exercise that integrates current events with the basics of decision making under crisis. The demands on class time are minimal. The simulation adds anywhere from 30 minutes to one hour to the regular classroom time spent on discussing foreign policy decision making. Instructor preparation merely requires a scan of reference materials and some creativity. The results, in terms of student interaction, awakening the shy student, and promoting learning instead of regurgitation of simple facts have all been extremely positive for classes as small as 25 students and as large as 65 students.


2019 ◽  
Vol 25 (06) ◽  
pp. 677-692
Author(s):  
Ralph Grishman

AbstractInformation extraction is the process of converting unstructured text into a structured data base containing selected information from the text. It is an essential step in making the information content of the text usable for further processing. In this paper, we describe how information extraction has changed over the past 25 years, moving from hand-coded rules to neural networks, with a few stops on the way. We connect these changes to research advances in NLP and to the evaluations organized by the US Government.


2011 ◽  
Vol 15 (4) ◽  
pp. 1
Author(s):  
Hamid Baghestani ◽  
Woo Jung

<span>The ASA-NBER multiperiod survey forecasts of business investment are compared with univariate forecasts to assess predictive information content. In general, the survey forecasts fail to be unbiased, and, none fully exploit the information in the past history of business investment. Interestingly, however, they contain predictive information on other relevant (quantitative or qualitative) variables. Combined forecasts of survey and univariate models score significant improvements over either, suggesting their potential usefulness in policy-making.</span>


The main purpose of the paper is to investigate the effect of spherical aberration on the information content of low-contrast photographic images with specified spread and noise characteristics. In § 1 an outline is given of the basic ideas and the relevant formalism. In § 2 the response functions of monochromats with spherical aberration and defocusing are considered, and their computed values in selected special cases are displayed in figures 3 to 5. A digression is made in § 3 in order to discuss, with the help of these values, a point of topical interest, namely, the variation of best focus with the line frequency of a sinusoidal test object. In § 4 the notion of the equivalent receiving surface of a low-contrast photographic process is introduced and two equivalent receiving surfaces are defined, with the help of some experimental results of Higgins & Jones (1952), which correspond to model photographic processes used with low-contrast objects. In § 5 the mean information density in the images of an aberration-free monochromat is calculated for two model emulsions, at selected noise levels, over a range of focal settings. In both models, the correctly focused images of a random object set are found to contain about one bit per Airy disk when the signal-to-noise ratio is 100 and the effect of defocusing are similar in the two cases. The effect of spherical aberration on information density at different focal settings is then examined in the second model. It appears that, for amounts of fourth-power aberration up to two fringes, the informationally best focus is approximately midway between paraxial and marginal foci, and that the acceptance of a 20% drop in information content corresponds to a focal tolerance of approximately ± ½ fringe.


2020 ◽  
Vol 2020 (10) ◽  
Author(s):  
Igor Broeckel ◽  
Michele Cicoli ◽  
Anshuman Maharana ◽  
Kajal Singh ◽  
Kuver Sinha

Abstract The statistics of the supersymmetry breaking scale in the string landscape has been extensively studied in the past finding either a power-law behaviour induced by uniform distributions of F-terms or a logarithmic distribution motivated by dynamical supersymmetry breaking. These studies focused mainly on type IIB flux compactifications but did not systematically incorporate the Kähler moduli. In this paper we point out that the inclusion of the Kähler moduli is crucial to understand the distribution of the supersymmetry breaking scale in the landscape since in general one obtains unstable vacua when the F-terms of the dilaton and the complex structure moduli are larger than the F- terms of the Kähler moduli. After taking Kähler moduli stabilisation into account, we find that the distribution of the gravitino mass and the soft terms is power-law only in KKLT and perturbatively stabilised vacua which therefore favour high scale supersymmetry. On the other hand, LVS vacua feature a logarithmic distribution of soft terms and thus a preference for lower scales of supersymmetry breaking. Whether the landscape of type IIB flux vacua predicts a logarithmic or power-law distribution of the supersymmetry breaking scale thus depends on the relative preponderance of LVS and KKLT vacua.


1990 ◽  
Vol 22 (1-2) ◽  
pp. 171-192 ◽  
Author(s):  
E. Arvin ◽  
P. Harremoës

This paper reviews the state of knowledge of the basic mechanisms governing transformation of pollutants and the present approaches with which to predict the performance of biofilm reactors. The design of biofilm reactors is based mainly on empirical loading criteria or empirical design formulae. Introduction of more stringent effluent requirements, new types of biofilm reactors, as well as application of biofilm reactors to more untraditional types of waste materials, calls for new design procedures with higher degrees of confidence. Most new attempts to model biofilm reactors are based on fundamental principles for mass transport to and through biofilms coupled with kinetic expressions for pollutant transformations in the biofilms. A simple simulation model based on half order kinetics is able to describe the removal of soluble substrates, mineralisation of organic matter, nitrification and denitrification. A simple first order kinetics is able to predict degradation of some xenobiotics. Advanced simulation models appearing in the past few years show a strong promise for detailed analysis of the effect of variation in influent waste characteristics, population dynamics, reactor configuration, etc. However, none of the models are able to predict properly the removal of particulate matter and mixtures of several groups of industrial organic chemicals. Again, insight in the basic removal mechanisms is required.


2017 ◽  
Vol 18 (1) ◽  
pp. 45-67 ◽  
Author(s):  
Lydia Byrne ◽  
Daniel Angus ◽  
Janet Wiles

Critical analyses provide information visualization practitioners with insight into the range and suitability of different techniques for visualization. Theory provides the necessary models and vocabulary to deconstruct, explain and classify visualizations, allowing the analysis and comparison of alternate designs, and evaluation of their success. While the critical vocabulary for information visualization in general is well developed, the same cannot be said for ‘hybrid’ information visualizations which combine abstract representation of data with figurative elements such as illustrations. Figurative elements are widely used in information visualization in practice and are increasingly recognized as beneficial for memorability. However, the information encoded by a figurative image and how that information contributes to the overall content of the visualization lacks robust definition within visualization theory. To support critical analysis of hybrid visualization, we provide a model of the information content of a figurative image, which we call the figurative frame model. We use the model to classify hybrid visualizations along two dimensions: information density in the images (defined as the number of features and preserved measurements) and integration of figurative and abstract forms of representation. The new vocabulary for analysing hybrid visualizations reveals how the figurative images expand the expressiveness of information visualization by integrating descriptive and abstract information and allows the formulation of new measures of visualization quality which can be applied to hybrid visualizations.


Sign in / Sign up

Export Citation Format

Share Document