scholarly journals CIS-LDRD Project 218313 Final Technical Report. Parsimonious Inference Information-Theoretic Foundations for a Complete Theory of Machine Learning.

2020 ◽  
Author(s):  
Jed Duersch ◽  
Thomas Catanach ◽  
Ming Gu
2011 ◽  
Vol 11 (2-3) ◽  
pp. 263-296 ◽  
Author(s):  
SHAY B. COHEN ◽  
ROBERT J. SIMMONS ◽  
NOAH A. SMITH

AbstractWeighted logic programming, a generalization of bottom-up logic programming, is a well-suited framework for specifying dynamic programming algorithms. In this setting, proofs correspond to the algorithm's output space, such as a path through a graph or a grammatical derivation, and are given a real-valued score (often interpreted as a probability) that depends on the real weights of the base axioms used in the proof. The desired output is a function over all possible proofs, such as a sum of scores or an optimal score. We describe the product transformation, which can merge two weighted logic programs into a new one. The resulting program optimizes a product of proof scores from the original programs, constituting a scoring function known in machine learning as a “product of experts.” Through the addition of intuitive constraining side conditions, we show that several important dynamic programming algorithms can be derived by applying product to weighted logic programs corresponding to simpler weighted logic programs. In addition, we show how the computation of Kullback–Leibler divergence, an information-theoretic measure, can be interpreted using product.


2009 ◽  
Vol 19 (05) ◽  
pp. 389-414 ◽  
Author(s):  
FRANK NIELSEN ◽  
RICHARD NOCK

In this paper, we first survey prior work for computing exactly or approximately the smallest enclosing balls of point or ball sets in Euclidean spaces. We classify previous work into three categories: (1) purely combinatorial, (2) purely numerical, and (3) recent mixed hybrid algorithms based on coresets. We then describe two novel tailored algorithms for computing arbitrary close approximations of the smallest enclosing Euclidean ball. These deterministic heuristics are based on solving relaxed decision problems using a primal-dual method. The primal-dual method is interpreted geometrically as solving for a minimum covering set, or dually as seeking for a minimum piercing set. Finally, we present some applications in machine learning of the exact and approximate smallest enclosing ball procedure, and discuss about its extension to non-Euclidean information-theoretic spaces.


2021 ◽  
Author(s):  
Fabian Schlebusch ◽  
Frederic Kehrein ◽  
Rainer Röhrig ◽  
Barbara Namer ◽  
Ekaterina Kutafina

openMNGlab is an open-source software framework for data analysis, tailored for the specific needs of microneurography – a type of electrophysiological technique particularly important for research on peripheral neural fibers coding. Currently, openMNGlab loads data from Spike2 and Dapsys, which are two major data acquisition solutions. By building on top of the Neo software, openMNGlab can be easily extended to handle the most common electrophysiological data formats. Furthermore, it provides methods for data visualization, fiber tracking, and a modular feature database to extract features for data analysis and machine learning.


Entropy ◽  
2020 ◽  
Vol 22 (5) ◽  
pp. 499
Author(s):  
Martin Hilbert ◽  
David Darmon

The machine-learning paradigm promises traders to reduce uncertainty through better predictions done by ever more complex algorithms. We ask about detectable results of both uncertainty and complexity at the aggregated market level. We analyzed almost one billion trades of eight currency pairs (2007–2017) and show that increased algorithmic trading is associated with more complex subsequences and more predictable structures in bid-ask spreads. However, algorithmic involvement is also associated with more future uncertainty, which seems contradictory, at first sight. On the micro-level, traders employ algorithms to reduce their local uncertainty by creating more complex algorithmic patterns. This entails more predictable structure and more complexity. On the macro-level, the increased overall complexity implies more combinatorial possibilities, and therefore, more uncertainty about the future. The chain rule of entropy reveals that uncertainty has been reduced when trading on the level of the fourth digit behind the dollar, while new uncertainty started to arise at the fifth digit behind the dollar (aka ‘pip-trading’). In short, our information theoretic analysis helps us to clarify that the seeming contradiction between decreased uncertainty on the micro-level and increased uncertainty on the macro-level is the result of the inherent relationship between complexity and uncertainty.


2020 ◽  
Vol 17 (2–3) ◽  
pp. 149-401
Author(s):  
Jonathan Scarlett ◽  
Albert Guillén i Fàbregas ◽  
Anelia Somekh-Baruch ◽  
Alfonso Martinez

Author(s):  
Melda Yuksel ◽  
Elza Erkip

This chapter provides an overview of the information theoretic foundations of cooperative communications. Earlier information theoretic achievements, as well as the more recent developments, are discussed. The analysis accounts for full/half-duplex node, and for multiple relays. Various channel models such as discrete memoryless, additive white Gaussian noise (AWGN), and fading channels are considered. Cooperative communication protocols are investigated using capacity, diversity, and diversity-multiplexing tradeoff (DMT) as performance metrics. Overall, this chapter provides a comprehensive view on the foundations of and the state-of-the-art reached in the theory of cooperative communications.


Sign in / Sign up

Export Citation Format

Share Document