Real World Metamer Sets: Or How we Came to Love Noise

2021 ◽  
Vol 2021 (29) ◽  
pp. 111-117
Author(s):  
Peter Morovič ◽  
Ján Morovič

It is well known that color formation acts as a noise-reducing lossy compression mechanism that results in ambiguity, known as metamerism. Surfaces that match under one set of conditions-an illuminant and observer or capture device-can mismatch under others. The phenomenon has been studied extensively in the past, leading to important results like metamer mismatch volumes, color correction, reflectance estimation and the computation of metamer sets-sets of all possible reflectances that could result in a given sensor response. However, most of these approaches have three limitations: first, they simplify the problem and make assumptions about what reflectances can look like (i.e., being smooth, natural, residing in a subspace based on some measured data), second, they deal with strict mathematical metamerism and overlook noise or precision, and third, only isolated responses are considered without taking the context of a response into account. In this paper we address these limitations by outlining an approach that allows for the robust computation of approximate unconstrained metamer sets and exact unconstrained paramer sets. The notion of spatial or relational paramer sets that take neighboring responses into account, and applications to illuminant estimation and color constancy are also briefly discussed.

2020 ◽  
Vol 16 (4) ◽  
pp. 291-300
Author(s):  
Zhenyu Gao ◽  
Yixing Li ◽  
Zhengxin Wang

AbstractThe recently concluded 2019 World Swimming Championships was another major swimming competition that witnessed some great progresses achieved by human athletes in many events. However, some world records created 10 years ago back in the era of high-tech swimsuits remained untouched. With the advancements in technical skills and training methods in the past decade, the inability to break those world records is a strong indication that records with the swimsuit bonus cannot reflect the real progressions achieved by human athletes in history. Many swimming professionals and enthusiasts are eager to know a measure of the real world records had the high-tech swimsuits never been allowed. This paper attempts to restore the real world records in Men’s swimming without high-tech swimsuits by integrating various advanced methods in probabilistic modeling and optimization. Through the modeling and separation of swimsuit bias, natural improvement, and athletes’ intrinsic performance, the result of this paper provides the optimal estimates and the 95% confidence intervals for the real world records. The proposed methodology can also be applied to a variety of similar studies with multi-factor considerations.


2002 ◽  
Vol 2 (4-5) ◽  
pp. 423-424 ◽  
Author(s):  
MAURICE BRUYNOOGHE ◽  
KUNG-KIU LAU

This special issue marks the tenth anniversary of the LOPSTR workshop. LOPSTR started in 1991 as a workshop on Logic Program Synthesis and Transformation, but later it broadened its scope to logic-based Program Development in general.The motivating force behind LOPSTR has been a belief that declarative paradigms such as logic programming are better suited to program development tasks than traditional non-declarative ones such as the imperative paradigm. Specification, synthesis, transformation or specialisation, analysis, verification and debugging can all be given logical foundations, thus providing a unifying framework for the whole development process.In the past ten years or so, such a theoretical framework has indeed begun to emerge. Even tools have been implemented for analysis, verification and specialisation. However, it is fair to say that so far the focus has largely been on programming-in-the-small. So the future challenge is to apply or extend these techniques to programming-in-the-large, in order to tackle software engineering in the real world.


1976 ◽  
Vol 50 (4) ◽  
pp. 503-513 ◽  
Author(s):  
Robert Craig West

Students of the origins and accomplishments of government regulation of economic activity have open suspected that the laws on which regulation is based were addressed to problems and conditions of the past that no longer prevailed, or — what is worse — assumptions about the “real world” that are highly unrealistic. This is Professor West's main conclusion about the Federal Reserve Act of 1913, especially as regards its discount rate and international exchange policies.


2021 ◽  
pp. 026638212110619
Author(s):  
Sharon Richardson

During the past two decades, there have been a number of breakthroughs in the fields of data science and artificial intelligence, made possible by advanced machine learning algorithms trained through access to massive volumes of data. However, their adoption and use in real-world applications remains a challenge. This paper posits that a key limitation in making AI applicable has been a failure to modernise the theoretical frameworks needed to evaluate and adopt outcomes. Such a need was anticipated with the arrival of the digital computer in the 1950s but has remained unrealised. This paper reviews how the field of data science emerged and led to rapid breakthroughs in algorithms underpinning research into artificial intelligence. It then discusses the contextual framework now needed to advance the use of AI in real-world decisions that impact human lives and livelihoods.


Author(s):  
Darrel Moellendorf

This chapter notes that normative International Political Theory (IPT) developed over the past several decades in response to political, social, and economic events. These included the globalization of trade and finance, the increasing credibility of human-rights norms in foreign policy, and a growing awareness of a global ecological crisis. The emergence of normative IPT was not simply an effort to understand these events, but an attempt to offer accounts of what the responses to them should be. Normative IPT, then, was originally doubly responsive to the real world. Additionally, this chapter argues that there is a plausible account of global egalitarianism, which takes the justification of principles of egalitarian justice to depend crucially on features of the social and economic world. The account of global egalitarianism applies to the current circumstances in part because of features of those circumstances.


2019 ◽  
Vol 2019 (3) ◽  
pp. 170-190
Author(s):  
Archita Agarwal ◽  
Maurice Herlihy ◽  
Seny Kamara ◽  
Tarik Moataz

Abstract The problem of privatizing statistical databases is a well-studied topic that has culminated with the notion of differential privacy. The complementary problem of securing these differentially private databases, however, has—as far as we know—not been considered in the past. While the security of private databases is in theory orthogonal to the problem of private statistical analysis (e.g., in the central model of differential privacy the curator is trusted) the recent real-world deployments of differentially-private systems suggest that it will become a problem of increasing importance. In this work, we consider the problem of designing encrypted databases (EDB) that support differentially-private statistical queries. More precisely, these EDBs should support a set of encrypted operations with which a curator can securely query and manage its data, and a set of private operations with which an analyst can privately analyze the data. Using such an EDB, a curator can securely outsource its database to an untrusted server (e.g., on-premise or in the cloud) while still allowing an analyst to privately query it. We show how to design an EDB that supports private histogram queries. As a building block, we introduce a differentially-private encrypted counter based on the binary mechanism of Chan et al. (ICALP, 2010). We then carefully combine multiple instances of this counter with a standard encrypted database scheme to support differentially-private histogram queries.


Author(s):  
Sean Stevens ◽  
Lee Jussim ◽  
Nathan Honeycutt

This paper explores the suppression of ideas within academic scholarship by academics, either by self-suppression or because of the efforts of other academics. Legal, moral, and social issues distinguishing freedom of speech, freedom of inquiry, and academic freedom are reviewed. How these freedoms and protections can come into tension is then explored by a sociological analysis of denunciation mobs who exercise their legal free speech rights to call for punishing scholars who express ideas they disapprove of and condemn. When successful, these efforts, which constitute legally protected speech, will suppress certain ideas. Real-world examples over the past five years of academics who have been sanctioned or terminated for scholarship targeted by a denunciation mob are then explored.


2014 ◽  
pp. 8-20
Author(s):  
Kurosh Madani

In a large number of real world dilemmas and related applications the modeling of complex behavior is the central point. Over the past decades, new approaches based on Artificial Neural Networks (ANN) have been proposed to solve problems related to optimization, modeling, decision making, classification, data mining or nonlinear functions (behavior) approximation. Inspired from biological nervous systems and brain structure, Artificial Neural Networks could be seen as information processing systems, which allow elaboration of many original techniques covering a large field of applications. Among their most appealing properties, one can quote their learning and generalization capabilities. The main goal of this paper is to present, through some of main ANN models and based techniques, their real application capability in real world industrial dilemmas. Several examples through industrial and real world applications have been presented and discussed.


2020 ◽  
Vol 68 ◽  
pp. 311-364
Author(s):  
Francesco Trovo ◽  
Stefano Paladino ◽  
Marcello Restelli ◽  
Nicola Gatti

Multi-Armed Bandit (MAB) techniques have been successfully applied to many classes of sequential decision problems in the past decades. However, non-stationary settings -- very common in real-world applications -- received little attention so far, and theoretical guarantees on the regret are known only for some frequentist algorithms. In this paper, we propose an algorithm, namely Sliding-Window Thompson Sampling (SW-TS), for nonstationary stochastic MAB settings. Our algorithm is based on Thompson Sampling and exploits a sliding-window approach to tackle, in a unified fashion, two different forms of non-stationarity studied separately so far: abruptly changing and smoothly changing. In the former, the reward distributions are constant during sequences of rounds, and their change may be arbitrary and happen at unknown rounds, while, in the latter, the reward distributions smoothly evolve over rounds according to unknown dynamics. Under mild assumptions, we provide regret upper bounds on the dynamic pseudo-regret of SW-TS for the abruptly changing environment, for the smoothly changing one, and for the setting in which both the non-stationarity forms are present. Furthermore, we empirically show that SW-TS dramatically outperforms state-of-the-art algorithms even when the forms of non-stationarity are taken separately, as previously studied in the literature.


2021 ◽  
Vol 59 (2) ◽  
pp. 66-80
Author(s):  
Catherine Belling

Abstract The ambivalent attraction of feeling horror might explain some paradoxes regarding the consumption of representations of atrocities committed in the real world, in the past, on actual other people. How do horror fictions work in the transmission or exploitation of historical trauma? How might they function as prosthetic memories, at once disturbing and informative to readers who might otherwise not be exposed to those histories at all? What are the ethical implications of horror elicited by fictional representations of historical suffering? This article engages these questions through the reading of Mo Hayder’s 2004 novel The Devil of Nanking. Hayder exploits horror’s appeal and also—by foregrounding the acts of representation, reading, and spectatorship that generate this response—opens that process to critique. The novel may productively be understood as a work of posttraumatic fiction, both containing and exposing the concentric layers of our representational engagement with records of past atrocity. Through such a reading, a spherical rather than linear topology emerges for history itself, a structure of haunted and embodied consumption.


Sign in / Sign up

Export Citation Format

Share Document