scholarly journals Core-Guided Minimal Correction Set and Core Enumeration

Author(s):  
Nina Narodytska ◽  
Nikolaj Bjørner ◽  
Maria-Cristina Marinescu ◽  
Mooly Sagiv

A set of constraints is unsatisfiable if there is no solution that satisfies these constraints. To analyse unsatisfiable problems, the user needs to understand where inconsistencies come from and how they can be repaired. Minimal unsatisfiable cores and correction sets are important subsets of constraints that enable such analysis. In this work, we propose a new algorithm for extracting minimal unsatisfiable cores and correction sets simultaneously. Building on top of the relaxation and strengthening framework, we introduce novel techniques for extracting these sets. Our new solver significantly outperforms several state of the art algorithms on common benchmarks when it comes to extracting correction sets and compares favorably on core extraction.

2020 ◽  
Vol 34 (07) ◽  
pp. 11890-11898
Author(s):  
Zhongang Qi ◽  
Saeed Khorram ◽  
Li Fuxin

Understanding and interpreting the decisions made by deep learning models is valuable in many domains. In computer vision, computing heatmaps from a deep network is a popular approach for visualizing and understanding deep networks. However, heatmaps that do not correlate with the network may mislead human, hence the performance of heatmaps in providing a faithful explanation to the underlying deep network is crucial. In this paper, we propose I-GOS, which optimizes for a heatmap so that the classification scores on the masked image would maximally decrease. The main novelty of the approach is to compute descent directions based on the integrated gradients instead of the normal gradient, which avoids local optima and speeds up convergence. Compared with previous approaches, our method can flexibly compute heatmaps at any resolution for different user needs. Extensive experiments on several benchmark datasets show that the heatmaps produced by our approach are more correlated with the decision of the underlying deep network, in comparison with other state-of-the-art approaches.


Author(s):  
Wong Wei Syuen ◽  
Umar Nirmal ◽  
M. N. Ervina Efzan ◽  
Ammar Al Shalabi

In 21st century, kitchen sink is no longer a luxury for a house but has become a necessity. Human nowadays are pursuing for an advance kitchen sink with multiple function and ergonomic to the user. This work is a dedicated review on the design evolution of sinks from year 1973 till 2016. According to the patents review from year 1973, kitchen sink or faucet with multiple functions compiled with ergonomic features are invented and improved in its design from time to time. The patents discusses on the stand alone sink, portable basin that can be packed, use of flexible braided metal hose, sensor operated faucet, foldable kitchen sink and others design features which is crucial to the user. As part of an initiative to predict user needs in the future, future research on state of the art design and development of a smart ergonomic sink for home applications has been included in this work.


Author(s):  
Jeremias Berg ◽  
Fahiem Bacchus ◽  
Alex Poole

Maximum satisfiability (MaxSat) solving is an active area of research motivated by numerous successful applications to solving NP-hard combinatorial optimization problems. One of the most successful approaches for solving MaxSat instances from real world domains are the so called implicit hitting set (IHS) solvers. IHS solvers decouple MaxSat solving into separate core-extraction (i.e. reasoning) and optimization steps which are tackled by a Boolean satisfiability (SAT) and an integer linear programming (IP) solver, respectively. While the approach shows state-of-the-art performance on many industrial instances, it is known that there exists instances on which IHS solvers need to extract an exponential number of cores before terminating. Motivated by the simplest of these problematic instances, we propose abstract cores, a compact representation for a potentially exponential number of regular cores. We demonstrate how to incorporate abstract core reasoning into the IHS algorithm and report on an empirical evaluation demonstrating, that including abstract cores into a state-of-the-art IHS solver improves its performance enough to surpass the best performing solvers of the 2019 MaxSat Evaluation.


2015 ◽  
Vol 63 (10) ◽  
Author(s):  
Volker Paelke ◽  
Carsten Röcker ◽  
Nils Koch ◽  
Holger Flatt ◽  
Sebastian Büttner

AbstractIn this paper, we analyze the specific requirements of interacting with cyber-physical systems and propose a design approach that is driven by user needs and makes use of an expanded toolbox that contains state-of-the-art interaction technologies including Smart Glasses and Wearables. We present several examples of assistance systems in industrial production that use these interaction technologies and discuss the corresponding usability and implementation aspects.


Author(s):  
Roger Seitz ◽  
Mark Freshley ◽  
Mark Williamson ◽  
Paul Dixon ◽  
Kurt Gerdes ◽  
...  

The U.S. Department of Energy (US DOE) Office of Environmental Management, Technology Innovation and Development is supporting a multi-National Laboratory effort to develop the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is an emerging state-of-the-art scientific approach and software infrastructure for understanding and predicting contaminant fate and transport in natural and engineered systems. These modular and open-source high performance computing tools and user interfaces will facilitate integrated approaches that enable standardized assessments of performance and risk for EM cleanup and closure decisions. The ASCEM team recognized that engaging end-users in the ASCEM development process would lead to enhanced development and implementation of the ASCEM toolsets in the user community. End-user involvement in ASCEM covers a broad spectrum of perspectives, including: performance assessment (PA) and risk assessment practitioners, research scientists, decision-makers, oversight personnel, and regulators engaged in the US DOE cleanup mission. End-users are primarily engaged in ASCEM via the ASCEM User Steering Committee (USC) and the ‘user needs interface’ task. Future plans also include user involvement in demonstrations of the ASCEM tools. This paper will describe the details of how end users have been engaged in the ASCEM program and will demonstrate how this involvement has strengthened both the tool development and community confidence. ASCEM tools requested by end-users specifically target modeling challenges associated with US DOE cleanup activities. The demonstration activities involve application of ASCEM tools and capabilities to representative problems at DOE sites. Selected results from the ASCEM Phase 1 demonstrations are discussed to illustrate how capabilities requested by end-users were implemented in prototype versions of the ASCEM tool.


Author(s):  
Chris. J. Hughes

AbstractThis article focuses on building a prototyping for immersive captioning following a user-centric approach. This methodology is characterised by following a bottom-up approach, where usability and user needs are at the heart of the development. Recent research on user requirements for captioning in immersive environments has shown that there is both a need for improvement and a wealth of research opportunities. The final aim is to identify how to display captions for an optimal viewing experience. This work began four years ago with some partial findings. We build from the lessons learnt, focussing on the user-centric design requirements cornerstone: prototyping. Our prototype framework integrates methods used in existing solutions aiming at instant contrast-and-compare functionalities. The first part of the article presents the state of the art for user requirements identifying the reasons behind the development of the prototyping framework. The second part of the article describes the two-stage framework development. The initial framework concept answered to the challenges resulting from the previous research. As soon as the first framework was developed, it became obvious that a second improved solution was required, almost as a showcase on how ideas can quickly be implemented for user testing, and for users to elicit requirements and creative solutions. The article finishes with a list of functionalities, resulting in new caption modes, and the opportunity of becoming a comprehensive immersive captions testbed, where tools such as eye-tracking, or physiological testing devices could be testing captions across any device with a web browser.


Author(s):  
Yasushi Kawase ◽  
Yuko Kuroki ◽  
Atsushi Miyauchi

Aggregating responses from crowd workers is a fundamental task in the process of crowdsourcing. In cases where a few experts are overwhelmed by a large number of non-experts, most answer aggregation algorithms such as the majority voting fail to identify the correct answers. Therefore, it is crucial to extract reliable experts from the crowd workers. In this study, we introduce the notion of "expert core", which is a set of workers that is very unlikely to contain a non-expert. We design a graph-mining-based efficient algorithm that exactly computes the expert core. To answer the aggregation task, we propose two types of algorithms. The first one incorporates the expert core into existing answer aggregation algorithms such as the majority voting, whereas the second one utilizes information provided by the expert core extraction algorithm pertaining to the reliability of workers. We then give a theoretical justification for the first type of algorithm. Computational experiments using synthetic and real-world datasets demonstrate that our proposed answer aggregation algorithms outperform state-of-the-art algorithms. 


Author(s):  
T. A. Welton

Various authors have emphasized the spatial information resident in an electron micrograph taken with adequately coherent radiation. In view of the completion of at least one such instrument, this opportunity is taken to summarize the state of the art of processing such micrographs. We use the usual symbols for the aberration coefficients, and supplement these with £ and 6 for the transverse coherence length and the fractional energy spread respectively. He also assume a weak, biologically interesting sample, with principal interest lying in the molecular skeleton remaining after obvious hydrogen loss and other radiation damage has occurred.


Author(s):  
Carl E. Henderson

Over the past few years it has become apparent in our multi-user facility that the computer system and software supplied in 1985 with our CAMECA CAMEBAX-MICRO electron microprobe analyzer has the greatest potential for improvement and updating of any component of the instrument. While the standard CAMECA software running on a DEC PDP-11/23+ computer under the RSX-11M operating system can perform almost any task required of the instrument, the commands are not always intuitive and can be difficult to remember for the casual user (of which our laboratory has many). Given the widespread and growing use of other microcomputers (such as PC’s and Macintoshes) by users of the microprobe, the PDP has become the “oddball” and has also fallen behind the state-of-the-art in terms of processing speed and disk storage capabilities. Upgrade paths within products available from DEC are considered to be too expensive for the benefits received. After using a Macintosh for other tasks in the laboratory, such as instrument use and billing records, word processing, and graphics display, its unique and “friendly” user interface suggested an easier-to-use system for computer control of the electron microprobe automation. Specifically a Macintosh IIx was chosen for its capacity for third-party add-on cards used in instrument control.


2010 ◽  
Vol 20 (1) ◽  
pp. 9-13 ◽  
Author(s):  
Glenn Tellis ◽  
Lori Cimino ◽  
Jennifer Alberti

Abstract The purpose of this article is to provide clinical supervisors with information pertaining to state-of-the-art clinic observation technology. We use a novel video-capture technology, the Landro Play Analyzer, to supervise clinical sessions as well as to train students to improve their clinical skills. We can observe four clinical sessions simultaneously from a central observation center. In addition, speech samples can be analyzed in real-time; saved on a CD, DVD, or flash/jump drive; viewed in slow motion; paused; and analyzed with Microsoft Excel. Procedures for applying the technology for clinical training and supervision will be discussed.


Sign in / Sign up

Export Citation Format

Share Document