scholarly journals New Canonical Representations by Augmenting OBDDs with Conjunctive Decomposition

2017 ◽  
Vol 58 ◽  
pp. 453-521 ◽  
Author(s):  
Yong Lai ◽  
Dayou Liu ◽  
Minghao Yin

We identify two families of canonical knowledge compilation languages. Both families augment ROBDD with conjunctive decomposition bounded by an integer i ranging from 0 to ∞. In the former, the decomposition is finest and the decision respects a chain C of variables, while both the decomposition and decision of the latter respect a tree T of variables. In particular, these two families cover the three existing languages ROBDD, ROBDD with as many implied literals as possible, and AND/OR BDD. We demonstrate that each language in the first family is complete, while each one in the second family is incomplete with expressivity that does not decrease with incremental i. We also demonstrate that the succinctness does not decrease from the i-th language in the second family to the i-th language in the first family, and then to the (i+1)-th language in the first family. For the operating efficiency, on the one hand, we show that the two families of languages support a rich class of tractable logical operations, and particularly the tractability of each language in the second family is not less than that of ROBDD; and on the other hand, we introduce a new time efficiency criterion called rapidity which reflects the idea that exponential operations may be preferable if the language can be exponentially more succinct, and we demonstrate that the rapidity of each operation does not decrease from the i-th language in the second family to the i-th language in the first family, and then to the (i+1)-th language in the first family. Furthermore, we develop a compiler for the last language in the first family (i = ∞). Empirical results show that the compiler significantly advances the compiling efficiency of canonical representations. In fact, its compiling efficiency is comparable with that of the state-of-the-art compilers of non-canonical representations. We also provide a compiler for the i-th language in the first family by translating the last language in the first family into the i-th language (i < ∞). Empirical results show that we can sometimes use the i-th language instead of the last language without any obvious loss of space efficiency.

Author(s):  
Chaotao Chen ◽  
Jinhua Peng ◽  
Fan Wang ◽  
Jun Xu ◽  
Hua Wu

In human conversation an input post is open to multiple potential responses, which is typically regarded as a one-to-many problem. Promising approaches mainly incorporate multiple latent mechanisms to build the one-to-many relationship. However, without accurate selection of the latent mechanism corresponding to the target response during training, these methods suffer from a rough optimization of latent mechanisms. In this paper, we propose a multi-mapping mechanism to better capture the one-to-many relationship, where multiple mapping modules are employed as latent mechanisms to model the semantic mappings from an input post to its diverse responses. For accurate optimization of latent mechanisms, a posterior mapping selection module is designed to select the corresponding mapping module according to the target response for further optimization. We also introduce an auxiliary matching loss to facilitate the optimization of posterior mapping selection. Empirical results demonstrate the superiority of our model in generating multiple diverse and informative responses over the state-of-the-art methods.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Naotomo Takemura ◽  
Kenta Takata ◽  
Masato Takiguchi ◽  
Masaya Notomi

AbstractThe Kuramoto model is a mathematical model for describing the collective synchronization phenomena of coupled oscillators. We theoretically demonstrate that an array of coupled photonic crystal lasers emulates the Kuramoto model with non-delayed nearest-neighbor coupling (the local Kuramoto model). Our novel strategy employs indirect coupling between lasers via additional cold cavities. By installing cold cavities between laser cavities, we avoid the strong coupling of lasers and realize ideal mutual injection-locking with effective non-delayed dissipative coupling. First, after discussing the limit cycle interpretation of laser oscillation, we demonstrate the synchronization of two indirectly coupled lasers by numerically simulating coupled-mode equations. Second, by performing a phase reduction analysis, we show that laser dynamics in the proposed device can be mapped to the local Kuramoto model. Finally, we briefly demonstrate that a chain of indirectly coupled photonic crystal lasers actually emulates the one-dimensional local Kuramoto chain. We also argue that our proposed structure, which consists of periodically aligned cold cavities and laser cavities, will best be realized by using state-of-the-art buried multiple quantum well photonic crystals.


2021 ◽  
Vol 16 (1) ◽  
Author(s):  
Jens Zentgraf ◽  
Sven Rahmann

Abstract Motivation With an increasing number of patient-derived xenograft (PDX) models being created and subsequently sequenced to study tumor heterogeneity and to guide therapy decisions, there is a similarly increasing need for methods to separate reads originating from the graft (human) tumor and reads originating from the host species’ (mouse) surrounding tissue. Two kinds of methods are in use: On the one hand, alignment-based tools require that reads are mapped and aligned (by an external mapper/aligner) to the host and graft genomes separately first; the tool itself then processes the resulting alignments and quality metrics (typically BAM files) to assign each read or read pair. On the other hand, alignment-free tools work directly on the raw read data (typically FASTQ files). Recent studies compare different approaches and tools, with varying results. Results We show that alignment-free methods for xenograft sorting are superior concerning CPU time usage and equivalent in accuracy. We improve upon the state of the art sorting by presenting a fast lightweight approach based on three-way bucketed quotiented Cuckoo hashing. Our hash table requires memory comparable to an FM index typically used for read alignment and less than other alignment-free approaches. It allows extremely fast lookups and uses less CPU time than other alignment-free methods and alignment-based methods at similar accuracy. Several engineering steps (e.g., shortcuts for unsuccessful lookups, software prefetching) improve the performance even further. Availability Our software xengsort is available under the MIT license at http://gitlab.com/genomeinformatics/xengsort. It is written in numba-compiled Python and comes with sample Snakemake workflows for hash table construction and dataset processing.


2021 ◽  
Vol 52 (1) ◽  
Author(s):  
Fabienne Archer ◽  
Alexandra Bobet-Erny ◽  
Maryline Gomes

AbstractThe number and severity of diseases affecting lung development and adult respiratory function have stimulated great interest in developing new in vitro models to study lung in different species. Recent breakthroughs in 3-dimensional (3D) organoid cultures have led to new physiological in vitro models that better mimic the lung than conventional 2D cultures. Lung organoids simulate multiple aspects of the real organ, making them promising and useful models for studying organ development, function and disease (infection, cancer, genetic disease). Due to their dynamics in culture, they can serve as a sustainable source of functional cells (biobanking) and be manipulated genetically. Given the differences between species regarding developmental kinetics, the maturation of the lung at birth, the distribution of the different cell populations along the respiratory tract and species barriers for infectious diseases, there is a need for species-specific lung models capable of mimicking mammal lungs as they are of great interest for animal health and production, following the One Health approach. This paper reviews the latest developments in the growing field of lung organoids.


Database ◽  
2021 ◽  
Vol 2021 ◽  
Author(s):  
Yifan Shao ◽  
Haoru Li ◽  
Jinghang Gu ◽  
Longhua Qian ◽  
Guodong Zhou

Abstract Extraction of causal relations between biomedical entities in the form of Biological Expression Language (BEL) poses a new challenge to the community of biomedical text mining due to the complexity of BEL statements. We propose a simplified form of BEL statements [Simplified Biological Expression Language (SBEL)] to facilitate BEL extraction and employ BERT (Bidirectional Encoder Representation from Transformers) to improve the performance of causal relation extraction (RE). On the one hand, BEL statement extraction is transformed into the extraction of an intermediate form—SBEL statement, which is then further decomposed into two subtasks: entity RE and entity function detection. On the other hand, we use a powerful pretrained BERT model to both extract entity relations and detect entity functions, aiming to improve the performance of two subtasks. Entity relations and functions are then combined into SBEL statements and finally merged into BEL statements. Experimental results on the BioCreative-V Track 4 corpus demonstrate that our method achieves the state-of-the-art performance in BEL statement extraction with F1 scores of 54.8% in Stage 2 evaluation and of 30.1% in Stage 1 evaluation, respectively. Database URL: https://github.com/grapeff/SBEL_datasets


1998 ◽  
Vol 08 (01) ◽  
pp. 21-66 ◽  
Author(s):  
W. M. P. VAN DER AALST

Workflow management promises a new solution to an age-old problem: controlling, monitoring, optimizing and supporting business processes. What is new about workflow management is the explicit representation of the business process logic which allows for computerized support. This paper discusses the use of Petri nets in the context of workflow management. Petri nets are an established tool for modeling and analyzing processes. On the one hand, Petri nets can be used as a design language for the specification of complex workflows. On the other hand, Petri net theory provides for powerful analysis techniques which can be used to verify the correctness of workflow procedures. This paper introduces workflow management as an application domain for Petri nets, presents state-of-the-art results with respect to the verification of workflows, and highlights some Petri-net-based workflow tools.


2021 ◽  
Vol 7 (4) ◽  
pp. 1-24
Author(s):  
Douglas Do Couto Teixeira ◽  
Aline Carneiro Viana ◽  
Jussara M. Almeida ◽  
Mrio S. Alvim

Predicting mobility-related behavior is an important yet challenging task. On the one hand, factors such as one’s routine or preferences for a few favorite locations may help in predicting their mobility. On the other hand, several contextual factors, such as variations in individual preferences, weather, traffic, or even a person’s social contacts, can affect mobility patterns and make its modeling significantly more challenging. A fundamental approach to study mobility-related behavior is to assess how predictable such behavior is, deriving theoretical limits on the accuracy that a prediction model can achieve given a specific dataset. This approach focuses on the inherent nature and fundamental patterns of human behavior captured in that dataset, filtering out factors that depend on the specificities of the prediction method adopted. However, the current state-of-the-art method to estimate predictability in human mobility suffers from two major limitations: low interpretability and hardness to incorporate external factors that are known to help mobility prediction (i.e., contextual information). In this article, we revisit this state-of-the-art method, aiming at tackling these limitations. Specifically, we conduct a thorough analysis of how this widely used method works by looking into two different metrics that are easier to understand and, at the same time, capture reasonably well the effects of the original technique. We evaluate these metrics in the context of two different mobility prediction tasks, notably, next cell and next distinct cell prediction, which have different degrees of difficulty. Additionally, we propose alternative strategies to incorporate different types of contextual information into the existing technique. Our evaluation of these strategies offer quantitative measures of the impact of adding context to the predictability estimate, revealing the challenges associated with doing so in practical scenarios.


1878 ◽  
Vol 28 (2) ◽  
pp. 633-671 ◽  
Author(s):  
Alexander Macfarlane

The experiments to which I shall refer were carried out in the physical laboratory of the University during the late summer session. I was ably assisted in conducting the experiments by three students of the laboratory,—Messrs H. A. Salvesen, G. M. Connor, and D. E. Stewart. The method which was used of measuring the difference of potential required to produce a disruptive discharge of electricity under given conditions, is that described in a paper communicated to the Royal Society of Edinburgh in 1876 in the names of Mr J. A. Paton, M. A., and myself, and was suggested to me by Professor Tait as a means of attacking the experimental problems mentioned below.The above sketch which I took of the apparatus in situ may facilitate tha description of the method. The receiver of an air-pump, having a rod capable of being moved air-tight up and down through the neck, was attached to one of the conductors of a Holtz machine in such a manner that the conductor of the machine and the rod formed one conducting system. Projecting from the bottom of the receiver was a short metallic rod, forming one conductor with the metallic parts of the air-pump, and by means of a chain with the uninsulated conductor of the Holtz machine. Brass balls and discs of various sizes were made to order, capable of being screwed on to the ends of the rods. On the table, and at a distance of about six feet from the receiver, was a stand supporting two insulated brass balls, the one fixed, the other having one degree of freedom, viz., of moving in a straight line in the plane of the table. The fixed insulated ball A was made one conductor with the insulated conductor of the Holtz and the rod of the receiver, by means of a copper wire insulated with gutta percha, having one end stuck firmly into a hole in the collar of the receiver, and having the other fitted in between the glass stem and the hollow in the ball, by which it fitted on to the stem tightly. A thin wire similarly fitted in between the ball B and its insulating stem connected the ball with the insulated half ring of a divided ring reflecting electrometer.


2020 ◽  
Vol 20 (9&10) ◽  
pp. 747-765
Author(s):  
F. Orts ◽  
G. Ortega ◽  
E.M. E.M. Garzon

Despite the great interest that the scientific community has in quantum computing, the scarcity and high cost of resources prevent to advance in this field. Specifically, qubits are very expensive to build, causing the few available quantum computers are tremendously limited in their number of qubits and delaying their progress. This work presents new reversible circuits that optimize the necessary resources for the conversion of a sign binary number into two's complement of N digits. The benefits of our work are two: on the one hand, the proposed two's complement converters are fault tolerant circuits and also are more efficient in terms of resources (essentially, quantum cost, number of qubits, and T-count) than the described in the literature. On the other hand, valuable information about available converters and, what is more, quantum adders, is summarized in tables for interested researchers. The converters have been measured using robust metrics and have been compared with the state-of-the-art circuits. The code to build them in a real quantum computer is given.


2018 ◽  
Vol 6 (3) ◽  
pp. 67 ◽  
Author(s):  
Laxmi Koju ◽  
Ram Koju ◽  
Shouyang Wang

This study investigated the impact of banking management on credit risk using a sample of Indian commercial banks. The study employed dynamic panel estimations to evaluate the link between banking management variables and credit risk. The empirical results show that an increase in loan portion over total assets does not necessarily increase problem loans. The findings suggest that high capital requirements and large bank size do not reduce default risk, whereas high profitability and strong income diversification policies lower the likelihood of default risk. The overall empirical results supported the “operating efficiency”, “diversification” and “too big to fail” hypotheses, confirming that credit quality in the banking industry is mainly driven by profitability, banking supervision, high credit standards and strong investment strategies. The findings are relevant to bank managers, investors and bank regulators, in formulating effective credit policies and investment strategies.


Sign in / Sign up

Export Citation Format

Share Document