Linear constraint systems as high-level nets

Author(s):  
Eike Best ◽  
Catuscia Palamidessi
2010 ◽  
Vol 19 (01) ◽  
pp. 65-99 ◽  
Author(s):  
MARC POULY

Computing inference from a given knowledgebase is one of the key competences of computer science. Therefore, numerous formalisms and specialized inference routines have been introduced and implemented for this task. Typical examples are Bayesian networks, constraint systems or different kinds of logic. It is known today that these formalisms can be unified under a common algebraic roof called valuation algebra. Based on this system, generic inference algorithms for the processing of arbitrary valuation algebras can be defined. Researchers benefit from this high level of abstraction to address open problems independently of the underlying formalism. It is therefore all the more astonishing that this theory did not find its way into concrete software projects. Indeed, all modern programming languages for example provide generic sorting procedures, but generic inference algorithms are still mythical creatures. NENOK breaks a new ground and offers an extensive library of generic inference tools based on the valuation algebra framework. All methods are implemented as distributed algorithms that process local and remote knowledgebases in a transparent manner. Besides its main purpose as software library, NENOK also provides a sophisticated graphical user interface to inspect the inference process and the involved graphical structures. This can be used for educational purposes but also as a fast prototyping architecture for inference formalisms.


This chapter introduces Integer Linear Programming (ILP) approaches for solving efficiently a ðnancial portfolio design problem. The authors proposed a matricial model in Chapter 3, which is a mathematical quadratic model. A linearization step is considered necessary to apply linear programming techniques. The corresponding matricial model shows clearly that the problem is strongly symmetrical. The row and column symmetries are easily handled by adding a negligible number of new constraints. The authors propose two linear models, which are given in detail and proven. These models represent the problem as linear constraint systems with 0-1 variables, which will be implemented in ILP solver. Experimental results in non-trivial instances of portfolio design problem are given.


Author(s):  
C. Argáez ◽  
M.J. Cánovas ◽  
J. Parra

AbstractWe are concerned with finite linear constraint systems in a parametric framework where the right-hand side is an affine function of the perturbation parameter. Such structured perturbations provide a unified framework for different parametric models in the literature, as block, directional and/or partial perturbations of both inequalities and equalities. We extend some recent results about calmness of the feasible set mapping and provide an application to the convergence of a certain path-following algorithmic scheme. We underline the fact that our formula for the calmness modulus depends only on the nominal data, which makes it computable in practice.


Author(s):  
David P. Bazett-Jones ◽  
Mark L. Brown

A multisubunit RNA polymerase enzyme is ultimately responsible for transcription initiation and elongation of RNA, but recognition of the proper start site by the enzyme is regulated by general, temporal and gene-specific trans-factors interacting at promoter and enhancer DNA sequences. To understand the molecular mechanisms which precisely regulate the transcription initiation event, it is crucial to elucidate the structure of the transcription factor/DNA complexes involved. Electron spectroscopic imaging (ESI) provides the opportunity to visualize individual DNA molecules. Enhancement of DNA contrast with ESI is accomplished by imaging with electrons that have interacted with inner shell electrons of phosphorus in the DNA backbone. Phosphorus detection at this intermediately high level of resolution (≈lnm) permits selective imaging of the DNA, to determine whether the protein factors compact, bend or wrap the DNA. Simultaneously, mass analysis and phosphorus content can be measured quantitatively, using adjacent DNA or tobacco mosaic virus (TMV) as mass and phosphorus standards. These two parameters provide stoichiometric information relating the ratios of protein:DNA content.


Author(s):  
J. S. Wall

The forte of the Scanning transmission Electron Microscope (STEM) is high resolution imaging with high contrast on thin specimens, as demonstrated by visualization of single heavy atoms. of equal importance for biology is the efficient utilization of all available signals, permitting low dose imaging of unstained single molecules such as DNA.Our work at Brookhaven has concentrated on: 1) design and construction of instruments optimized for a narrow range of biological applications and 2) use of such instruments in a very active user/collaborator program. Therefore our program is highly interactive with a strong emphasis on producing results which are interpretable with a high level of confidence.The major challenge we face at the moment is specimen preparation. The resolution of the STEM is better than 2.5 A, but measurements of resolution vs. dose level off at a resolution of 20 A at a dose of 10 el/A2 on a well-behaved biological specimen such as TMV (tobacco mosaic virus). To track down this problem we are examining all aspects of specimen preparation: purification of biological material, deposition on the thin film substrate, washing, fast freezing and freeze drying. As we attempt to improve our equipment/technique, we use image analysis of TMV internal controls included in all STEM samples as a monitor sensitive enough to detect even a few percent improvement. For delicate specimens, carbon films can be very harsh-leading to disruption of the sample. Therefore we are developing conducting polymer films as alternative substrates, as described elsewhere in these Proceedings. For specimen preparation studies, we have identified (from our user/collaborator program ) a variety of “canary” specimens, each uniquely sensitive to one particular aspect of sample preparation, so we can attempt to separate the variables involved.


Sign in / Sign up

Export Citation Format

Share Document