Direct Methods and Powder Data: State of the Art and Perspectives

Author(s):  
C. Giacovazzo
Author(s):  
Sébastien Cayrols ◽  
Iain S Duff ◽  
Florent Lopez

We describe the parallelization of the solve phase in the sparse Cholesky solver SpLLT when using a sequential task flow model. In the context of direct methods, the solution of a sparse linear system is achieved through three main phases: the analyse, the factorization and the solve phases. In the last two phases, which involve numerical computation, the factorization corresponds to the most computationally costly phase, and it is therefore crucial to parallelize this phase in order to reduce the time-to-solution on modern architectures. As a consequence, the solve phase is often not as optimized as the factorization in state-of-the-art solvers, and opportunities for parallelism are often not exploited in this phase. However, in some applications, the time spent in the solve phase is comparable to or even greater than the time for the factorization, and the user could dramatically benefit from a faster solve routine. This is the case, for example, for a conjugate gradient (CG) solver using a block Jacobi preconditioner. The diagonal blocks are factorized once only, but their factors are used to solve subsystems at each CG iteration. In this study, we design and implement a parallel version of a task-based solve routine for an OpenMP version of the SpLLT solver. We show that we can obtain good scalability on a multicore architecture enabling a dramatic reduction of the overall time-to-solution in some applications.


Sensors ◽  
2019 ◽  
Vol 19 (5) ◽  
pp. 1032 ◽  
Author(s):  
Mingliang Fu ◽  
Weijia Zhou

In recent years, estimating the 6D pose of object instances with convolutional neural network (CNN) has received considerable attention. Depending on whether intermediate cues are used, the relevant literature can be roughly divided into two broad categories: direct methods and two-stage pipelines. For the latter, intermediate cues, such as 3D object coordinates, semantic keypoints, or virtual control points instead of pose parameters are regressed by CNN in the first stage. Object pose can then be solved by correspondence constraints constructed with these intermediate cues. In this paper, we focus on the postprocessing of a two-stage pipeline and propose to combine two learning concepts for estimating object pose under challenging scenes: projection grouping on one side, and correspondence learning on the other. We firstly employ a local-patch based method to predict projection heatmaps which denote the confidence distribution of projection of 3D bounding box’s corners. A projection grouping module is then proposed to remove redundant local maxima from each layer of heatmaps. Instead of directly feeding 2D–3D correspondences to the perspective-n-point (PnP) algorithm, multiple correspondence hypotheses are sampled from local maxima and its corresponding neighborhood and ranked by a correspondence–evaluation network. Finally, correspondences with higher confidence are selected to determine object pose. Extensive experiments on three public datasets demonstrate that the proposed framework outperforms several state of the art methods.


2020 ◽  
Vol 2020 ◽  
pp. 1-16
Author(s):  
Ester Scotto di Perta ◽  
Nunzio Fiorentino ◽  
Marco Carozzi ◽  
Elena Cervelli ◽  
Stefania Pindozzi

Agriculture is mainly responsible for ammonia (NH3) volatilisation. A common effort to produce reliable quantifications, national emission inventories, and policies is needed to reduce health and environmental issues related to this emission. Sources of NH3 are locally distributed and mainly depend on farm building characteristics, management of excreta, and the field application of mineral fertilisers. To date, appropriate measurements related to the application of fertilisers to the field are still scarce in the literature. Proper quantification of NH3 must consider the nature of the fertiliser, the environmental variables that influence the dynamic of the emission, and a reliable measurement method. This paper presents the state of the art of the most commonly used direct methods to measure NH3 volatilisation following field application of fertilisers, mainly focusing on chamber method. The characteristics and the associated uncertainty of the measurement of the most widespread chamber types are discussed and compared to the micrometeorological methods.


2008 ◽  
Vol 41 (3) ◽  
pp. 548-553 ◽  
Author(s):  
Rocco Caliandro ◽  
Benedetta Carrozzini ◽  
Giovanni Luca Cascarano ◽  
Liberato De Caro ◽  
Carmelo Giacovazzo ◽  
...  

The success of theab initiophasing process mainly depends on two parameters: data resolution and structural complexity. In agreement with the Sheldrick rule, the presence of heavy atoms can also play a nonnegligible role in the success of direct methods. The increased efficiency of the Patterson methods and the advent of new phasing techniques based on extrapolated reflections have changed the state of the art. In particular, it is not clear how much the resolution limit and the structural complexity may be pushed in the presence of heavy atoms. In this paper, it is shown that the limits fixed by the Sheldrick rule may be relaxed if the structure contains heavy atoms and thatab initiotechniques can succeed even when the data resolution is about 2 Å, a limit unthinkable a few years ago. The method is successful in solving a structure with 7890 non-H atoms in the asymmetric unit at a resolution of 1.65 Å, a considerable advance on the previous record of 6319 atoms at atomic resolution.


Author(s):  
T. A. Welton

Various authors have emphasized the spatial information resident in an electron micrograph taken with adequately coherent radiation. In view of the completion of at least one such instrument, this opportunity is taken to summarize the state of the art of processing such micrographs. We use the usual symbols for the aberration coefficients, and supplement these with £ and 6 for the transverse coherence length and the fractional energy spread respectively. He also assume a weak, biologically interesting sample, with principal interest lying in the molecular skeleton remaining after obvious hydrogen loss and other radiation damage has occurred.


Author(s):  
James F. Hainfeld

Lipids are an important class of molecules, being found in membranes, HDL, LDL, and other natural structures, serving essential roles in structure and with varied functions such as compartmentalization and transport. Synthetic liposomes are also widely used as delivery and release vehicles for drugs, cosmetics, and other chemicals; soap is made from lipids. Lipids may form bilayer or multilammellar vesicles, micelles, sheets, tubes, and other structures. Lipid molecules may be linked to proteins, carbohydrates, or other moieties. EM study of this essential ingredient of life has lagged, due to lack of direct methods to visualize lipids without extensive alteration. OsO4 reacts with double bonds in membrane phospholipids, forming crossbridges. This has been the method of choice to both fix and stain membranes, thus far. An earlier work described the use of tungstate clusters (W11) attached to lipid moieties to form lipid structures and lipid probes.


Author(s):  
G. W. Hacker ◽  
I. Zehbe ◽  
J. Hainfeld ◽  
A.-H. Graf ◽  
C. Hauser-Kronberger ◽  
...  

In situ hybridization (ISH) with biotin-labeled probes is increasingly used in histology, histopathology and molecular biology, to detect genetic nucleic acid sequences of interest, such as viruses, genetic alterations and peptide-/protein-encoding messenger RNA (mRNA). In situ polymerase chain reaction (PCR) (PCR in situ hybridization = PISH) and the new in situ self-sustained sequence replication-based amplification (3SR) method even allow the detection of single copies of DNA or RNA in cytological and histological material. However, there is a number of considerable problems with the in situ PCR methods available today: False positives due to mis-priming of DNA breakdown products contained in several types of cells causing non-specific incorporation of label in direct methods, and re-diffusion artefacts of amplicons into previously negative cells have been observed. To avoid these problems, super-sensitive ISH procedures can be used, and it is well known that the sensitivity and outcome of these methods partially depend on the detection system used.


Author(s):  
Carl E. Henderson

Over the past few years it has become apparent in our multi-user facility that the computer system and software supplied in 1985 with our CAMECA CAMEBAX-MICRO electron microprobe analyzer has the greatest potential for improvement and updating of any component of the instrument. While the standard CAMECA software running on a DEC PDP-11/23+ computer under the RSX-11M operating system can perform almost any task required of the instrument, the commands are not always intuitive and can be difficult to remember for the casual user (of which our laboratory has many). Given the widespread and growing use of other microcomputers (such as PC’s and Macintoshes) by users of the microprobe, the PDP has become the “oddball” and has also fallen behind the state-of-the-art in terms of processing speed and disk storage capabilities. Upgrade paths within products available from DEC are considered to be too expensive for the benefits received. After using a Macintosh for other tasks in the laboratory, such as instrument use and billing records, word processing, and graphics display, its unique and “friendly” user interface suggested an easier-to-use system for computer control of the electron microprobe automation. Specifically a Macintosh IIx was chosen for its capacity for third-party add-on cards used in instrument control.


2010 ◽  
Vol 20 (1) ◽  
pp. 9-13 ◽  
Author(s):  
Glenn Tellis ◽  
Lori Cimino ◽  
Jennifer Alberti

Abstract The purpose of this article is to provide clinical supervisors with information pertaining to state-of-the-art clinic observation technology. We use a novel video-capture technology, the Landro Play Analyzer, to supervise clinical sessions as well as to train students to improve their clinical skills. We can observe four clinical sessions simultaneously from a central observation center. In addition, speech samples can be analyzed in real-time; saved on a CD, DVD, or flash/jump drive; viewed in slow motion; paused; and analyzed with Microsoft Excel. Procedures for applying the technology for clinical training and supervision will be discussed.


1995 ◽  
Vol 38 (5) ◽  
pp. 1126-1142 ◽  
Author(s):  
Jeffrey W. Gilger

This paper is an introduction to behavioral genetics for researchers and practioners in language development and disorders. The specific aims are to illustrate some essential concepts and to show how behavioral genetic research can be applied to the language sciences. Past genetic research on language-related traits has tended to focus on simple etiology (i.e., the heritability or familiality of language skills). The current state of the art, however, suggests that great promise lies in addressing more complex questions through behavioral genetic paradigms. In terms of future goals it is suggested that: (a) more behavioral genetic work of all types should be done—including replications and expansions of preliminary studies already in print; (b) work should focus on fine-grained, theory-based phenotypes with research designs that can address complex questions in language development; and (c) work in this area should utilize a variety of samples and methods (e.g., twin and family samples, heritability and segregation analyses, linkage and association tests, etc.).


Sign in / Sign up

Export Citation Format

Share Document