Direct measurement of electron-diffraction-pattern intensities using an energy loss spectrometer

Author(s):  
A. G. Jackson ◽  
M. Rowe

Diffraction intensities from intermetallic compounds are, in the kinematic approximation, proportional to the scattering amplitude from the element doing the scattering. More detailed calculations have shown that site symmetry and occupation by various atom species also affects the intensity in a diffracted beam. [1] Hence, by measuring the intensities of beams, or their ratios, the occupancy can be estimated. Measurement of the intensity values also allows structure calculations to be made to determine the spatial distribution of the potentials doing the scattering. Thermal effects are also present as a background contribution. Inelastic effects such as loss or absorption/excitation complicate the intensity behavior, and dynamical theory is required to estimate the intensity value.The dynamic range of currents in diffracted beams can be 104or 105:1. Hence, detection of such information requires a means for collecting the intensity over a signal-to-noise range beyond that obtainable with a single film plate, which has a S/N of about 103:1. Although such a collection system is not available currently, a simple system consisting of instrumentation on an existing STEM can be used as a proof of concept which has a S/N of about 255:1, limited by the 8 bit pixel attributes used in the electronics. Use of 24 bit pixel attributes would easily allowthe desired noise range to be attained in the processing instrumentation. The S/N of the scintillator used by the photoelectron sensor is about 106 to 1, well beyond the S/N goal. The trade-off that must be made is the time for acquiring the signal, since the pattern can be obtained in seconds using film plates, compared to 10 to 20 minutes for a pattern to be acquired using the digital scan. Parallel acquisition would, of course, speed up this process immensely.

Author(s):  
Holger Gruen ◽  
Carsten Benthin ◽  
Sven Woop

We propose an easy and simple-to-integrate approach to accelerate ray tracing of alpha-tested transparent geometry with a focus on Microsoft® DirectX® or Vulkan® ray tracing extensions. Pre-computed bit masks are used to quickly determine fully transparent and fully opaque regions of triangles thereby skipping the more expensive alpha-test operation. These bit masks allow us to skip up to 86% of all transparency tests, yielding up to 40% speed up in a proof-of-concept DirectX® software only implementation.


Author(s):  
Christian Rauch ◽  
Thomas Ho¨rmann ◽  
Sebastian Jagsch ◽  
Raimund Almbauer

Much attention has been paid recently by research and development engineers on performing multi-physics calculations. One way to do this is to couple commercial tools for examining complex systems. Since the proposal of an software architecture for coupling programs as published in a previous paper significant changes have led to an improved performance for large-scale industrial applications. This architecture is being described and as a proof of concept a simulation is being conducted by coupling two commercial solvers. The speed-up of the new system is being presented. The simulation results are then compared with measurements of surface temperatures of an exhaust system of an actual sports utilities vehicle (SUV) and conclusions are being drawn. The proposed architecture is easily adaptable to various programs as it is implemented in C++ and changes for a specific code can be restricted to a view classes.


2020 ◽  
Vol 29 (11) ◽  
pp. 2030008
Author(s):  
Raj Kumar ◽  
Ritesh Kumar Jaiswal ◽  
Ram Awadh Mishra

Modulo multiplier has been attracting considerable attention as one of the essential components of residue number system (RNS)-based computational circuits. This paper contributes a comprehensive review in the design of modulo [Formula: see text] multipliers for the first time. The modulo multipliers can be implemented using ROM (look-up-table) as well as VLSI components (memoryless); however, the former is preferable for lower word-length and later for larger word-length. The modular and parallelism properties of RNS are used to improve the performance of memoryless multipliers. Moreover, a Booth-encoding algorithm is used to speed-up the multipliers. Also, an advanced modulo [Formula: see text] multiplier based on redundant RNS (RRNS) could be further chosen for very high dynamic range. These perspectives of modulo [Formula: see text] multipliers have been extensively studied for recent state-of-the-art and analyzed using Synopsis design compiler tool.


Author(s):  
D. Ye ◽  
L. Veen ◽  
A. Nikishova ◽  
J. Lakhlili ◽  
W. Edeling ◽  
...  

Uncertainty quantification (UQ) is a key component when using computational models that involve uncertainties, e.g. in decision-making scenarios. In this work, we present uncertainty quantification patterns (UQPs) that are designed to support the analysis of uncertainty in coupled multi-scale and multi-domain applications. UQPs provide the basic building blocks to create tailored UQ for multiscale models. The UQPs are implemented as generic templates, which can then be customized and aggregated to create a dedicated UQ procedure for multiscale applications. We present the implementation of the UQPs with multiscale coupling toolkit Multiscale Coupling Library and Environment 3. Potential speed-up for UQPs has been derived as well. As a proof of concept, two examples of multiscale applications using UQPs are presented. This article is part of the theme issue ‘Reliability and reproducibility in computational science: implementing verification, validation and uncertainty quantification in silico ’.


Author(s):  
J.A. Eades ◽  
S. Moore ◽  
T. Pfullmann ◽  
John Hangas

In the fall 1990 issue of the EMSA Bulletin, one of us launched an appeal, asking to be sent output from programs that simulate HOLZ lines. This unusual request was prompted by the discovery that a number of such programs (including some that are well respected and widely distributed) gave different results. Several people responded and as a result we have been able to establish the reasons for the discrepancies - at least enough of them that we are confident in giving a sample result.All the programs that we have considered are programs that calculate the positions of HOLZ lines in the kinematic approximation. However it is generally assumed that a kinematic simulation will give a result that agrees with the full dynamical theory and with experiment, provided that the kinematic calculation is carried out using an incorrect value of the accelerating voltage. This will be true for lines close enough to the zone axis and provided that a different voltage is used for each Laue zone.


2013 ◽  
Vol 2013 ◽  
pp. 1-12 ◽  
Author(s):  
Darío Guerrero-Fernández ◽  
Juan Falgueras ◽  
M. Gonzalo Claros

Current genomic analyses often require the managing and comparison of big data using desktop bioinformatic software that was not developed regarding multicore distribution. The task-farm SCBI_MAPREDUCE is intended to simplify the trivial parallelisation and distribution of new and legacy software and scripts for biologists who are interested in using computers but are not skilled programmers. In the case of legacy applications, there is no need of modification or rewriting the source code. It can be used from multicore workstations to heterogeneous grids. Tests have demonstrated that speed-up scales almost linearly and that distribution in small chunks increases it. It is also shown that SCBI_MAPREDUCE takes advantage of shared storage when necessary, is fault-tolerant, allows for resuming aborted jobs, does not need special hardware or virtual machine support, and provides the same results than a parallelised, legacy software. The same is true for interrupted and relaunched jobs. As proof-of-concept, distribution of a compiled version of BLAST+ in the SCBI_DISTRIBUTED_BLAST gem is given, indicating that other blast binaries can be used while maintaining the same SCBI_DISTRIBUTED_BLAST code. Therefore, SCBI_MAPREDUCE suits most parallelisation and distribution needs in, for example, gene and genome studies.


F1000Research ◽  
2016 ◽  
Vol 4 ◽  
pp. 798 ◽  
Author(s):  
Edward P. Randviir ◽  
Samuel M. Illingworth ◽  
Matthew J. Baker ◽  
Matthew Cude ◽  
Craig E. Banks

The Royal Society of Chemistry held, to our knowledge, the world’s first Twitter conference at 9am on February 5 th, 2015. The conference was a Twitter-only conference, allowing researchers to upload academic posters as tweets, replacing a physical meeting. This paper reports the details of the event and discusses the outcomes, such as the potential for the use of social media to enhance scientific communication at conferences. In particular, the present work argues that social media outlets such as Twitter broaden audiences, speed up communication, and force clearer and more concise descriptions of a researcher’s work. The benefits of poster presentations are also discussed in terms of potential knowledge exchange and networking. This paper serves as a proof-of-concept approach for improving both the public opinion of the poster, and the enhancement of the poster through an innovative online format that some may feel more comfortable with, compared to face-to-face communication.


Author(s):  
Raul Jimenez Rosenberg ◽  
Raul Sierra-Alcocer

The work involved in checking millions of records by hand is hard and requires thousands of human hours. At the increasing rate at which we are collecting new data from different sources with a wide range of 'quality', the problem is getting worse. An institution like CONABIO (National Commission for the Knowledge and Use of Biodiversity, Mexico) dedicates a large amount of human resources to review species records to ensure that data published by the institution has high quality. At CONABIO we are designing a system to help us direct our attention to the most problematic data. Our methodology (Stephens et al. 2019) scores a species record according to the features of its location, and it labels it as suspicious if it has a low score. A low score means that the features of the location are unusual for that species. The features of locations are the set of abiotic, like climate and topographic charactersitics and occurrences of other species in the location. Although this does not mean that a record is wrong, it may be an indicator that a record needs to be assessed. The system we are designing works in two scenarios: in one, it scores new data based on parameters adjusted from validated data; in the second, the system checks for consistency in the database, that is, it flags records of a species that seem like outliers according to the predominant records distribution for that species. Our initial tests show that we could speed up the detection process for some problematic records. In one of our tests, where we used data that were previously labeled by hand, the method flagged 624 records, out of which 70 were confirmed as incorrect data. If we look only at the precision of the results it might seem like a poor performance, however if we look at the amount of work it might save us, it looks promising because to find the same number of inaccurate records without any assistance we would have had to review almost 5,000 records. This talk is a proof of concept for this system, and details on our initial results, reviewing both weaknesses and strengths.


Foods ◽  
2021 ◽  
Vol 10 (11) ◽  
pp. 2670
Author(s):  
Antoon Lievens ◽  
Valentina Paracchini ◽  
Danilo Pietretti ◽  
Linda Garlant ◽  
Alain Maquet ◽  
...  

The EU General Food Law not only aims at ensuring food safety but also to ‘prevent fraudulent or deceptive practices; the adulteration of food; and any other practices which may mislead the consumer’. Especially the partial or complete, deliberate, and intentional substitution of valuable ingredients (e.g., Saffron) for less valuable ones is of concern. Due to the variety of products on the market an approach to detect food adulteration that works well for one species may not be easily applicable to another. Here we present a broadly applicable approach for the detection of substitution of biological materials based on digital PCR. By simultaneously measuring and forecasting the number of genome copies in a sample, fraud is detectable as a discrepancy between these two values. Apart from the choice of target gene, the procedure is identical across all species. It is scalable, rapid, and has a high dynamic range. We provide proof of concept by presenting the analysis of 141 samples of Saffron (Crocus sativus) from across the European market by DNA accounting and the verification of these results by NGS analysis.


Sign in / Sign up

Export Citation Format

Share Document