The Computing Power of Determinism and Reversibility in Chemical Reaction Automata

Author(s):  
Fumiya Okubo ◽  
Takashi Yokomori
Author(s):  
S.J.B. Reed

Characteristic fluorescenceThe theory of characteristic fluorescence corrections was first developed by Castaing. The same approach, with an improved expression for the relative primary x-ray intensities of the exciting and excited elements, was used by Reed, who also introduced some simplifications, which may be summarized as follows (with reference to K-K fluorescence, i.e. K radiation of element ‘B’ exciting K radiation of ‘A’):1.The exciting radiation is assumed to be monochromatic, consisting of the Kα line only (neglecting the Kβ line).2.Various parameters are lumped together in a single tabulated function J(A), which is assumed to be independent of B.3.For calculating the absorption of the emerging fluorescent radiation, the depth distribution of the primary radiation B is represented by a simple exponential.These approximations may no longer be justifiable given the much greater computing power now available. For example, the contribution of the Kβ line can easily be calculated separately.


Author(s):  
Stuart McKernan

For many years the concept of quantitative diffraction contrast experiments might have consisted of the determination of dislocation Burgers vectors using a g.b = 0 criterion from several different 2-beam images. Since the advent of the personal computer revolution, the available computing power for performing image-processing and image-simulation calculations is enormous and ubiquitous. Several programs now exist to perform simulations of diffraction contrast images using various approximations. The most common approximations are the use of only 2-beams or a single systematic row to calculate the image contrast, or calculating the image using a column approximation. The increasing amount of literature showing comparisons of experimental and simulated images shows that it is possible to obtain very close agreement between the two images; although the choice of parameters used, and the assumptions made, in performing the calculation must be properly dealt with. The simulation of the images of defects in materials has, in many cases, therefore become a tractable problem.


Author(s):  
Dai Dalin ◽  
Guo Jianmin

Lipid cytochemistry has not yet advanced far at the EM level. A major problem has been the loss of lipid during dehydration and embedding. Although the adoption of glutaraldehyde and osmium tetroxide accelerate the chemical reaction of lipid and osmium tetroxide can react on the double bouds of unsaturated lipid to from the osmium black, osmium tetroxide can be reduced in saturated lipid and subsequently some of unsaturated lipid are lost during dehydration. In order to reduce the loss of lipid by traditional method, some researchers adopted a few new methods, such as the change of embedding procedure and the adoption of new embedding media, to solve the problem. In a sense, these new methods are effective. They, however, usually require a long period of preparation. In this paper, we do research on the fiora nectary strucure of lauraceae by the rapid-embedding method wwith PEG under electron microscope and attempt to find a better method to solve the problem mentioned above.


Author(s):  
Jose-Maria Carazo ◽  
I. Benavides ◽  
S. Marco ◽  
J.L. Carrascosa ◽  
E.L. Zapata

Obtaining the three-dimensional (3D) structure of negatively stained biological specimens at a resolution of, typically, 2 - 4 nm is becoming a relatively common practice in an increasing number of laboratories. A combination of new conceptual approaches, new software tools, and faster computers have made this situation possible. However, all these 3D reconstruction processes are quite computer intensive, and the middle term future is full of suggestions entailing an even greater need of computing power. Up to now all published 3D reconstructions in this field have been performed on conventional (sequential) computers, but it is a fact that new parallel computer architectures represent the potential of order-of-magnitude increases in computing power and should, therefore, be considered for their possible application in the most computing intensive tasks.We have studied both shared-memory-based computer architectures, like the BBN Butterfly, and local-memory-based architectures, mainly hypercubes implemented on transputers, where we have used the algorithmic mapping method proposed by Zapata el at. In this work we have developed the basic software tools needed to obtain a 3D reconstruction from non-crystalline specimens (“single particles”) using the so-called Random Conical Tilt Series Method. We start from a pair of images presenting the same field, first tilted (by ≃55°) and then untilted. It is then assumed that we can supply the system with the image of the particle we are looking for (ideally, a 2D average from a previous study) and with a matrix describing the geometrical relationships between the tilted and untilted fields (this step is now accomplished by interactively marking a few pairs of corresponding features in the two fields). From here on the 3D reconstruction process may be run automatically.


2019 ◽  
Vol 12 (1) ◽  
pp. 47-60
Author(s):  
László Kota

The artificial intelligence undergoes an enormous development since its appearance in the fifties. The computing power has grown exponentially since then, enabling the use of artificial intelligence applications in different areas. Since then, artificial intelligence applications are not only present in the industry, but they have slowly conquered households as well. Their use in logistics is becoming more and more widespread, just think of self-driving cars and trucks. In this paper, the author attempts to summarize and present the artificial intelligence logistical applications, its development and impact on logistics.


Sign in / Sign up

Export Citation Format

Share Document