scholarly journals 100 Years of Progress in Applied Meteorology. Part I: Basic Applications

2018 ◽  
Vol 59 ◽  
pp. 22.1-22.33 ◽  
Author(s):  
Sue Ellen Haupt ◽  
Robert M. Rauber ◽  
Bruce Carmichael ◽  
Jason C. Knievel ◽  
James L. Cogan

Abstract The field of atmospheric science has been enhanced by its long-standing collaboration with entities with specific needs. This chapter and the two subsequent ones describe how applications have worked to advance the science at the same time that the science has served the needs of society. This chapter briefly reviews the synergy between the applications and advancing the science. It specifically describes progress in weather modification, aviation weather, and applications for security. Each of these applications has resulted in enhanced understanding of the physics and dynamics of the atmosphere, new and improved observing equipment, better models, and a push for greater computing power.

Author(s):  
S.J.B. Reed

Characteristic fluorescenceThe theory of characteristic fluorescence corrections was first developed by Castaing. The same approach, with an improved expression for the relative primary x-ray intensities of the exciting and excited elements, was used by Reed, who also introduced some simplifications, which may be summarized as follows (with reference to K-K fluorescence, i.e. K radiation of element ‘B’ exciting K radiation of ‘A’):1.The exciting radiation is assumed to be monochromatic, consisting of the Kα line only (neglecting the Kβ line).2.Various parameters are lumped together in a single tabulated function J(A), which is assumed to be independent of B.3.For calculating the absorption of the emerging fluorescent radiation, the depth distribution of the primary radiation B is represented by a simple exponential.These approximations may no longer be justifiable given the much greater computing power now available. For example, the contribution of the Kβ line can easily be calculated separately.


Author(s):  
Stuart McKernan

For many years the concept of quantitative diffraction contrast experiments might have consisted of the determination of dislocation Burgers vectors using a g.b = 0 criterion from several different 2-beam images. Since the advent of the personal computer revolution, the available computing power for performing image-processing and image-simulation calculations is enormous and ubiquitous. Several programs now exist to perform simulations of diffraction contrast images using various approximations. The most common approximations are the use of only 2-beams or a single systematic row to calculate the image contrast, or calculating the image using a column approximation. The increasing amount of literature showing comparisons of experimental and simulated images shows that it is possible to obtain very close agreement between the two images; although the choice of parameters used, and the assumptions made, in performing the calculation must be properly dealt with. The simulation of the images of defects in materials has, in many cases, therefore become a tractable problem.


Author(s):  
Jose-Maria Carazo ◽  
I. Benavides ◽  
S. Marco ◽  
J.L. Carrascosa ◽  
E.L. Zapata

Obtaining the three-dimensional (3D) structure of negatively stained biological specimens at a resolution of, typically, 2 - 4 nm is becoming a relatively common practice in an increasing number of laboratories. A combination of new conceptual approaches, new software tools, and faster computers have made this situation possible. However, all these 3D reconstruction processes are quite computer intensive, and the middle term future is full of suggestions entailing an even greater need of computing power. Up to now all published 3D reconstructions in this field have been performed on conventional (sequential) computers, but it is a fact that new parallel computer architectures represent the potential of order-of-magnitude increases in computing power and should, therefore, be considered for their possible application in the most computing intensive tasks.We have studied both shared-memory-based computer architectures, like the BBN Butterfly, and local-memory-based architectures, mainly hypercubes implemented on transputers, where we have used the algorithmic mapping method proposed by Zapata el at. In this work we have developed the basic software tools needed to obtain a 3D reconstruction from non-crystalline specimens (“single particles”) using the so-called Random Conical Tilt Series Method. We start from a pair of images presenting the same field, first tilted (by ≃55°) and then untilted. It is then assumed that we can supply the system with the image of the particle we are looking for (ideally, a 2D average from a previous study) and with a matrix describing the geometrical relationships between the tilted and untilted fields (this step is now accomplished by interactively marking a few pairs of corresponding features in the two fields). From here on the 3D reconstruction process may be run automatically.


2019 ◽  
Vol 12 (1) ◽  
pp. 47-60
Author(s):  
László Kota

The artificial intelligence undergoes an enormous development since its appearance in the fifties. The computing power has grown exponentially since then, enabling the use of artificial intelligence applications in different areas. Since then, artificial intelligence applications are not only present in the industry, but they have slowly conquered households as well. Their use in logistics is becoming more and more widespread, just think of self-driving cars and trucks. In this paper, the author attempts to summarize and present the artificial intelligence logistical applications, its development and impact on logistics.


2020 ◽  
Vol 2 ◽  
pp. 58-61 ◽  
Author(s):  
Syed Junaid ◽  
Asad Saeed ◽  
Zeili Yang ◽  
Thomas Micic ◽  
Rajesh Botchu

The advances in deep learning algorithms, exponential computing power, and availability of digital patient data like never before have led to the wave of interest and investment in artificial intelligence in health care. No radiology conference is complete without a substantial dedication to AI. Many radiology departments are keen to get involved but are unsure of where and how to begin. This short article provides a simple road map to aid departments to get involved with the technology, demystify key concepts, and pique an interest in the field. We have broken down the journey into seven steps; problem, team, data, kit, neural network, validation, and governance.


2019 ◽  
Vol 12 (3) ◽  
pp. 202-211
Author(s):  
Yuancheng Li ◽  
Rong Huang ◽  
Xiangqian Nie

Background: With the rapid development of the Internet, the number of web spam has increased dramatically in recent years, which has wasted search engine storage and computing power on a massive scale. To identify the web spam effectively, the content features, link features, hidden features and quality features of web page are integrated to establish the corresponding web spam identification index system. However, the index system is highly correlation dimension. Methods: An improved method of autoencoder named stacked autoencoder neural network (SAE) is used to realize the reduction of the web spam identification index system. Results: The experiment results show that our method could reduce effectively the index of web spam and significantly improves the recognition rate in the following work. Conclusion: An autoencoder based web spam indexes reduction method is proposed in this paper. The experimental results show that it greatly reduces the temporal and spatial complexity of the future web spam detection model.


2017 ◽  
Author(s):  
Solomon Bililign ◽  
◽  
Keith A. Schimmel ◽  
Ademe Mekonnen ◽  
Yuh-Lang Lin ◽  
...  
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document