Automatically Correcting for Specimen Displacement Error During XRD Search/Match Identification

1981 ◽  
Vol 25 ◽  
pp. 231-236 ◽  
Author(s):  
Walter N. Schreiner ◽  
Ronald Jenkins

Over the past several years there has been considerable interest in computer search/match programs for qualitative analysis of powder diffraction patterns. This interest has been stimulated by the availability of modern minicomputers supported by relatively inexpensive mass storage devices capable of containing the entire JCPDS (l) data base on line. As the traditional search/match algorithms have been reviewed for possible implementation on the slower speed and restricted memory minicomputers being supplied with today's automated diffractometers, new ideas have emerged for such algorithms. One very extensive set of new algorithms has been developed by our group and these are contained in the SANDMAN search/match/identify program which was described at this conference last year (2). Experience has shown those algorithms to be extremely effective, particularly in handling eases where the presence of systematic errors in the data has precluded the correct analysis by other computerised search/match systems.

1983 ◽  
Vol 27 ◽  
pp. 3-18
Author(s):  
L. K. Frevel

AbstractA condensed chronology of computer SEARCH/MATCH programs for the analysis of multiphase crystalline powders is presented covering the years 1965 to 1981. For this period the various algorithms for searching a data base of standard powder diffraction patterns, {ds, (I/I1)s}, are predicated on empirical “fingerprint” matching of the experimental powder data, {dν,Iν}, with one or more of the standard patterns. Within the past year an interactive computer SEARCH/MATCH program has been developed based on a structure-sensitive SEARCH procedure.The evolution of computer SEARCH/MATCH procedures undoubtedly will continue and will be influenced by marked improvements in the quality of digitized, machine-readable, powder diffraction data. Three conjectures are offered on future developments in x-ray diffractometry.


Author(s):  
William Krakow

In the past few years on-line digital television frame store devices coupled to computers have been employed to attempt to measure the microscope parameters of defocus and astigmatism. The ultimate goal of such tasks is to fully adjust the operating parameters of the microscope and obtain an optimum image for viewing in terms of its information content. The initial approach to this problem, for high resolution TEM imaging, was to obtain the power spectrum from the Fourier transform of an image, find the contrast transfer function oscillation maxima, and subsequently correct the image. This technique requires a fast computer, a direct memory access device and even an array processor to accomplish these tasks on limited size arrays in a few seconds per image. It is not clear that the power spectrum could be used for more than defocus correction since the correction of astigmatism is a formidable problem of pattern recognition.


Author(s):  
W.J. de Ruijter ◽  
Sharma Renu

Established methods for measurement of lattice spacings and angles of crystalline materials include x-ray diffraction, microdiffraction and HREM imaging. Structural information from HREM images is normally obtained off-line with the traveling table microscope or by the optical diffractogram technique. We present a new method for precise measurement of lattice vectors from HREM images using an on-line computer connected to the electron microscope. It has already been established that an image of crystalline material can be represented by a finite number of sinusoids. The amplitude and the phase of these sinusoids are affected by the microscope transfer characteristics, which are strongly influenced by the settings of defocus, astigmatism and beam alignment. However, the frequency of each sinusoid is solely a function of overall magnification and periodicities present in the specimen. After proper calibration of the overall magnification, lattice vectors can be measured unambiguously from HREM images.Measurement of lattice vectors is a statistical parameter estimation problem which is similar to amplitude, phase and frequency estimation of sinusoids in 1-dimensional signals as encountered, for example, in radar, sonar and telecommunications. It is important to properly model the observations, the systematic errors and the non-systematic errors. The observations are modelled as a sum of (2-dimensional) sinusoids. In the present study the components of the frequency vector of the sinusoids are the only parameters of interest. Non-systematic errors in recorded electron images are described as white Gaussian noise. The most important systematic error is geometric distortion. Lattice vectors are measured using a two step procedure. First a coarse search is obtained using a Fast Fourier Transform on an image section of interest. Prior to Fourier transformation the image section is multiplied with a window, which gradually falls off to zero at the edges. The user indicates interactively the periodicities of interest by selecting spots in the digital diffractogram. A fine search for each selected frequency is implemented using a bilinear interpolation, which is dependent on the window function. It is possible to refine the estimation even further using a non-linear least squares estimation. The first two steps provide the proper starting values for the numerical minimization (e.g. Gauss-Newton). This third step increases the precision with 30% to the highest theoretically attainable (Cramer and Rao Lower Bound). In the present studies we use a Gatan 622 TV camera attached to the JEM 4000EX electron microscope. Image analysis is implemented on a Micro VAX II computer equipped with a powerful array processor and real time image processing hardware. The typical precision, as defined by the standard deviation of the distribution of measurement errors, is found to be <0.003Å measured on single crystal silicon and <0.02Å measured on small (10-30Å) specimen areas. These values are ×10 times larger than predicted by theory. Furthermore, the measured precision is observed to be independent on signal-to-noise ratio (determined by the number of averaged TV frames). Obviously, the precision is restricted by geometric distortion mainly caused by the TV camera. For this reason, we are replacing the Gatan 622 TV camera with a modern high-grade CCD-based camera system. Such a system not only has negligible geometric distortion, but also high dynamic range (>10,000) and high resolution (1024x1024 pixels). The geometric distortion of the projector lenses can be measured, and corrected through re-sampling of the digitized image.


Author(s):  
Joseph Mazur

While all of us regularly use basic mathematical symbols such as those for plus, minus, and equals, few of us know that many of these symbols weren't available before the sixteenth century. What did mathematicians rely on for their work before then? And how did mathematical notations evolve into what we know today? This book explains the fascinating history behind the development of our mathematical notation system. It shows how symbols were used initially, how one symbol replaced another over time, and how written math was conveyed before and after symbols became widely adopted. Traversing mathematical history and the foundations of numerals in different cultures, the book looks at how historians have disagreed over the origins of the number system for the past two centuries. It follows the transfigurations of algebra from a rhetorical style to a symbolic one, demonstrating that most algebra before the sixteenth century was written in prose or in verse employing the written names of numerals. It also investigates the subconscious and psychological effects that mathematical symbols have had on mathematical thought, moods, meaning, communication, and comprehension. It considers how these symbols influence us (through similarity, association, identity, resemblance, and repeated imagery), how they lead to new ideas by subconscious associations, how they make connections between experience and the unknown, and how they contribute to the communication of basic mathematics. From words to abbreviations to symbols, this book shows how math evolved to the familiar forms we use today.


2019 ◽  
Author(s):  
Adib Rifqi Setiawan

Lisa Randall is a theoretical physicist working in particle physics and cosmology. She was born in Queens, New York City, on June 18, 1962. Lisa Randall is an alumna of Hampshire College Summer Studies in Mathematics; and she graduated from Stuyvesant High School in 1980. She won first place in the 1980 Westinghouse Science Talent Search at the age of 18; and at Harvard University, Lisa Randall earned both a BA in physics (1983) and a PhD in theoretical particle physics (1987) under advisor Howard Mason Georgi III, a theoretical physicist. She is currently Frank B. Baird, Jr. Professor of Science on the physics faculty of Harvard University, where he has been for the past a decade. Her works concerns elementary particles and fundamental forces, and has involved the study of a wide variety of models, the most recent involving dimensions. She has also worked on supersymmetry, Standard Model observables, cosmological inflation, baryogenesis, grand unified theories, and general relativity. Consequently, her studies have made her among the most cited and influential theoretical physicists and she has received numerous awards and honors for her scientific endeavors. Since December 27, 2010 at 00:42 (GMT+7), Lisa Randall is Twitter’s user with account @lirarandall. “Thanks to new followers. Interesting how different it feels broadcasting on line vs.via book or article. Explanations? Pithiness? Rapidity?” is her first tweet.


1954 ◽  
Vol 14 (3) ◽  
pp. 301-312
Author(s):  
George C. A. Boehrer

The leaders of the independence movements in the several Ibero-American areas frequently turned their attention to the problems of the aborigine. Usually the liberators’ concern was restricted to the pious hope that the Indian would be incorporated into creole society. Detailed programs to this end were not presented. The chief exception was José Bonifacio de Andrada e Silva’s Apontamentos para a civilização dos índios bravos do Império do Brasil. Mostly a drawing upon the experience of the past with a blending of the new ideas of the Enlightenment, the Apontamentos present little new. They are today important because, along with José Bonifácio’s more celebrated treatise on the Negro slave, they show an interest in social problems to which few of his contemporaries gave more than a passing glance. In the present century, they have become the guidepost for Brazil’s Indian program. As such they have frequently been hailed by Brazilians when the Indian problem is discussed.


Author(s):  
Sivakumar Dhandapani ◽  
Madara M. Ogot

Abstract A key consideration in the design of walking machines is the minimization of energy consumption. Two main avenues of research have been pursued in the past (a) finding an optimal gait which reduces energy consumption or (b) the employment of energy storage devices to recover energy from one step to the next. This study follows the latter approach, which has hitherto concentrated on hopping machines. Several practical design considerations for energy recovery in multi-legged walking machines, where a stance phase prevents the immediate recovery of energy from one step to the next, are investigated. Two schemes, passive and active locking, are introduced. The simplified models presented serve to illustrate the feasibility of these schemes.


2003 ◽  
Vol 183 (1) ◽  
pp. 1-2 ◽  
Author(s):  
Peter Tyrer

The challenges for scientific journals at the beginning of 21st century are exciting but formidable. In addition to reporting faithfully new knowledge and new ideas, each journal, or at least all those aiming for a general readership, has to cater for a potentially huge lay readership waiting at the internet portals, a hungry press eager for juicy titbits, and core readers who, while impressed to some extent by weighty contributions to knowledge, are also looking for lighter material that is both informative and entertaining. In the past this type of content was frowned on as mere journalism, fluff of short-term appeal but no real substance. The lighter approach was pioneered by Michael O'Donnell as editor of World Medicine in the 1970s, who introduced a brand of racy articles, debates and controversial issues in a tone of amusing and irreverent iconoclasm. At this time it was dismissed as a comic by some of the learned journals but its popularity ensured that in subsequent years its critics quietly followed suit, as any current reader of the British Medical Journal and the Lancet will testify.


1983 ◽  
Vol 61 (2) ◽  
pp. 301-304 ◽  
Author(s):  
Jacques Bures ◽  
François Leonard ◽  
Jean-Pierre Monchalin

A self-scanned photodiode array has been used as a multiplex sensor for laboratory detection and measurement, by dispersive spectroscopy, of trace quantities of the atmospheric pollutant NO2. The on-line data acquisition and numerical analysis system allows in particular to eliminate some systematic errors and drifts (Taylor filtering) and the noise associated with high spatial frequencies (low-pass filtering). We have then been able to show that an absorption spectrum, corresponding to low absorber concentrations, has a sufficient information content for the characterization of the pollutant and the measurement of its concentration (ppm m), even when noise and drifts are present. The proposed system can be favorably compared to the ones, based on a single photoelectric detector, which are commercially used.


Sign in / Sign up

Export Citation Format

Share Document