Noise and Detection Limits in Signal-Integrating Analytical Methods

Author(s):  
H. C. Smit ◽  
H. Steigstra
1990 ◽  
Vol 36 (8) ◽  
pp. 1408-1427 ◽  
Author(s):  
James P Gosling

Abstract Immunoassays are now very widely used in the clinical laboratory, either because no other type of assay system is feasible or because they are often the most effective and suitable of the possible analytical methods. The last decade has seen the development and refinement of many new immunoassay reagents and systems. The major trend has been away from liquid-phase assays involving radioisotopic labels, towards fast homogeneous or solid-phase assays capable of operation anywhere; and towards precise and reliable nonisotopic, automated or semi-automated laboratory assays, often with detection limits measured in pico- or attomoles. The use of monoclonal antibodies is now widespread, and the methodologies of labels and of solid-phase components are much more sophisticated. New assay formulations, novel homogeneous systems, immunosensors, free-analyte assays, the importance of thorough validation and of interfering substances, and future trends are discussed.


1982 ◽  
Vol 65 (3) ◽  
pp. 531-534
Author(s):  
James T Tanner

Abstract With each passing decade new problems arise for the regulatory analytical chemist. The push for low detection limits from percent to parts per million to parts per billion brought the need for new and improved analytical instrumentation followed by questions of reliability at such low values. Each question has been met by new instruments or techniques and critical studies. The question for the 1980s is not how to achieve low detection limits but how to reliably and rapidly perform analyses at low values. During the 1960s the emphasis was on the single component/element techniques. We seem now to be entering the computer-controlled era. In each analytical specialty, computer-controlled instruments are offered which greatly aid the analyst in producing an accurate, reliable analysis in a shorter time. The advantage of larger numbers of analyses per unit of time with, in some cases, reduced personnel are not to be overlooked in this age of economy. To the AOAC collaborative study this means a reduction in the number of laboratories who can participate. It also means greater standardization of methodology, and the chemist’s laboratory ability becomes less of a factor in producing reliable analyses. Specific analytical examples are discussed to illustrate the trend for the 1980s.


Author(s):  
Gianluigi Botton ◽  
Gilles L'espérance

As interest for parallel EELS spectrum imaging grows in laboratories equipped with commercial spectrometers, different approaches were used in recent years by a few research groups in the development of the technique of spectrum imaging as reported in the literature. Either by controlling, with a personal computer both the microsope and the spectrometer or using more powerful workstations interfaced to conventional multichannel analysers with commercially available programs to control the microscope and the spectrometer, spectrum images can now be obtained. Work on the limits of the technique, in terms of the quantitative performance was reported, however, by the present author where a systematic study of artifacts detection limits, statistical errors as a function of desired spatial resolution and range of chemical elements to be studied in a map was carried out The aim of the present paper is to show an application of quantitative parallel EELS spectrum imaging where statistical analysis is performed at each pixel and interpretation is carried out using criteria established from the statistical analysis and variations in composition are analyzed with the help of information retreived from t/γ maps so that artifacts are avoided.


Author(s):  
J.R. McIntosh ◽  
D.L. Stemple ◽  
William Bishop ◽  
G.W. Hannaway

EM specimens often contain 3-dimensional information that is lost during micrography on a single photographic film. Two images of one specimen at appropriate orientations give a stereo view, but complex structures composed of multiple objects of graded density that superimpose in each projection are often difficult to decipher in stereo. Several analytical methods for 3-D reconstruction from multiple images of a serially tilted specimen are available, but they are all time-consuming and computationally intense.


Author(s):  
R. Packwood ◽  
M.W. Phaneuf ◽  
V. Weatherall ◽  
I. Bassignana

The development of specialized analytical instruments such as the SIMS, XPS, ISS etc., all with truly incredible abilities in certain areas, has given rise to the notion that electron probe microanalysis (EPMA) is an old fashioned and rather inadequate technique, and one that is of little or no use in such high technology fields as the semiconductor industry. Whilst it is true that the microprobe does not possess parts-per-billion sensitivity (ppb) or monolayer depth resolution it is also true that many times these extremes of performance are not essential and that a few tens of parts-per-million (ppm) and a few tens of nanometers depth resolution is all that is required. In fact, the microprobe may well be the second choice method for a wide range of analytical problems and even the method of choice for a few.The literature is replete with remarks that suggest the writer is confusing an SEM-EDXS combination with an instrument such as the Cameca SX-50. Even where this confusion does not exist, the literature discusses microprobe detection limits that are seldom stated to be as low as 100 ppm, whereas there are numerous element combinations for which 10-20 ppm is routinely attainable.


Sign in / Sign up

Export Citation Format

Share Document