Open Tools for Storage and Management of Quantitative Image Data

Author(s):  
Joshua Moore ◽  
Chris Allan ◽  
Jean‐Marie Burel ◽  
Brian Loranger ◽  
Donald MacDonald ◽  
...  
2021 ◽  
pp. 1-12
Author(s):  
Julia Fuchs ◽  
Olivia Nonn ◽  
Christine Daxboeck ◽  
Silvia Groiss ◽  
Gerit Moser ◽  
...  

Immunostaining in clinical routine and research highly depends on standardized staining methods and quantitative image analyses. We qualitatively and quantitatively compared antigen retrieval methods (no pretreatment, pretreatment with pepsin, and heat-induced pretreatment with pH 6 or pH 9) for 17 antibodies relevant for placenta and implantation diagnostics and research. Using our newly established, comprehensive automated quantitative image analysis approach, fluorescent signal intensities were evaluated. Automated quantitative image analysis found that 9 out of 17 antibodies needed antigen retrieval to show positive staining. Heat induction proved to be the most efficient form of antigen retrieval. Eight markers stained positive after pepsin digestion, with β-hCG and vWF showing enhanced staining intensities. To avoid the misinterpretation of quantitative image data, the qualitative aspect should always be considered. Results from native placental tissue were compared with sections of a placental invasion model based on thermo-sensitive scaffolds. Immunostaining on placentas in vitro leads to new insights into fetal development and maternal pathophysiological pathways, as pregnant women are justifiably excluded from clinical studies. Thus, there is a clear need for the assessment of reliable immunofluorescent staining and pretreatment methods. Our evaluation offers a powerful tool for antibody and pretreatment selection in placental research providing objective and precise results.


2014 ◽  
Vol 07 (06) ◽  
pp. 343-350 ◽  
Author(s):  
Nancy L. Ford ◽  
Angjelina Protik ◽  
Paul Babyn ◽  
Karen Thomas

Author(s):  
Bruce D. Newell

Advances in computers and related digital hardware, coupled with sophisticated software techniques have resulted in microscopy migrating from its historical roots as a subjective, qualitative science towards a more robust position as a truly quantitative technique. Granted, we will probably never totally remove the microscopist from the process of image interpretation (at least those at this conference hope not) but we will certainly continue to progress from describing our image data in qualitative terms (e.g., many/few, large/small, equiaxis/elongated, ordered/random, rough/smooth) to quantitative measurements of number, size, shape, location, texture, and so on.To move along the path toward quantitative image interpretations requires an understanding of image processing and analysis (IP/A) fundamentals to insure that the data obtained is of the required accuracy and precision. A generalized model of the critical steps in the image processing and analysis chain is given in Figure 1. This tutorial will examine the fundamental issues in each step that impact the quality of the final result and provide a broad overview of techniques that may be applicable.


2009 ◽  
Vol 14 (8) ◽  
pp. 944-955 ◽  
Author(s):  
Daniel F. Gilbert ◽  
Till Meinhof ◽  
Rainer Pepperkok ◽  
Heiko Runz

In this article, the authors describe the image analysis software DetecTiff©, which allows fully automated object recognition and quantification from digital images. The core module of the LabView©-based routine is an algorithm for structure recognition that employs intensity thresholding and size-dependent particle filtering from microscopic images in an iterative manner. Detected structures are converted into templates, which are used for quantitative image analysis. DetecTiff © enables processing of multiple detection channels and provides functions for template organization and fast interpretation of acquired data. The authors demonstrate the applicability of DetecTiff© for automated analysis of cellular uptake of fluorescencelabeled low-density lipoproteins as well as diverse other image data sets from a variety of biomedical applications. Moreover, the performance of DetecTiff© is compared with preexisting image analysis tools. The results show that DetecTiff© can be applied with high consistency for automated quantitative analysis of image data (e.g., from large-scale functional RNAi screening projects). ( Journal of Biomolecular Screening 2009:944-955)


2011 ◽  
Vol 5 (2A) ◽  
pp. 894-923 ◽  
Author(s):  
Jeffrey S. Morris ◽  
Veerabhadran Baladandayuthapani ◽  
Richard C. Herrick ◽  
Pietro Sanna ◽  
Howard Gutstein

Biometrics ◽  
2012 ◽  
Vol 68 (4) ◽  
pp. 1260-1268 ◽  
Author(s):  
Hongxiao Zhu ◽  
Philip J. Brown ◽  
Jeffrey S. Morris

Author(s):  
J.P. Fallon ◽  
P.J. Gregory ◽  
C.J. Taylor

Quantitative image analysis systems have been used for several years in research and quality control applications in various fields including metallurgy and medicine. The technique has been applied as an extension of subjective microscopy to problems requiring quantitative results and which are amenable to automatic methods of interpretation.Feature extraction. In the most general sense, a feature can be defined as a portion of the image which differs in some consistent way from the background. A feature may be characterized by the density difference between itself and the background, by an edge gradient, or by the spatial frequency content (texture) within its boundaries. The task of feature extraction includes recognition of features and encoding of the associated information for quantitative analysis.Quantitative Analysis. Quantitative analysis is the determination of one or more physical measurements of each feature. These measurements may be straightforward ones such as area, length, or perimeter, or more complex stereological measurements such as convex perimeter or Feret's diameter.


Author(s):  
H.P. Rohr

Today, in image analysis the broadest possible rationalization and economization have become desirable. Basically, there are two approaches for image analysis: The image analysis through the so-called scanning methods which are usually performed without the human eye and the systems of optical semiautomatic analysis completely relying on the human eye.The new MOP AM 01 opto-manual system (fig.) represents one of the very promising approaches in this field. The instrument consists of an electronic counting and storing unit, which incorporates a microprocessor and a keyboard for choice of measuring parameters, well designed for easy use.Using the MOP AM 01 there are three possibilities of image analysis:the manual point counting,the opto-manual point counting andthe measurement of absolute areas and/or length (size distribution analysis included).To determine a point density for the calculation of the corresponding volume density the intercepts lying within the structure are scanned with the light pen.


Author(s):  
Robert M. Glaeser ◽  
Bing K. Jap

The dynamical scattering effect, which can be described as the failure of the first Born approximation, is perhaps the most important factor that has prevented the widespread use of electron diffraction intensities for crystallographic structure determination. It would seem to be quite certain that dynamical effects will also interfere with structure analysis based upon electron microscope image data, whenever the dynamical effect seriously perturbs the diffracted wave. While it is normally taken for granted that the dynamical effect must be taken into consideration in materials science applications of electron microscopy, very little attention has been given to this problem in the biological sciences.


Author(s):  
Richard S. Chemock

One of the most common tasks in a typical analysis lab is the recording of images. Many analytical techniques (TEM, SEM, and metallography for example) produce images as their primary output. Until recently, the most common method of recording images was by using film. Current PS/2R systems offer very large capacity data storage devices and high resolution displays, making it practical to work with analytical images on PS/2s, thereby sidestepping the traditional film and darkroom steps. This change in operational mode offers many benefits: cost savings, throughput, archiving and searching capabilities as well as direct incorporation of the image data into reports.The conventional way to record images involves film, either sheet film (with its associated wet chemistry) for TEM or PolaroidR film for SEM and light microscopy. Although film is inconvenient, it does have the highest quality of all available image recording techniques. The fine grained film used for TEM has a resolution that would exceed a 4096x4096x16 bit digital image.


Sign in / Sign up

Export Citation Format

Share Document