scholarly journals Research on optimal Modeling of Food system based on data potential Distribution Mining and quantitative Analysis

2021 ◽  
Vol 1952 (4) ◽  
pp. 042084
Author(s):  
Yanqian Chen ◽  
Hongtu Qu ◽  
Chunming Kong
Author(s):  
Maya Fitri Faoziah ◽  
Untung Yuwono

Since food is a necessity for human life, there have been many innovations aimed at speeding up food production. However, these innovations can have negative effects on the environment and, thus, the overall food system. Greenpeace, a non-governmental organization, creates food campaigns that include online materials touting a better food system, naming bad corporations, and asking readers or supporters to join the campaigns. This study analyzes Greenpeace’s attitude in evaluating the environment using Halliday and Matthiessen’s transitivity system and Martin and White’s appraisal framework. The research was conducted using UAM CorpusTool software to perform a quantitative analysis of the data in terms of transitivity and appraisal. The results show that Greenpeace’s food campaigns contain judgments as the most frequent appraisal in material clauses and relational clauses. These judgments concern how entities, processes, and innovations affect the environment.


Author(s):  
J.P. Fallon ◽  
P.J. Gregory ◽  
C.J. Taylor

Quantitative image analysis systems have been used for several years in research and quality control applications in various fields including metallurgy and medicine. The technique has been applied as an extension of subjective microscopy to problems requiring quantitative results and which are amenable to automatic methods of interpretation.Feature extraction. In the most general sense, a feature can be defined as a portion of the image which differs in some consistent way from the background. A feature may be characterized by the density difference between itself and the background, by an edge gradient, or by the spatial frequency content (texture) within its boundaries. The task of feature extraction includes recognition of features and encoding of the associated information for quantitative analysis.Quantitative Analysis. Quantitative analysis is the determination of one or more physical measurements of each feature. These measurements may be straightforward ones such as area, length, or perimeter, or more complex stereological measurements such as convex perimeter or Feret's diameter.


Author(s):  
V. V. Damiano ◽  
R. P. Daniele ◽  
H. T. Tucker ◽  
J. H. Dauber

An important example of intracellular particles is encountered in silicosis where alveolar macrophages ingest inspired silica particles. The quantitation of the silica uptake by these cells may be a potentially useful method for monitoring silica exposure. Accurate quantitative analysis of ingested silica by phagocytic cells is difficult because the particles are frequently small, irregularly shaped and cannot be visualized within the cells. Semiquantitative methods which make use of particles of known size, shape and composition as calibration standards may be the most direct and simplest approach to undertake. The present paper describes an empirical method in which glass microspheres were used as a model to show how the ratio of the silicon Kα peak X-ray intensity from the microspheres to that of a bulk sample of the same composition correlated to the mass of the microsphere contained within the cell. Irregular shaped silica particles were also analyzed and a calibration curve was generated from these data.


Author(s):  
H.J. Dudek

The chemical inhomogenities in modern materials such as fibers, phases and inclusions, often have diameters in the region of one micrometer. Using electron microbeam analysis for the determination of the element concentrations one has to know the smallest possible diameter of such regions for a given accuracy of the quantitative analysis.In th is paper the correction procedure for the quantitative electron microbeam analysis is extended to a spacial problem to determine the smallest possible measurements of a cylindrical particle P of high D (depth resolution) and diameter L (lateral resolution) embeded in a matrix M and which has to be analysed quantitative with the accuracy q. The mathematical accounts lead to the following form of the characteristic x-ray intens ity of the element i of a particle P embeded in the matrix M in relation to the intensity of a standard S


Author(s):  
John A. Hunt

Spectrum-imaging is a useful technique for comparing different processing methods on very large data sets which are identical for each method. This paper is concerned with comparing methods of electron energy-loss spectroscopy (EELS) quantitative analysis on the Al-Li system. The spectrum-image analyzed here was obtained from an Al-10at%Li foil aged to produce δ' precipitates that can span the foil thickness. Two 1024 channel EELS spectra offset in energy by 1 eV were recorded and stored at each pixel in the 80x80 spectrum-image (25 Mbytes). An energy range of 39-89eV (20 channels/eV) are represented. During processing the spectra are either subtracted to create an artifact corrected difference spectrum, or the energy offset is numerically removed and the spectra are added to create a normal spectrum. The spectrum-images are processed into 2D floating-point images using methods and software described in [1].


Author(s):  
M. Pan ◽  
J.M. Cowley

Electron microdiffraction patterns, obtained when a small electron probe with diameter of 10-15 Å is directed to run parallel to and outside a flat crystal surface, are sensitive to the surface nature of the crystals. Dynamical diffraction calculations have shown that most of the experimental observations for a flat (100) face of a MgO crystal, such as the streaking of the central spot in the surface normal direction and (100)-type forbidden reflections etc., could be explained satisfactorily by assuming a modified image potential field outside the crystal surface. However the origin of this extended surface potential remains uncertain. A theoretical analysis by Howie et al suggests that the surface image potential should have a form different from above-mentioned image potential and also be smaller by several orders of magnitude. Nevertheless the surface potential distribution may in practice be modified in various ways, such as by the adsorption of a monolayer of gas molecules.


Author(s):  
Delbert E. Philpott ◽  
David Leaffer

There are certain advantages for electron probe analysis if the sample can be tilted directly towards the detector. The count rate is higher, it optimizes the geometry since only one angle need be taken into account for quantitative analysis and the signal to background ratio is improved. The need for less tilt angle may be an advantage because the grid bars are not moved quite as close to each other, leaving a little more open area for observation. Our present detector (EDAX) and microscope (Philips 300) combination precludes moving the detector behind the microscope where it would point directly at the grid. Therefore, the angle of the specimen was changed in order to optimize the geometry between the specimen and the detector.


Author(s):  
Conly L. Rieder

The behavior of many cellular components, and their dynamic interactions, can be characterized in the living cell with considerable spatial and temporal resolution by video-enhanced light microscopy (video-LM). Indeed, under the appropriate conditions video-LM can be used to determine the real-time behavior of organelles ≤ 25-nm in diameter (e.g., individual microtubules—see). However, when pushed to its limit the structures and components observed within the cell by video-LM cannot be resolved nor necessarily even identified, only detected. Positive identification and a quantitative analysis often requires the corresponding electron microcopy (EM).


Author(s):  
John T. Armstrong

One of the most cited papers in the geological sciences has been that of Albee and Bence on the use of empirical " α -factors" to correct quantitative electron microprobe data. During the past 25 years this method has remained the most commonly used correction for geological samples, despite the facts that few investigators have actually determined empirical α-factors, but instead employ tables of calculated α-factors using one of the conventional "ZAF" correction programs; a number of investigators have shown that the assumption that an α-factor is constant in binary systems where there are large matrix corrections is incorrect (e.g, 2-3); and the procedure’s desirability in terms of program size and computational speed is much less important today because of developments in computing capabilities. The question thus exists whether it is time to honorably retire the Bence-Albee procedure and turn to more modern, robust correction methods. This paper proposes that, although it is perhaps time to retire the original Bence-Albee procedure, it should be replaced by a similar method based on compositiondependent polynomial α-factor expressions.


Sign in / Sign up

Export Citation Format

Share Document