scholarly journals Formation and quantitative analysis of internal structure of Si nanoparticles developed via bead-milling

AIP Advances ◽  
2021 ◽  
Vol 11 (7) ◽  
pp. 075101
Author(s):  
Mingcai Zhao ◽  
Juan Zhang ◽  
Wei Wang ◽  
Qi Zhang
2020 ◽  
Vol 27 (2) ◽  
pp. 24-35
Author(s):  
Kevan Lamm ◽  
Alexa Lamm ◽  
Don Edgar

The importance of valid and reliable data and its collection is fundamental to empirical research; however, there remain inconsistent approaches to creating robust scales capable of capturing both valid and reliable data, particularly within international agricultural and extension education contexts. Robust scale development consists of five areas for validation: content, response process, internal structure, external structure, and consequential. The purpose of this guide was to provide methodological recommendations to improve scale development rigor and adoption and to provide a set of functional principles to aid researchers and practitioners interested in capturing data through developed, or adapted, scales. Additionally, the information summarized provide a benchmark upon which to evaluate the rigor and validity of reported scale results. A consistent framework should provide a common lexicon upon which to examine scales and associated results. Proper scale development and validation will help ensure research findings accurately describe intended underlying concepts, particularly within an international agricultural and extension education context. Keywords: scale development, validity, quantitative analysis


Diachronica ◽  
2016 ◽  
Vol 33 (3) ◽  
pp. 297-329 ◽  
Author(s):  
Joseph Bauman

Studies of grammaticalization have identified a tendency for verbs of possession to develop modal meanings (Bybee et al. 1994, Heine & Kuteva 2002). I present evidence of the mechanisms contributing to both semantic and structural change in one such instance, the Modern Spanish deontic modal construction [tener que + Inf] “to have to”. Quantitative analysis of a corpus of written texts confirms that this process is gradual and layered, exhibiting semantic changes measurable in the ratio of lexical infinitive types to total tokens of the constructions, changing tendencies in the construction’s internal structure and the presence of highly frequent, lexically particular instances of tener que. This study presents quantifiable manifestations of grammaticalization processes that do not adhere to a linear, uniform cline and are consistently variable, even on a small scale.


Nanomaterials ◽  
2021 ◽  
Vol 11 (3) ◽  
pp. 594
Author(s):  
Mingcai Zhao ◽  
Juan Zhang ◽  
Wei Wang ◽  
Qi Zhang

This work aims to prepare the silicon nanoparticles with the nanocrystal-embedded amorphous structure through spark erosion followed by bead milling. Spark erosion breaks up monocrystal silicon ingots into micro/nanoparticles, refines the crystal grains, makes the crystals randomly disordered, and increases isotropic character. Bead milling further refines the crystal grains to a few nanometers and increases the amorphous portion in the structure, eventually forming an amorphous structure with the nanocrystals embedded. Spark erosion saves much time and energy for bead milling. The crystallite size and the amount of amorphous phase could be controlled through varying pulse durations of spark discharge and bead milling time. The final particles could contain the nanocrystals as small as 4 nm and the content of amorphous phase as high as 84% and could be considered as amorphous-like Si nanoparticles. This processing route for Si nanoparticles greatly reduced the production time and the energy consumption and, more importantly, is structure-controllable and scalable for mass production of the products with higher purity.


Author(s):  
J.P. Fallon ◽  
P.J. Gregory ◽  
C.J. Taylor

Quantitative image analysis systems have been used for several years in research and quality control applications in various fields including metallurgy and medicine. The technique has been applied as an extension of subjective microscopy to problems requiring quantitative results and which are amenable to automatic methods of interpretation.Feature extraction. In the most general sense, a feature can be defined as a portion of the image which differs in some consistent way from the background. A feature may be characterized by the density difference between itself and the background, by an edge gradient, or by the spatial frequency content (texture) within its boundaries. The task of feature extraction includes recognition of features and encoding of the associated information for quantitative analysis.Quantitative Analysis. Quantitative analysis is the determination of one or more physical measurements of each feature. These measurements may be straightforward ones such as area, length, or perimeter, or more complex stereological measurements such as convex perimeter or Feret's diameter.


Author(s):  
V. V. Damiano ◽  
R. P. Daniele ◽  
H. T. Tucker ◽  
J. H. Dauber

An important example of intracellular particles is encountered in silicosis where alveolar macrophages ingest inspired silica particles. The quantitation of the silica uptake by these cells may be a potentially useful method for monitoring silica exposure. Accurate quantitative analysis of ingested silica by phagocytic cells is difficult because the particles are frequently small, irregularly shaped and cannot be visualized within the cells. Semiquantitative methods which make use of particles of known size, shape and composition as calibration standards may be the most direct and simplest approach to undertake. The present paper describes an empirical method in which glass microspheres were used as a model to show how the ratio of the silicon Kα peak X-ray intensity from the microspheres to that of a bulk sample of the same composition correlated to the mass of the microsphere contained within the cell. Irregular shaped silica particles were also analyzed and a calibration curve was generated from these data.


Author(s):  
H.J. Dudek

The chemical inhomogenities in modern materials such as fibers, phases and inclusions, often have diameters in the region of one micrometer. Using electron microbeam analysis for the determination of the element concentrations one has to know the smallest possible diameter of such regions for a given accuracy of the quantitative analysis.In th is paper the correction procedure for the quantitative electron microbeam analysis is extended to a spacial problem to determine the smallest possible measurements of a cylindrical particle P of high D (depth resolution) and diameter L (lateral resolution) embeded in a matrix M and which has to be analysed quantitative with the accuracy q. The mathematical accounts lead to the following form of the characteristic x-ray intens ity of the element i of a particle P embeded in the matrix M in relation to the intensity of a standard S


Author(s):  
H.W. Deckman ◽  
B.F. Flannery ◽  
J.H. Dunsmuir ◽  
K.D' Amico

We have developed a new X-ray microscope which produces complete three dimensional images of samples. The microscope operates by performing X-ray tomography with unprecedented resolution. Tomography is a non-invasive imaging technique that creates maps of the internal structure of samples from measurement of the attenuation of penetrating radiation. As conventionally practiced in medical Computed Tomography (CT), radiologists produce maps of bone and tissue structure in several planar sections that reveal features with 1mm resolution and 1% contrast. Microtomography extends the capability of CT in several ways. First, the resolution which approaches one micron, is one thousand times higher than that of the medical CT. Second, our approach acquires and analyses the data in a panoramic imaging format that directly produces three-dimensional maps in a series of contiguous stacked planes. Typical maps available today consist of three hundred planar sections each containing 512x512 pixels. Finally, and perhaps of most import scientifically, microtomography using a synchrotron X-ray source, allows us to generate maps of individual element.


Author(s):  
John A. Hunt

Spectrum-imaging is a useful technique for comparing different processing methods on very large data sets which are identical for each method. This paper is concerned with comparing methods of electron energy-loss spectroscopy (EELS) quantitative analysis on the Al-Li system. The spectrum-image analyzed here was obtained from an Al-10at%Li foil aged to produce δ' precipitates that can span the foil thickness. Two 1024 channel EELS spectra offset in energy by 1 eV were recorded and stored at each pixel in the 80x80 spectrum-image (25 Mbytes). An energy range of 39-89eV (20 channels/eV) are represented. During processing the spectra are either subtracted to create an artifact corrected difference spectrum, or the energy offset is numerically removed and the spectra are added to create a normal spectrum. The spectrum-images are processed into 2D floating-point images using methods and software described in [1].


Author(s):  
Leo Barish

Although most of the wool used today consists of fine, unmedullated down-type fibers, a great deal of coarse wool is used for carpets, tweeds, industrial fabrics, etc. Besides the obvious diameter difference, coarse wool fibers are often medullated.Medullation may be easily observed using bright field light microscopy. Fig. 1A shows a typical fine diameter nonmedullated wool fiber, Fig. IB illustrates a coarse fiber with a large medulla. The opacity of the medulla is due to the inability of the mounting media to penetrate to the center of the fiber leaving air pockets. Fig. 1C shows an even thicker fiber with a very large medulla and with very thin skin. This type of wool is called “Kemp”, is shed annually or more often, and corresponds to guard hair in fur-bearing animals.


Sign in / Sign up

Export Citation Format

Share Document