Recent Developments In Laboratory Data Sets For Determination Of Miscibility Limits

10.2118/91-4 ◽  
1991 ◽  
Author(s):  
F.B. Thomas ◽  
D.B. Bennion ◽  
D.W. Bennion
2019 ◽  
Vol 75 (9) ◽  
pp. 782-791 ◽  
Author(s):  
Max E. Wilkinson ◽  
Ananthanarayanan Kumar ◽  
Ana Casañal

Recent developments have resulted in electron cryo-microscopy (cryo-EM) becoming a useful tool for the structure determination of biological macromolecules. For samples containing inherent flexibility, heterogeneity or preferred orientation, the collection of extensive cryo-EM data using several conditions and microscopes is often required. In such a scenario, merging cryo-EM data sets is advantageous because it allows improved three-dimensional reconstructions to be obtained. Since data sets are not always collected with the same pixel size, merging data can be challenging. Here, two methods to combine cryo-EM data are described. Both involve the calculation of a rescaling factor from independent data sets. The effects of errors in the scaling factor on the results of data merging are also estimated. The methods described here provide a guideline for cryo-EM users who wish to combine data sets from the same type of microscope and detector.


Author(s):  
Douglas L. Dorset

The quantitative use of electron diffraction intensity data for the determination of crystal structures represents the pioneering achievement in the electron crystallography of organic molecules, an effort largely begun by B. K. Vainshtein and his co-workers. However, despite numerous representative structure analyses yielding results consistent with X-ray determination, this entire effort was viewed with considerable mistrust by many crystallographers. This was no doubt due to the rather high crystallographic R-factors reported for some structures and, more importantly, the failure to convince many skeptics that the measured intensity data were adequate for ab initio structure determinations.We have recently demonstrated the utility of these data sets for structure analyses by direct phase determination based on the probabilistic estimate of three- and four-phase structure invariant sums. Examples include the structure of diketopiperazine using Vainshtein's 3D data, a similar 3D analysis of the room temperature structure of thiourea, and a zonal determination of the urea structure, the latter also based on data collected by the Moscow group.


Author(s):  
William Krakow ◽  
David A. Smith

Recent developments in specimen preparation, imaging and image analysis together permit the experimental determination of the atomic structure of certain, simple grain boundaries in metals such as gold. Single crystal, ∼125Å thick, (110) oriented gold films are vapor deposited onto ∼3000Å of epitaxial silver on (110) oriented cut and polished rock salt substrates. Bicrystal gold films are then made by first removing the silver coated substrate and placing in contact two suitably misoriented pieces of the gold film on a gold grid. Controlled heating in a hot stage first produces twist boundaries which then migrate, so reducing the grain boundary area, to give mixed boundaries and finally tilt boundaries perpendicular to the foil. These specimens are well suited to investigation by high resolution transmission electron microscopy.


Information ◽  
2021 ◽  
Vol 12 (5) ◽  
pp. 202
Author(s):  
Louai Alarabi ◽  
Saleh Basalamah ◽  
Abdeltawab Hendawi ◽  
Mohammed Abdalla

The rapid spread of infectious diseases is a major public health problem. Recent developments in fighting these diseases have heightened the need for a contact tracing process. Contact tracing can be considered an ideal method for controlling the transmission of infectious diseases. The result of the contact tracing process is performing diagnostic tests, treating for suspected cases or self-isolation, and then treating for infected persons; this eventually results in limiting the spread of diseases. This paper proposes a technique named TraceAll that traces all contacts exposed to the infected patient and produces a list of these contacts to be considered potentially infected patients. Initially, it considers the infected patient as the querying user and starts to fetch the contacts exposed to him. Secondly, it obtains all the trajectories that belong to the objects moved nearby the querying user. Next, it investigates these trajectories by considering the social distance and exposure period to identify if these objects have become infected or not. The experimental evaluation of the proposed technique with real data sets illustrates the effectiveness of this solution. Comparative analysis experiments confirm that TraceAll outperforms baseline methods by 40% regarding the efficiency of answering contact tracing queries.


Author(s):  
Hernâni Marques ◽  
Pedro Cruz-Vicente ◽  
Tiago Rosado ◽  
Mário Barroso ◽  
Luís A. Passarinha ◽  
...  

Environmental tobacco smoke exposure (ETS) and smoking have been described as the most prevalent factors in the development of certain diseases worldwide. According to the World Health Organization, more than 8 million people die every year due to exposure to tobacco, around 7 million due to direct ETS and the remaining due to exposure to second-hand smoke. Both active and second-hand exposure can be measured and controlled using specific biomarkers of tobacco and its derivatives, allowing the development of more efficient public health policies. Exposure to these compounds can be measured using different methods (involving for instance liquid- or gas-chromatographic procedures) in a wide range of biological specimens to estimate the type and degree of tobacco exposure. In recent years, a lot of research has been carried out using different extraction methods and different analytical equipment; this way, liquid–liquid extraction, solid-phase extraction or even miniaturized procedures have been used, followed by chromatographic analysis coupled mainly to mass spectrometric detection. Through this type of methodologies, second-hand smokers can be distinguished from active smokers, and this is also valid for e-cigarettes and vapers, among others, using their specific biomarkers. This review will focus on recent developments in the determination of tobacco smoke biomarkers, including nicotine and other tobacco alkaloids, specific nitrosamines, polycyclic aromatic hydrocarbons, etc. The methods for their detection will be discussed in detail, as well as the potential use of threshold values to distinguish between types of exposure.


Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 189
Author(s):  
Susana Campuzano ◽  
Paloma Yáñez-Sedeño ◽  
José Manuel Pingarrón

The multifaceted key roles of cytokines in immunity and inflammatory processes have led to a high clinical interest for the determination of these biomolecules to be used as a tool in the diagnosis, prognosis, monitoring and treatment of several diseases of great current relevance (autoimmune, neurodegenerative, cardiac, viral and cancer diseases, hypercholesterolemia and diabetes). Therefore, the rapid and accurate determination of cytokine biomarkers in body fluids, cells and tissues has attracted considerable attention. However, many currently available techniques used for this purpose, although sensitive and selective, require expensive equipment and advanced human skills and do not meet the demands of today’s clinic in terms of test time, simplicity and point-of-care applicability. In the course of ongoing pursuit of new analytical methodologies, electrochemical biosensing is steadily gaining ground as a strategy suitable to develop simple, low-cost methods, with the ability for multiplexed and multiomics determinations in a short time and requiring a small amount of sample. This review article puts forward electrochemical biosensing methods reported in the last five years for the determination of cytokines, summarizes recent developments and trends through a comprehensive discussion of selected strategies, and highlights the challenges to solve in this field. Considering the key role demonstrated in the last years by different materials (with nano or micrometric size and with or without magnetic properties), in the design of analytical performance-enhanced electrochemical biosensing strategies, special attention is paid to the methods exploiting these approaches.


Minerals ◽  
2020 ◽  
Vol 11 (1) ◽  
pp. 33
Author(s):  
Valérie Laperche ◽  
Bruno Lemière

Portable X-ray fluorescence spectroscopy is now widely used in almost any field of geoscience. Handheld XRF analysers are easy to use, and results are available in almost real time anywhere. However, the results do not always match laboratory analyses, and this may deter users. Rather than analytical issues, the bias often results from sample preparation differences. Instrument setup and analysis conditions need to be fully understood to avoid reporting erroneous results. The technique’s limitations must be kept in mind. We describe a number of issues and potential pitfalls observed from our experience and described in the literature. This includes the analytical mode and parameters; protective films; sample geometry and density, especially for light elements; analytical interferences between elements; physical effects of the matrix and sample condition, and more. Nevertheless, portable X-ray fluorescence spectroscopy (pXRF) results gathered with sufficient care by experienced users are both precise and reliable, if not fully accurate, and they can constitute robust data sets. Rather than being a substitute for laboratory analyses, pXRF measurements are a valuable complement to those. pXRF improves the quality and relevance of laboratory data sets.


2021 ◽  
Vol 2021 (2) ◽  
Author(s):  
DianYu Liu ◽  
ChuanLe Sun ◽  
Jun Gao

Abstract The possible non-standard interactions (NSIs) of neutrinos with matter plays important role in the global determination of neutrino properties. In our study we select various data sets from LHC measurements at 13 TeV with integrated luminosities of 35 ∼ 139 fb−1, including production of a single jet, photon, W/Z boson, or charged lepton accompanied with large missing transverse momentum. We derive constraints on neutral-current NSIs with quarks imposed by different data sets in a framework of either effective operators or simplified Z′ models. We use theoretical predictions of productions induced by NSIs at next-to-leading order in QCD matched with parton showering which stabilize the theory predictions and result in more robust constraints. In a simplified Z′ model we obtain a 95% CLs upper limit on the conventional NSI strength ϵ of 0.042 and 0.0028 for a Z′ mass of 0.2 and 2 TeV respectively. We also discuss possible improvements from future runs of LHC with higher luminosities.


2004 ◽  
Vol 67 (9) ◽  
pp. 2024-2032 ◽  
Author(s):  
FUMIKO KASUGA ◽  
MASAMITSU HIROTA ◽  
MASAMICHI WADA ◽  
TOSHIHIKO YUNOKAWA ◽  
HAJIME TOYOFUKU ◽  
...  

The Ministry of Health, Labor and Welfare (former MHW) of Japan issued a Directive in 1997 advising restaurants and caterers to freeze portions of both raw food and cooked dishes for at least 2 weeks. This system has been useful for determining vehicle foods at outbreaks. Enumeration of bacteria in samples of stored food provide data about pathogen concentrations in the implicated food. Data on Salmonella concentrations in vehicle foods associated with salmonellosis outbreaks were collected in Japan between 1989 and 1998. The 39 outbreaks that occurred during this period were categorized by the settings where the outbreaks took place, and epidemiological data from each outbreak were summarized. Characteristics of outbreak groups were analyzed and compared. The effect of new food-storage system on determination of bacterial concentration was evaluated. Freezing and nonfreezing conditions prior to microbial examination were compared in the dose-response relationship. Data from outbreaks in which implicated foods had been kept frozen suggested apparent correlation between the Salmonella dose ingested and the disease rate. Combined with results of epidemiological investigation, quantitative data from the ingested pathogen could provide complete dose-response data sets.


2008 ◽  
Vol 44-46 ◽  
pp. 871-878 ◽  
Author(s):  
Chu Yang Luo ◽  
Jun Jiang Xiong ◽  
R.A. Shenoi

This paper outlines a new technique to address the paucity of data in determining fatigue life and performance based on reliability concepts. Two new randomized models are presented for estimating the safe life and pS-N curve, by using the standard procedure for statistical analysis and dealing with small sample numbers of incomplete data. The confidence level formulations for the safe and p-S-N curve are also given. The concepts are then applied for the determination of the safe life and p-S-N curve. Two sets of fatigue tests for the safe life and p-S-N curve are conducted to validate the presented method, demonstrating the practical use of the proposed technique.


Sign in / Sign up

Export Citation Format

Share Document