A modified Pinkerton-type helium gas-flow system for high-accuracy data collection at the X3 SUNY synchrotron beamline at NSLS

2001 ◽  
Vol 34 (1) ◽  
pp. 76-79 ◽  
Author(s):  
Lynn Ribaud ◽  
Guang Wu ◽  
Yuegang Zhang ◽  
Philip Coppens

As the combination of high-intensity synchrotron sources and area detectors allows collection of large data sets in a much shorter time span than previously possible, the use of open helium gas-flow systems is much facilitated. A flow system installed at the SUNY X3 synchrotron beamline at the National Synchrotron Light Source has been used for collection of a number of large data sets at a temperature of ∼16 K. Instability problems encountered when using a helium cryostat for three-dimensional data collection are eliminated. Details of the equipment, its temperature calibration and a typical result are described.

2017 ◽  
Vol 73 (2) ◽  
pp. 79-92 ◽  
Author(s):  
Ilka Müller

With continuous technical improvements at synchrotron facilities, data-collection rates have increased dramatically. This makes it possible to collect diffraction data for hundreds of protein–ligand complexes within a day, provided that a suitable crystal system is at hand. However, developing a suitable crystal system can prove challenging, exceeding the timescale of data collection by several orders of magnitude. Firstly, a useful crystallization construct of the protein of interest needs to be chosen and its expression and purification optimized, before screening for suitable crystallization and soaking conditions can start. This article reviews recent publications analysing large data sets of crystallization trials, with the aim of identifying factors that do or do not make agoodcrystallization construct, and gives guidance in the design of an expression construct. It provides an overview of common protein-expression systems, addresses how ligand binding can be both help and hindrance for protein purification, and describes ligand co-crystallization and soaking, with an emphasis on troubleshooting.


Geophysics ◽  
1990 ◽  
Vol 55 (10) ◽  
pp. 1321-1326 ◽  
Author(s):  
X. Wang ◽  
R. O. Hansen

Two‐dimensional (profile) inversion techniques for magnetic anomalies are widely used in exploration geophysics: but, until now, the three‐dimensional (3-D) methods available have been restricted in their geologic applicability, dependent upon good initial values or limited by the capabilities of existing computers. We have developed a fully 3-D inversion algorithm intended for routine application to large data sets. The algorithm based on a Fourier transform expression for the magnetic field of homogeneous polyhedral bodies (Hansen and Wang, 1998), is a 3-D generalization of CompuDepth (O’Brien, 1972). Like CompuDepth, the new inversion algorithm employs thespatial equivalent of frequency‐domain autoregression to determine a series of coefficients from which the depths and locations of polyhedral vertices are calculated by solving complex polynomials. These vertices are used to build a 3-D geologic model. Application to the Medicine Lake Volcano aeromagnetic anomaly resulted in a geologically reasonable model of the source.


2006 ◽  
Vol 21 (2) ◽  
pp. 102-104 ◽  
Author(s):  
Colleen S. Frazer ◽  
Mark A. Rodriguez ◽  
Ralph G. Tissot

The Interactive Data Language has been used to produce a software program capable of advanced three-dimensional visualizations of pole figure and θ-2θ data. The data can also be used to calculate quantitative properties such as strain level and to minimize the peak-height texture effects in individual θ-2θ scans. The collection of the large data sets necessary for the analyses is facilitated by use of a position sensitive detector or area detector.


Psychology ◽  
2020 ◽  
Author(s):  
Jeffrey Stanton

The term “data science” refers to an emerging field of research and practice that focuses on obtaining, processing, visualizing, analyzing, preserving, and re-using large collections of information. A related term, “big data,” has been used to refer to one of the important challenges faced by data scientists in many applied environments: the need to analyze large data sources, in certain cases using high-speed, real-time data analysis techniques. Data science encompasses much more than big data, however, as a result of many advancements in cognate fields such as computer science and statistics. Data science has also benefited from the widespread availability of inexpensive computing hardware—a development that has enabled “cloud-based” services for the storage and analysis of large data sets. The techniques and tools of data science have broad applicability in the sciences. Within the field of psychology, data science offers new opportunities for data collection and data analysis that have begun to streamline and augment efforts to investigate the brain and behavior. The tools of data science also enable new areas of research, such as computational neuroscience. As an example of the impact of data science, psychologists frequently use predictive analysis as an investigative tool to probe the relationships between a set of independent variables and one or more dependent variables. While predictive analysis has traditionally been accomplished with techniques such as multiple regression, recent developments in the area of machine learning have put new predictive tools in the hands of psychologists. These machine learning tools relax distributional assumptions and facilitate exploration of non-linear relationships among variables. These tools also enable the analysis of large data sets by opening options for parallel processing. In this article, a range of relevant areas from data science is reviewed for applicability to key research problems in psychology including large-scale data collection, exploratory data analysis, confirmatory data analysis, and visualization. This bibliography covers data mining, machine learning, deep learning, natural language processing, Bayesian data analysis, visualization, crowdsourcing, web scraping, open source software, application programming interfaces, and research resources such as journals and textbooks.


2005 ◽  
Vol 11 (1) ◽  
pp. 9-17 ◽  
Author(s):  
H. Narfi Stefansson ◽  
Kevin W. Eliceiri ◽  
Charles F. Thomas ◽  
Amos Ron ◽  
Ron DeVore ◽  
...  

The use of multifocal-plane, time-lapse recordings of living specimens has allowed investigators to visualize dynamic events both within ensembles of cells and individual cells. Recordings of such four-dimensional (4D) data from digital optical sectioning microscopy produce very large data sets. We describe a wavelet-based data compression algorithm that capitalizes on the inherent redunancies within multidimensional data to achieve higher compression levels than can be obtained from single images. The algorithm will permit remote users to roam through large 4D data sets using communication channels of modest bandwidth at high speed. This will allow animation to be used as a powerful aid to visualizing dynamic changes in three-dimensional structures.


1987 ◽  
Vol 20 (6) ◽  
pp. 507-511
Author(s):  
J. B. Weinrach ◽  
D. W. Bennett

An algorithm for the optimization of data collection time has been written and a subsequent computer program tested for diffractometer systems. The program, which utilizes a global statistical approach to the traveling salesman problem, yields reasonable solutions in a relatively short time. The algorithm has been successful in treating very large data sets (up to 4000 points) in three dimensions with subsequent time savings of ca 30%.


Author(s):  
John A. Hunt

Spectrum-imaging is a useful technique for comparing different processing methods on very large data sets which are identical for each method. This paper is concerned with comparing methods of electron energy-loss spectroscopy (EELS) quantitative analysis on the Al-Li system. The spectrum-image analyzed here was obtained from an Al-10at%Li foil aged to produce δ' precipitates that can span the foil thickness. Two 1024 channel EELS spectra offset in energy by 1 eV were recorded and stored at each pixel in the 80x80 spectrum-image (25 Mbytes). An energy range of 39-89eV (20 channels/eV) are represented. During processing the spectra are either subtracted to create an artifact corrected difference spectrum, or the energy offset is numerically removed and the spectra are added to create a normal spectrum. The spectrum-images are processed into 2D floating-point images using methods and software described in [1].


Author(s):  
Thomas W. Shattuck ◽  
James R. Anderson ◽  
Neil W. Tindale ◽  
Peter R. Buseck

Individual particle analysis involves the study of tens of thousands of particles using automated scanning electron microscopy and elemental analysis by energy-dispersive, x-ray emission spectroscopy (EDS). EDS produces large data sets that must be analyzed using multi-variate statistical techniques. A complete study uses cluster analysis, discriminant analysis, and factor or principal components analysis (PCA). The three techniques are used in the study of particles sampled during the FeLine cruise to the mid-Pacific ocean in the summer of 1990. The mid-Pacific aerosol provides information on long range particle transport, iron deposition, sea salt ageing, and halogen chemistry.Aerosol particle data sets suffer from a number of difficulties for pattern recognition using cluster analysis. There is a great disparity in the number of observations per cluster and the range of the variables in each cluster. The variables are not normally distributed, they are subject to considerable experimental error, and many values are zero, because of finite detection limits. Many of the clusters show considerable overlap, because of natural variability, agglomeration, and chemical reactivity.


Sign in / Sign up

Export Citation Format

Share Document