scholarly journals Characterisation of molecular motions in cryo-EM single-particle data by multi-body refinement in RELION

eLife ◽  
2018 ◽  
Vol 7 ◽  
Author(s):  
Takanori Nakane ◽  
Dari Kimanius ◽  
Erik Lindahl ◽  
Sjors HW Scheres

Macromolecular complexes that exhibit continuous forms of structural flexibility pose a challenge for many existing tools in cryo-EM single-particle analysis. We describe a new tool, called multi-body refinement, which models flexible complexes as a user-defined number of rigid bodies that move independently from each other. Using separate focused refinements with iteratively improved partial signal subtraction, the new tool generates improved reconstructions for each of the defined bodies in a fully automated manner. Moreover, using principal component analysis on the relative orientations of the bodies over all particle images in the data set, we generate movies that describe the most important motions in the data. Our results on two test cases, a cytoplasmic ribosome from Plasmodium falciparum, and the spliceosomal B-complex from yeast, illustrate how multi-body refinement can be useful to gain unique insights into the structure and dynamics of large and flexible macromolecular complexes.

2018 ◽  
Author(s):  
Takanori Nakane ◽  
Dari Kimanius ◽  
Erik Lindahl ◽  
Sjors H.W. Scheres

AbstractMacromolecular complexes that exhibit continuous forms of structural flexibility pose a challenge for many existing tools in cryo-EM single-particle analysis. We describe a new tool, called multi-body refinement, which models flexible complexes as a user-defined number of rigid bodies that move independently from each other. Using separate focused refinements with iteratively improved partial signal subtraction, the new tool generates improved reconstructions for each of the defined bodies in a fully automated manner. Moreover, using principal component analysis on the relative orientations of the bodies over all particles in the data set, we generate movies that describe the most important motions in the data. Our results on two test cases, a cytoplasmic ribosome from Plasmodium falciparum, and the spliceosomal B-complex from yeast, illustrate how multi-body refinement can be useful to gain unique insights into the structure and dynamics of large and flexible macromolecular complexes.Please note that this bioRxiv submission is ahead of the availability of the multi-body software in relion-3.0. We take great care in distributing stable software, but this does take time. We will announce the (beta-)release of relion-3.0 through the ccp-em mailing list (https://www.jiscmail.ac.uk/CCPEM) and on twitter (@SjorsScheres).


Author(s):  
Ruijie Yao ◽  
Jiaqiang Qian ◽  
Qiang Huang

Abstract Motivation Single-particle cryo-electron microscopy (cryo-EM) has become a powerful technique for determining 3D structures of biological macromolecules at near-atomic resolution. However, this approach requires picking huge numbers of macromolecular particle images from thousands of low-contrast, high-noisy electron micrographs. Although machine-learning methods were developed to get rid of this bottleneck, it still lacks universal methods that could automatically picking the noisy cryo-EM particles of various macromolecules. Results Here, we present a deep-learning segmentation model that employs fully convolutional networks trained with synthetic data of known 3D structures, called PARSED (PARticle SEgmentation Detector). Without using any experimental information, PARSED could automatically segment the cryo-EM particles in a whole micrograph at a time, enabling faster particle picking than previous template/feature-matching and particle-classification methods. Applications to six large public cryo-EM datasets clearly validated its universal ability to pick macromolecular particles of various sizes. Thus, our deep-learning method could break the particle-picking bottleneck in the single-particle analysis, and thereby accelerates the high-resolution structure determination by cryo-EM. Availability and implementation The PARSED package and user manual for noncommercial use are available as Supplementary Material (in the compressed file: parsed_v1.zip). Supplementary information Supplementary data are available at Bioinformatics online.


Author(s):  
J. Bernard Heymann

In single-particle analysis (SPA), the aim is to obtain a 3D reconstruction of a biological molecule from 2D electron micrographs to the highest level of detail or resolution as possible. Current practice is to collect large volumes of data, hoping to reach high-resolution maps through sheer numbers. However, adding more particles from a specific data set eventually leads to diminishing improvements in resolution. Understanding what these resolution limits are and how to deal with them are important in optimization and automation of SPA. This study revisits the theory of 3D reconstruction and demonstrates how the associated statistics can provide a diagnostic tool to improve SPA. Small numbers of images already give sufficient information on micrograph quality and the amount of data required to reach high resolution. Such feedback allows the microscopist to improve sample-preparation and imaging parameters before committing to extensive data collection. Once a larger data set is available, a B factor can be determined describing the suppression of the signal owing to one or more causes, such as specimen movement, radiation damage, alignment inaccuracy and structural variation. Insight into the causes of signal suppression can then guide the user to consider appropriate actions to obtain better reconstructions.


2018 ◽  
Author(s):  
M. Kazemi ◽  
C. O. S. Sorzano ◽  
A. Des Georges ◽  
J. M. Carazo ◽  
J. Vargas

AbstractCryo-electron microscopy using single particle analysis requires the computational averaging of thousands of projection images captured from identical macromolecules. However, macromolecules usually present some degree of flexibility showing different conformations. Computational approaches are then required to classify heterogeneous single particle images into homogeneous sets corresponding to different structural states. Nonetheless, sometimes the attainable resolution of reconstructions obtained from these smaller homogeneous sets is compromised because of reduced number of particles or lack of images at certain macromolecular orientations. In these situations, the current solution to improve map resolution is returning to the electron microscope and collect more data. In this work, we present a fast approach to partially overcome this limitation for heterogeneous data sets. Our method is based on deforming and then moving particles between different conformations using an optical flow approach. Particles are then merged into a unique conformation obtaining reconstructions with improved resolution, contrast and signal-to-noise ratio, then, partially circumventing many issues that impact obtaining high quality reconstructions from small data sets. We present experimental results that show clear improvements in the quality of obtained 3D maps, however, there are also limits to this approach, which we discuss in the manuscript.


2015 ◽  
Vol 22 (6) ◽  
pp. 1345-1352 ◽  
Author(s):  
S. A. Bobkov ◽  
A. B. Teslyuk ◽  
R. P. Kurta ◽  
O. Yu. Gorobtsov ◽  
O. M. Yefanov ◽  
...  

Modern X-ray free-electron lasers (XFELs) operating at high repetition rates produce a tremendous amount of data. It is a great challenge to classify this information and reduce the initial data set to a manageable size for further analysis. Here an approach for classification of diffraction patterns measured in prototypical diffract-and-destroy single-particle imaging experiments at XFELs is presented. It is proposed that the data are classified on the basis of a set of parameters that take into account the underlying diffraction physics and specific relations between the real-space structure of a particle and its reciprocal-space intensity distribution. The approach is demonstrated by applying principal component analysis and support vector machine algorithms to the simulated and measured X-ray data sets.


2015 ◽  
Vol 14 (4) ◽  
pp. 165-181 ◽  
Author(s):  
Sarah Dudenhöffer ◽  
Christian Dormann

Abstract. The purpose of this study was to replicate the dimensions of the customer-related social stressors (CSS) concept across service jobs, to investigate their consequences for service providers’ well-being, and to examine emotional dissonance as mediator. Data of 20 studies comprising of different service jobs (N = 4,199) were integrated into a single data set and meta-analyzed. Confirmatory factor analyses and explorative principal component analysis confirmed four CSS scales: disproportionate expectations, verbal aggression, ambiguous expectations, disliked customers. These CSS scales were associated with burnout and job satisfaction. Most of the effects were partially mediated by emotional dissonance. Further analyses revealed that differences among jobs exist with regard to the factor solution. However, associations between CSS and outcomes are mainly invariant across service jobs.


2018 ◽  
Author(s):  
Peter De Wolf ◽  
Zhuangqun Huang ◽  
Bede Pittenger

Abstract Methods are available to measure conductivity, charge, surface potential, carrier density, piezo-electric and other electrical properties with nanometer scale resolution. One of these methods, scanning microwave impedance microscopy (sMIM), has gained interest due to its capability to measure the full impedance (capacitance and resistive part) with high sensitivity and high spatial resolution. This paper introduces a novel data-cube approach that combines sMIM imaging and sMIM point spectroscopy, producing an integrated and complete 3D data set. This approach replaces the subjective approach of guessing locations of interest (for single point spectroscopy) with a big data approach resulting in higher dimensional data that can be sliced along any axis or plane and is conducive to principal component analysis or other machine learning approaches to data reduction. The data-cube approach is also applicable to other AFM-based electrical characterization modes.


2020 ◽  
Vol 16 (8) ◽  
pp. 1088-1105
Author(s):  
Nafiseh Vahedi ◽  
Majid Mohammadhosseini ◽  
Mehdi Nekoei

Background: The poly(ADP-ribose) polymerases (PARP) is a nuclear enzyme superfamily present in eukaryotes. Methods: In the present report, some efficient linear and non-linear methods including multiple linear regression (MLR), support vector machine (SVM) and artificial neural networks (ANN) were successfully used to develop and establish quantitative structure-activity relationship (QSAR) models capable of predicting pEC50 values of tetrahydropyridopyridazinone derivatives as effective PARP inhibitors. Principal component analysis (PCA) was used to a rational division of the whole data set and selection of the training and test sets. A genetic algorithm (GA) variable selection method was employed to select the optimal subset of descriptors that have the most significant contributions to the overall inhibitory activity from the large pool of calculated descriptors. Results: The accuracy and predictability of the proposed models were further confirmed using crossvalidation, validation through an external test set and Y-randomization (chance correlations) approaches. Moreover, an exhaustive statistical comparison was performed on the outputs of the proposed models. The results revealed that non-linear modeling approaches, including SVM and ANN could provide much more prediction capabilities. Conclusion: Among the constructed models and in terms of root mean square error of predictions (RMSEP), cross-validation coefficients (Q2 LOO and Q2 LGO), as well as R2 and F-statistical value for the training set, the predictive power of the GA-SVM approach was better. However, compared with MLR and SVM, the statistical parameters for the test set were more proper using the GA-ANN model.


Sign in / Sign up

Export Citation Format

Share Document