scholarly journals Denoising-based Image Compression for Connectomics

2021 ◽  
Author(s):  
David Minnen ◽  
Michał Januszewski ◽  
Alexander Shapson-Coe ◽  
Richard L. Schalek ◽  
Johannes Ballé ◽  
...  

Connectomic reconstruction of neural circuits relies on nanometer resolution microscopy which produces on the order of a petabyte of imagery for each cubic millimeter of brain tissue. The cost of storing such data is a significant barrier to broadening the use of connectomic approaches and scaling to even larger volumes. We present an image compression approach that uses machine learning-based denoising and standard image codecs to compress raw electron microscopy imagery of neuropil up to 17-fold with negligible loss of reconstruction accuracy.

Polymers ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 353
Author(s):  
Kun-Cheng Ke ◽  
Ming-Shyan Huang

Conventional methods for assessing the quality of components mass produced using injection molding are expensive and time-consuming or involve imprecise statistical process control parameters. A suitable alternative would be to employ machine learning to classify the quality of parts by using quality indices and quality grading. In this study, we used a multilayer perceptron (MLP) neural network along with a few quality indices to accurately predict the quality of “qualified” and “unqualified” geometric shapes of a finished product. These quality indices, which exhibited a strong correlation with part quality, were extracted from pressure curves and input into the MLP model for learning and prediction. By filtering outliers from the input data and converting the measured quality into quality grades used as output data, we increased the prediction accuracy of the MLP model and classified the quality of finished parts into various quality levels. The MLP model may misjudge datapoints in the “to-be-confirmed” area, which is located between the “qualified” and “unqualified” areas. We classified the “to-be-confirmed” area, and only the quality of products in this area were evaluated further, which reduced the cost of quality control considerably. An integrated circuit tray was manufactured to experimentally demonstrate the feasibility of the proposed method.


2021 ◽  
Vol 3 (1) ◽  
Author(s):  
Zhikuan Zhao ◽  
Jack K. Fitzsimons ◽  
Patrick Rebentrost ◽  
Vedran Dunjko ◽  
Joseph F. Fitzsimons

AbstractMachine learning has recently emerged as a fruitful area for finding potential quantum computational advantage. Many of the quantum-enhanced machine learning algorithms critically hinge upon the ability to efficiently produce states proportional to high-dimensional data points stored in a quantum accessible memory. Even given query access to exponentially many entries stored in a database, the construction of which is considered a one-off overhead, it has been argued that the cost of preparing such amplitude-encoded states may offset any exponential quantum advantage. Here we prove using smoothed analysis that if the data analysis algorithm is robust against small entry-wise input perturbation, state preparation can always be achieved with constant queries. This criterion is typically satisfied in realistic machine learning applications, where input data is subjective to moderate noise. Our results are equally applicable to the recent seminal progress in quantum-inspired algorithms, where specially constructed databases suffice for polylogarithmic classical algorithm in low-rank cases. The consequence of our finding is that for the purpose of practical machine learning, polylogarithmic processing time is possible under a general and flexible input model with quantum algorithms or quantum-inspired classical algorithms in the low-rank cases.


2020 ◽  
Vol 7 (1) ◽  
Author(s):  
Miles L. Timpe ◽  
Maria Han Veiga ◽  
Mischa Knabenhans ◽  
Joachim Stadel ◽  
Stefano Marelli

AbstractIn the late stages of terrestrial planet formation, pairwise collisions between planetary-sized bodies act as the fundamental agent of planet growth. These collisions can lead to either growth or disruption of the bodies involved and are largely responsible for shaping the final characteristics of the planets. Despite their critical role in planet formation, an accurate treatment of collisions has yet to be realized. While semi-analytic methods have been proposed, they remain limited to a narrow set of post-impact properties and have only achieved relatively low accuracies. However, the rise of machine learning and access to increased computing power have enabled novel data-driven approaches. In this work, we show that data-driven emulation techniques are capable of classifying and predicting the outcome of collisions with high accuracy and are generalizable to any quantifiable post-impact quantity. In particular, we focus on the dataset requirements, training pipeline, and classification and regression performance for four distinct data-driven techniques from machine learning (ensemble methods and neural networks) and uncertainty quantification (Gaussian processes and polynomial chaos expansion). We compare these methods to existing analytic and semi-analytic methods. Such data-driven emulators are poised to replace the methods currently used in N-body simulations, while avoiding the cost of direct simulation. This work is based on a new set of 14,856 SPH simulations of pairwise collisions between rotating, differentiated bodies at all possible mutual orientations.


2020 ◽  
Vol 14 ◽  
Author(s):  
Hung Tri Tran ◽  
Esther H. R. Tsai ◽  
Amanda J. Lewis ◽  
Tim Moors ◽  
J. G. J. M. Bol ◽  
...  

Gaining insight to pathologically relevant processes in continuous volumes of unstained brain tissue is important for a better understanding of neurological diseases. Many pathological processes in neurodegenerative disorders affect myelinated axons, which are a critical part of the neuronal circuitry. Cryo ptychographic X-ray computed tomography in the multi-keV energy range is an emerging technology providing phase contrast at high sensitivity, allowing label-free and non-destructive three dimensional imaging of large continuous volumes of tissue, currently spanning up to 400,000 μm3. This aspect makes the technique especially attractive for imaging complex biological material, especially neuronal tissues, in combination with downstream optical or electron microscopy techniques. A further advantage is that dehydration, additional contrast staining, and destructive sectioning/milling are not required for imaging. We have developed a pipeline for cryo ptychographic X-ray tomography of relatively large, hydrated and unstained biological tissue volumes beyond what is typical for the X-ray imaging, using human brain tissue and combining the technique with complementary methods. We present four imaged volumes of a Parkinson’s diseased human brain and five volumes from a non-diseased control human brain using cryo ptychographic X-ray tomography. In both cases, we distinguish neuromelanin-containing neurons, lipid and melanic pigment, blood vessels and red blood cells, and nuclei of other brain cells. In the diseased sample, we observed several swellings containing dense granular material resembling clustered vesicles between the myelin sheaths arising from the cytoplasm of the parent oligodendrocyte, rather than the axoplasm. We further investigated the pathological relevance of such swollen axons in adjacent tissue sections by immunofluorescence microscopy for phosphorylated alpha-synuclein combined with multispectral imaging. Since cryo ptychographic X-ray tomography is non-destructive, the large dataset volumes were used to guide further investigation of such swollen axons by correlative electron microscopy and immunogold labeling post X-ray imaging, a possibility demonstrated for the first time. Interestingly, we find that protein antigenicity and ultrastructure of the tissue are preserved after the X-ray measurement. As many pathological processes in neurodegeneration affect myelinated axons, our work sets an unprecedented foundation for studies addressing axonal integrity and disease-related changes in unstained brain tissues.


2021 ◽  
Vol 6 (11) ◽  
pp. 157
Author(s):  
Gonçalo Pereira ◽  
Manuel Parente ◽  
João Moutinho ◽  
Manuel Sampaio

Decision support and optimization tools to be used in construction often require an accurate estimation of the cost variables to maximize their benefit. Heavy machinery is traditionally one of the greatest costs to consider mainly due to fuel consumption. These typically diesel-powered machines have a great variability of fuel consumption depending on the scenario of utilization. This paper describes the creation of a framework aiming to estimate the fuel consumption of construction trucks depending on the carried load, the slope, the distance, and the pavement type. Having a more accurate estimation will increase the benefit of these optimization tools. The fuel consumption estimation model was developed using Machine Learning (ML) algorithms supported by data, which were gathered through several sensors, in a specially designed datalogger with wireless communication and opportunistic synchronization, in a real context experiment. The results demonstrated the viability of the method, providing important insight into the advantages associated with the combination of sensorization and the machine learning models in a real-world construction setting. Ultimately, this study comprises a significant step towards the achievement of IoT implementation from a Construction 4.0 viewpoint, especially when considering its potential for real-time and digital twins applications.


2021 ◽  
Vol 3 (2) ◽  
pp. 43-50
Author(s):  
Safa SEN ◽  
Sara Almeida de Figueiredo

Predicting bank failures has been an essential subject in literature due to the significance of the banks for the economic prosperity of a country. Acting as an intermediary player of the economy, banks channel funds between creditors and debtors. In that matter, banks are considered the backbone of the economies; hence, it is important to create early warning systems that identify insolvent banks from solvent ones. Thus, Insolvent banks can apply for assistance and avoid bankruptcy in financially turbulent times. In this paper, we will focus on two different machine learning disciplines: Boosting and Cost-Sensitive methods to predict bank failures. Boosting methods are widely used in the literature due to their better prediction capability. However, Cost-Sensitive Forest is relatively new to the literature and originally invented to solve imbalance problems in software defect detection. Our results show that comparing to the boosting methods, Cost-Sensitive Forest particularly classifies failed banks more accurately. Thus, we suggest using the Cost-Sensitive Forest when predicting bank failures with imbalanced datasets.


Polymers ◽  
2021 ◽  
Vol 13 (18) ◽  
pp. 3100
Author(s):  
Anusha Mairpady ◽  
Abdel-Hamid I. Mourad ◽  
Mohammad Sayem Mozumder

The selection of nanofillers and compatibilizing agents, and their size and concentration, are always considered to be crucial in the design of durable nanobiocomposites with maximized mechanical properties (i.e., fracture strength (FS), yield strength (YS), Young’s modulus (YM), etc). Therefore, the statistical optimization of the key design factors has become extremely important to minimize the experimental runs and the cost involved. In this study, both statistical (i.e., analysis of variance (ANOVA) and response surface methodology (RSM)) and machine learning techniques (i.e., artificial intelligence-based techniques (i.e., artificial neural network (ANN) and genetic algorithm (GA)) were used to optimize the concentrations of nanofillers and compatibilizing agents of the injection-molded HDPE nanocomposites. Initially, through ANOVA, the concentrations of TiO2 and cellulose nanocrystals (CNCs) and their combinations were found to be the major factors in improving the durability of the HDPE nanocomposites. Further, the data were modeled and predicted using RSM, ANN, and their combination with a genetic algorithm (i.e., RSM-GA and ANN-GA). Later, to minimize the risk of local optimization, an ANN-GA hybrid technique was implemented in this study to optimize multiple responses, to develop the nonlinear relationship between the factors (i.e., the concentration of TiO2 and CNCs) and responses (i.e., FS, YS, and YM), with minimum error and with regression values above 95%.


Sign in / Sign up

Export Citation Format

Share Document