How to Use Boltzmann Machines and Neural Networks for Covering Array Generation

Author(s):  
Ludwig Kampel ◽  
Michael Wagner ◽  
Ilias S. Kotsireas ◽  
Dimitris E. Simos
Author(s):  
Ludwig Kampel ◽  
Michael Wagner ◽  
Ilias S. Kotsireas ◽  
Dimitris E. Simos

Author(s):  
Chuan Luo ◽  
Jinkun Lin ◽  
Shaowei Cai ◽  
Xin Chen ◽  
Bing He ◽  
...  

Author(s):  
Yan Wang ◽  
Huayao Wu ◽  
Xintao Niu ◽  
Changhai Nie ◽  
Jiaxi Xu

2020 ◽  
Vol 36 (12) ◽  
pp. 3781-3787
Author(s):  
Pavel Sulimov ◽  
Anastasia Voronkova ◽  
Attila Kertész-Farkas

Abstract Motivation The discrimination ability of score functions to separate correct from incorrect peptide-spectrum-matches in database-searching-based spectrum identification is hindered by many superfluous peaks belonging to unexpected fragmentation ions or by the lacking peaks of anticipated fragmentation ions. Results Here, we present a new method, called BoltzMatch, to learn score functions using a particular stochastic neural networks, called restricted Boltzmann machines, in order to enhance their discrimination ability. BoltzMatch learns chemically explainable patterns among peak pairs in the spectrum data, and it can augment peaks depending on their semantic context or even reconstruct lacking peaks of expected ions during its internal scoring mechanism. As a result, BoltzMatch achieved 50% and 33% more annotations on high- and low-resolution MS2 data than XCorr at a 0.1% false discovery rate in our benchmark; conversely, XCorr yielded the same number of spectrum annotations as BoltzMatch, albeit with 4–6 times more errors. In addition, BoltzMatch alone does yield 14% more annotations than Prosit (which runs with Percolator), and BoltzMatch with Percolator yields 32% more annotations than Prosit at 0.1% FDR level in our benchmark. Availability and implementation BoltzMatch is freely available at: https://github.com/kfattila/BoltzMatch. Contact [email protected] Supporting information Supplementary data are available at Bioinformatics online.


Author(s):  
Florian Marquardt

These brief lecture notes cover the basics of neural networks and deep learning as well as their applications in the quantum domain, for physicists without prior knowledge. In the first part, we describe training using backpropagation, image classification, convolutional networks and autoencoders. The second part is about advanced techniques like reinforce-ment learning (for discovering control strategies), recurrent neural networks (for analyz-ing time traces), and Boltzmann machines (for learning probability distributions). In the third lecture, we discuss first recent applications to quantum physics, with an emphasis on quantum information processing machines. Finally, the fourth lecture is devoted to the promise of using quantum effects to accelerate machine learning.


Sign in / Sign up

Export Citation Format

Share Document