scholarly journals Bootstrapping Bloch bands

Author(s):  
Serguei Tchoumakov ◽  
Serge Florens

Abstract Bootstrap methods, initially developed for solving statistical and quantum field theories, have recently been shown to capture the discrete spectrum of quantum mechanical problems, such as the single particle Schrödinger equation with an anharmonic potential. The core of bootstrap methods builds on exact recursion relations of arbitrary moments of some quantum operator and the use of an adequate set of positivity criteria. We extend this methodology to models with continuous Bloch band spectra, by considering a single quantum particle in a periodic cosine potential. We find that the band structure can be obtained accurately provided the bootstrap uses moments involving both position and momentum variables. We also introduce several new techniques that can apply generally to other bootstrap studies. First, we devise a trick to reduce by one unit the dimensionality of the search space for the variables parametrizing the bootstrap. Second, we employ statistical techniques to reconstruct the distribution probability allowing to compute observables that are analytic functions of the canonical variables. This method is used to extract the Bloch momentum, a quantity that is not readily available from the bootstrap recursion itself.

2004 ◽  
Vol 19 (supp02) ◽  
pp. 34-49 ◽  
Author(s):  
H. BABUJIAN ◽  
M. KAROWSKI

The purpose of the "bootstrap program" for integrable quantum field theories in 1+1 dimensions is to construct a model in terms of its Wightman functions explicitly. In this article, the program is mainly illustrated in terms of the sine-Gordon and the sinh-Gordon model and (as an exercise) the scaling Ising model. We review some previous results on sine-Gordon breather form factors and quantum operator equations. The problem to sum over intermediate states is attacked in the short distance limit of the two point Wightman function for the sinh-Gordon and the scaling Ising model.


2011 ◽  
Vol 19 (3) ◽  
pp. 405-428 ◽  
Author(s):  
Jingpeng Li ◽  
Andrew J. Parkes ◽  
Edmund K. Burke

Squeaky wheel optimization (SWO) is a relatively new metaheuristic that has been shown to be effective for many real-world problems. At each iteration SWO does a complete construction of a solution starting from the empty assignment. Although the construction uses information from previous iterations, the complete rebuilding does mean that SWO is generally effective at diversification but can suffer from a relatively weak intensification. Evolutionary SWO (ESWO) is a recent extension to SWO that is designed to improve the intensification by keeping the good components of solutions and only using SWO to reconstruct other poorer components of the solution. In such algorithms a standard challenge is to understand how the various parameters affect the search process. In order to support the future study of such issues, we propose a formal framework for the analysis of ESWO. The framework is based on Markov chains, and the main novelty arises because ESWO moves through the space of partial assignments. This makes it significantly different from the analyses used in local search (such as simulated annealing) which only move through complete assignments. Generally, the exact details of ESWO will depend on various heuristics; so we focus our approach on a case of ESWO that we call ESWO-II and that has probabilistic as opposed to heuristic selection and construction operators. For ESWO-II, we study a simple problem instance and explicitly compute the stationary distribution probability over the states of the search space. We find interesting properties of the distribution. In particular, we find that the probabilities of states generally, but not always, increase with their fitness. This nonmonotonocity is quite different from the monotonicity expected in algorithms such as simulated annealing.


2020 ◽  
Author(s):  
Andrew Lensen ◽  
Bing Xue ◽  
Mengjie Zhang

© 2016 IEEE. Clustering, the process of grouping unlabelled data, is an important task in data analysis. It is regarded as one of the most difficult tasks due to the large search space that must be explored. Feature selection is commonly used to reduce the size of a search space, and evolutionary computation (EC) is a group of techniques which are known to give good solutions to difficult problems such as clustering or feature selection. However, there has been relatively little work done on simultaneous clustering and feature selection using EC methods. In this paper we compare medoid and centroid representations that allow particle swarm optimisation (PSO) to perform simultaneous clustering and feature selection. We propose several new techniques which improve clustering performance and ensure valid solutions are generated. Experiments are conducted on a variety of real-world and synthetic datasets in order to analyse the effectiveness of the PSO representations across several different criteria. We show that a medoid representation can achieve superior results compared to the widely used centroid representation.


Author(s):  
Guozhen Liu ◽  
Weidong Qiu ◽  
Yi Tu

Keccak-f is the permutation used in the NIST SHA-3 hash function standard. Inspired by the previous exhaustive differential trail search methods by Mella et al. at ToSC 2017, we introduce in this paper new algorithms to cover 3-round trail cores with propagation weight at least 53, up from the previous best weight 45. To achieve the goal, the concept of ideal improvement assumption is proposed to construct theoretical representative of subspaces so as to efficiently cover the search space of 3-round trail cores with at least one out-Kernel α state. Of particular note is that the exhaustiveness in 3-round trail core search of at least one out-Kernel α is only experimentally verified. With the knowledge of all 3-round trail cores of weight up to 53, lower bounds on 4/5/6-round trails are tightened to 56/58/108, from the previous 48/50/92, respectively.


2020 ◽  
Author(s):  
Andrew Lensen ◽  
Bing Xue ◽  
Mengjie Zhang

© 2016 IEEE. Clustering, the process of grouping unlabelled data, is an important task in data analysis. It is regarded as one of the most difficult tasks due to the large search space that must be explored. Feature selection is commonly used to reduce the size of a search space, and evolutionary computation (EC) is a group of techniques which are known to give good solutions to difficult problems such as clustering or feature selection. However, there has been relatively little work done on simultaneous clustering and feature selection using EC methods. In this paper we compare medoid and centroid representations that allow particle swarm optimisation (PSO) to perform simultaneous clustering and feature selection. We propose several new techniques which improve clustering performance and ensure valid solutions are generated. Experiments are conducted on a variety of real-world and synthetic datasets in order to analyse the effectiveness of the PSO representations across several different criteria. We show that a medoid representation can achieve superior results compared to the widely used centroid representation.


Author(s):  
Jean Zinn-Justin

Chapter 16 deals with the important problem of quantization with symmetries, that is, how to implement symmetries of the classical action in the corresponding quantum theory. The proposed solutions are based on methods like regularization by addition of higher order derivatives or regulator fields, or lattice regularization. Difficulties encountered in the case of chiral theories are emphasized. This may lead to obstacles for symmetric quantization called anomalies. Examples can be found in the case of chiral gauge theories. Their origin can be traced to the problem of quantum operator ordering in products. A non–perturbative regularization, also useful for numerical simulations, is based on introducing a space lattice. Difficulties appear for lattice Dirac fermions, leading the fermion doubling problem. Wilson’s fermions provide a non–chiral invariant solution. Chiral invariant solutions have been found, called overlap fermions or domain wall fermions.


1962 ◽  
Vol 11 (02) ◽  
pp. 137-143
Author(s):  
M. Schwarzschild

It is perhaps one of the most important characteristics of the past decade in astronomy that the evolution of some major classes of astronomical objects has become accessible to detailed research. The theory of the evolution of individual stars has developed into a substantial body of quantitative investigations. The evolution of galaxies, particularly of our own, has clearly become a subject for serious research. Even the history of the solar system, this close-by intriguing puzzle, may soon make the transition from being a subject of speculation to being a subject of detailed study in view of the fast flow of new data obtained with new techniques, including space-craft.


Author(s):  
M.A. Parker ◽  
K.E. Johnson ◽  
C. Hwang ◽  
A. Bermea

We have reported the dependence of the magnetic and recording properties of CoPtCr recording media on the thickness of the Cr underlayer. It was inferred from XRD data that grain-to-grain epitaxy of the Cr with the CoPtCr was responsible for the interaction observed between these layers. However, no cross-sectional TEM (XTEM) work was performed to confirm this inference. In this paper, we report the application of new techniques for preparing XTEM specimens from actual magnetic recording disks, and for layer-by-layer micro-diffraction with an electron probe elongated parallel to the surface of the deposited structure which elucidate the effect of the crystallographic structure of the Cr on that of the CoPtCr.XTEM specimens were prepared from magnetic recording disks by modifying a technique used to prepare semiconductor specimens. After 3mm disks were prepared per the standard XTEM procedure, these disks were then lapped using a tripod polishing device. A grid with a single 1mmx2mm hole was then glued with M-bond 610 to the polished side of the disk.


Author(s):  
P. Pradère ◽  
J.F. Revol ◽  
R. St. John Manley

Although radiation damage is the limiting factor in HREM of polymers, new techniques based on low dose imaging at low magnification have permitted lattice images to be obtained from very radiation sensitive polymers such as polyethylene (PE). This paper describes the computer averaging of P4MP1 lattice images. P4MP1 is even more sensitive than PE (total end point dose of 27 C m-2 as compared to 100 C m-2 for PE at 120 kV). It does, however, have the advantage of forming flat crystals from dilute solution and no change in d-spacings is observed during irradiation.Crystals of P4MP1 were grown at 60°C in xylene (polymer concentration 0.05%). Electron microscopy was performed with a Philips EM 400 T microscope equipped with a Low Dose Unit and operated at 120 kV. Imaging conditions were the same as already described elsewhere. Enlarged micrographs were digitized and processed with the Spider image processing system.


Sign in / Sign up

Export Citation Format

Share Document