scholarly journals Statistical Concepts in Particle Physics

1976 ◽  
Vol 29 (6) ◽  
pp. 363
Author(s):  
DC Peaslee

A review is given of the level density formula in hadron physics, which has reached a completely developed state on the basis of the bootstrap equation. It is shown how the approximate formula


1974 ◽  
Vol 53 ◽  
pp. 77-91
Author(s):  
J. Craig Wheeler

The statistical bootstrap theory of hadrons predicts a particle level density which increases with mass like ϱ(m) = cmaexp bm. The motivation for this level density is explored and then it is used to derive an equation of state for zero-temperature ultra-dense (> 1017 g cm−3) matter. The nature, uses, and limitations of the equation of state are discussed.



2018 ◽  
Vol 194 ◽  
pp. 01001 ◽  
Author(s):  
Vladimir Zelevinsky ◽  
Sofia Karampagia

The knowledge of the level density as a function of excitation energy and nuclear spin is necessary for the description of nuclear reactions and in many applied areas. We discuss the level density problem as a part of the general understanding of mesoscopic systems with strong interactions. The underlying physics is that of quantum chaos and thermalization which allows one to use statistical methods avoiding full diagonalization. The resulting level density is well described by the constant temperature model in agreement with experimental data. We discuss the effective temperature parameter and show that it is not related to the pairing phase transition being analogous to the limiting temperature in particle physics. Other aspects of underlying physics include the collective enhancement of the level density, random coupling of individual spins and the role of incoherent collision-like interactions.



Author(s):  
E.D. Wolf

Most microelectronics devices and circuits operate faster, consume less power, execute more functions and cost less per circuit function when the feature-sizes internal to the devices and circuits are made smaller. This is part of the stimulus for the Very High-Speed Integrated Circuits (VHSIC) program. There is also a need for smaller, more sensitive sensors in a wide range of disciplines that includes electrochemistry, neurophysiology and ultra-high pressure solid state research. There is often fundamental new science (and sometimes new technology) to be revealed (and used) when a basic parameter such as size is extended to new dimensions, as is evident at the two extremes of smallness and largeness, high energy particle physics and cosmology, respectively. However, there is also a very important intermediate domain of size that spans from the diameter of a small cluster of atoms up to near one micrometer which may also have just as profound effects on society as “big” physics.



Author(s):  
Sterling P. Newberry

At the 1958 meeting of our society, then known as EMSA, the author introduced the concept of microspace and suggested its use to provide adequate information storage space and the use of electron microscope techniques to provide storage and retrieval access. At this current meeting of MSA, he wishes to suggest an additional use of the power of the electron microscope.The author has been contemplating this new use for some time and would have suggested it in the EMSA fiftieth year commemorative volume, but for page limitations. There is compelling reason to put forth this suggestion today because problems have arisen in the “Standard Model” of particle physics and funds are being greatly reduced just as we need higher energy machines to resolve these problems. Therefore, any techniques which complement or augment what we can accomplish during this austerity period with the machines at hand is worth exploring.



2013 ◽  
Vol 221 (3) ◽  
pp. 190-200 ◽  
Author(s):  
Jörg-Tobias Kuhn ◽  
Thomas Kiefer

Several techniques have been developed in recent years to generate optimal large-scale assessments (LSAs) of student achievement. These techniques often represent a blend of procedures from such diverse fields as experimental design, combinatorial optimization, particle physics, or neural networks. However, despite the theoretical advances in the field, there still exists a surprising scarcity of well-documented test designs in which all factors that have guided design decisions are explicitly and clearly communicated. This paper therefore has two goals. First, a brief summary of relevant key terms, as well as experimental designs and automated test assembly routines in LSA, is given. Second, conceptual and methodological steps in designing the assessment of the Austrian educational standards in mathematics are described in detail. The test design was generated using a two-step procedure, starting at the item block level and continuing at the item level. Initially, a partially balanced incomplete item block design was generated using simulated annealing, whereas in a second step, items were assigned to the item blocks using mixed-integer linear optimization in combination with a shadow-test approach.



Nature China ◽  
2007 ◽  
Author(s):  
Tim Reid
Keyword(s):  




Nature ◽  
2004 ◽  
Author(s):  
Michael Hopkin
Keyword(s):  






Sign in / Sign up

Export Citation Format

Share Document