Effective Theories from Nuclear to Particle Physics

Author(s):  
Antal Jakovác ◽  
András Patkós
Universe ◽  
2021 ◽  
Vol 7 (8) ◽  
pp. 273
Author(s):  
Mariana Graña ◽  
Alvaro Herráez

The swampland is the set of seemingly consistent low-energy effective field theories that cannot be consistently coupled to quantum gravity. In this review we cover some of the conjectural properties that effective theories should possess in order not to fall in the swampland, and we give an overview of their main applications to particle physics. The latter include predictions on neutrino masses, bounds on the cosmological constant, the electroweak and QCD scales, the photon mass, the Higgs potential and some insights about supersymmetry.


Author(s):  
Subhaditya Bhattacharya ◽  
José Wudka

Standard Model (SM) of particle physics has achieved enormous success in describing the interactions among the known fundamental constituents of nature, yet it fails to describe phenomena for which there is very strong experimental evidence, such as the existence of dark matter, and which point to the existence of new physics not included in that model; beyond its existence, experimental data, however, have not provided clear indications as to the nature of that new physics. The effective field theory (EFT) approach, the subject of this review, is designed for this type of situations; it provides a consistent and unbiased framework within which to study new physics effects whose existence is expected but whose detailed nature is known very imperfectly. We will provide a description of this approach together with a discussion of some of its basic theoretical aspects. We then consider applications to high-energy phenomenology and conclude with a discussion of the application of EFT techniques to the study of dark matter physics and its possible interactions with the SM. In several of the applications we also briefly discuss specific models that are ultraviolet complete and may realize the effects described by the EFT.


2006 ◽  
Vol 21 (13) ◽  
pp. 999-1016 ◽  
Author(s):  
RALF HOFMANN

We sketch the development of effective theories for SU(2) and SU(3) Yang–Mills thermodynamics. The most important results are quoted and some implications for particle physics and cosmology are discussed.


Author(s):  
E.D. Wolf

Most microelectronics devices and circuits operate faster, consume less power, execute more functions and cost less per circuit function when the feature-sizes internal to the devices and circuits are made smaller. This is part of the stimulus for the Very High-Speed Integrated Circuits (VHSIC) program. There is also a need for smaller, more sensitive sensors in a wide range of disciplines that includes electrochemistry, neurophysiology and ultra-high pressure solid state research. There is often fundamental new science (and sometimes new technology) to be revealed (and used) when a basic parameter such as size is extended to new dimensions, as is evident at the two extremes of smallness and largeness, high energy particle physics and cosmology, respectively. However, there is also a very important intermediate domain of size that spans from the diameter of a small cluster of atoms up to near one micrometer which may also have just as profound effects on society as “big” physics.


Author(s):  
Sterling P. Newberry

At the 1958 meeting of our society, then known as EMSA, the author introduced the concept of microspace and suggested its use to provide adequate information storage space and the use of electron microscope techniques to provide storage and retrieval access. At this current meeting of MSA, he wishes to suggest an additional use of the power of the electron microscope.The author has been contemplating this new use for some time and would have suggested it in the EMSA fiftieth year commemorative volume, but for page limitations. There is compelling reason to put forth this suggestion today because problems have arisen in the “Standard Model” of particle physics and funds are being greatly reduced just as we need higher energy machines to resolve these problems. Therefore, any techniques which complement or augment what we can accomplish during this austerity period with the machines at hand is worth exploring.


2013 ◽  
Vol 221 (3) ◽  
pp. 190-200 ◽  
Author(s):  
Jörg-Tobias Kuhn ◽  
Thomas Kiefer

Several techniques have been developed in recent years to generate optimal large-scale assessments (LSAs) of student achievement. These techniques often represent a blend of procedures from such diverse fields as experimental design, combinatorial optimization, particle physics, or neural networks. However, despite the theoretical advances in the field, there still exists a surprising scarcity of well-documented test designs in which all factors that have guided design decisions are explicitly and clearly communicated. This paper therefore has two goals. First, a brief summary of relevant key terms, as well as experimental designs and automated test assembly routines in LSA, is given. Second, conceptual and methodological steps in designing the assessment of the Austrian educational standards in mathematics are described in detail. The test design was generated using a two-step procedure, starting at the item block level and continuing at the item level. Initially, a partially balanced incomplete item block design was generated using simulated annealing, whereas in a second step, items were assigned to the item blocks using mixed-integer linear optimization in combination with a shadow-test approach.


Sign in / Sign up

Export Citation Format

Share Document