scholarly journals Emergent gauge symmetries: making symmetry as well as breaking it

Author(s):  
Steven D. Bass

Gauge symmetries play an essential role in determining the interactions of particle physics. Where do they come from? Might the gauge symmetries of the Standard Model unify in the ultraviolet or might they be emergent in the infrared, below some large scale close to the Planck scale? Emergent gauge symmetries are important in quantum many-body systems in quantum phases associated with long range entanglement and topological order, e.g. they arise in high temperature superconductors, with string-net condensation and in the A-phase of superfluid 3 He. String-nets and superfluid 3 He exhibit emergent properties similar to the building blocks of particle physics. Emergent gauge symmetries also play an important role in simulations of quantum field theories. This article discusses recent thinking on possible emergent gauge symmetries in particle physics, commenting also on Higgs phenomena and the vacuum energy or cosmological constant puzzle in emergent gauge systems. This article is part of the theme issue ‘Quantum technologies in particle physics’.

2019 ◽  
Author(s):  
Johannes Balkenhol ◽  
Juan Prada ◽  
Hannelore Ehrenreich ◽  
Johannes Grohmann ◽  
Jóakim v. Kistowski ◽  
...  

Short abstractBrain world representation emerges not by philosophy but from integrating simple followed by more complex actions (due to drives, instincts) with sensory feedback and inputs such as rewards. Our simulation provides this world representation holistically by identical information encoded as holographic wave patterns for all associative cortex regions. Observed circular activation in cell culture experiments provides building blocks from which such an integrative circuit can evolve just by excitation and inhibition transfer to neighbouring neurons. Large-scale grid-computing of the simulation brought no new emergent phenomena but rather linear gains and losses regarding performance. The circuit integrates perceptions and actions. The resulting simulation compares well with data from electrophysiology, visual perception tasks, and oscillations in cortical areas. Non-local, wave-like information processing in the cortex agrees well with EEG observations such as cortical alpha, beta, and gamma oscillations. Non-local information processing has powerful emergent properties, including key features of conscious information processing.


2013 ◽  
Vol 221 (3) ◽  
pp. 190-200 ◽  
Author(s):  
Jörg-Tobias Kuhn ◽  
Thomas Kiefer

Several techniques have been developed in recent years to generate optimal large-scale assessments (LSAs) of student achievement. These techniques often represent a blend of procedures from such diverse fields as experimental design, combinatorial optimization, particle physics, or neural networks. However, despite the theoretical advances in the field, there still exists a surprising scarcity of well-documented test designs in which all factors that have guided design decisions are explicitly and clearly communicated. This paper therefore has two goals. First, a brief summary of relevant key terms, as well as experimental designs and automated test assembly routines in LSA, is given. Second, conceptual and methodological steps in designing the assessment of the Austrian educational standards in mathematics are described in detail. The test design was generated using a two-step procedure, starting at the item block level and continuing at the item level. Initially, a partially balanced incomplete item block design was generated using simulated annealing, whereas in a second step, items were assigned to the item blocks using mixed-integer linear optimization in combination with a shadow-test approach.


Effective field theory (EFT) is a general method for describing quantum systems with multiple-length scales in a tractable fashion. It allows us to perform precise calculations in established models (such as the standard models of particle physics and cosmology), as well as to concisely parametrize possible effects from physics beyond the standard models. EFTs have become key tools in the theoretical analysis of particle physics experiments and cosmological observations, despite being absent from many textbooks. This volume aims to provide a comprehensive introduction to many of the EFTs in use today, and covers topics that include large-scale structure, WIMPs, dark matter, heavy quark effective theory, flavour physics, soft-collinear effective theory, and more.


Author(s):  
Daniel Canarutto

This monograph addresses the need to clarify basic mathematical concepts at the crossroad between gravitation and quantum physics. Selected mathematical and theoretical topics are exposed within a not-too-short, integrated approach that exploits standard and non-standard notions in natural geometric language. The role of structure groups can be regarded as secondary even in the treatment of the gauge fields themselves. Two-spinors yield a partly original ‘minimal geometric data’ approach to Einstein-Cartan-Maxwell-Dirac fields. The gravitational field is jointly represented by a spinor connection and by a soldering form (a ‘tetrad’) valued in a vector bundle naturally constructed from the assumed 2-spinor bundle. We give a presentation of electroweak theory that dispenses with group-related notions, and we introduce a non-standard, natural extension of it. Also within the 2-spinor approach we present: a non-standard view of gauge freedom; a first-order Lagrangian theory of fields with arbitrary spin; an original treatment of Lie derivatives of spinors and spinor connections. Furthermore we introduce an original formulation of Lagrangian field theories based on covariant differentials, which works in the classical and quantum field theories alike and simplifies calculations. We offer a precise mathematical approach to quantum bundles and quantum fields, including ghosts, BRST symmetry and anti-fields, treating the geometry of quantum bundles and their jet prolongations in terms Frölicher's notion of smoothness. We propose an approach to quantum particle physics based on the notion of detector, and illustrate the basic scattering computations in that context.


2021 ◽  
Vol 22 (11) ◽  
pp. 5793
Author(s):  
Brianna M. Quinville ◽  
Natalie M. Deschenes ◽  
Alex E. Ryckman ◽  
Jagdeep S. Walia

Sphingolipids are a specialized group of lipids essential to the composition of the plasma membrane of many cell types; however, they are primarily localized within the nervous system. The amphipathic properties of sphingolipids enable their participation in a variety of intricate metabolic pathways. Sphingoid bases are the building blocks for all sphingolipid derivatives, comprising a complex class of lipids. The biosynthesis and catabolism of these lipids play an integral role in small- and large-scale body functions, including participation in membrane domains and signalling; cell proliferation, death, migration, and invasiveness; inflammation; and central nervous system development. Recently, sphingolipids have become the focus of several fields of research in the medical and biological sciences, as these bioactive lipids have been identified as potent signalling and messenger molecules. Sphingolipids are now being exploited as therapeutic targets for several pathologies. Here we present a comprehensive review of the structure and metabolism of sphingolipids and their many functional roles within the cell. In addition, we highlight the role of sphingolipids in several pathologies, including inflammatory disease, cystic fibrosis, cancer, Alzheimer’s and Parkinson’s disease, and lysosomal storage disorders.


2011 ◽  
Vol 18 (5) ◽  
pp. 563-572 ◽  
Author(s):  
G. Balasis ◽  
C. Papadimitriou ◽  
I. A. Daglis ◽  
A. Anastasiadis ◽  
I. Sandberg ◽  
...  

Abstract. The dynamics of complex systems are founded on universal principles that can be used to describe disparate problems ranging from particle physics to economies of societies. A corollary is that transferring ideas and results from investigators in hitherto disparate areas will cross-fertilize and lead to important new results. In this contribution, we investigate the existence of a universal behavior, if any, in solar flares, magnetic storms, earthquakes and pre-seismic electromagnetic (EM) emissions, extending the work recently published by Balasis et al. (2011a). A common characteristic in the dynamics of the above-mentioned phenomena is that their energy release is basically fragmentary, i.e. the associated events are being composed of elementary building blocks. By analogy with earthquakes, the magnitude of the magnetic storms, solar flares and pre-seismic EM emissions can be appropriately defined. Then the key question we can ask in the frame of complexity is whether the magnitude distribution of earthquakes, magnetic storms, solar flares and pre-fracture EM emissions obeys the same law. We show that these apparently different extreme events, which occur in the solar-terrestrial system, follow the same energy distribution function. The latter was originally derived for earthquake dynamics in the framework of nonextensive Tsallis statistics.


Author(s):  
Alan Gray ◽  
Kevin Stratford

Leading high performance computing systems achieve their status through use of highly parallel devices such as NVIDIA graphics processing units or Intel Xeon Phi many-core CPUs. The concept of performance portability across such architectures, as well as traditional CPUs, is vital for the application programmer. In this paper we describe targetDP, a lightweight abstraction layer which allows grid-based applications to target data parallel hardware in a platform agnostic manner. We demonstrate the effectiveness of our pragmatic approach by presenting performance results for a complex fluid application (with which the model was co-designed), plus separate lattice quantum chromodynamics particle physics code. For each application, a single source code base is seen to achieve portable performance, as assessed within the context of the Roofline model. TargetDP can be combined with Message Passing Interface (MPI) to allow use on systems containing multiple nodes: we demonstrate this through provision of scaling results on traditional and graphics processing unit-accelerated large scale supercomputers.


2021 ◽  
Author(s):  
Kor de Jong ◽  
Marc van Kreveld ◽  
Debabrata Panja ◽  
Oliver Schmitz ◽  
Derek Karssenberg

<p>Data availability at global scale is increasing exponentially. Although considerable challenges remain regarding the identification of model structure and parameters of continental scale hydrological models, we will soon reach the situation that global scale models could be defined at very high resolutions close to 100 m or less. One of the key challenges is how to make simulations of these ultra-high resolution models tractable ([1]).</p><p>Our research contributes by the development of a model building framework that is specifically designed to distribute calculations over multiple cluster nodes. This framework enables domain experts like hydrologists to develop their own large scale models, using a scripting language like Python, without the need to acquire the skills to develop low-level computer code for parallel and distributed computing.</p><p>We present the design and implementation of this software framework and illustrate its use with a prototype 100 m, 1 h continental scale hydrological model. Our modelling framework ensures that any model built with it is parallelized. This is made possible by providing the model builder with a set of building blocks of models, which are coded in such a manner that parallelization of calculations occurs within and across these building blocks, for any combination of building blocks. There is thus full flexibility on the side of the modeller, without losing performance.</p><p>This breakthrough is made possible by applying a novel approach to the implementation of the model building framework, called asynchronous many-tasks, provided by the HPX C++ software library ([3]). The code in the model building framework expresses spatial operations as large collections of interdependent tasks that can be executed efficiently on individual laptops as well as computer clusters ([2]). Our framework currently includes the most essential operations for building large scale hydrological models, including those for simulating transport of material through a flow direction network. By combining these operations, we rebuilt an existing 100 m, 1 h resolution model, thus far used for simulations of small catchments, requiring limited coding as we only had to replace the computational back end of the existing model. Runs at continental scale on a computer cluster show acceptable strong and weak scaling providing a strong indication that global simulations at this resolution will soon be possible, technically speaking.</p><p>Future work will focus on extending the set of modelling operations and adding scalable I/O, after which existing models that are currently limited in their ability to use the computational resources available to them can be ported to this new environment.</p><p>More information about our modelling framework is at https://lue.computationalgeography.org.</p><p><strong>References</strong></p><p>[1] M. Bierkens. Global hydrology 2015: State, trends, and directions. Water Resources Research, 51(7):4923–4947, 2015.<br>[2] K. de Jong, et al. An environmental modelling framework based on asynchronous many-tasks: scalability and usability. Submitted.<br>[3] H. Kaiser, et al. HPX - The C++ standard library for parallelism and concurrency. Journal of Open Source Software, 5(53):2352, 2020.</p>


2021 ◽  
Vol 75 (1) ◽  
Author(s):  
Diego O. Serra ◽  
Regine Hengge

Biofilms are a widespread multicellular form of bacterial life. The spatial structure and emergent properties of these communities depend on a polymeric extracellular matrix architecture that is orders of magnitude larger than the cells that build it. Using as a model the wrinkly macrocolony biofilms of Escherichia coli, which contain amyloid curli fibers and phosphoethanolamine (pEtN)-modified cellulose as matrix components, we summarize here the structure, building, and function of this large-scale matrix architecture. Based on different sigma and other transcription factors as well as second messengers, the underlying regulatory network reflects the fundamental trade-off between growth and survival. It controls matrix production spatially in response to long-range chemical gradients, but it also generates distinct patterns of short-range matrix heterogeneity that are crucial for tissue-like elasticity and macroscopic morphogenesis. Overall, these biofilms confer protection and a potential for homeostasis, thereby reducing maintenance energy, which makes multicellularity an emergent property of life itself. Expected final online publication date for the Annual Review of Microbiology, Volume 75 is October 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.


Sign in / Sign up

Export Citation Format

Share Document