scholarly journals Heterotic String Model Building with Monad Bundles and Reinforcement Learning

2022 ◽  
pp. 2100186
Author(s):  
Andrei Constantin ◽  
Thomas R. Harvey ◽  
Andre Lukas
1996 ◽  
Vol 11 (05) ◽  
pp. 903-920 ◽  
Author(s):  
RICHARD ALTENDORFER ◽  
TATSUO KOBAYASHI

We study the gauge coupling unification of the minimal supersymmetric standard model with nonuniversal soft scalar and gaugino masses. The unification scale of the gauge couplings is estimated for nonuniversal cases. It is sensitive to the nonuniversality. It turns out that these cases can be combined with the assumption of string unification, which leads to a prediction of sin 2 θW(MZ) and k1, the normalization of the U (1)Y generator. String unification predicts that k1=1.3–1.4. These values have nontrivial implications for string model building. Two-loop corrections are also calculated. Some of these cases exhibit a large discrepancy between experiment and string unification. We calculate string threshold corrections to explain the discrepancy.


2007 ◽  
Vol 2007 (03) ◽  
pp. 035-035 ◽  
Author(s):  
Stefan Groot Nibbelink ◽  
Michele Trapletti ◽  
Martin G.A Walter
Keyword(s):  

2011 ◽  
Vol 2011 ◽  
pp. 1-18 ◽  
Author(s):  
Rhys Davies

This is a short review of recent constructions of new Calabi-Yau threefolds with small Hodge numbers and/or nontrivial fundamental group, which are of particular interest for model building in the context of heterotic string theory. The two main tools are topological transitions and taking quotients by actions of discrete groups. Both of these techniques can produce new manifolds from existing ones, and they have been used to bring many new specimens to the previously sparse corner of the Calabi-Yau zoo, where both Hodge numbers are small. Two new manifolds are also obtained here from hyperconifold transitions, including the first example with fundamental groupS3, the smallest non-Abelian group.


2012 ◽  
pp. 120-146 ◽  
Author(s):  
Janusz A. Starzyk

This chapter describes a motivated learning (ML) method that advances model building and learning techniques required for intelligent systems. Motivated learning addresses critical limitations of reinforcement learning (RL), the more common approach to coordinating a machine’s interaction with an unknown environment. RL maximizes the external reward by approximating multidimensional value functions; however, it does not work well in dynamically changing environments. The ML method overcomes RL problems by triggering internal motivations, and creating abstract goals and internal reward systems to stimulate learning. The chapter addresses the important question of how to motivate an agent to learn and enhance its own complexity? A mechanism is presented that extends low-level sensory-motor interactions towards advanced perception and motor skills, resulting in the emergence of desired cognitive properties. ML is compared to RL using a rapidly changing environment in which the agent needs to manage its motivations as well as choose and implement goals in order to succeed.


2011 ◽  
Vol 26 (32) ◽  
pp. 2411-2426 ◽  
Author(s):  
D. MOORE ◽  
J. GREENWALD ◽  
T. RENNER ◽  
M. ROBINSON ◽  
C. BUESCHER ◽  
...  

Using software under development at Baylor University, we explicitly construct all layer 1 gauge, weakly coupled free fermionic heterotic string models up to order 22 in four large spacetime dimensions. The gauge models consist primarily of gauge content making a systematic construction process efficient. We present an overview of the model building procedure, redundancies in the process, methods used to reduce such redundancies and statistics regarding the occurrence of various combinations of gauge group factors and GUT groups. Statistics for both [Formula: see text] and [Formula: see text] models are presented.


2009 ◽  
Vol 24 (33) ◽  
pp. 2703-2715 ◽  
Author(s):  
MATTHEW B. ROBINSON ◽  
GERALD B. CLEAVER ◽  
MARKUS HUNZIKER

We consider an alternative derivation of the GSO Projection in the free fermionic construction of the weakly coupled heterotic string in terms of root systems, as well as the interpretation of the GSO Projection in this picture. We then present an algorithm to systematically and efficiently generate input sets (i.e. basic vectors) in order to study landscape statistics with minimal computational cost. For example, the improvement at order 6 is ≈10-13 over a traditional brute force approach, and improvement increases with order. We then consider an example of statistics on a relatively simple class of models.


Sign in / Sign up

Export Citation Format

Share Document