scholarly journals Searches for PromptR-Parity-Violating Supersymmetry at the LHC

2015 ◽  
Vol 2015 ◽  
pp. 1-24 ◽  
Author(s):  
Andreas Redelbach

Searches for supersymmetry (SUSY) at the LHC frequently assume the conservation ofR-parity in their design, optimization, and interpretation. In the case thatR-parity is not conserved, constraints on SUSY particle masses tend to be weakened with respect toR-parity-conserving models. We review the current status of searches forR-parity-violating (RPV) supersymmetry models at the ATLAS and CMS experiments, limited to 8 TeV search results published or submitted for publication as of the end of March 2015. All forms of renormalisable RPV terms leading to prompt signatures have been considered in the set of analyses under review. Discussing results for searches for promptR-parity-violating SUSY signatures summarizes the main constraints for various RPV models from LHC Run I and also defines the basis for promising signal regions to be optimized for Run II. In addition to identifying highly constrained regions from existing searches, also gaps in the coverage of the parameter space of RPV SUSY are outlined.

Author(s):  
Marta Losada

In this paper we present the current status of searches for neutral long-lived particles. The basic formalism that allows the determination of the number of expected long-lived particles is presented. Heavy neutral leptons can be a type of long-lived particles. The main observational motivations for the existence of heavy neutral lepton is covered as well. A summary of the main results from both collider searches and fixed target/beam dump experiments is presented. The outlook for next generation experiments and their impact on the parameter space of coupling strength and mass of heavy neutral leptons is also discussed.


2015 ◽  
Vol 30 (15) ◽  
pp. 1540017 ◽  
Author(s):  
Greg Landsberg

The success of the first three years of operations of the CERN Large Hadron Collider (LHC) at center-of-mass energies of 7 TeV and 8 TeV radically changed the landscape of searches for new physics beyond the Standard Model (BSM) and our very way of thinking about its possible origin and its hiding place. Among the paradigms of new physics that have been probed quite extensively at the LHC, are various models that predict the existence of extra spatial dimensions. In this review, the current status of searches for extra dimensions with the Compact Muon Solenoid (CMS) detector is presented, along with prospects for future searches at the full energy of the LHC, expected to be reached in the next few years.


2020 ◽  
Vol 59 (10) ◽  
pp. 3189-3205
Author(s):  
Ijaz Ahmed ◽  
Murad Badshah ◽  
Nadia Kausar
Keyword(s):  

Buildings ◽  
2022 ◽  
Vol 12 (1) ◽  
pp. 69
Author(s):  
István Kistelegdi ◽  
Kristóf Roland Horváth ◽  
Tamás Storcz ◽  
Zsolt Ercsey

Due to negative environmental impacts caused by the building industry, sustainable buildings have recently become one of the most investigated fields in research. As the design technique itself is mainly responsible for building performance, building energy design optimization is of particular interest. Several studies concentrate on systems, operation, and control optimization, complemented by passive strategies, specifically related to the envelope. In building physics, different architectural considerations, in particular, the building’s shape, are essential variables, as they greatly influence the performance of a building. Most scientific work that takes into consideration building geometry explores spaces without any energy optimization or calculates optimization processes of a few basic variables of simplified space geometries. Review studies mainly discuss the historic development of optimization algorithms, building domains, and the algorithm-system and software framework performance with coupling issues. By providing a systemized clustering of different levels of shape integration intensities, space creation principals, and algorithms, this review explores the current status of sustainability related shape optimization. The review proves that geometry design variable modifications and, specifically, shape generation techniques offer promising optimization potential; however, the findings also indicate that building shape optimization is still in its infancy.


2018 ◽  
Vol 33 (05n06) ◽  
pp. 1842002 ◽  
Author(s):  
M. Drewes ◽  
B. Garbrecht ◽  
P. Hernández ◽  
M. Kekic ◽  
J. Lopez-Pavon ◽  
...  

We review the current status of the leptogenesis scenario originally proposed by Akhmedov, Rubakov and Smirnov (ARS). It takes place in the parametric regime where the right-handed neutrinos are at the electroweak scale or below and the CP-violating effects are induced by the coherent superposition of different right-handed mass eigenstates. Two main theoretical approaches to derive quantum kinetic equations, the Hamiltonian time evolution as well as the Closed-Time-Path technique are presented, and we discuss their relations. For scenarios with two right-handed neutrinos, we chart the viable parameter space. Both, a Bayesian analysis, that determines the most likely configurations for viable leptogenesis given different variants of flat priors, and a determination of the maximally allowed mixing between the light, mostly left-handed, and heavy, mostly right-handed, neutrino states are discussed. Rephasing invariants are shown to be a useful tool to classify and to understand various distinct contributions to ARS leptogenesis that can dominate in different parametric regimes. While these analyses are carried out for the parametric regime where initial asymmetries are generated predominantly from lepton-number conserving, but flavor violating effects, we also review the contributions from lepton-number violating operators and identify the regions of parameter space where these are relevant.


2019 ◽  
Vol 79 (11) ◽  
Author(s):  
Prasenjit Sanyal

Abstract The latest CMS results on the upper limits on $$\sigma _{H^\pm }$$σH±BR($$H^\pm \rightarrow \tau ^\pm \nu )$$H±→τ±ν) and $$\sigma _{H^\pm }$$σH±BR($$H^+ \rightarrow t{\bar{b}}$$H+→tb¯) for $$\sqrt{s}=13$$s=13 TeV at an integrated luminosity of 35.9 $$\hbox {fb}^{-1}$$fb-1 are used to impose constraints on the charged Higgs $$H^\pm $$H± parameters within the Two Higgs Doublet Model (2HDM). The 2HDM is the simplest extension of the Standard Model (SM) under the same gauge symmetry to contain charged Higgs and is relatively little constrained compared to the Minimal Supersymmetric Standard Model (MSSM). The latest results lead to much more stringent constraints on the charged Higgs parameter space than for the earlier 8 TeV results. The CMS collaboration also studied the exotic bosonic decays $$H^\pm \rightarrow W^\pm A$$H±→W±A and $$A \rightarrow \mu ^+ \mu ^-$$A→μ+μ- for the first time and put upper limits on the BR($$t\rightarrow H^+ b$$t→H+b) for the light charged Higgs boson. These constraints lead to the exclusion of parameter space which is not excluded by the $$\tau \nu $$τν channel. For comparison the exclusion regions from flavor physics constraints are also discussed.


2017 ◽  
Vol 32 (02n03) ◽  
pp. 1750014
Author(s):  
Ran Ding ◽  
Li Huang ◽  
Tianjun Li ◽  
Bin Zhu

We propose a supersymmetric explanation of the diphoton excess in the Minimal Supersymmetric Standard Model (MSSM) with the leptonic R-parity violation. In our model, sneutrino serves as the 750 GeV resonance and produced through quark–antiquark annihilation. With introducing appropriate trilinear soft parameters, we show that the diphoton branching ratio is significantly enhanced compared with the conventional MSSM. For current dijet and W-pair LHC constraints, we can successfully fit the observed diphoton signal rate in sizeable parameter regions, the resulting parameter space strongly favor the masses of light smuon and stau within the range from 375–500 GeV, which depends on the choice of electroweakino masses and soft trilinear terms. While after taking into account the compatibility of diphoton excess between the 8 TeV and 13 TeV LHC, only the coupling involved with the second generation quarks is survived. In this case, the corresponding parameter space favors a narrow mass range of smuon and stau with [Formula: see text]. Even if the 750 GeV diphoton excesses were not confirmed by the ATLAS and CMS experiments, we point out that our proposal can still be used to explain the current and future tentative diphoton excesses.


2014 ◽  
Vol 29 (23) ◽  
pp. 1430041 ◽  
Author(s):  
Andrew Askew ◽  
Sushil Chauhan ◽  
Björn Penning ◽  
William Shepherd ◽  
Mani Tripathi

Theoretical and experimental techniques employed in dedicated searches for dark matter at hadron colliders are reviewed. Bounds from the 7 TeV and 8 TeV proton–proton collisions at the Large Hadron Collider (LHC) on dark matter interactions have been collected and the results interpreted. We review the current status of the Effective Field Theory picture of dark matter interactions with the Standard Model. Currently, LHC experiments have stronger bounds on operators leading to spin-dependent scattering than direct detection experiments, while direct detection probes are more constraining for spin-independent scattering for WIMP masses above a few GeV.


Author(s):  
Sudha Ram

We are fortunate to be experiencing an explosive growth and advancement in the Internet and the World Wide Web (WWW). In 1999, the global online population was estimated to be 250 million WWW users worldwide, while the “/images/spacer_white.gif”number of pages on the Web was estimated at 800 million (http://www.internetindicators.com/facts.html). The bright side of this kind of growth is that information is available to almost anyone with access to a computer and a phone line. However, the dark side of this explosion is that we are now squarely in the midst of the “Age of Information Overload”!!! The staggering amount of information has made it extremely difficult for users to locate and retrieve information that is actually relevant to their task at hand. Given the bewildering array of resources being generated and posted on the WWW, the task of finding exactly what a user wants is rather daunting. Although many search engines currently exist to assist in information retrieval, much of the burden of searching is on the end-user. A typical search results in millions of hit, many of which are outdated, irrelevant, or duplicated. One promising approach to managing the information overload problem is to use “intelligent agents” for search and retrieval. This editorial explores the current status of intelligent agents and points out some challenges in the development of intelligent agents based systems.


Sign in / Sign up

Export Citation Format

Share Document