scholarly journals Sampling the $$\mu \nu $$SSM for displaced decays of the tau left sneutrino LSP at the LHC

2019 ◽  
Vol 79 (11) ◽  
Author(s):  
Essodjolo Kpatcha ◽  
Iñaki Lara ◽  
Daniel E. López-Fogliani ◽  
Carlos Muñoz ◽  
Natsumi Nagata ◽  
...  

AbstractWithin the framework of the $$\mu \nu \mathrm{SSM}$$μνSSM, a displaced dilepton signal is expected at the LHC from the decay of a tau left sneutrino as the lightest supersymmetric particle (LSP) with a mass in the range 45–100 GeV. We compare the predictions of this scenario with the ATLAS search for long-lived particles using displaced lepton pairs in pp collisions, considering an optimization of the trigger requirements by means of a high level trigger that exploits tracker information. The analysis is carried out in the general case of three families of right-handed neutrino superfields, where all the neutrinos get contributions to their masses at tree level. To analyze the parameter space, we sample the $$\mu \nu $$μνSSM for a tau left sneutrino LSP with proper decay length $$c\tau > 0.1 \, \hbox {mm}$$cτ>0.1mm using a likelihood data-driven method, and paying special attention to reproduce the current experimental data on neutrino and Higgs physics, as well as flavor observables. The sneutrino is special in the $$\mu \nu \mathrm{SSM}$$μνSSM since its couplings have to be chosen so that the neutrino oscillation data are reproduced. We find that important regions of the parameter space can be probed at the LHC run 3.

2021 ◽  
Vol 81 (2) ◽  
Author(s):  
Essodjolo Kpatcha ◽  
Iñaki Lara ◽  
Daniel E. López-Fogliani ◽  
Carlos Muñoz ◽  
Natsumi Nagata

AbstractWe analyze the anomalous magnetic moment of the muon $$g-2$$ g - 2 in the $$\mu \nu $$ μ ν SSM. This R-parity violating model solves the $$\mu $$ μ problem reproducing simultaneously neutrino data, only with the addition of right-handed neutrinos. In the framework of the $$\mu \nu $$ μ ν SSM, light left muon-sneutrino and wino masses can be naturally obtained driven by neutrino physics. This produces an increase of the dominant chargino-sneutrino loop contribution to muon $$g-2$$ g - 2 , solving the gap between the theoretical computation and the experimental data. To analyze the parameter space, we sample the $$\mu \nu $$ μ ν SSM using a likelihood data-driven method, paying special attention to reproduce the current experimental data on neutrino and Higgs physics, as well as flavor observables such as B and $$\mu $$ μ decays. We then apply the constraints from LHC searches for events with multi-leptons + MET on the viable regions found. They can probe these regions through chargino–chargino, chargino–neutralino and neutralino–neutralino pair production. We conclude that significant regions of the parameter space of the $$\mu \nu \mathrm{SSM}$$ μ ν SSM can explain muon $$g-2$$ g - 2 data.


Author(s):  
Xiaoling Luo ◽  
Adrian Cottam ◽  
Yao-Jan Wu ◽  
Yangsheng Jiang

Trip purpose information plays a significant role in transportation systems. Existing trip purpose information is traditionally collected through human observation. This manual process requires many personnel and a large amount of resources. Because of this high cost, automated trip purpose estimation is more attractive from a data-driven perspective, as it could improve the efficiency of processes and save time. Therefore, a hybrid-data approach using taxi operations data and point-of-interest (POI) data to estimate trip purposes was developed in this research. POI data, an emerging data source, was incorporated because it provides a wealth of additional information for trip purpose estimation. POI data, an open dataset, has the added benefit of being readily accessible from online platforms. Several techniques were developed and compared to incorporate this POI data into the hybrid-data approach to achieve a high level of accuracy. To evaluate the performance of the approach, data from Chengdu, China, were used. The results show that the incorporation of POI information increases the average accuracy of trip purpose estimation by 28% compared with trip purpose estimation not using the POI data. These results indicate that the additional trip attributes provided by POI data can increase the accuracy of trip purpose estimation.


2019 ◽  
Vol 214 ◽  
pp. 05010 ◽  
Author(s):  
Giulio Eulisse ◽  
Piotr Konopka ◽  
Mikolaj Krzewicki ◽  
Matthias Richter ◽  
David Rohr ◽  
...  

ALICE is one of the four major LHC experiments at CERN. When the accelerator enters the Run 3 data-taking period, starting in 2021, ALICE expects almost 100 times more Pb-Pb central collisions than now, resulting in a large increase of data throughput. In order to cope with this new challenge, the collaboration had to extensively rethink the whole data processing chain, with a tighter integration between Online and Offline computing worlds. Such a system, code-named ALICE O2, is being developed in collaboration with the FAIR experiments at GSI. It is based on the ALFA framework which provides a generalized implementation of the ALICE High Level Trigger approach, designed around distributed software entities coordinating and communicating via message passing. We will highlight our efforts to integrate ALFA within the ALICE O2 environment. We analyze the challenges arising from the different running environments for production and development, and conclude on requirements for a flexible and modular software framework. In particular we will present the ALICE O2 Data Processing Layer which deals with ALICE specific requirements in terms of Data Model. The main goal is to reduce the complexity of development of algorithms and managing a distributed system, and by that leading to a significant simplification for the large majority of the ALICE users.


2009 ◽  
Author(s):  
R. Covarelli ◽  
Marvin L. Marshak
Keyword(s):  

Author(s):  
Ioannis T. Georgiou

A local damage at the tip of a composite propeller is diagnosed by properly comparing its impact-induced free coupled dynamics to that of a pristine wooden propeller of the same size and shape. This is accomplished by creating indirectly via collocated measurements distributed information for the coupled acceleration field of the propellers. The powerful data-driven modal expansion analysis delivered by the Proper Orthogonal Decomposition (POD) Transform reveals that ensembles of impact-induced collocated coupled experimental acceleration signals are underlined by a high level of spatio-temporal coherence. Thus they furnish a valuable spatio-temporal sample of coupled response induced by a point impulse. In view of this fact, a tri-axial sensor was placed on the propeller hub to collect collocated coupled acceleration signals induced via modal hammer nondestructive impacts and thus obtained a reduced order characterization of the coupled free dynamics. This experimental data-driven analysis reveals that the in-plane unit components of the POD modes for both propellers have similar shapes-nearly identical. For the damaged propeller this POD shape-difference is quite pronounced. The shapes of the POD modes are used to compute indices of difference reflecting directly damage. At the first POD energy level, the shape-difference indices of the damaged composite propeller are quite larger than those of the pristine wooden propeller.


2020 ◽  
Vol 245 ◽  
pp. 07044
Author(s):  
Frank Berghaus ◽  
Franco Brasolin ◽  
Alessandro Di Girolamo ◽  
Marcus Ebert ◽  
Colin Roy Leavett-Brown ◽  
...  

The Simulation at Point1 (Sim@P1) project was built in 2013 to take advantage of the ATLAS Trigger and Data Acquisition High Level Trigger (HLT) farm. The HLT farm provides around 100,000 cores, which are critical to ATLAS during data taking. When ATLAS is not recording data, such as the long shutdowns of the LHC, this large compute resource is used to generate and process simulation data for the experiment. At the beginning of the second long shutdown of the large hadron collider, the HLT farm including the Sim@P1 infrastructure was upgraded. Previous papers emphasised the need for simple, reliable, and efficient tools and assessed various options to quickly switch between data acquisition operation and offline processing. In this contribution, we describe the new mechanisms put in place for the opportunistic exploitation of the HLT farm for offline processing and give the results from the first months of operation.


2020 ◽  
Vol 245 ◽  
pp. 01031
Author(s):  
Thiago Rafael Fernandez Perez Tomei

The CMS experiment has been designed with a two-level trigger system: the Level-1 Trigger, implemented on custom-designed electronics, and the High Level Trigger, a streamlined version of the CMS offline reconstruction software running on a computer farm. During its second phase the LHC will reach a luminosity of 7.5 1034 cm−2 s−1 with a pileup of 200 collisions, producing integrated luminosity greater than 3000 fb−1 over the full experimental run. To fully exploit the higher luminosity, the CMS experiment will introduce a more advanced Level-1 Trigger and increase the full readout rate from 100 kHz to 750 kHz. CMS is designing an efficient data-processing hardware trigger that will include tracking information and high-granularity calorimeter information. The current Level-1 conceptual design is expected to take full advantage of advances in FPGA and link technologies over the coming years, providing a high-performance, low-latency system for large throughput and sophisticated data correlation across diverse sources. The higher luminosity, event complexity and input rate present an unprecedented challenge to the High Level Trigger that aims to achieve a similar efficiency and rejection factor as today despite the higher pileup and more pure preselection. In this presentation we will discuss the ongoing studies and prospects for the online reconstruction and selection algorithms for the high-luminosity era.


Sign in / Sign up

Export Citation Format

Share Document