background events
Recently Published Documents


TOTAL DOCUMENTS

86
(FIVE YEARS 35)

H-INDEX

10
(FIVE YEARS 3)

2021 ◽  
Vol 16 (12) ◽  
pp. P12001
Author(s):  
D. Dobur ◽  
J. Knolle ◽  
G. Mestdach ◽  
K. Skovpen

Abstract Kinematic reconstruction of top quarks allows to define a set of kinematic observables relevant to various physics processes that involve top quarks and provides an additional handle for the suppression of background events. Radiation of photons in association with the top quarks alters the kinematics and the topology of the event, leading to visible systematic effects in measurable observables. The present study introduces an improved reconstruction of the top quark kinematics in the presence of photon radiation. The results are presented for processes with top quark pair production, as well as for singly-produced top quarks.


Author(s):  
José Manuel Manuel Clavijo Columbié ◽  
Paul Glaysher ◽  
Jenia Jitsev ◽  
Judith Maria Katzy

Abstract We apply adversarial domain adaptation to reduce sample bias in a classification machine learning algorithm. We add a gradient reversal layer to a neural network to simultaneously classify signal versus background events, while minimising the difference of the classifier response to a background sample using an alternative MC model. We show this on the example of simulated events at the LHC with $t\bar{t}H$ signal versus $t\bar{t}b\bar{b}$ background classification.


2021 ◽  
Vol 16 (11) ◽  
pp. C11008
Author(s):  
V.Y. Dik ◽  
V.A. Allakhverdyan ◽  
A.D. Avrorin ◽  
A.V. Avrorin ◽  
V.M. Aynutdinov ◽  
...  

Abstract The high-energy muon neutrino events of the IceCube telescope, that are triggered as neutrino alerts in one of two probability ranks of astrophysical origin, “gold” and “bronze”, have been followed up by the Baikal-GVD in a fast quasi-online mode since September 2020. Search for correlations between alerts and GVD events reconstructed in two modes, muon-track and cascades (electromagnetic or hadronic showers), for the time windows ±1 h and ±12 h does not indicate statistically significant excess of the measured events over the expected number of background events. Upper limits on the neutrino fluence will be presented for each alert.


Author(s):  
Christian Grimm ◽  
Martin Käser ◽  
Sebastian Hainzl ◽  
Marco Pagani ◽  
Helmut Küchenhoff

ABSTRACT Earthquake sequences add a substantial hazard beyond the solely declustered perspective of common probabilistic seismic hazard analysis. A particularly strong driver for both social and economic losses are so-called earthquake doublets (more generally multiplets), that is, sequences of two (or more) comparatively large events in spatial and temporal proximity. Without differentiating between foreshocks and aftershocks, we hypothesize three main influencing factors of doublet occurrence: (1) the number of direct and secondary aftershocks triggered by an earthquake; (2) the occurrence of independent clusters and seismic background events in the same time–space window; and (3) the magnitude size distribution of triggered events (in contrast to independent events). We tested synthetic catalogs simulated by a standard epidemic-type aftershock sequence (ETAS) model for both Japan and southern California. Our findings show that the common ETAS approach significantly underestimates doublet frequencies compared with observations in historical catalogs. In combination with that the simulated catalogs show a smoother spatiotemporal clustering compared with the observed counterparts. Focusing on the impact on direct aftershock productivity and total cluster sizes, we propose two modifications of the ETAS spatial kernel to improve doublet rate predictions: (a) a restriction of the spatial function to a maximum distance of 2.5 estimated rupture lengths and (b) an anisotropic function with contour lines constructed by a box with two semicircular ends around the estimated rupture segment. These modifications shift the triggering potential from weaker to stronger events and consequently improve doublet rate predictions for larger events, despite still underestimating historic doublet occurrence rates. Besides, the results for the restricted spatial functions fulfill better the empirical Båth’s law for the maximum aftershock magnitude. The tested clustering properties of strong events are not sufficiently incorporated in typically used global catalog scale measures, such as log-likelihood values, which would favor the conventional, unrestricted models.


PLoS ONE ◽  
2021 ◽  
Vol 16 (8) ◽  
pp. e0250753 ◽  
Author(s):  
David Ross

Flow cytometry is commonly used to evaluate the performance of engineered bacteria. With increasing use of high-throughput experimental methods, there is a need for automated analysis methods for flow cytometry data. Here, we describe FlowGateNIST, a Python package for automated analysis of bacterial flow cytometry data. The main components of FlowGateNIST perform automatic gating to differentiate between cells and background events and then between singlet and multiplet events. FlowGateNIST also includes a method for automatic calibration of fluorescence signals using fluorescence calibration beads. FlowGateNIST is open source and freely available with tutorials and example data to facilitate adoption by users with minimal programming experience.


Author(s):  
Md. Shahinur Rahman ◽  
Wayne D. Hutchison ◽  
Lindsey Bignell ◽  
Gregory Lane ◽  
Lei Wang ◽  
...  

Abstract The SABRE (Sodium-iodide with Active Background Rejection) experiment consists of 50 kg of ultrapure NaI(Tl) crystal contained within a 10.5 ton liquid scintillator (LS) veto detector, and will search for dark matter interactions in the inner NaI(Tl) detector. The relative scintillation light yield in NaI(Tl) scintillator for different incident particle energies is not constant and is important for characterizing the detector response. The relative scintillation light yield in two different NaI(Tl) scintillators was measured with a 10 µCi 137Cs radioactive source using the Compton coincidence technique (CCT) for scattering angles 30? - 135? using electron energies ranging from 60 to 500 keVee, and these measurements are compared to the previously published results. Light yield was proportional within 3.5% at energies between 60 and 500 keVee, but non-proportionality increases drastically below 60 keVee which might be due to the non-uniform ionization density and multiple Compton scattering background events in the scintillator. An improved experimental setup with ultrapure NaI(Tl) scintillator and proper coincidence timing of radioactive events could allow scintillation light yield measurement at lower electron recoil energy. The obtained light yield non-proportionality results will be useful for the SABRE dark matter detector experiment.


2021 ◽  
Author(s):  
David Ross

AbstractFlow cytometry is commonly used to evaluate the performance of engineered bacteria. With increasing use of high-throughput experimental methods, there is a need for automated analysis methods for flow cytometry data. Here, we describe FlowGateNIST, a Python package for automated analysis of bacterial flow cytometry data. The main components of FlowGateNIST perform automatic gating to differentiate between cells and background events and then between singlet and multiplet events. FlowGateNIST also includes a method for automatic calibration of fluorescence signals using fluorescence calibration beads. FlowGateNIST is open source and freely available with tutorials and example data to facilitate adoption by users with minimal programming experience.


2021 ◽  
Vol 2021 (4) ◽  
Author(s):  
S. H. Seo ◽  
Y. D. Kim

Abstract Dark photons are well motivated hypothetical dark sector particles that could account for observations that cannot be explained by the standard model of particle physics. A search for dark photons that are produced by an electron beam striking a thick tungsten target and subsequently interact in a 3 kiloton-scale neutrino detector in Yemilab, a new underground lab in Korea, is proposed. Dark photons can be produced by “darkstrahlung” or by oscillations from ordinary photons produced in the target and detected by their visible decays, “absorption” or by their oscillation to ordinary photons. By detecting the absorption process or the oscillation-produced photons, a world’s best sensitivity for measurements of the dark-photon kinetic mixing parameter of ϵ2> 1.5 × 10−13(6.1 × 10−13) at the 95% confidence level (C.L.) could be obtained for dark photon masses between 80 eV and 1 MeV in a year-long exposure to a 100 MeV–100 kW electron beam with zero (103) background events. In parallel, the detection of e+e− pairs from decays of dark photons with mass between 1 MeV and ∼86 MeV would have sensitivities of ϵ2>$$ \mathcal{O}\left({10}^{-17}\right)\left(\mathcal{O}\left({10}^{-16}\right)\right) $$ O 10 − 17 O 10 − 16 at the 95% C.L. with zero (103) background events. This is comparable to that of the Super-K experiment under the same zero background assumption.


2021 ◽  
Author(s):  
Ilya Zaliapin ◽  
Yehuda Ben-Zion

<p>We present results aimed at understanding preparation processes of large earthquakes by tracking progressive localization of earthquake deformation with three complementary analyses: (i) estimated production of rock damage by background events, (ii) spatial localization of background seismicity within damaged areas, and (iii) progressive coalescence of individual earthquakes into clusters. Techniques (i) and (ii) employ declustered catalogs to avoid the occasional strong fluctuations associated with aftershock sequences, while technique (iii) examines developing clusters in entire catalog data. The different techniques provide information on different time scales and on the spatial extent of weakened damaged regions. The analyses reveal generation of earthquake-induced rock damage on a decadal timescale around eventual rupture zones, and progressive localization of background seismicity on a 2-3 yr timescale before several M > 7 earthquakes in southern and Baja California and M7.9 events in Alaska. This is followed by coalescence of earthquakes into growing clusters that precede the mainshocks. Corresponding analysis around the 2004 M6 Parkfield earthquake in the creeping section of the San Andreas fault shows contrasting tendencies to those associated with the large seismogenic faults. The results are consistent with observations from laboratory experiments and physics-based models with heterogeneous materials not dominated by a pre-existing failure zone. Continuing studies with these techniques, combined with analysis of geodetic data and insights from laboratory experiments and model simulations, may allow developing an integrated multi-signal procedure to estimate the approaching time and size of large earthquakes.</p>


Sign in / Sign up

Export Citation Format

Share Document