background rejection
Recently Published Documents


TOTAL DOCUMENTS

150
(FIVE YEARS 33)

H-INDEX

19
(FIVE YEARS 1)

2021 ◽  
Vol 81 (12) ◽  
Author(s):  
L. Olivera-Nieto ◽  
A. M. W. Mitchell ◽  
K. Bernlöhr ◽  
J. A. Hinton

AbstractThe presence of muons in air-showers initiated by cosmic ray protons and nuclei is well established as a powerful tool to separate such showers from those initiated by gamma rays. However, so far this approach has been fully exploited only for ground level particle detecting arrays. We explore the feasibility of using Cherenkov light from muons as a background rejection tool for imaging atmospheric Cherenkov telescope arrays at the highest energies. We adopt an analytical model of the Cherenkov light from individual muons to allow rapid simulation of a large number of showers in a hybrid mode. This allows us to explore the very high background rejection power regime at acceptable cost in terms of computing time. We show that for very large ($$\gtrsim 20$$ ≳ 20  m mirror diameter) telescopes, efficient identification of muon light can potentially lead to background rejection levels up to 10$$^{-5}$$ - 5 whilst retaining high efficiency for gamma rays. While many challenges remain in the effective exploitation of the muon Cherenkov light in the data analysis for imaging Cherenkov telescope arrays, our study indicates that for arrays containing at least one large telescope, this is a very worthwhile endeavor.


2021 ◽  
Vol 7 (12) ◽  
pp. 253
Author(s):  
Luigi Cimmino

Radiographic imaging with muons, also called Muography, is based on the measurement of the absorption of muons, generated by the interaction of cosmic rays with the earth’s atmosphere, in matter. Muons are elementary particles with high penetrating power, a characteristic that makes them capable of crossing bodies of dimensions of the order of hundreds of meters. The interior of bodies the size of a pyramid or a volcano can be seen directly with the use of this technique, which can rely on highly segmented muon trackers. Since the muon flux is distributed in energy over a wide spectrum that depends on the direction of incidence, the main difference with radiography made with X-rays is in the source. The source of muons is not tunable, neither in energy nor in direction; to improve the signal-to-noise ratio, muography requires large instrumentation, long time data acquisition and high background rejection capacity. Here, we present the principles of the Muography, illustrating how radiographic images can be obtained, starting from the measurement of the attenuation of the muon flux through an object. It will then be discussed how recent technologies regarding artificial intelligence can give an impulse to this methodology in order to improve its results.


Universe ◽  
2021 ◽  
Vol 7 (11) ◽  
pp. 421
Author(s):  
Mathieu de Naurois

Thirty years after the discovery of the first very-high-energy γ-ray source by the Whipple telescope, the field experienced a revolution mainly driven by the third generation of imaging atmospheric Cherenkov telescopes (IACTs). The combined use of large mirrors and the invention of the imaging technique at the Whipple telescope, stereoscopic observations, developed by the HEGRA array and the fine-grained camera, pioneered by the CAT telescope, led to a jump by a factor of more than ten in sensitivity. The advent of advanced analysis techniques led to a vast improvement in background rejection, as well as in angular and energy resolutions. Recent instruments already have to deal with a very large amount of data (petabytes), containing a large number of sources often very extended (at least within the Galactic plane) and overlapping each other, and the situation will become even more dramatic with future instruments. The first large catalogues of sources have emerged during the last decade, which required numerous, dedicated observations and developments, but also made the first population studies possible. This paper is an attempt to summarize the evolution of the field towards the building up of the source catalogues, to describe the first population studies already made possible, and to give some perspectives in the context of the upcoming, new generation of instruments.


2021 ◽  
Vol 1 (4) ◽  
Author(s):  
Gihan Ahmed Sobhy ◽  
◽  
Hazem Mohamed Zakaria ◽  
Haidy Mohammed Zakaria ◽  
◽  
...  

Background: Rejection is an important adverse event after pediatric liver transplantation (LT). Aim: We aimed to study the incidence and risk factors for post-transplant rejection in pediatrics. Methods: The study included 40 pediatric patients underwent LT. All patients' records were reviewed. A wide range of potential risk factors for rejection, were recorded. Results: Rejection occurred in 13/40 (32.5%) of recipients. For the 13 rejecters, a total of 24 rejection attacks have occurred. 25% of which occurred during the 1st month post-LT. Acute rejection accounted for 54% of total rejection attacks, while chronic rejection occurred in 46%. LT for biliary atrasia (BA) was a significant risk factor for rejection. The means of transaminases levels were 268 ± 141 (IU/L) AST and 221 ± 119 (IU/L) ALT, biliary enzymes were 962 ± 687 (IU/L) for the ALKP and 485 ± 347 (IU/L) for the GGT, total BIL was 6.5 ± 7.1 (mg/dl), and FKL levels were 10.4 ± 5.6 (ng/ml) during the rejection attacks. Chronic rejection contributed to death of only one of the cases. Conclusion: BA was a significant risk for rejection. Elevated transaminases and biliary enzymes but not FK trough level is alarming signs for presence of rejection. Keywords: liver transplantation; pediatrics; rejection.


2021 ◽  
Vol 7 (4) ◽  
pp. 188-193
Author(s):  
Dr. Vani Krishnamurthy ◽  
◽  
Rubiya Ahmad ◽  

Background: Rejection of hemolysed samples for coagulation test is the standard practice.However, when clinicians deal with extremely sick patients where repeat sampling is difficult toobtain, rejection of the sample is a lost opportunity for the lab physician to assist inpatient care.Proceeding with the test and providing a clinically helpful interpretation of the results will ensure theactive participation of the laboratory physician. Different principles of coagulation testing handle thehemolysed samples differently. It is essential to know the best principle to proceed with thehemolysed sample if need be. This study set out to estimate the predictive values of post-hemolyticsample coagulation test results with various coagulation test principles. Methods: This is aprospective experimental study where the non-hemolysed samples were processed for coagulationtests. Part of the sample was deliberately hemolysed, and the coagulation tests were repeated.Results: Two hundred and forty-eight samples were studied. A median of 11% hemolysis wasachieved experimentally. The mean difference in prothrombin time between pre and post hemolyticsamples with normal PT was 0.9 and with abnormal PT, it was 1.1 seconds. The same for APTT was4.9 and 1.1 seconds, respectively. The majority of the samples showed prolonged coagulation posthemolysis. Positive (PPV) and negative (NPV) predictive values for prothrombin time are 97.3 and73.4%, respectively. Similarly, PPV and NPV for APTT are 97.4 and 47.1%, respectively.Conclusions: Samples with normal values after hemolysis are more likely to be normal.


2021 ◽  
Author(s):  
Laura Olivera-Nieto ◽  
Alison Mitchell ◽  
Konrad Bernlöhr ◽  
James Hinton

2021 ◽  
Vol 2021 (7) ◽  
Author(s):  
◽  
A. Simón ◽  
Y. Ifergan ◽  
A. B. Redwine ◽  
R. Weiss-Babai ◽  
...  

Abstract Next-generation neutrinoless double beta decay experiments aim for half-life sensitivities of ∼ 1027 yr, requiring suppressing backgrounds to < 1 count/tonne/yr. For this, any extra background rejection handle, beyond excellent energy resolution and the use of extremely radiopure materials, is of utmost importance. The NEXT experiment exploits differences in the spatial ionization patterns of double beta decay and single-electron events to discriminate signal from background. While the former display two Bragg peak dense ionization regions at the opposite ends of the track, the latter typically have only one such feature. Thus, comparing the energies at the track extremes provides an additional rejection tool. The unique combination of the topology-based background discrimination and excellent energy resolution (1% FWHM at the Q-value of the decay) is the distinguishing feature of NEXT. Previous studies demonstrated a topological background rejection factor of ∼ 5 when reconstructing electron-positron pairs in the 208Tl 1.6 MeV double escape peak (with Compton events as background), recorded in the NEXT-White demonstrator at the Laboratorio Subterráneo de Canfranc, with 72% signal efficiency. This was recently improved through the use of a deep convolutional neural network to yield a background rejection factor of ∼ 10 with 65% signal efficiency. Here, we present a new reconstruction method, based on the Richardson-Lucy deconvolution algorithm, which allows reversing the blurring induced by electron diffusion and electroluminescence light production in the NEXT TPC. The new method yields highly refined 3D images of reconstructed events, and, as a result, significantly improves the topological background discrimination. When applied to real-data 1.6 MeV e−e+ pairs, it leads to a background rejection factor of 27 at 57% signal efficiency.


2021 ◽  
Vol 15 (6) ◽  
Author(s):  
P. Abratenko ◽  
M. Alrashed ◽  
R. An ◽  
J. Anthony ◽  
J. Asaadi ◽  
...  

Author(s):  
Md. Shahinur Rahman ◽  
Wayne D. Hutchison ◽  
Lindsey Bignell ◽  
Gregory Lane ◽  
Lei Wang ◽  
...  

Abstract The SABRE (Sodium-iodide with Active Background Rejection) experiment consists of 50 kg of ultrapure NaI(Tl) crystal contained within a 10.5 ton liquid scintillator (LS) veto detector, and will search for dark matter interactions in the inner NaI(Tl) detector. The relative scintillation light yield in NaI(Tl) scintillator for different incident particle energies is not constant and is important for characterizing the detector response. The relative scintillation light yield in two different NaI(Tl) scintillators was measured with a 10 µCi 137Cs radioactive source using the Compton coincidence technique (CCT) for scattering angles 30? - 135? using electron energies ranging from 60 to 500 keVee, and these measurements are compared to the previously published results. Light yield was proportional within 3.5% at energies between 60 and 500 keVee, but non-proportionality increases drastically below 60 keVee which might be due to the non-uniform ionization density and multiple Compton scattering background events in the scintillator. An improved experimental setup with ultrapure NaI(Tl) scintillator and proper coincidence timing of radioactive events could allow scintillation light yield measurement at lower electron recoil energy. The obtained light yield non-proportionality results will be useful for the SABRE dark matter detector experiment.


2021 ◽  
Author(s):  
Sheng Xiao ◽  
Eric Lowet ◽  
Howard Gritton ◽  
Pierre Fabris ◽  
Yangyang Wang ◽  
...  

Recent improvements in genetically encoded voltage indicators enabled high precision imaging of single neuron's action potentials and subthreshold membrane voltage dynamics in the mammalian brain. To perform high speed voltage imaging, widefield microscopy remains an essential tool to record activity from many neurons simultaneously over a large anatomical area. However, the lack of optical sectioning makes widefield microscopy prone to background signal contamination. We implemented a simple, low cost, targeted illumination strategy based on a digital micromirror device (DMD) to restrict illumination to the cells of interest to improve background rejection, and quantified optical voltage signal improvement in neurons expressing the fully genetically encoded voltage indicator SomArchon. We found that targeted illumination, in comparison to widefield illumination, increased SomArchon signal contrast and reduced background cross-contamination in the brains of awake mice. Such improvement permitted the reduction of illumination intensity, and thus reduced fluorescence photobleaching and prolonged imaging duration. When coupled with a high-speed sCMOS camera, we routinely imaged tens of spiking neurons simultaneously over several minutes in the brain. Thus, the DMD-based targeted illumination strategy described here offers a simple solution for high-speed voltage imaging analysis of large scale network at the millisecond time scale with single cell resolution in the brains of behaving animals.


Sign in / Sign up

Export Citation Format

Share Document