full simulation
Recently Published Documents


TOTAL DOCUMENTS

76
(FIVE YEARS 17)

H-INDEX

13
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Ilona Bass ◽  
Kevin Smith ◽  
Elizabeth Bonawitz ◽  
Tomer David Ullman

People can reason intuitively, efficiently, and accurately about everyday physical events. Recent accounts suggest that people use mental simulation to make such intuitive physical judgments. But mental simulation models are computationally expensive; how is physical reasoning relatively accurate, while maintaining computational tractability? We suggest that people make use of partial simulation, mentally moving forward in time only parts of the world deemed relevant. We propose a novel partial simulation model, and test it on the physical conjunction fallacy, a recently observed phenomenon (Ludwin-Peery, Bramley, Davis, & Gureckis, 2020) that poses a challenge for full simulation models. We find an excellent fit between our model's predictions and human performance on a set of scenarios that build on and extend those used by Ludwin-Peery et al. (2020), quantitatively and qualitatively accounting for a deviation from optimal performance. Our results suggest more generally how we allocate cognitive resources to efficiently represent and simulate physical scenes.


Author(s):  
Massimiliano Antonello ◽  
Massimo Caccia ◽  
Romualdo Santoro ◽  
Roberto Ferrari ◽  
Gabriella Gaudio ◽  
...  

Dual-readout calorimetry is now a mature and well-known technology which guarantees excellent electromagnetic and hadronic resolution in the same detector. It has recently being proposed in the framework of IDEA (Innovative Detector for Electron–Positron Accelerators) for both Future Circular Collider (FCC-ee) and Circular Electron–Positron Collider (CEPC). After being extensively tested on prototypes, the dual-readout calorimetry is now moving toward a technology design study in order to be realistically available for an experiment. In this context, a full simulation of the calorimeter has been developed and used to estimate the expected performance of the detector. At the same time, the development of a novel technique for mass production of the detector modules, at an effective cost, is ongoing. As a first step, an electromagnetic-size prototype is under construction for a testbeam data taking originally foreseen in November 2020 and now moved to spring 2021, due to the Covid-19 pandemic spread.


Author(s):  
Anna Schroder ◽  
Tim Lawrence ◽  
Natalie Voets ◽  
Daniel Garcia-Gonzalez ◽  
Mike Jones ◽  
...  

Resting state functional magnetic resonance imaging (rsfMRI), and the underlying brain networks identified with it, have recently appeared as a promising avenue for the evaluation of functional deficits without the need for active patient participation. We hypothesize here that such alteration can be inferred from tissue damage within the network. From an engineering perspective, the numerical prediction of tissue mechanical damage following an impact remains computationally expensive. To this end, we propose a numerical framework aimed at predicting resting state network disruption for an arbitrary head impact, as described by the head velocity, location and angle of impact, and impactor shape. The proposed method uses a library of precalculated cases leveraged by a machine learning layer for efficient and quick prediction. The accuracy of the machine learning layer is illustrated with a dummy fall case, where the machine learning prediction is shown to closely match the full simulation results. The resulting framework is finally tested against the rsfMRI data of nine TBI patients scanned within 24 h of injury, for which paramedical information was used to reconstruct in silico the accident. While more clinical data are required for full validation, this approach opens the door to (i) on-the-fly prediction of rsfMRI alterations, readily measurable on clinical premises from paramedical data, and (ii) reverse-engineered accident reconstruction through rsfMRI measurements.


2021 ◽  
Vol 251 ◽  
pp. 03022
Author(s):  
Stefano Carrazza ◽  
Juan Cruz-Martinez ◽  
Marco Rossi ◽  
Marco Zaro

In this proceedings we present MadFlow, a new framework for the automation of Monte Carlo (MC) simulation on graphics processing units (GPU) for particle physics processes. In order to automate MC simulation for a generic number of processes, we design a program which provides to the user the possibility to simulate custom processes through the Mad-Graph5_aMC@NLO framework. The pipeline includes a first stage where the analytic expressions for matrix elements and phase space are generated and exported in a GPU-like format. The simulation is then performed using the VegasFlow and PDFFlow libraries which deploy automatically the full simulation on systems with different hardware acceleration capabilities, such as multi-threading CPU, single-GPU and multi-GPU setups. We show some preliminary results for leading-order simulations on different hardware configurations.


2021 ◽  
Vol 251 ◽  
pp. 03016
Author(s):  
Vladimir Ivanchenko ◽  
Sunanda Banerjee ◽  
Gabrielle Hugo ◽  
Sergio Lo Meo ◽  
Ianna Osborne ◽  
...  

We report the status of the CMS full simulation for Run 3. During the long shutdown of the LHC a significant update has been introduced to the CMS code for simulation. The CMS geometry description is reviewed. Several important modifications were needed. CMS detector description software is migrated to the DD4Hep community developed tool. We will report on our experience obtained during the process of this migration. Geant4 10.7 is the CMS choice for Run 3 simulation productions. We will discuss arguments for this choice, the strategy of adaptation of a new Geant4 version, and will report on the physics performance of the CMS simulation. A special Geant4 Physics List configuration FTFP_BERT_EMM will be described, which provides a compromise between simulation accuracy and CPU performance. A significant fraction of time for simulation of CMS events is spent on tracking of charged particles in a magnetic field. In the CMS simulation a dynamic choice of Geant4 parameters for tracking in field is implemented. A new method is introduced into simulation of electromagnetic components of hadronic showers in the electromagnetic calorimeter of CMS. For low-energy electrons and positrons a parametrization of GFlash type is applied. Results of tests of this method will be discussed. In summary, we expect about 25% speedup of the CMS simulation production for Run 3 compared to the Run 2 simulations.


Author(s):  
B. Abi ◽  
R. Acciarri ◽  
M. A. Acero ◽  
G. Adamov ◽  
D. Adams ◽  
...  

AbstractThe sensitivity of the Deep Underground Neutrino Experiment (DUNE) to neutrino oscillation is determined, based on a full simulation, reconstruction, and event selection of the far detector and a full simulation and parameterized analysis of the near detector. Detailed uncertainties due to the flux prediction, neutrino interaction model, and detector effects are included. DUNE will resolve the neutrino mass ordering to a precision of 5$$\sigma $$ σ , for all $$\delta _{\mathrm{CP}}$$ δ CP values, after 2 years of running with the nominal detector design and beam configuration. It has the potential to observe charge-parity violation in the neutrino sector to a precision of 3$$\sigma $$ σ (5$$\sigma $$ σ ) after an exposure of 5 (10) years, for 50% of all $$\delta _{\mathrm{CP}}$$ δ CP values. It will also make precise measurements of other parameters governing long-baseline neutrino oscillation, and after an exposure of 15 years will achieve a similar sensitivity to $$\sin ^{2} 2\theta _{13}$$ sin 2 2 θ 13 to current reactor experiments.


2020 ◽  
Vol 35 (15n16) ◽  
pp. 2041009 ◽  
Author(s):  
G. Voutsinas ◽  
K. Elsener ◽  
P. Janot ◽  
D. El Khechen ◽  
A. Kolano ◽  
...  

The FCC-ee machine induced backgrounds on the two proposed detectors (CLD and IDEA) have been studied in detail. Synchrotron Radiation (SR) considerations dictate the Interaction Region (IR) optimization. An asymmetric IR design limits the final bend critical energy to 100 keV. Masks placed before the final focus quadrupole protect the detector from direct hits, and a shield placed around the beam pipe from secondary particles, keeping the effect of SR on the detector to negligible levels. The most important source of background is expected to be the Incoherent Pair Creation (IPC). Its effect has been studied in full simulation and reconstruction, and it was shown that it will not pose a problem for the detector, even if conservative estimations for the time resolution of the detector sensors are assumed. Moreover, the [Formula: see text], radiative Bhabhas and beam-gas interaction induced backgrounds were studied. All were found to have small to negligible effect on the detector. Overall, the FCC–ee interaction region backgrounds are not expected to compromise the detector performance.


2020 ◽  
Vol 24 (2) ◽  
pp. 14-27
Author(s):  
Byungheon Han ◽  
Yoonsik Park ◽  
K. Gnanaprakash ◽  
Jaeyong Yoo ◽  
Jai-ick Yoh
Keyword(s):  

2020 ◽  
Vol 15 (1) ◽  
Author(s):  
Angelina M. Bacala

Abstract Background In Monte Carlo simulations, the fine-tuning of linac beam parameters to produce a good match between simulated and measured dose profiles is a lengthy, time-consuming and resource-intensive process. The objective of this study is to utilize the results of the gamma-index analysis toolkit embedded inside the windows-based PRIMO software package to yield a truncated linac photon beam fine-tuning process. Methods Using PRIMO version 0.1.5.1307, a Varian Clinac 2100 is simulated at two nominal energy configurations of 6 MV and 10 MV for varying number of histories from 106 to more than 108. The dose is tallied on a homogeneous water phantom with dimensions 16.2 × 16.2 × 31.0 cm3 at a source-to-surface-distance of 100.0 cm. For each nominal energy setting, two initial electron beam energies are configured to reproduce the measured percent depth dose (PDD) distribution. Once the initial beam energy is fixed, several beam configurations are sequentially simulated to determine the parameters yielding good agreement with the measured lateral dose profiles. The simulated dose profiles are compared with the Varian Golden Beam Data Set (GBDS) using the gamma-index analysis method incorporating the dose-difference and distance-to-agreement criteria. The simulations are run on Pentium-type computers while the tuned 10 MV beam configuration is simulated at more than 108 histories using a virtual server in the Amazon.com Elastic Compute Cloud. Results The initial electron beam energy configuration that will likely reproduce the measured PDD is determined by comparing directly the gamma-index analysis results of two different beam configurations. The configuration is indicated to yield good agreement with data if the gamma-index passing rates using the 1%/1 mm criteria generally increase as the number of histories is increased. Additionally at the highest number of histories, the matching configuration gives a much higher passing rate at the 1%/1 mm acceptance criteria over the other competing configuration. With the matching initial electron beam energy known, this input to the subsequent simulations allows the fine-tuning of the lateral beam profiles to proceed at a fixed yet lower number of histories. In a three-stage serial optimization procedure, the first remaining beam parameter is varied and the highest passing rate at the 1%/1 mm criteria is determined. This optimum value is input to the second stage and the procedure is repeated until all the remaining beam parameters are optimized. The final tuned beam configuration is then simulated at much higher number of histories and the good agreement with the measured dose distributions is verified. Conclusions As physical nature is not stingy, it reveals at low statistics what is hidden at high statistics. In the matter of fine-tuning a linac to conform with measurements, this characteristic is exploited directly by the PRIMO software package. PRIMO is an automated, self-contained and full Monte Carlo linac simulator and dose calculator. It embeds the gamma-index analysis toolkit which can be used to determine all the parameters of the initial electron beam configuration at relatively lower number of histories before the full simulation is run at very high statistics. In running the full simulation, the Amazon.com compute cloud proves to be a very cost-effective and reliable platform. These results are significant because of the time required to run full-blown simulations especially for resource-deficient communities where there could just be one computer as their sole workhorse.


Sign in / Sign up

Export Citation Format

Share Document