variance reduction
Recently Published Documents


TOTAL DOCUMENTS

1042
(FIVE YEARS 237)

H-INDEX

38
(FIVE YEARS 5)

2022 ◽  
Vol 41 (1) ◽  
pp. 1-15
Author(s):  
Shilin Zhu ◽  
Zexiang Xu ◽  
Tiancheng Sun ◽  
Alexandr Kuznetsov ◽  
Mark Meyer ◽  
...  

Although Monte Carlo path tracing is a simple and effective algorithm to synthesize photo-realistic images, it is often very slow to converge to noise-free results when involving complex global illumination. One of the most successful variance-reduction techniques is path guiding, which can learn better distributions for importance sampling to reduce pixel noise. However, previous methods require a large number of path samples to achieve reliable path guiding. We present a novel neural path guiding approach that can reconstruct high-quality sampling distributions for path guiding from a sparse set of samples, using an offline trained neural network. We leverage photons traced from light sources as the primary input for sampling density reconstruction, which is effective for challenging scenes with strong global illumination. To fully make use of our deep neural network, we partition the scene space into an adaptive hierarchical grid, in which we apply our network to reconstruct high-quality sampling distributions for any local region in the scene. This allows for effective path guiding for arbitrary path bounce at any location in path tracing. We demonstrate that our photon-driven neural path guiding approach can generalize to diverse testing scenes, often achieving better rendering results than previous path guiding approaches and opening up interesting future directions.


2022 ◽  
Vol 32 (1) ◽  
pp. 1-28
Author(s):  
Ran Xin ◽  
Usman A. Khan ◽  
Soummya Kar

2021 ◽  
Author(s):  
Shiyao Guo ◽  
Yuxia Sheng ◽  
Shenpeng Li ◽  
Li Chai ◽  
Jingxin Zhang

<div>Represented by the kernelized expectation maximization (KEM), the kernelized maximum-likelihood (ML) expectation maximization (EM) methods have recently gained prominence in PET image reconstruction, outperforming many previous state-of-the-art methods. But they are not immune to the problems of non-kernelized MLEM methods in potentially large reconstruction variance and high sensitivity to iteration number. Also, it is generally difficult to simultaneously reduce image variance and preserve image details using kernels. To solve these problems, this paper presents a novel regularized KEM (RKEM) method with a kernel space composite regularizer for PET image reconstruction. The composite regularizer consists of a convex kernel space graph regularizer that smoothes the kernel coefficients, a non-convex kernel space energy regularizer that enhances the coefficients’ energy, and a composition constant that guarantees the convexity of composite regularizer. These kernel space regularizers are based on the theory of data manifold and graph regularization and can be constructed from different prior image data for simultaneous image variance reduction and image detail preservation. Using this kernel space composite regularizer and the technique of optimization transfer, a globally convergent iterative algorithm is derived for RKEM reconstruction. Tests and comparisons on the simulated and in vivo data are presented to validate and evaluate the proposed algorithm, and demonstrate its better performance and advantages over KEM and other conventional methods.</div>


2021 ◽  
Author(s):  
Shiyao Guo ◽  
Yuxia Sheng ◽  
Shenpeng Li ◽  
Li Chai ◽  
Jingxin Zhang

<div>Represented by the kernelized expectation maximization (KEM), the kernelized maximum-likelihood (ML) expectation maximization (EM) methods have recently gained prominence in PET image reconstruction, outperforming many previous state-of-the-art methods. But they are not immune to the problems of non-kernelized MLEM methods in potentially large reconstruction variance and high sensitivity to iteration number. Also, it is generally difficult to simultaneously reduce image variance and preserve image details using kernels. To solve these problems, this paper presents a novel regularized KEM (RKEM) method with a kernel space composite regularizer for PET image reconstruction. The composite regularizer consists of a convex kernel space graph regularizer that smoothes the kernel coefficients, a non-convex kernel space energy regularizer that enhances the coefficients’ energy, and a composition constant that guarantees the convexity of composite regularizer. These kernel space regularizers are based on the theory of data manifold and graph regularization and can be constructed from different prior image data for simultaneous image variance reduction and image detail preservation. Using this kernel space composite regularizer and the technique of optimization transfer, a globally convergent iterative algorithm is derived for RKEM reconstruction. Tests and comparisons on the simulated and in vivo data are presented to validate and evaluate the proposed algorithm, and demonstrate its better performance and advantages over KEM and other conventional methods.</div>


2021 ◽  
Author(s):  
Bangti Jin ◽  
Zehui Zhou ◽  
Jun Zou

Abstract Stochastic variance reduced gradient (SVRG) is a popular variance reduction technique for stochastic gradient descent (SGD). We provide a first analysis of the method for solving a class of linear inverse problems in the lens of the classical regularization theory. We prove that for a suitable constant step size schedule, the method can achieve an optimal convergence rate in terms of the noise level (under suitable regularity condition) and the variance of the SVRG iterate error is smaller than that by SGD. These theoretical findings are corroborated by a set of numerical experiments.


Author(s):  
Liangsheng Huang ◽  
Liqun Hu ◽  
Luying NIU ◽  
Mengjie Zhou ◽  
Bing Hong ◽  
...  

Abstract The Local Monte Carlo (LMC) method is used to solve the problems of deep penetration and long time in the neutronics calculation of the Radial Neutron Camera (RNC) diagnostic system on the Experimental Advanced Superconducting Tokamak (EAST), and the radiation distribution of the RNC and the neutron flux at the detector positions of each channel are obtained. Compared with the results calculated by the Global Variance Reduction (GVR) method, it is shown that the LMC calculation is reliable within the reasonable error range. The calculation process of LMC is analyzed in detail, and the transport process of radiation particles is simulated in two steps. In the first step, an integrated neutronics model considering the complex window environment and a neutron source model based on EAST plasma shape are used to support the calculation. The particle information on the equivalent surface is analyzed to evaluate the rationality of settings of equivalent surface source and boundary. Based on the characteristic that only a local geometric model is needed in the second step, it is shown that the LMC is an advantageous calculation method for the nuclear shielding design of tokamak diagnostic systems.


2021 ◽  
Vol 13 (24) ◽  
pp. 13551
Author(s):  
Mohamed Hussein ◽  
Abdelrahman E. E. Eltoukhy ◽  
Amos Darko ◽  
Amr Eltawil

Off-site construction is a modern construction method that brings many sustainability merits to the built environment. However, the sub-optimal planning decisions (e.g., resource allocation, logistics and overtime planning decisions) of off-site construction projects can easily wipe away their sustainability merits. Therefore, simulation modelling—an efficient tool to consider the complexity and uncertainty of these projects—is integrated with metaheuristics, developing a simulation-optimization model to find the best possible planning decisions. Recent swarm intelligence metaheuristics have been used to solve various complex optimization problems. However, their potential for solving the simulation-optimization problems of construction projects has not been investigated. This research contributes by investigating the status-quo of simulation-optimization models in the construction field and comparing the performance of five recent swarm intelligence metaheuristics to solve the stochastic time–cost trade-off problem with the aid of parallel computing and a variance reduction technique to reduce the computation time. These five metaheuristics include the firefly algorithm, grey wolf optimization, the whale optimization algorithm, the salp swarm algorithm, and one improved version of the well-known bat algorithm. The literature analysis of the simulation-optimization models in the construction field shows that: (1) discrete-event simulation is the most-used simulation method in these models, (2) most studies applied genetic algorithms, and (3) very few studies used computation time reduction techniques, although the simulation-optimization models are computationally expensive. The five selected swarm intelligence metaheuristics were applied to a case study of a bridge deck construction project using the off-site construction method. The results further show that grey wolf optimization and the improved bat algorithm are superior to the firefly, whale optimization, and salp swarm algorithms in terms of the obtained solutions’ quality and convergence behaviour. Finally, the use of parallel computing and a variance reduction technique reduces the average computation time of the simulation-optimization models by about 87.0%. This study is a step towards the optimum planning of off-site construction projects in order to maintain their sustainability advantages.


2021 ◽  
Vol 2021 ◽  
pp. 1-7
Author(s):  
Chen Jicheng ◽  
Chen Hongchang ◽  
Li Hanchao

Link prediction is a concept of network theory that intends to find a link between two separate network entities. In the present world of social media, this concept has taken root, and its application is seen through numerous social networks. A typical example is 2004, 4 February “TheFeacebook,” currently known as just Facebook. It uses this concept to recommend friends by checking their links using various algorithms. The same goes for shopping and e-commerce sites. Notwithstanding all the merits link prediction presents, they are only enjoyed by large networks. For sparse networks, there is a wide disparity between the links that are likely to form and the ones that include. A barrage of literature has been written to approach this problem; however, they mostly come from the angle of unsupervised learning (UL). While it may seem appropriate based on a dataset’s nature, it does not provide accurate information for sparse networks. Supervised learning could seem reasonable in such cases. This research is aimed at finding the most appropriate link-based link prediction methods in the context of big data based on supervised learning. There is a tone of books written on the same; nonetheless, they are core issues that are not always addressed in these studies, which are critical in understanding the concept of link prediction. This research explicitly looks at the new problems and uses the supervised approach in analyzing them to devise a full-fledge holistic link-based link prediction method. Specifically, the network issues that we will be delving into the lack of specificity in the existing techniques, observational periods, variance reduction, sampling approaches, and topological causes of imbalances. In the subsequent sections of the paper, we explain the theory prediction algorithms, precisely the flow-based process. We specifically address the problems on sparse networks that are never discussed with other prediction methods. The resolutions made by addressing the above techniques place our framework above the previous literature’s unsupervised approaches.


2021 ◽  
Author(s):  
Peng Lu ◽  
Qiuran Wu ◽  
Hua Du ◽  
Yu Zheng ◽  
Xiaokang Zhang ◽  
...  

Abstract The neutron induced irradiation field is a key problem in fusion reactor related to nuclear responses, shielding design, nuclear safety, and thermo-hydraulic analysis. To support the system design of China Fusion Engineering Test Reactor (CFETR), the comprehensive analysis of irradiation field has been conducted in support of many new developed advanced tools. The paper first summarizes the recent progress on related neutronics code development effort including the geometry conversion tool cosVMPT, Monte Carlo variance reduction technology ‘on-the-fly’ global variance reduction (GVR). Such developed tools have been fully validated and applied on the CFETR nuclear analysis. The neutron irradiation has been evaluated on CFETR Water Cooled Ceramic Breeder (WCCB) blanket, divertor, vacuum vessel, superconductive coils and four kinds of heating systems including the Electron Cyclotron Resonance Heating (ECRH), Ion Cyclotron Resonance Heating (ICRH), Low Hybrid Wave (LHW) and Neutral Beam Injection (NBI). The nuclear responses of tritium breeding ratio (TBR), heating, irradiation damage, Hydrogen/Helium (H/He) production rate of material have been analyzed. In case of neutron damage and overheating deposition on the superconductive coils and Vacuum Vessel (VV), the interface and shielding design among heating systems, blanket and other systems has been initialized. The results show the shielding design can meet the requirement of coil and VV after several iterated neutronics calculation.


Sign in / Sign up

Export Citation Format

Share Document