realistic result
Recently Published Documents


TOTAL DOCUMENTS

14
(FIVE YEARS 6)

H-INDEX

2
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Cengiz Görkem Dengiz ◽  
Kemal Yildizli

Abstract In this study, different techniques using for modeling of bimetallic sheets by finite elements method has been compared. Sheets modeled with 5 different assumptions were compared with each other and experimental datas, to determine the FE model that gives the most realistic result. FE models were created with the assumption that the adhesion was excellent or separable, and with the case that the solidified adhesive in the intermediate layer was modeled and not modeled. As a result, the closest values to the experimental results were obtained with the model created with the assumption that there is a solidified adhesive layer in the middle layer and there is an adhesion interface between this layer and metallic layers. On the other hand, when the adhesive is not modeled in the intermediate layer, it has been observed that the CPU time is reduced by 2 times while the results are seen small changes.


2021 ◽  
Vol 253 ◽  
pp. 07011
Author(s):  
Pavlína Haroková ◽  
Martin Lovecký

One of the methodologies used in criticality safety analysis is burnup credit method, which allows considering fuel burnup in models with spent fuel. This removes excessive conservatism from the analysis, but it also brings new uncertainties originating from computational prediction of spent fuel composition. The burnup credit method offer several possibilities of how to deal with this problem, e.g. using bounding approach with correction factors on nuclide concentrations, which is simple, but still very conservative approach. Another option is Monte Carlo sampling, which aims at receiving the most realistic result as possible, but is very computationally demanding. In this work, we have analyzed correction factors for selected nuclides and compared the results of both methods on model of spent fuel storage pool. The results show how much conservative the bounding approach is – in this case, the multiplication factor was higher by almost 0.03 than in Monte Carlo sampling, exceeding the standard deviation by more than 5.4 times.


Geophysics ◽  
2020 ◽  
Vol 85 (5) ◽  
pp. G109-G113
Author(s):  
G. R. J. Cooper

Although the boundaries between geologic units with different physical properties are usually quite distinct, the potential-field anomalies associated with them are relatively smooth, particularly for deeper bodies. The terracing filter has been introduced to sharpen anomaly edges and to produce regions of constant amplitude between them, mimicking geologic units on a geologic map. The boundaries between the pseudogeologic units are defined by the zero contour of the Laplacian function. Unfortunately, this can result in the domains of terraced anomalies extending far from the original location of the causative body, producing an image that poorly represents the geology. I have determined that the use of the mathematical shape index of the anomalies, rather than their Laplacian, produces a much more geologically realistic result. The effect can be controlled as desired using a threshold parameter. I evaluate the benefits of the method on gravity and magnetic data from southern Africa.


2019 ◽  
Vol 147 (11) ◽  
pp. 4177-4198
Author(s):  
Casey E. Davenport ◽  
Conrad L. Ziegler ◽  
Michael I. Biggerstaff

Abstract Convective environments are known to be heterogeneous in both time and space, yet idealized models use fixed base-state environments to simulate storm evolution. Recently, the base-state substitution (BSS) technique was devised to account for environmental variability in a controlled manner while maintaining horizontal homogeneity; BSS involves updating the background environment to reflect a new storm-relative proximity sounding at a prescribed time interval. The study herein sought to assess the ability of BSS to more realistically represent the structure and evolution of an observed supercell thunderstorm in comparison to simulations with fixed base-state environments. An extended dual-Doppler dataset of an intensifying supercell thunderstorm in a varying inflow environment was compared to idealized simulations of the same storm; simulations included those with fixed background environments, as well as a BSS simulation that incorporated environmental variability continuously via tendencies to the base-state variables based on changes in a series of observed soundings. While the simulated supercells were generally more intense than what was measured in the observations, broad trends in reflectivity, vertical velocity, and vertical vorticity were more similar between the observed and BSS-simulated supercell; with a fixed environment, the supercell either shrunk in size and weakened over time, or grew upscale into a larger convective system. Quantitative comparisons examining distributions, areas, and volumes of vertical velocity and vorticity further confirm these differences. Overall, BSS provides a more realistic result, supporting the idea that a series of proximity soundings can sufficiently represent the effects of environmental variability, enhancing accuracy over fixed environments.


2019 ◽  
Vol 966 ◽  
pp. 483-488 ◽  
Author(s):  
Budi Adiperdana ◽  
Risdiana

A possible method to reconstruct μSR spectra using Monte Carlo approach presented. Three issues carefully addressed for the simulations, that is automatic muon sites estimations, movement of muon due to gradient electrostatic potential and thermal fluctuation. All minima within the unit cell need to be included for more realistic theoretical μSR spectra. The optimum scale of gradient potential and thermal fluctuation needed to achieve a realistic result. Additional μSR spectra can be revealed in comparison with the simulation at lower thermal fluctuation.


Author(s):  
Naif Aljabri and Osama Abulnaja Naif Aljabri and Osama Abulnaja

. Reduce the application power consumption is one of the main challenges for the HPC community. Code power profilers are very important for researchers to identify the performance bottlenecks and power consumption for their code. Most of the modern CPUs are equipped with a built-in sensor to allow researchers and HPC engineers to estimate the power consumption of the running applications. To estimate the power consumption for any piece of code running on CPU, you need to eliminate the confounding factors as possible and run the code many times until the average converge. The reason for that is the environment, which has the OS and other processes and services running at the same time with your code and may report incorrect power readings. In this paper, we build a power profiler tool, which saves the researcher time by running and profiling different pieces of code with different types of workloads, and keeps running until the average converge. Furthermore, we identify and eliminate the environment confounding factor which saves the researcher time and gives a realistic result for power consumption experiments.


10.29007/fl91 ◽  
2018 ◽  
Author(s):  
Divyesh Patel ◽  
Rena Shukla

This paper reviews traffic congestion problem and its solutions. BRTS becomeattractive and effective solution which is adopted in many countries. The aim of the present paper is to describe effect of BRTS which is the sustainable solution for public transport services in urban area. Various study carried out to developed guidelines forimplementation of BRTS for different countries. Each country having different BRTS features as well as its impact also. The impact can be measure by micro-simulation before implementation of BRTS. Simulation gives realistic result under real world application.


Author(s):  
M. Skamantzari ◽  
A. Georgopoulos

The interest in the development of virtual museums is nowadays rising rapidly. During the last decades there have been numerous efforts concerning the 3D digitization of cultural heritage and the development of virtual museums, digital libraries and serious games. The realistic result has always been the main concern and a real challenge when it comes to 3D modelling of monuments, artifacts and especially sculptures. This paper implements, investigates and evaluates the results of the photogrammetric methods and 3D surveys that were used for the development of a virtual museum. Moreover, the decisions, the actions, the methodology and the main elements that this kind of application should include and take into consideration are described and analysed. It is believed that the outcomes of this application will be useful to researchers who are planning to develop and further improve the attempts made on virtual museums and mass production of 3D models.


Author(s):  
M. Skamantzari ◽  
A. Georgopoulos

The interest in the development of virtual museums is nowadays rising rapidly. During the last decades there have been numerous efforts concerning the 3D digitization of cultural heritage and the development of virtual museums, digital libraries and serious games. The realistic result has always been the main concern and a real challenge when it comes to 3D modelling of monuments, artifacts and especially sculptures. This paper implements, investigates and evaluates the results of the photogrammetric methods and 3D surveys that were used for the development of a virtual museum. Moreover, the decisions, the actions, the methodology and the main elements that this kind of application should include and take into consideration are described and analysed. It is believed that the outcomes of this application will be useful to researchers who are planning to develop and further improve the attempts made on virtual museums and mass production of 3D models.


Sign in / Sign up

Export Citation Format

Share Document