fast evaluation
Recently Published Documents


TOTAL DOCUMENTS

571
(FIVE YEARS 126)

H-INDEX

36
(FIVE YEARS 5)

Author(s):  
Jonas Klingwort ◽  
Sofie Myriam Marcel Gabrielle De Broe ◽  
Sven Alexander Brocker

IntroductionTo combat and mitigate the transmission of the SARS-CoV-2 virus, reducing the number of social contacts within a population is highly effective. Non-pharmaceutical policy interventions, e.g. stay-at-home orders, closing schools, universities, and (non-essential) businesses, are expected to decrease pedestrian flows in public areas, leading to reduced social contacts. The extent to which such interventions show the targeted effect is often measured retrospectively by surveying behavioural changes. Approaches that use data generated through mobile phones are hindered by data confidentiality and privacy regulations and complicated by selection effects. Furthermore, access to such sensitive data is limited. However, a complex pandemic situation requires a fast evaluation of the effectiveness of the introduced interventions aiming to reduce social contacts. Location-based sensor systems installed in cities, providing objective measurements of spatial mobility in the form of pedestrian flows, are suited for such a purpose. These devices record changes in a population’s behaviour in real-time, do not have privacy problems as they do not identify persons, and have no selection problems due to ownership of a device. ObjectiveThis work aimed to analyse location-based sensor measurements of pedestrian flows in 49 metropolitan areas at 100 locations in Germany to study whether such technology is suitable for the real-time assessment of behavioural changes during a phase of several different pandemic-related policy interventions. MethodsSpatial mobility data of pedestrian flows was linked with policy interventions using the date as a unique linkage key. Data was visualised to observe potential changes in pedestrian flows before or after interventions. Furthermore, differences in time series of pedestrian counts between the pandemic and the pre-pandemic year were analysed. ResultsThe sensors detected changes in mobility patterns even before policy interventions were enacted. Compared to the pre-pandemic year, pedestrian counts were 85% lower. ConclusionsThe study illustrated the practical value of sensor-based real-time measurements when linked with non-pharmaceutical policy intervention data. This study’s core contribution is that the sensors detected behavioural changes before enacting or loosening non-pharmaceutical policy interventions. Therefore, such technologies should be considered in the future by policymakers for crisis management and policy evaluation.


Author(s):  
Bingqian Lu ◽  
Jianyi Yang ◽  
Weiwen Jiang ◽  
Yiyu Shi ◽  
Shaolei Ren

Convolutional neural networks (CNNs) are used in numerous real-world applications such as vision-based autonomous driving and video content analysis. To run CNN inference on various target devices, hardware-aware neural architecture search (NAS) is crucial. A key requirement of efficient hardware-aware NAS is the fast evaluation of inference latencies in order to rank different architectures. While building a latency predictor for each target device has been commonly used in state of the art, this is a very time-consuming process, lacking scalability in the presence of extremely diverse devices. In this work, we address the scalability challenge by exploiting latency monotonicity --- the architecture latency rankings on different devices are often correlated. When strong latency monotonicity exists, we can re-use architectures searched for one proxy device on new target devices, without losing optimality. In the absence of strong latency monotonicity, we propose an efficient proxy adaptation technique to significantly boost the latency monotonicity. Finally, we validate our approach and conduct experiments with devices of different platforms on multiple mainstream search spaces, including MobileNet-V2, MobileNet-V3, NAS-Bench-201, ProxylessNAS and FBNet. Our results highlight that, by using just one proxy device, we can find almost the same Pareto-optimal architectures as the existing per-device NAS, while avoiding the prohibitive cost of building a latency predictor for each device.


Author(s):  
Vesa Kaarnioja ◽  
Yoshihito Kazashi ◽  
Frances Y. Kuo ◽  
Fabio Nobile ◽  
Ian H. Sloan

AbstractThis paper deals with the kernel-based approximation of a multivariate periodic function by interpolation at the points of an integration lattice—a setting that, as pointed out by Zeng et al. (Monte Carlo and Quasi-Monte Carlo Methods 2004, Springer, New York, 2006) and Zeng et al. (Constr. Approx. 30: 529–555, 2009), allows fast evaluation by fast Fourier transform, so avoiding the need for a linear solver. The main contribution of the paper is the application to the approximation problem for uncertainty quantification of elliptic partial differential equations, with the diffusion coefficient given by a random field that is periodic in the stochastic variables, in the model proposed recently by Kaarnioja et al. (SIAM J Numer Anal 58(2): 1068–1091, 2020). The paper gives a full error analysis, and full details of the construction of lattices needed to ensure a good (but inevitably not optimal) rate of convergence and an error bound independent of dimension. Numerical experiments support the theory.


Algorithms ◽  
2021 ◽  
Vol 14 (12) ◽  
pp. 343
Author(s):  
Mikhail Lavrentiev ◽  
Konstantin Lysakov ◽  
Andrey Marchuk ◽  
Konstantin Oblaukhov ◽  
Mikhail Shadrin

Events of a seismic nature followed by catastrophic floods caused by tsunami waves (the incidence of which has increased in recent decades) have an important impact on the populations of littoral regions. On the coast of Japan and Kamchatka, it takes nearly 20 min for tsunami waves to approach the nearest dry land after an offshore seismic event. This paper addresses an important question of fast simulation of tsunami wave propagation by mapping the algorithms in use in field-programmable gate arrays (FPGAs) with the help of high-level synthesis (HLS). Wave propagation is described by the shallow water system, and for numerical treatment the MacCormack scheme is used. The MacCormack algorithm is a direct difference scheme at a three-point stencil of a “cross” type; it happens to be appropriate for FPGA-based parallel implementation. A specialized calculator was designed. The developed software was tested for precision and performance. Numerical tests computing wave fronts show very good agreement with the available exact solutions (for two particular cases of the sea bed topography) and with the reference code. As the result, it takes just 17.06 s to simulate 1600 s (3200 time steps) of the wave propagation using a 3000 × 3200 computation grid with a VC709 board. The step length of the computational grid was chosen to display the simulation results in sufficient detail along the coastline. At the same time, the size of data arrays should provide their free placement in the memory of FPGA chips. The rather high performance achieved shows that tsunami danger could be correctly evaluated in a few minutes after seismic events.


Author(s):  
Raphaël Jolivet ◽  
Julie Clavreul ◽  
Raphaël Brière ◽  
Romain Besseau ◽  
Anne Prieur Vernat ◽  
...  

Abstract Purpose In this paper, we present new tools to ease the analysis of the effect of variability and uncertainty on life cycle assessment (LCA) results. Methods The tools consist of a standard protocol and an open-source library: lca_algebraic. This library, written in Python and based on the framework Brightway2 (Mutel in J Open Source Softw 2(12):236, 2017) provides functions to support sensitivity analysis by bringing symbolic calculus to LCA. The use of symbolic calculus eases the definition of parametric inventories and enables a very fast evaluation of impacts by factorizing the background activities. Thanks to this processing speed, a large number of Monte Carlo simulations can be generated to evaluate the variation of the impacts and apply advanced statistic tools such as Sobol indices to quantify the contribution of each parameter to the final variance (Sobol in Math Comput Simul 55(1–3):271–280, 2001). An additional algorithm uses the key parameters, identified from their high Sobol indices, to generate simplified arithmetic models for fast estimates of LCA results. Results and discussion The protocol and library were validated through their application to the assessment of impacts of mono crystalline photovoltaic (PV) systems. A comprehensive sensitivity analysis was performed based on the protocol and the complementary functions provided by lca_algebraic. The proposed tools helped building a detailed parametric reference LCA model of the PV system to identify the range of variation of multi-criterion LCA results and the key foreground-related parameters explaining these variations. Based on these key parameters, we generated simplified arithmetic models for quick and simple multi-criteria environmental assessments to be used by non-expert LCA users. The resulting models are both compact and aligned with the reference parametric LCA model of crystalline silicon PV systems. Conclusion This work brings powerful and practical tools to the LCA community to better understand, identify, and quantify the sources of variation of environmental impacts and produce simplified models to spread the use of LCA among non-experts. The library mainly explores the uncertainties of the foreground activities. Further work could also integrate the uncertainty of background activities, described, for example, by pedigree matrices.


2021 ◽  
Vol 1 (3) ◽  
pp. 557-573
Author(s):  
Emilia Oleandro ◽  
Simonetta Grilli ◽  
Romina Rega ◽  
Martina Mugnano ◽  
Vittorio Bianco ◽  
...  

The development of more sensitive methodologies, capable of quickly detecting and monitoring a microbial population present in a specific biological matrix, as well as performing to allow for the study of all its metabolic changes (e.g., during the formation of biofilm) to occur, is an essential requirement for both well-being and the food industry. Two techniques, in particular, have gained the attention of scientists: The first is “biospeckle”, an optical technique representing an innovative tool for applications in food quality, food safety, and nutraceuticals. With this technique, we can quickly evaluate and monitor the presence of bacteria (or their proliferation) in a solid or liquid biological matrix. In addition, the technique is helpful in quantifying and optimizing the correct storage time of the pro-biotics, if they are entrapped in matrices such as alginate and follow their survival rate in simulated gastro-intestinal conditions. A second technique with great chances is the “biofilm electrostatic test” (BET). BET undoubtedly represents a fast, simple, and highly reproducible tool suitable for admitting the evaluation of the in vitro bacterial capacity in order to adhere through an electrostatic interaction with a pyro-electrified carrier after only 2 h of incubation. BET could represent the way for a quick and standardized evaluation of bacterial resistance among biofilm-producing microorganisms through a fast evaluation of the potential presence of the biofilm.


Author(s):  
Stavros Efthymiou ◽  
Sergi Ramos-Calderer ◽  
Carlos Bravo-Prieto ◽  
Adriian Perez-Salinas ◽  
Diego García-Martín ◽  
...  

Abstract We present Qibo, a new open-source software for fast evaluation of quantum circuits and adiabatic evolution which takes full advantage of hardware accelerators. The growing interest in quantum computing and the recent developments of quantum hardware devices motivates the development of new advanced computational tools focused on performance and usage simplicity. In this work we introduce a new quantum simulation framework that enables developers to delegate all complicated aspects of hardware or platform implementation to the library so they can focus on the problem and quantum algorithms at hand. This software is designed from scratch with simulation performance, code simplicity and user friendly interface as target goals. It takes advantage of hardware acceleration such as multi-threading CPU, single GPU and multi-GPU devices.


Author(s):  
Caroline Marks ◽  
Jörn Viell

AbstractThe production of biofuels and biochemicals requires a pretreatment to cleave the composite-like structure of lignocellulosic biomass and thus facilitate further conversion. In the case of liquid-based pretreatment, it is important to know which pretreatment liquids allow for an effective conversion of biomass. For the development of effective pretreatment strategies, simple criteria for a fast evaluation of pretreatment results are advantageous. In this study, we use the example of acetosolv pretreatment of beech wood to explore the influence of composition of the employed acetosolv liquids. To this end, we investigate pretreatment phenomena on different scales including macroscopic disintegration, overall mass balances and compositional changes of beech wood. We relate the investigated phenomena with the type and amount of catalyst acid as well as water content of the employed acetosolv liquids. The results show that disintegration increases with both a higher concentration and acidity of the catalyst acid, while excessive disintegration can be balanced by an increased water content up to equimolar ratios of water and acetic acid. Furthermore, an increasing disintegration correlates with an increasing non-recovered fraction up to a maximum of 40 wt%. The non-recovered fraction in turn linearly depends on the amount of removed hemicellulose and lignin. Overall, a low lignin content together with complete disintegration after pretreatment in acetosolv liquids with a high water content allows for increased sugar yields in subsequent enzymatic hydrolysis. Thus, disintegration and non-recovered fraction serve as a simple indicator for a first assessment of pretreatment effectiveness.


Water ◽  
2021 ◽  
Vol 13 (21) ◽  
pp. 3122
Author(s):  
Leonardo Primavera ◽  
Emilia Florio

The possibility to create a flood wave in a river network depends on the geometric properties of the river basin. Among the models that try to forecast the Instantaneous Unit Hydrograph (IUH) of rainfall precipitation, the so-called Multifractal Instantaneous Unit Hydrograph (MIUH) by De Bartolo et al. (2003) rather successfully connects the multifractal properties of the river basin to the observed IUH. Such properties can be assessed through different types of analysis (fixed-size algorithm, correlation integral, fixed-mass algorithm, sandbox algorithm, and so on). The fixed-mass algorithm is the one that produces the most precise estimate of the properties of the multifractal spectrum that are relevant for the MIUH model. However, a disadvantage of this method is that it requires very long computational times to produce the best possible results. In a previous work, we proposed a parallel version of the fixed-mass algorithm, which drastically reduced the computational times almost proportionally to the number of Central Processing Unit (CPU) cores available on the computational machine by using the Message Passing Interface (MPI), which is a standard for distributed memory clusters. In the present work, we further improved the code in order to include the use of the Open Multi-Processing (OpenMP) paradigm to facilitate the execution and improve the computational speed-up on single processor, multi-core workstations, which are much more common than multi-node clusters. Moreover, the assessment of the multifractal spectrum has also been improved through a direct computation method. Currently, to the best of our knowledge, this code represents the state-of-the-art for a fast evaluation of the multifractal properties of a river basin, and it opens up a new scenario for an effective flood forecast in reasonable computational times.


Sign in / Sign up

Export Citation Format

Share Document