minimal data
Recently Published Documents


TOTAL DOCUMENTS

265
(FIVE YEARS 92)

H-INDEX

23
(FIVE YEARS 3)

2022 ◽  
Vol 15 (2) ◽  
pp. 1-31
Author(s):  
Joel Mandebi Mbongue ◽  
Danielle Tchuinkou Kwadjo ◽  
Alex Shuping ◽  
Christophe Bobda

Cloud deployments now increasingly exploit Field-Programmable Gate Array (FPGA) accelerators as part of virtual instances. While cloud FPGAs are still essentially single-tenant, the growing demand for efficient hardware acceleration paves the way to FPGA multi-tenancy. It then becomes necessary to explore architectures, design flows, and resource management features that aim at exposing multi-tenant FPGAs to the cloud users. In this article, we discuss a hardware/software architecture that supports provisioning space-shared FPGAs in Kernel-based Virtual Machine (KVM) clouds. The proposed hardware/software architecture introduces an FPGA organization that improves hardware consolidation and support hardware elasticity with minimal data movement overhead. It also relies on VirtIO to decrease communication latency between hardware and software domains. Prototyping the proposed architecture with a Virtex UltraScale+ FPGA demonstrated near specification maximum frequency for on-chip data movement and high throughput in virtual instance access to hardware accelerators. We demonstrate similar performance compared to single-tenant deployment while increasing FPGA utilization, which is one of the goals of virtualization. Overall, our FPGA design achieved about 2× higher maximum frequency than the state of the art and a bandwidth reaching up to 28 Gbps on 32-bit data width.


Atmosphere ◽  
2022 ◽  
Vol 13 (1) ◽  
pp. 101
Author(s):  
Renee Zbizika ◽  
Paulina Pakszys ◽  
Tymon Zielinski

Aerosol Optical Depth (AOD) is a measure of the extinction of solar radiation by aerosols in the atmosphere. Understanding the variations of global AOD is necessary for precisely determining the role of aerosols. Arctic warming is partially caused by aerosols transported from vast distances, including those released during biomass burning events (BBEs). However, measuring AODs is challenging, typically requiring active LIDAR systems or passive sun photometers. Both are limited to cloud-free conditions; sun photometers provide only point measurements, thus requiring more spatial coverage. A more viable method to obtain accurate AOD may be found through machine learning. This study uses DNNs to estimate Svalbard’s AODs using a minimal set of meteorological parameters (temperature, air mass, water vapor, wind speed, latitude, longitude, and time of year). The mean absolute error (MAE) between predicted and true data was 0.00401 for the entire set and 0.0079 for the validation set. It was then shown that the inclusion of BBE data improves predictions by 42.167%. It was demonstrated that AODs may be accurately estimated without the use of expensive instrumentation, using machine learning and minimal data. Similar models may be developed for other regions, allowing immediate improvement of current meteorological models.


2021 ◽  
Vol 14 (1) ◽  
pp. 28
Author(s):  
Francesco Ioli ◽  
Alberto Bianchi ◽  
Alberto Cina ◽  
Carlo De Michele ◽  
Paolo Maschio ◽  
...  

Recently, Unmanned Aerial Vehicles (UAV) have opened up unparalleled opportunities for alpine glacier monitoring, as they allow for reconstructing extensive and high-resolution 3D models. In order to evaluate annual ice flow velocities and volume variations, six yearly measurements were carried out between 2015 and 2020 on the debris-covered Belvedere Glacier (Anzasca Valley, Italian Alps) with low-cost fixed-wing UAVs and quadcopters. Every year, ground control points and check points were measured with GNSS. Images acquired from UAV were processed with Structure-from-Motion and Multi-View Stereo algorithms to build photogrammetric models, orthophotos and digital surface models, with decimetric accuracy. Annual glacier velocities were derived by combining manually-tracked features on orthophotos with GNSS measurements. Velocities ranging between 17 m y−1 and 22 my−1 were found in the central part of the glacier, whereas values between 2 m y−1 and 7 my−1 were found in the accumulation area and at the glacier terminus. Between 2 × 106 m3 and 3.5 × 106m3 of ice volume were lost every year. A pair of intra-year measurements (October 2017–July 2018) highlighted that winter and spring volume reduction was ∼1/4 of the average annual ice loss. The Belvedere monitoring activity proved that decimetric-accurate glacier models can be derived with low-cost UAVs and photogrammetry, limiting in-situ operations. Moreover, UAVs require minimal data acquisition costs and allow for great surveying flexibility, compared to traditional techniques. Information about annual flow velocities and ice volume variations of the Belvedere Glacier may have great value for further understanding glacier dynamics, compute mass balances, or it might be used as input for glacier flow modelling.


2021 ◽  
Vol 163 (1) ◽  
pp. 6
Author(s):  
Sahil Agarwal ◽  
J. S. Wettlaufer

Abstract Technological advances in instrumentation have led to an exponential increase in exoplanet detection and scrutiny of stellar features such as spots and faculae. While the spots and faculae enable us to understand the stellar dynamics, exoplanets provide us with a glimpse into stellar evolution. While the ubiquity of noise (e.g., telluric, instrumental, or photonic) is unavoidable, combining this with increased spectrographic resolution compounds technological challenges. To account for these noise sources and resolution issues, we use a temporal multifractal framework to study data from the Spot Oscillation And Planet 2.0 tool, which simulates a stellar spectrum in the presence of a spot, a facula or a planet. Given these controlled simulations, we vary the resolution as well as the signal-to-noise ratio (S/N) to obtain a lower limit on the resolution and S/N required to robustly detect features. We show that a spot and a facula with a 1% coverage of the stellar disk can be robustly detected for a S/N (per pixel) of 35 and 60, respectively, for any spectral resolution above 20,000, while a planet with a radial velocity of 10 m s−1 can be detected for a S/N (per pixel) of 600. Rather than viewing noise as an impediment, our approach uses noise as a source of information.


2021 ◽  
Vol 11 (23) ◽  
pp. 11248
Author(s):  
Brecht De Beelde ◽  
David Plets ◽  
Wout Joseph

With the deployment of data-driven assembly and production factories, challenges arise in sensor data acquisition and gathering. Different wireless technologies are currently used for transferring data, each with different advantages and constraints. In this paper, we present a hybrid network architecture for providing Quality of Service (QoS) in an industrial environment where guaranteed minimal data rates and maximal latency are of utmost importance for controlling devices and processes. The location of the access points (APs) is determined during the initial network-planning action, together with physical parameters such as frequency, transmit power, and modulation and coding schemes. Instead of performing network-planning just once before the network rollout, the network is monitored continuously by adding telemetry data to the frame header of all data streams, and the network is automatically reconfigured in real-time if the requirements are not met. By not using maximum transmit powers during the initial roll-out, more APs are needed, but coverage is guaranteed when new obstructions such as metallic racks or machinery are added. It is found that decreasing the transmit power by 6 dB gives the best trade-off between the number of required APs and network robustness. The proposed architecture is validated via simulations and via a proof-of-concept setup.


2021 ◽  
Vol 13 (4) ◽  
pp. 172-217
Author(s):  
Wallace P. Mullin ◽  
Christopher M. Snyder

We propose a simple method, requiring only minimal data, for bounding demand elasticities in growing, homogeneous-product markets. Since growing demand curves cannot cross, shifts in market equilibrium over time can be used to “funnel” the demand curve into a narrow region, bounding its slope. Our featured application assesses the antitrust remedy in the 1952 DuPont decision, ordering incumbents to license patents for commercial plastics. We bound the demand elasticity significantly below 1 in many post-remedy years, inconsistent with monopoly, supporting the remedy’s effectiveness. A second application investigates whether the 1911 dissolution of American Tobacco fostered competition in the cigarette market. (JEL K21, L24, L65, L66, N41, N42, O34)


2021 ◽  
Author(s):  
Samer Alkarkoukly ◽  
Abdul-Mateen Rajput

openEHR is an open-source technology for e-health, aims to build data models for interoperable Electronic Health Records (EHRs) and to enhance semantic interoperability. openEHR architecture consists of different building blocks, among them is the “template” which consists of different archetypes and aims to collect the data for a specific use-case. In this paper, we created a generic data model for a virtual pancreatic cancer patient, using the openEHR approach and tools, to be used for testing and virtual environments. The data elements for this template were derived from the “Oncology minimal data set” of HiGHmed project. In addition, we generated virtual data profiles for 10 patients using the template. The objective of this exercise is to provide a data model and virtual data profiles for testing and experimenting scenarios within the openEHR environment. Both of the template and the 10 virtual patient profiles are available publicly.


2021 ◽  
Author(s):  
Seth Bryant ◽  
Heather McGrath ◽  
Mathieu Boudreault

Abstract. Canada's RADARSAT missions improve the potential to study past flood events; however, existing tools to derive flood depths from this remote-sensing data do not correct for errors, leading to poor estimates. To provide more accurate gridded depth estimates of historical flooding, a new tool is proposed that integrates Height Above Nearest Drainage and Cost Allocation algorithms. This tool is tested against two trusted, hydraulically derived, gridded depths of recent floods in Canada. This validation shows the proposed tool outperforms existing tools and can provide more accurate estimates from minimal data without the need for complex physics-based models or expert judgement. With improvements in remote-sensing data, the tool proposed here can provide flood researchers and emergency managers accurate depths in near-real time.


Author(s):  
J Víctor Moreno-Mayar

Abstract Present-day and ancient population genomic studies from different study organisms have rapidly become accessible to diverse research groups worldwide. Unfortunately, as datasets and analyses become more complex, researchers with less computational experience often miss their chance to analyse their own data. We introduce FrAnTK, a user-friendly toolkit for computation and visualisation of allele frequency-based statistics in ancient and present-day genome variation datasets. We provide fast, memory-efficient tools that allow the user to go from sequencing data to complex exploratory analyses and visual representations with minimal data manipulation. Its simple usage and low computational requirements make FrAnTK ideal for users that are less familiar with computer programming carrying out large-scale population studies.


Sign in / Sign up

Export Citation Format

Share Document