cloud models
Recently Published Documents


TOTAL DOCUMENTS

355
(FIVE YEARS 108)

H-INDEX

35
(FIVE YEARS 5)

2022 ◽  
Vol 15 (2) ◽  
pp. 1-27
Author(s):  
Andrea Damiani ◽  
Giorgia Fiscaletti ◽  
Marco Bacis ◽  
Rolando Brondolin ◽  
Marco D. Santambrogio

“Cloud-native” is the umbrella adjective describing the standard approach for developing applications that exploit cloud infrastructures’ scalability and elasticity at their best. As the application complexity and user-bases grow, designing for performance becomes a first-class engineering concern. As an answer to these needs, heterogeneous computing platforms gained widespread attention as powerful tools to continue meeting SLAs for compute-intensive cloud-native workloads. We propose BlastFunction, an FPGA-as-a-Service full-stack framework to ease FPGAs’ adoption for cloud-native workloads, integrating with the vast spectrum of fundamental cloud models. At the IaaS level, BlastFunction time-shares FPGA-based accelerators to provide multi-tenant access to accelerated resources without any code rewriting. At the PaaS level, BlastFunction accelerates functionalities leveraging the serverless model and scales functions proactively, depending on the workload’s performance. Further lowering the FPGAs’ adoption barrier, an accelerators’ registry hosts accelerated functions ready to be used within cloud-native applications, bringing the simplicity of a SaaS-like approach to the developers. After an extensive experimental campaign against state-of-the-art cloud scenarios, we show how BlastFunction leads to higher performance metrics (utilization and throughput) against native execution, with minimal latency and overhead differences. Moreover, the scaling scheme we propose outperforms the main serverless autoscaling algorithms in workload performance and scaling operation amount.


2022 ◽  
Vol 8 ◽  
Author(s):  
Chao Cao ◽  
Feng Cai ◽  
Hongshuai Qi ◽  
Jianhui Liu ◽  
Gang Lei ◽  
...  

Global climate change-induced sea-level rise and storm wave intensification, along with the large population densities and high-intensity human development activities in coastal areas, have caused serious burden and damage to China’s coasts, led to the rapid growth of artificial shorelines development, and formed a “new Great Wall” of reinforced concrete against the laws of nature. After the last ice age, transgression formed the different features of China’s coast. Depending on the types of geological and landform features, coasts are divided into 36 evaluation units, and 10 indicators are selected from natural aspects (including tectonics, geomorphology, sediment, and storms) and aspects of social economy (population, GDP, Gross Domestic Product), and cloud model theory is used to build a coastal erosion vulnerability evaluation index system in China. The results show that high grade (V), high-middle grade (IV), middle grade (III), low-middle grade (II), and low grade (I) coastal erosion vulnerability degrees account for 5.56, 13.89, 41.67, 33.33, and 5.56% of the Chinese coastlines, respectively. The coastal erosion vulnerability of the subsidence zone is significantly higher than that of the uplift zone. Reverse cloud model and analytic hierarchy process calculation show that the main factors that control coastal erosion vulnerability since the transgression after the last ice age are geological structure, topography and lithological features, and in recent years, the decrease in sea sediment loads and increase in reclamation engineering. Mainland China must live with the basic situation of coastal erosion, and this study shows that the index system and method of cloud modeling are suitable for the evaluation of the coastal erosion vulnerability of the Chinese mainland. This study provides a scientific basis for the adaptive management of coastal erosion, coastal disaster assessment and the overall planning of land and sea.


2022 ◽  
Vol 259 ◽  
pp. 107238
Author(s):  
Ying Guo ◽  
Xiaoling Lu ◽  
Jiquan Zhang ◽  
Kaiwei Li ◽  
Rui Wang ◽  
...  

2021 ◽  
Author(s):  
Huan Yu ◽  
Claudia Emde ◽  
Arve Kylling ◽  
Ben Veihelmann ◽  
Bernhard Mayer ◽  
...  

Abstract. Operational retrievals of tropospheric trace gases from space-borne spectrometers are based on one-dimensional radiative transfer models. To minimize cloud effects, trace gas retrievals generally implement Lambertian cloud models based on radiometric cloud fraction estimates and photon path length corrections. The latter relies on measurements of the oxygen collision pair (O2-O2) absorption at 477 nm or on the oxygen A-band around 760 nm. In reality however, the impact of clouds is much more complex, involving unresolved sub-pixel clouds, scattering of clouds in neighboring pixels and cloud shadow effects, such that unresolved three-dimensional effects due to clouds may introduce significant biases in trace gas retrievals. In order to quantify this impact, we study NO2 as a trace gas example, and apply standard retrieval methods including approximate cloud corrections to synthetic data generated by the state-of-the-art three-dimensional Monte Carlo radiative transfer model MYSTIC. A sensitivity study is performed for simulations including a box-cloud, and the dependency on various parameters is investigated. The most significant bias is found for cloud shadow effects under polluted conditions. Biases depend strongly on cloud shadow fraction, NO2 profile, cloud optical thickness, solar zenith angle, and surface albedo. Several approaches to correct NO2 retrievals under cloud shadow conditions are explored. We find that air mass factors calculated using fitted surface albedo or corrected using the O2-O2 slant column density can partly mitigate cloud shadow effects. However, these approaches are limited to cloud-free pixels affected by surrounding clouds. A parameterization approach is presented based on relationships derived from the sensitivity study. This allows identifying measurements for which the standard NO2 retrieval produces a significant bias, and therefore provides a way to improve the current data flagging approach.


2021 ◽  
Vol 11 (23) ◽  
pp. 11522
Author(s):  
Quoc-Trung Do ◽  
Wen-Yang Chang ◽  
Li-Wei Chen

In the era of rapid development in industry, an automatic production line is the fundamental and crucial mission for robotic pick-place. However, most production works for picking and placing workpieces are still manual operations in the stamping industry. Therefore, an intelligent system that is fully automatic with robotic pick-place instead of human labor needs to be developed. This study proposes a dynamic workpiece modeling integrated with a robotic arm based on two stereo vision scans using the fast point-feature histogram algorithm for the stamping industry. The point cloud models of workpieces are acquired by leveraging two depth cameras, type Azure Kinect Microsoft, after stereo calibration. The 6D poses of workpieces, including three translations and three rotations, can be estimated by applying algorithms for point cloud processing. After modeling the workpiece, a conveyor controlled by a microcontroller will deliver the dynamic workpiece to the robot. In order to accomplish this dynamic task, a formula related to the velocity of the conveyor and the moving speed of the robot is implemented. The average error of 6D pose information between our system and the practical measurement is lower than 7%. The performance of the proposed method and algorithm has been appraised on real experiments of a specified stamping workpiece.


Author(s):  
Natalia Kouremeti ◽  
William Kitchin ◽  
Taras Plakhotnik

Abstract A detailed description is given of how the liquid water content (LWC) and the ice water content (IWC) can be determined accurately and absolutely from the measured water Raman spectra of clouds. All instrumental and spectroscopic parameters that affect the accuracy of the water-content measurement are discussed and quantified, specifically, these are the effective absolute differential Raman backscattering cross section of water vapor (π)/dΩ, and the molecular Raman backscattering efficiencies ηliq and ηice of liquid and frozen microparticles, respectively. The latter two are determined following rigorous theoretical approaches combined with RAMSES measurements. For ηice, this includes a new experimental method which assumes continuity of the number of water molecules across the vertical extent of the melting layer. Examples of water-content measurements are presented, including supercooled liquid-water clouds and melting layers. Error sources are discussed, one effect that stands out is interfering fluorescence by aerosols. Aerosol effects and calibration issues are the main reasons why spectral Raman measurements are required for quantitative measurements of LWC and IWC. The presented study lays the foundation for cloud microphysical investigations, and for the evaluation of cloud models or the cloud data products of other instruments. As a first application, IWC retrieval methods are evaluated that are based on either lidar extinction or radar reflectivity measurements. While the lidar-based retrievals show unsatisfactory agreement with the RAMSES IWC measurements, the radar-based IWC retrieval which is used in the Cloudnet project performs reasonably well. On average, retrieved IWC agrees within 20% to 30% (dry bias) with measured IWC.


Sensors ◽  
2021 ◽  
Vol 21 (22) ◽  
pp. 7558
Author(s):  
Linyan Cui ◽  
Guolong Zhang ◽  
Jinshen Wang

For the engineering application of manipulator grasping objects, mechanical arm occlusion and limited imaging angle produce various holes in the reconstructed 3D point clouds of objects. Acquiring a complete point cloud model of the grasped object plays a very important role in the subsequent task planning of the manipulator. This paper proposes a method with which to automatically detect and repair the holes in the 3D point cloud model of symmetrical objects grasped by the manipulator. With the established virtual camera coordinate system and boundary detection, repair and classification of holes, the closed boundaries for the nested holes were detected and classified into two kinds, which correspond to the mechanical claw holes caused by mechanical arm occlusion and the missing surface produced by limited imaging angle. These two kinds of holes were repaired based on surface reconstruction and object symmetry. Experiments on simulated and real point cloud models demonstrate that our approach outperforms the other state-of-the-art 3D point cloud hole repair algorithms.


2021 ◽  
Author(s):  
Adam Erickson ◽  
Nicholas Coops

Reliable estimates of canopy light transmission are critical to understanding the structure and function of vegetation communities but are difficult and costly to attain by traditional field inventory methods. Airborne laser scanning (ALS) data uniquely provide multi-angular vertically resolved representation of canopy geometry across large geographic areas. While previous studies have proposed ALS indices of canopy light transmission, new algorithms based on theoretical advancements may improve existing models. Herein, we propose two new models of canopy light transmission (i.e., gap fraction, or Po, the inverse of angular canopy closure). We demonstrate the models against a suite of existing models and ancillary metrics, validated against convex spherical densiometer measurements for 950 field plots in the foothills of Alberta, Canada. We also tested the effects of synthetic hemispherical lens models on the performance of the proposed hemispherical Voronoi gap fraction (Phv) index. While vertical canopy cover metrics showed the best overall fit to field measurements, one new metric, point-density-normalized gap fraction (Ppdn), outperformed all other gap fraction metrics by two-fold. We provide suggestions for further algorithm enhancements based on validation data improvements. We argue that traditional field measurements are no longer appropriate for ‘ground-truthing’ modern LiDAR or SfM point cloud models, as the latter provide orders of magnitude greater sampling and coverage. We discuss the implications of this finding for LiDAR applications in forestry.


2021 ◽  
Vol 21 (19) ◽  
pp. 15213-15220
Author(s):  
Bernd Kärcher ◽  
Claudia Marcolli

Abstract. The homogeneous nucleation of ice in supercooled liquid-water clouds is characterized by time-dependent freezing rates. By contrast, water phase transitions induced heterogeneously by ice-nucleating particles (INPs) are described by time-independent ice-active fractions depending on ice supersaturation (s). Laboratory studies report ice-active particle number fractions (AFs) that are cumulative in s. Cloud models budget INP and ice crystal numbers to conserve total particle number during water phase transitions. Here, we show that ice formation from INPs with time-independent nucleation behavior is overpredicted when models budget particle numbers and at the same time derive ice crystal numbers from s-cumulative AFs. This causes a bias towards heterogeneous ice formation in situations where INPs compete with homogeneous droplet freezing during cloud formation. We resolve this issue by introducing differential AFs, thereby moving us one step closer to more robust simulations of aerosol–cloud interactions.


Author(s):  
Mustapha Adewusi

Consistently required lager bandwidth at lower cost induce increases in magnitude of transmission frequency for satellite signal. This is phenomenally accompanied by proportional hydrometeors attenuation. Hence, there is need to evaluate cloud attenuation impact in every climatic region periodically. This report is one of the outcomes of experimental communication research carried out at tropical Ota (6.7oN, 3.23oE) station, southwest, Nigeria. The station spectrum analyzer measures its received beacons total attenuation at 12.245 GHz and elevation angle 59.9o to Astra satellites located at 28.2oE. Daily maximum, minimum and mean temperatures; rain amount, wind speed and direction as well as time of occurrence of each of these weather parameters were also measured. Then the radiometric data including acquired radiosonde data were analysed under rainy and non-rainy conditions, to obtain cloud attenuation contribution from the total attenuation measured per minute. The various data used range in measurement periods between four and fifty-eight years. The outputs were used to compute the station cumulative distributions for the existing cloud models and for the integrated station’s data. Statistical analysis comparing the two cumulative distributions show a high difference between the measured data and existing models’ predicted values. Hence a cloud attenuation computation algorithm and its simulation program were developed and used to derive a new tropical cloud attenuation model. The results of climatic data and analysis were used to justify the well corroborated new cloud attenuation model.


Sign in / Sign up

Export Citation Format

Share Document