Real-Time Large-Scale Fusion of High Resolution 3D Scans with Details Preservation

Author(s):  
Hicham Sekkati ◽  
Jonathan Boisvert ◽  
Guy Godin ◽  
Louis Borgeat
2020 ◽  
Author(s):  
Vera Thiemig ◽  
Peter Salamon ◽  
Goncalo N. Gomes ◽  
Jon O. Skøien ◽  
Markus Ziese ◽  
...  

<p>We present EMO-5, a Pan-European high-resolution (5 km), (sub-)daily, multi-variable meteorological data set especially developed to the needs of an operational, pan-European hydrological service (EFAS; European Flood Awareness System). The data set is built on historic and real-time observations coming from 18,964 meteorological in-situ stations, collected from 24 data providers, and 10,632 virtual stations from four high-resolution regional observational grids (CombiPrecip, ZAMG - INCA, EURO4M-APGD and CarpatClim) as well as one global reanalysis product (ERA-Interim-land). This multi-variable data set covers precipitation, temperature (average, min and max), wind speed, solar radiation and vapor pressure; all at daily resolution and in addition 6-hourly resolution for precipitation and average temperature. The original observations were thoroughly quality controlled before we used the Spheremap interpolation method to estimate the variable values for each of the 5 x 5 km grid cells and their affiliated uncertainty. EMO-5 v1 grids covering the time period from 1990 till 2019 will be released as a free and open Copernicus product mid-2020 (with a near real-time release of the latest gridded observations in future). We would like to present the great potential EMO-5 holds for the hydrological modelling community.</p><p> </p><p>footnote: EMO = European Meteorological Observations</p>


2013 ◽  
Vol 10 (4) ◽  
pp. 1127-1167 ◽  
Author(s):  
P. Y. Le Traon

Abstract. The launch of the US/French mission Topex/Poseidon (T/P) (CNES/NASA) in August 1992 was the start of a revolution in oceanography. For the first time, a very precise altimeter system optimized for large scale sea level and ocean circulation observations was flying. T/P alone could not observe the mesoscale circulation. In the 1990s, the ESA satellites ERS-1/2 were flying simultaneously with T/P. Together with my CLS colleagues, we demonstrated that we could use T/P as a reference mission for ERS-1/2 and bring the ERS-1/2 data to an accuracy level comparable to T/P. Near real time high resolution global sea level anomaly maps were then derived. These maps have been operationally produced as part of the SSALTO/DUACS system for the last 15 yr. They are now widely used by the oceanographic community and have contributed to a much better understanding and recognition of the role and importance of mesoscale dynamics. Altimetry needs to be complemented with global in situ observations. In the end of the 90s, a major international initiative was launched to develop Argo, the global array of profiling floats. This has been an outstanding success. Argo floats now provide the most important in situ observations to monitor and understand the role of the ocean on the earth climate and for operational oceanography. This is a second revolution in oceanography. The unique capability of satellite altimetry to observe the global ocean in near real time at high resolution and the development of Argo were essential to the development of global operational oceanography, the third revolution in oceanography. The Global Ocean Data Assimilation Experiment (GODAE) was instrumental in the development of the required capabilities. This paper provides an historical perspective on the development of these three revolutions in oceanography which are very much interlinked. This is not an exhaustive review and I will mainly focus on the contributions we made together with many colleagues and friends.


Ocean Science ◽  
2013 ◽  
Vol 9 (5) ◽  
pp. 901-915 ◽  
Author(s):  
P. Y. Le Traon

Abstract. The launch of the French/US mission Topex/Poseidon (T/P) (CNES/NASA) in August 1992 was the start of a revolution in oceanography. For the first time, a very precise altimeter system optimized for large-scale sea level and ocean circulation observations was flying. T/P alone could not observe the mesoscale circulation. In the 1990s, the ESA satellites ERS-1/2 were flying simultaneously with T/P. Together with my CLS colleagues, we demonstrated that we could use T/P as a reference mission for ERS-1/2 and bring the ERS-1/2 data to an accuracy level comparable to T/P. Near-real-time high-resolution global sea level anomaly maps were then derived. These maps have been operationally produced as part of the SSALTO/DUACS system for the last 15 yr. They are now widely used by the oceanographic community and have contributed to a much better understanding and recognition of the role and importance of mesoscale dynamics. Altimetry needs to be complemented with global in situ observations. At the end of the 90s, a major international initiative was launched to develop Argo, the global array of profiling floats. This has been an outstanding success. Argo floats now provide the most important in situ observations to monitor and understand the role of the ocean on the earth climate and for operational oceanography. This is a second revolution in oceanography. The unique capability of satellite altimetry to observe the global ocean in near-real-time at high resolution and the development of Argo were essential for the development of global operational oceanography, the third revolution in oceanography. The Global Ocean Data Assimilation Experiment (GODAE) was instrumental in the development of the required capabilities. This paper provides an historical perspective on the development of these three revolutions in oceanography which are very much interlinked. This is not an exhaustive review and I will mainly focus on the contributions we made together with many colleagues and friends.


2009 ◽  
Vol 9 (2) ◽  
pp. 303-314 ◽  
Author(s):  
S. Martinis ◽  
A. Twele ◽  
S. Voigt

Abstract. In this paper, an automatic near-real time (NRT) flood detection approach is presented, which combines histogram thresholding and segmentation based classification, specifically oriented to the analysis of single-polarized very high resolution Synthetic Aperture Radar (SAR) satellite data. The challenge of SAR-based flood detection is addressed in a completely unsupervised way, which assumes no training data and therefore no prior information about the class statistics to be available concerning the area of investigation. This is usually the case in NRT-disaster management, where the collection of ground truth information is not feasible due to time-constraints. A simple thresholding algorithm can be used in the most of the cases to distinguish between "flood" and "non-flood" pixels in a high resolution SAR image to detect the largest part of an inundation area. Due to the fact that local gray-level changes may not be distinguished by global thresholding techniques in large satellite scenes the thresholding algorithm is integrated into a split-based approach for the derivation of a global threshold by the analysis and combination of the split inherent information. The derived global threshold is then integrated into a multi-scale segmentation step combining the advantages of small-, medium- and large-scale per parcel segmentation. Experimental investigations performed on a TerraSAR-X Stripmap scene from southwest England during large scale flooding in the summer 2007 show high classification accuracies of the proposed split-based approach in combination with image segmentation and optional integration of digital elevation models.


Water ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 259
Author(s):  
Guohan Zhao ◽  
Thomas Balstrøm ◽  
Ole Mark ◽  
Marina B. Jensen

The accuracy of two-dimensional hydrodynamic models (2D models) is improved when high-resolution Digital Elevation Models (DEMs) are used. However, the entailed high spatial discretisation results in excessive computational expenses, thus prohibiting their implementation in real-time forecasting especially at a large scale. This paper presents a sub-model approach that adapts 1D static models to tailor high-resolution 2D model grids relevant to specified targets, such that the tailor-made 2D hydrodynamic sub-models yield fast processing without significant loss of accuracy via a GIS-based multi-scale simulation framework. To validate the proposed approach, model experiments were first designed to separately test the impact of two outcomes (i.e., the reduced computational domains and the optimised boundary conditions) towards final 2D prediction results. Then, the robustness of the sub-model approach was evaluated by selecting four focus areas with distinct catchment terrain morphologies as well as distinct rainfall return periods of 1–100 years. The sub-model approach resulted in a 45–553 times faster processing with a 99% reduction in the number of computational cells for all four cases; the goodness of fit regarding predicted flood extents was above 0.88 of F2, flood depths yield Root Mean Square Errors (RMSE) below 1.5 cm and the discrepancies of u- and v-directional velocities at selected points were less than 0.015 ms−1. As such, this approach reduces the 2D models’ computing expenses significantly, thus paving the way for large-scale high-resolution 2D real-time forecasting.


2020 ◽  
Author(s):  
Guohan Zhao ◽  
Thomas Balstrøm ◽  
Ole Mark ◽  
Marina B. Jensen

Abstract. The accuracy of two-dimensional urban flood models (2D models) is improved when high-resolution Digital Elevation Models (DEMs) is used, but the entailed high spatial discretisation results in excessive computational expenses, thus prohibiting the use of 2D models in real-time forecasting at a large scale. This paper presents a sub-model approach to tailoring high-resolution 2D model grids according to specified targets, and thus such tailor-made sub-model yields fast processing without significant loss of accuracy. Among the numerous sinks detected from full-basin high-resolution DEMs, the computationally important ones are determined using a proposed Volume Ratio Sink Screening method. Also, the drainage basin is discretised into a collection of sub-impact zones according to those sinks' spatial configuration. When adding full-basin distributed static rainfall, the drainage basin's flow conditions are modelled as a 1D static flow by using a fast-inundation spreading algorithm. Next, sub-impact zones relevant to the targets' local inundation process can be identified by tracing the 1D flow continuity, and thus suggest the critical computational cells from the high-resolution model grids on the basis of the spatial intersection. In MIKE FLOOD's 2D simulations, those screened cells configure the reduced computational domains as well as the optimised boundary conditions, which ultimately enables the fast 2D prediction in the tailor-made sub-model. To validate the method, model experiments were designed to test the impact of the reduced computational domains and the optimised boundary conditions separately. Further, the general applicability and the robustness of the sub-model approach were evaluated by targeting at four focus areas representing different catchment terrain morphologies as well as different rainfall return periods of 1–100 years. The sub-model approach resulted in a 45–553 times faster processing with a 99 % reduction in the number of computational cells for all four cases; the predicted flood extents, depths and flow velocities showed only marginal discrepancies with Root Mean Square Errors (RMSE) below 1.5 cm. As such, this approach reduces the 2D models' computing expenses significantly, thus paving the way for large-scale high-resolution 2D real-time forecasting.


2012 ◽  
Vol 27 (3) ◽  
pp. 784-795 ◽  
Author(s):  
Dan Bikos ◽  
Daniel T. Lindsey ◽  
Jason Otkin ◽  
Justin Sieglaff ◽  
Louie Grasso ◽  
...  

Abstract Output from a real-time high-resolution numerical model is used to generate synthetic infrared satellite imagery. It is shown that this imagery helps to characterize model-simulated large-scale precursors to the formation of deep-convective storms as well as the subsequent development of storm systems. A strategy for using this imagery in the forecasting of severe convective weather is presented. This strategy involves comparing model-simulated precursors to their observed counterparts to help anticipate model errors in the timing and location of storm formation, while using the simulated storm evolution as guidance.


2021 ◽  
Author(s):  
Ying Xu ◽  
Tingting Song ◽  
Xiaozhou Wang ◽  
Jiao Zheng ◽  
Yu Li ◽  
...  

Abstract Background: Spinal muscular atrophy (SMA) is a common neuromuscular disorder, caused by absence of both copies of the survival motor neuron 1 (SMN1) gene. Population-wide SMA screening to quantify copy number of SMN1 is recommended by multiple regions. SMN1 diagnostic assay with simplified procedure, high sensitivity and throughput is still needed.Methods: Real-Time PCR with High-Resolution Melting for the quantification of the SMN1 gene exon 7 copies and SMN1 gene exon 8 copies was established and confirmed by multiplex ligation-dependent probe amplification (MLPA). The diagnosis of 2563 individuals including SMA patients, suspected cases and the general population were analyzed by the real-time PCR. The results were compared with the gold standard test MPLA. Results: In this study, the homozygous deletions, heterozygous deletions were identified by Real-Time PCR with High-Resolution Melting method with an incidence of 10.18% and 2.42%, respectively. In addition, the R value distribution (P>0.05) among the 8 replicates and the coefficient of variation (CV<0.003) suggested that the qPCR screening test had high reproducibility. High concordance was obtained between Real-Time PCR with High-Resolution Melting and MPLA. Conclusions: The qPCR based on High-Resolution Melting provides a sensitive and high-throughput approach to large-scale SMA carrier screening with low cost and labor.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Matteo Perini ◽  
Gherard Batisti Biffignandi ◽  
Domenico Di Carlo ◽  
Ajay Ratan Pasala ◽  
Aurora Piazza ◽  
...  

Abstract Background The rapid identification of pathogen clones is pivotal for effective epidemiological control strategies in hospital settings. High Resolution Melting (HRM) is a molecular biology technique suitable for fast and inexpensive pathogen typing protocols. Unfortunately, the mathematical/informatics skills required to analyse HRM data for pathogen typing likely limit the application of this promising technique in hospital settings. Results MeltingPlot is the first tool specifically designed for epidemiological investigations using HRM data, easing the application of HRM typing to large real-time surveillance and rapid outbreak reconstructions. MeltingPlot implements a graph-based algorithm designed to discriminate pathogen clones on the basis of HRM data, producing portable typing results. The tool also merges typing information with isolates and patients metadata to create graphical and tabular outputs useful in epidemiological investigations and it runs in a few seconds even with hundreds of isolates. Availability: https://skynet.unimi.it/index.php/tools/meltingplot/. Conclusions The analysis and result interpretation of HRM typing protocols can be not trivial and this likely limited its application in hospital settings. MeltingPlot is a web tool designed to help the user to reconstruct epidemiological events by combining HRM-based clustering methods and the isolate/patient metadata. The tool can be used for the implementation of HRM based real time large scale surveillance programs in hospital settings.


Sign in / Sign up

Export Citation Format

Share Document