A Novel Approach of the Modelling of Dynamics of the Ice Cover Applying Microsatellites Data

Author(s):  
Magdalena Łukosz ◽  
Wojciech Witkowski

<p>Keywords: ice cover; glacier dynamics; microsatellites; offset-tracking; climate changes</p><p>Radar images acquired by SAR satellites allow scientists to monitor the movements of glaciers in polar regions. Observation of these areas is significant as it provides information on the process of global warming. It also makes it possible to assess the amount of ice mass that is melting and, as a result, increasing the mean level of the global ocean. Due to high speeds and loss of consistency in glacial areas, the optimal technique for estimating glacier velocity is Offset-Tracking. Its accuracy depends on the size of the terrain pixel and can therefore increase the accuracy of the results obtained by using high-resolution images. Microsatellites open up new possibilities through high resolution imagery and short revisit time.</p><p>The study uses ICEYE products. The aim of the research was to investigate the influence of SAR image resolution on the accuracy of calculated movements in the Offset-Tracking method. Additionally, a comparison of obtained results with previous studies allowed to analyze changes in the dynamics of chosen areas. The research was carried out for 2 glaciers: Jakobshavn in Greenland and Thwaites in Antarctica. It made it possible to compare the quality of results in areas that are located in various parts of the world and moving at different dynamics. Additionally, calculations were made for Sentinel-1 SAR images for comparative analysis. </p><p>As a result of research, velocities of glaciers and their directions in periods of several days were obtained. For Thwaites glacier, daily changes in dynamics were also analyzed. Moreover, by comparing results to earlier researches which were carried out in these areas, it was possible to estimate changes in ice cover during longer timespans. In the last step, the quality and accuracy of products obtained from ICEYE and Sentinel-1 satellites were compared. </p><p>This research assesses the utility of microsatellite images for monitoring glacier movements and shows possibilities of their usage in future research.</p>

2020 ◽  
Author(s):  
Silvan Leinss ◽  
Shiyi Li ◽  
Philipp Bernhard ◽  
Othmar Frey

<p>The velocity of glaciers is commonly derived by offset tracking using pairwise cross correlation or feature matching of either optical or synthetic aperture radar (SAR) images.  SAR images, however, are inherently affected by noise-like radar speckle and require therefore much larger images patches for successful tracking compared to the patch size used with optical data. As a consequence, glacier velocity maps based on SAR offset tracking have a relatively low resolution compared to the nominal resolution of SAR sensors. Moreover, tracking may fail because small features on the glacier surface cannot be detected due to radar speckle. Although radar speckle can be reduced by applying spatial low-pass filters (e.g. 5x5 boxcar), the spatial smoothing reduces the image resolution roughly by an order of magnitude which strongly reduces the tracking precision. Furthermore, it blurs out small features on the glacier surface, and therefore tracking can also fail unless clear features like large crevasses are visible.</p><p>In order to create high resolution velocity maps from SAR images and to generate speckle-free radar images of glaciers, we present a new method that derives the glacier surface velocity field by correlating temporally averaged sub-stacks of a series of SAR images. The key feature of the method is to warp every pixel in each SAR image according to its temporally increasing offset with respect to a reference date. The offset is determined by the glacier velocity which is obtained by maximizing the cross-correlation between the averages of two sub-stacks. Currently, we need to assume that the surface velocity is constant during the acquisition period of the image series but this assumption can be relaxed to a certain extend.</p><p>As the method combines the information of multiple images, radar speckle are highly suppressed by temporal multi-looking, therefore the signal-to-noise ratio of the cross-correlation is significantly improved. We found that the method outperforms the pair-wise cross-correlation method for velocity estimation in terms of both the coverage and the resolution of the velocity field. At the same time, very high resolution radar images are obtained and reveal features that are otherwise hidden in radar speckle.</p><p>As the reference date, to which the sub-stacks are averaged, can be arbitrarily chosen a smooth flow animation of the glacier surface can be generated based on a limited number of SAR images. The presented method could build a basis for a new generation of tracking methods as the method is excellently suited to exploit the large number of emerging free and globally available high resolution SAR image time series.</p>


Author(s):  
H.S. von Harrach ◽  
D.E. Jesson ◽  
S.J. Pennycook

Phase contrast TEM has been the leading technique for high resolution imaging of materials for many years, whilst STEM has been the principal method for high-resolution microanalysis. However, it was demonstrated many years ago that low angle dark-field STEM imaging is a priori capable of almost 50% higher point resolution than coherent bright-field imaging (i.e. phase contrast TEM or STEM). This advantage was not exploited until Pennycook developed the high-angle annular dark-field (ADF) technique which can provide an incoherent image showing both high image resolution and atomic number contrast.This paper describes the design and first results of a 300kV field-emission STEM (VG Microscopes HB603U) which has improved ADF STEM image resolution towards the 1 angstrom target. The instrument uses a cold field-emission gun, generating a 300 kV beam of up to 1 μA from an 11-stage accelerator. The beam is focussed on to the specimen by two condensers and a condenser-objective lens with a spherical aberration coefficient of 1.0 mm.


Author(s):  
Abdallah Naser ◽  
Ahmad Lotfi ◽  
Joni Zhong

AbstractHuman distance estimation is essential in many vital applications, specifically, in human localisation-based systems, such as independent living for older adults applications, and making places safe through preventing the transmission of contagious diseases through social distancing alert systems. Previous approaches to estimate the distance between a reference sensing device and human subject relied on visual or high-resolution thermal cameras. However, regular visual cameras have serious concerns about people’s privacy in indoor environments, and high-resolution thermal cameras are costly. This paper proposes a novel approach to estimate the distance for indoor human-centred applications using a low-resolution thermal sensor array. The proposed system presents a discrete and adaptive sensor placement continuous distance estimators using classification techniques and artificial neural network, respectively. It also proposes a real-time distance-based field of view classification through a novel image-based feature. Besides, the paper proposes a transfer application to the proposed continuous distance estimator to measure human height. The proposed approach is evaluated in different indoor environments, sensor placements with different participants. This paper shows a median overall error of $$\pm 0.2$$ ± 0.2  m in continuous-based estimation and $$96.8\%$$ 96.8 % achieved-accuracy in discrete distance estimation.


Machines ◽  
2021 ◽  
Vol 9 (1) ◽  
pp. 13
Author(s):  
Yuhang Yang ◽  
Zhiqiao Dong ◽  
Yuquan Meng ◽  
Chenhui Shao

High-fidelity characterization and effective monitoring of spatial and spatiotemporal processes are crucial for high-performance quality control of many manufacturing processes and systems in the era of smart manufacturing. Although the recent development in measurement technologies has made it possible to acquire high-resolution three-dimensional (3D) surface measurement data, it is generally expensive and time-consuming to use such technologies in real-world production settings. Data-driven approaches that stem from statistics and machine learning can potentially enable intelligent, cost-effective surface measurement and thus allow manufacturers to use high-resolution surface data for better decision-making without introducing substantial production cost induced by data acquisition. Among these methods, spatial and spatiotemporal interpolation techniques can draw inferences about unmeasured locations on a surface using the measurement of other locations, thus decreasing the measurement cost and time. However, interpolation methods are very sensitive to the availability of measurement data, and their performances largely depend on the measurement scheme or the sampling design, i.e., how to allocate measurement efforts. As such, sampling design is considered to be another important field that enables intelligent surface measurement. This paper reviews and summarizes the state-of-the-art research in interpolation and sampling design for surface measurement in varied manufacturing applications. Research gaps and future research directions are also identified and can serve as a fundamental guideline to industrial practitioners and researchers for future studies in these areas.


Coatings ◽  
2021 ◽  
Vol 11 (7) ◽  
pp. 758
Author(s):  
Cibi Pranav ◽  
Minh-Tan Do ◽  
Yi-Chang Tsai

High Friction Surfaces (HFS) are applied to increase friction capacity on critical roadway sections, such as horizontal curves. HFS friction deterioration on these sections is a safety concern. This study deals with characterization of the aggregate loss, one of the main failure mechanisms of HFS, using texture parameters to study its relationship with friction. Tests are conducted on selected HFS spots with different aggregate loss severity levels at the National Center for Asphalt Technology (NCAT) Test Track. Friction tests are performed using a Dynamic Friction Tester (DFT). The surface texture is measured by means of a high-resolution 3D pavement scanning system (0.025 mm vertical resolution). Texture data are processed and analyzed by means of the MountainsMap software. The correlations between the DFT friction coefficient and the texture parameters confirm the impact of change in aggregates’ characteristics (including height, shape, and material volume) on friction. A novel approach to detect the HFS friction coefficient transition based on aggregate loss, inspired by previous works on the tribology of coatings, is proposed. Using the proposed approach, preliminary outcomes show it is possible to observe the rapid friction coefficient transition, similar to observations at NCAT. Perspectives for future research are presented and discussed.


Coatings ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 204
Author(s):  
Yuhao Zhou ◽  
Bowen Ji ◽  
Minghao Wang ◽  
Kai Zhang ◽  
Shuaiqi Huangfu ◽  
...  

Remarkable progress has been made in the high resolution, biocompatibility, durability and stretchability for the implantable brain-computer interface (BCI) in the last decades. Due to the inevitable damage of brain tissue caused by traditional rigid devices, the thin film devices are developing rapidly and attracting considerable attention, with continuous progress in flexible materials and non-silicon micro/nano fabrication methods. Therefore, it is necessary to systematically summarize the recent development of implantable thin film devices for acquiring brain information. This brief review subdivides the flexible thin film devices into the following four categories: planar, open-mesh, probe, and micro-wire layouts. In addition, an overview of the fabrication approaches is also presented. Traditional lithography and state-of-the-art processing methods are discussed for the key issue of high-resolution. Special substrates and interconnects are also highlighted with varied materials and fabrication routines. In conclusion, a discussion of the remaining obstacles and directions for future research is provided.


Risks ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 114
Author(s):  
Paritosh Navinchandra Jha ◽  
Marco Cucculelli

The paper introduces a novel approach to ensemble modeling as a weighted model average technique. The proposed idea is prudent, simple to understand, and easy to implement compared to the Bayesian and frequentist approach. The paper provides both theoretical and empirical contributions for assessing credit risk (probability of default) effectively in a new way by creating an ensemble model as a weighted linear combination of machine learning models. The idea can be generalized to any classification problems in other domains where ensemble-type modeling is a subject of interest and is not limited to an unbalanced dataset or credit risk assessment. The results suggest a better forecasting performance compared to the single best well-known machine learning of parametric, non-parametric, and other ensemble models. The scope of our approach can be extended to any further improvement in estimating weights differently that may be beneficial to enhance the performance of the model average as a future research direction.


Sign in / Sign up

Export Citation Format

Share Document