scholarly journals IMPROVED LARGE-SCALE SLOPE ANALYSIS ON MARS BASED ON CORRELATION OF SLOPES DERIVED WITH DIFFERENT BASELINES

Author(s):  
Y. Wang ◽  
B. Wu

The surface slopes of planetary bodies are important factors for exploration missions, such as landing site selection and rover manoeuvre. Generally, high-resolution digital elevation models (DEMs) such as those generated from the HiRISE images on Mars are preferred to generate detailed slopes with a better fidelity of terrain features. Unfortunately, high-resolution datasets normally only cover small area and are not always available. While lower resolution datasets, such as MOLA, provide global coverage of the Martian surface. Slopes generated from the low-resolution DEM will be based on a large baseline and be smoothed from the real situation. In order to carry out slope analysis at large scale on Martian surface based low-resolution data such as MOLA data, while alleviating the smoothness problem of slopes due to its low resolution, this paper presents an amplifying function of slopes derived from low-resolution DEMs based on the relationships between DEM resolutions and slopes. First, slope maps are derived from the HiRISE DEM (meter-level resolution DEM generated from HiRISE images) and a series of down-sampled HiRISE DEMs. The latter are used to simulate low-resolution DEMs. Then the high-resolution slope map is down- sampled to the same resolution with the slope map from the lower-resolution DEMs. Thus, a comparison can be conducted pixel-wise. For each pixel on the slope map derived from the lower-resolution DEM, it can reach the same value with the down-sampled HiRISE slope by multiplying an amplifying factor. Seven sets of HiRISE images with representative terrain types are used for correlation analysis. It shows that the relationship between the amplifying factors and the original MOLA slopes can be described by the exponential function. Verifications using other datasets show that after applying the proposed amplifying function, the updated slope maps give better representations of slopes on Martian surface compared with the original slopes.

2021 ◽  
Vol 13 (15) ◽  
pp. 2877
Author(s):  
Yu Tao ◽  
Siting Xiong ◽  
Susan J. Conway ◽  
Jan-Peter Muller ◽  
Anthony Guimpier ◽  
...  

The lack of adequate stereo coverage and where available, lengthy processing time, various artefacts, and unsatisfactory quality and complexity of automating the selection of the best set of processing parameters, have long been big barriers for large-area planetary 3D mapping. In this paper, we propose a deep learning-based solution, called MADNet (Multi-scale generative Adversarial u-net with Dense convolutional and up-projection blocks), that avoids or resolves all of the above issues. We demonstrate the wide applicability of this technique with the ExoMars Trace Gas Orbiter Colour and Stereo Surface Imaging System (CaSSIS) 4.6 m/pixel images on Mars. Only a single input image and a coarse global 3D reference are required, without knowing any camera models or imaging parameters, to produce high-quality and high-resolution full-strip Digital Terrain Models (DTMs) in a few seconds. In this paper, we discuss technical details of the MADNet system and provide detailed comparisons and assessments of the results. The resultant MADNet 8 m/pixel CaSSIS DTMs are qualitatively very similar to the 1 m/pixel HiRISE DTMs. The resultant MADNet CaSSIS DTMs display excellent agreement with nested Mars Reconnaissance Orbiter Context Camera (CTX), Mars Express’s High-Resolution Stereo Camera (HRSC), and Mars Orbiter Laser Altimeter (MOLA) DTMs at large-scale, and meanwhile, show fairly good correlation with the High-Resolution Imaging Science Experiment (HiRISE) DTMs for fine-scale details. In addition, we show how MADNet outperforms traditional photogrammetric methods, both on speed and quality, for other datasets like HRSC, CTX, and HiRISE, without any parameter tuning or re-training of the model. We demonstrate the results for Oxia Planum (the landing site of the European Space Agency’s Rosalind Franklin ExoMars rover 2023) and a couple of sites of high scientific interest.


2017 ◽  
Vol 10 (3) ◽  
pp. 1383-1402 ◽  
Author(s):  
Paolo Davini ◽  
Jost von Hardenberg ◽  
Susanna Corti ◽  
Hannah M. Christensen ◽  
Stephan Juricke ◽  
...  

Abstract. The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate – specifically the Madden–Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).


Author(s):  
Yan Leng ◽  
Nakash Ali Babwany ◽  
Alex Pentland

AbstractDiversity has tremendous value in modern society. Economic theories suggest that cultural and ethnic diversity may contribute to economic development and prosperity. To date, however, the correspondence between diversity measures and the economic indicators, such as the Consumer Price Index, has not been quantified. This is primarily due to the difficulty in obtaining data on the micro behaviors and macroeconomic indicators. In this paper, we explore the relationship between diversity measures extracted from large-scale and high-resolution mobile phone data, and the CPIs in different sectors in a tourism country. Interestingly, we show that diversity measures associate strongly with the general and sectoral CPIs, using phone records in Andorra. Based on these strong predictive relationships, we construct daily, and spatial maps to monitor CPI measures at a high resolution to complement existing CPI measures from the statistical office. The case study on Andorra used in this study contributes to two growing literature: linking diversity with economic outcomes, and macro-economic monitoring with large-scale data. Future study is required to examine the relationship between the two measures in other countries.


2016 ◽  
Author(s):  
Paolo Davini ◽  
Jost von Hardenberg ◽  
Susanna Corti ◽  
Hannah M. Christensen ◽  
Stephan Juricke ◽  
...  

Abstract. The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth-System Model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 km up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PBytes of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Center (LRZ) in Garching, Germany. About 140 TBytes of post-processed data are stored on the CINECA supercomputing center archives and are freely accessible to the community thanks to an EUDAT Data Pilot project. This paper presents the technical and scientific setup of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given: an improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increases is observed. It is also shown that including stochastic parameterisation in the low resolution runs helps to improve some aspects of the tropical climate – specifically the Madden-Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small scale processes on the large scale climate variability either explicitly (with high resolution simulations) or stochastically (in low resolution simulations).


2006 ◽  
Vol 7 (1) ◽  
pp. 61-80 ◽  
Author(s):  
B. Decharme ◽  
H. Douville ◽  
A. Boone ◽  
F. Habets ◽  
J. Noilhan

Abstract This study focuses on the influence of an exponential profile of saturated hydraulic conductivity, ksat, with soil depth on the water budget simulated by the Interaction Soil Biosphere Atmosphere (ISBA) land surface model over the French Rhône River basin. With this exponential profile, the saturated hydraulic conductivity at the surface increases by approximately a factor of 10, and its mean value increases in the root zone and decreases in the deeper region of the soil in comparison with the values given by Clapp and Hornberger. This new version of ISBA is compared to the original version in offline simulations using the Rhône-Aggregation high-resolution database. Low-resolution simulations, where all atmospheric data and surface parameters have been aggregated, are also performed to test the impact of the modified ksat profile at the typical scale of a climate model. The simulated discharges are compared to observations from a dense network consisting of 88 gauging stations. Results of the high-resolution experiments show that the exponential profile of ksat globally improves the simulated discharges and that the assumption of an increase in saturated hydraulic conductivity from the soil surface to a depth close to the rooting depth in comparison with values given by Clapp and Hornberger is reasonable. Results of the scaling experiments indicate that this parameterization is also suitable for large-scale hydrological applications. Nevertheless, low-resolution simulations with both model versions overestimate evapotranspiration (especially from the plant transpiration and the wet fraction of the canopy) to the detriment of total runoff, which emphasizes the need for implementing subgrid distribution of precipitation and land surface properties in large-scale hydrological applications.


2019 ◽  
Vol 219 (Supplement_1) ◽  
pp. S137-S151 ◽  
Author(s):  
Julien Aubert

SUMMARY The geodynamo features a broad separation between the large scale at which Earth’s magnetic field is sustained against ohmic dissipation and the small scales of the turbulent and electrically conducting underlying fluid flow in the outer core. Here, the properties of this scale separation are analysed using high-resolution numerical simulations that approach closer to Earth’s core conditions than earlier models. The new simulations are obtained by increasing the resolution and gradually relaxing the hyperdiffusive approximation of previously published low-resolution cases. This upsizing process does not perturb the previously obtained large-scale, leading-order quasi-geostrophic (QG) and first-order magneto-Archimedes-Coriolis (MAC) force balances. As a result, upsizing causes only weak transients typically lasting a fraction of a convective overturn time, thereby demonstrating the efficiency of this approach to reach extreme conditions at reduced computational cost. As Earth’s core conditions are approached in the upsized simulations, Ohmic losses dissipate up to 97 per cent of the injected convective power. Kinetic energy spectra feature a gradually broadening self-similar, power-law spectral range extending over more than a decade in length scale. In this range, the spectral energy density profile of vorticity is shown to be approximately flat between the large scale at which the magnetic field draws its energy from convection through the QG-MAC force balance and the small scale at which this energy is dissipated. The resulting velocity and density anomaly planforms in the physical space consist in large-scale columnar sheets and plumes, respectively, co-existing with small-scale vorticity filaments and density anomaly ramifications. In contrast, magnetic field planforms keep their large-scale structure after upsizing. The small-scale vorticity filaments are aligned with the large-scale magnetic field lines, thereby minimizing the dynamical influence of the Lorentz force. The diagnostic outputs of the upsized simulations are more consistent with the asymptotic QG-MAC theory than those of the low-resolution cases that they originate from, but still feature small residual deviations that may call for further theoretical refinements to account for the structuring constraints of the magnetic field on the flow.


Author(s):  
Yiran Wang ◽  
Bo Wu

Images from two sensors, the High-Resolution Imaging Science Experiment (HiRISE) and the Context Camera (CTX), both on-board the Mars Reconnaissance Orbiter (MRO), were used to generate high-quality DEMs (Digital Elevation Models) of the Martian surface. However, there were discrepancies between the DEMs generated from the images acquired by these two sensors due to various reasons, such as variations in boresight alignment between the two sensors during the flight in the complex environment. This paper presents a systematic investigation of the discrepancies between the DEMs generated from the HiRISE and CTX images. A combined adjustment algorithm is presented for the co-registration of HiRISE and CTX DEMs. Experimental analysis was carried out using the HiRISE and CTX images collected at the Mars Rover landing site and several other typical regions. The results indicated that there were systematic offsets between the HiRISE and CTX DEMs in the longitude and latitude directions. However, the offset in the altitude was less obvious. After combined adjustment, the offsets were eliminated and the HiRISE and CTX DEMs were co-registered to each other. The presented research is of significance for the synergistic use of HiRISE and CTX images for precision Mars topographic mapping.


2021 ◽  
Vol 13 (21) ◽  
pp. 4220
Author(s):  
Yu Tao ◽  
Jan-Peter Muller ◽  
Siting Xiong ◽  
Susan J. Conway

The High-Resolution Imaging Science Experiment (HiRISE) onboard the Mars Reconnaissance Orbiter provides remotely sensed imagery at the highest spatial resolution at 25–50 cm/pixel of the surface of Mars. However, due to the spatial resolution being so high, the total area covered by HiRISE targeted stereo acquisitions is very limited. This results in a lack of the availability of high-resolution digital terrain models (DTMs) which are better than 1 m/pixel. Such high-resolution DTMs have always been considered desirable for the international community of planetary scientists to carry out fine-scale geological analysis of the Martian surface. Recently, new deep learning-based techniques that are able to retrieve DTMs from single optical orbital imagery have been developed and applied to single HiRISE observational data. In this paper, we improve upon a previously developed single-image DTM estimation system called MADNet (1.0). We propose optimisations which we collectively call MADNet 2.0, which is based on a supervised image-to-height estimation network, multi-scale DTM reconstruction, and 3D co-alignment processes. In particular, we employ optimised single-scale inference and multi-scale reconstruction (in MADNet 2.0), instead of multi-scale inference and single-scale reconstruction (in MADNet 1.0), to produce more accurate large-scale topographic retrieval with boosted fine-scale resolution. We demonstrate the improvements of the MADNet 2.0 DTMs produced using HiRISE images, in comparison to the MADNet 1.0 DTMs and the published Planetary Data System (PDS) DTMs over the ExoMars Rosalind Franklin rover’s landing site at Oxia Planum. Qualitative and quantitative assessments suggest the proposed MADNet 2.0 system is capable of producing pixel-scale DTM retrieval at the same spatial resolution (25 cm/pixel) of the input HiRISE images.


1966 ◽  
Vol 24 ◽  
pp. 141-154
Author(s):  
D. H. P. Jones

1. The relationship between spectral classification and multi-colour photometry is that between high resolution with low photometric accuracy and low resolution with high photometric accuracy. Observations with a spectrum scanner are one method of bridging the gap. The continuum can be measured in regions comparatively free from lines, especially in the red; moreover the stronger spectral features can be measured quantitatively.


Author(s):  
T. Kramm ◽  
D. Hoffmeister

<p><strong>Abstract.</strong> The resolution and accuracy of digital elevation models (DEMs) have direct influence on further geoscientific computations like landform classifications and hydrologic modelling results. Thus, it is crucial to analyse the accuracy of DEMs to select the most suitable elevation model regarding aim, accuracy and scale of the study. Nowadays several worldwide DEMs are available, as well as DEMs covering regional or local extents. In this study a variety of globally available elevation models were evaluated for an area of about 190,000&amp;thinsp;km<sup>2</sup>. Data from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) 30 m, Shuttle Radar Topography Mission (SRTM) 30&amp;thinsp;m and 90&amp;thinsp;m, Advanced Land Observing Satellite (ALOS) World 3D 30&amp;thinsp;m and TanDEM-X WorldDEM&amp;trade; &amp;ndash; 12&amp;thinsp;m and 90&amp;thinsp;m resolution were obtained. Additionally, several very high resolution DEM data were derived from stereo satellite imagery from SPOT 6/7 and Pléiades for smaller areas of about 100&amp;ndash;400&amp;thinsp;km<sup>2</sup> for each dataset. All datasets were evaluated with height points of the Geoscience Laser Altimeter System (GLAS) instrument aboard the NASA Ice, Cloud, and land Elevation (ICESat) satellite on a regional scale and with nine very high resolution elevation models from UAV-based photogrammetry on a very large scale. For all datasets the root mean square error (RMSE) and normalized median absolute deviation (NMAD) was calculated. Furthermore, the association of errors to specific terrain was conducted by assigning these errors to landforms from the topographic position index (TPI), topographic roughness index (TRI) and slope. For all datasets with a global availability the results show the highest overall accuracies for the TanDEM-X 12&amp;thinsp;m (RMSE: 2.3&amp;thinsp;m, NMAD: 0.8&amp;thinsp;m). The lowest accuracies were detected for the 30&amp;thinsp;m ASTER GDEM v3 (RMSE: 8.9&amp;thinsp;m, NMAD: 7.1&amp;thinsp;m). Depending on the landscape the accuracies are higher for all DEMs in flat landscapes and the errors rise significantly in rougher terrain. Local scale DEMs derived from stereo satellite imagery show a varying overall accuracy, mainly depending on the topography covered by the scene.</p>


Sign in / Sign up

Export Citation Format

Share Document