scholarly journals Erosion and deposition vulnerability of small (<5,000 km2) tropical islands

PLoS ONE ◽  
2021 ◽  
Vol 16 (9) ◽  
pp. e0253080
Author(s):  
Trevor N. Browning ◽  
Derek E. Sawyer

The tropics are naturally vulnerable to watershed erosion. This region is rapidly growing (projected to be 50% of the global population by 2050) which exacerbates erosional issues by the subsequent land use change. The issue is particularly of interest on the many (~45,000) small tropical (<5,000 km2) islands, and their >115M residents, where ecotourism and sediment intolerant ecosystems such as coral reefs are the main driver of their economies. However, vulnerability to erosion and deposition is poorly quantified in these regions due to the misclassification or exclusion of small islands in coarse global analyses. We use the only vulnerability assessment method that connects watershed erosion and coastal deposition to compare locally sourced, high-resolution datasets (5 x 5 m) to satellite-collected, remotely sensed low-resolution datasets (463 x 463 m). We find that on the island scale (~52 km2) the difference in vulnerability calculated by the two methods is minor. On the watershed scale however, low-resolution datasets fail to accurately demonstrate watershed and coastal deposition vulnerability when compared to high-resolution analysis. Specifically, we find that anthropogenic development (roads and buildings) is poorly constrained at a global scale. Structures and roads are difficult to identify in heavily forested regions using satellite algorithms and the rapid, ongoing rate of development aggravates the issue. We recommend that end-users of this method obtain locally sourced anthropogenic development datasets for the best results while using low resolution datasets for the other variables. Fortunately, anthropogenic development data can be easily collected using community-based research or identified using satellite imagery by any level of user. Using high-resolution results, we identify a development trend across St. John and regions that are both high risk and possible targets for future development. Previously published modeled and measured sedimentation rates demonstrate the method is accurate when using low-resolution or high-resolution data but, anthropogenic development, watershed slope, and earthquake probability datasets should be of the highest resolution depending on the region specified.

2016 ◽  
Vol 4 (3) ◽  
pp. T387-T394 ◽  
Author(s):  
Ankur Roy ◽  
Atilla Aydin ◽  
Tapan Mukerji

It is a common practice to analyze fracture spacing data collected from scanlines and wells at various resolutions for the purposes of aquifer and reservoir characterization. However, the influence of resolution on such analyses is not well-studied. Lacunarity is a parameter that is used for multiscale analysis of spatial data. In quantitative terms, at any given scale, it is a function of the mean and variance of the distribution of masses captured by a gliding a window of that scale (size) across any pattern of interest. We have described the application of lacunarity for delineating differences between scale-dependent clustering attributes of data collected at different resolutions along a scanline. Specifically, we considered data collected at different resolutions from two outcrop exposures, a pavement and a cliff section, of the Cretaceous turbititic sandstones of the Chatsworth Formation widely exposed in southern California. For each scanline, we analyzed data from low-resolution aerial or ground photographs and high-resolution ground measurements for scale-dependent clustering attributes. High-resolution data show larger values of scale-dependent lacunarity than their respective low-resolution counterparts. We further performed a bootstrap analysis for each data set to test for the significance of such clustering differences. We started with generating 300 realizations for each data set and then ran lacunarity analysis on them. It was seen that lacunarity for higher resolution data set lay significantly outside the upper 90th percentile values, thus proving that higher resolution data are distinctly different from random and fractures are clustered. We have therefore postulated that lower resolution data capture fracture zones that had relatively uniform spacing, whereas higher resolution data capture thin and short splay joints and sheared joints that contribute to fracture clustering. Such findings have important implications in terms of understanding organization of fractures in fracture corridors, which in turn is critical for modeling and upscaling exercises.


2020 ◽  
Author(s):  
Michael C. Dimmick ◽  
Leo J. Lee ◽  
Brendan J. Frey

AbstractMotivationHi-C data has enabled the genome-wide study of chromatin folding and architecture, and has led to important discoveries in the structure and function of chromatin conformation. Here, high resolution data plays a particularly important role as many chromatin substructures such as Topologically Associating Domains (TADs) and chromatin loops cannot be adequately studied with low resolution contact maps. However, the high sequencing costs associated with the generation of high resolution Hi-C data has become an experimental barrier. Data driven machine learning models, which allow low resolution Hi-C data to be computationally enhanced, offer a promising avenue to address this challenge.ResultsBy carefully examining the properties of Hi-C maps and integrating various recent advances in deep learning, we developed a Hi-C Super-Resolution (HiCSR) framework capable of accurately recovering the fine details, textures, and substructures found in high resolution contact maps. This was achieved using a novel loss function tailored to the Hi-C enhancement problem which optimizes for an adversarial loss from a Generative Adversarial Network (GAN), a feature reconstruction loss derived from the latent representation of a denoising autoencoder, and a pixel-wise loss. Not only can the resulting framework generate enhanced Hi-C maps more visually similar to the original high resolution maps, it also excels on a suite of reproducibility metrics produced by members of the ENCODE Consortium compared to existing approaches, including HiCPlus, HiCNN, hicGAN and DeepHiC. Finally, we demonstrate that HiCSR is capable of enhancing Hi-C data across sequencing depth, cell types, and species, recovering biologically significant contact domain boundaries.AvailabilityWe make our implementation available for download at: https://github.com/PSI-Lab/[email protected] informationAvailable Online


2020 ◽  
Vol 493 (2) ◽  
pp. 2215-2228 ◽  
Author(s):  
Neale P Gibson ◽  
Stephanie Merritt ◽  
Stevanus K Nugroho ◽  
Patricio E Cubillos ◽  
Ernst J W de Mooij ◽  
...  

ABSTRACT High-resolution Doppler-resolved spectroscopy has opened up a new window into the atmospheres of both transiting and non-transiting exoplanets. Here, we present VLT/UVES observations of a transit of WASP-121b, an ‘ultra-hot’ Jupiter previously found to exhibit a temperature inversion and detections of multiple species at optical wavelengths. We present initial results using the blue arm of UVES (≈3700–5000 Å), recovering a clear signal of neutral Fe in the planet’s atmosphere at &gt;8$\, \sigma$, which could contribute to (or even fully explain) the temperature inversion in the stratosphere. However, using standard cross-correlation methods, it is difficult to extract physical parameters such as temperature and abundances. Recent pioneering efforts have sought to develop likelihood ‘mappings’ that can be used to directly fit models to high-resolution data sets. We introduce a new framework that directly computes the likelihood of the model fit to the data, and can be used to explore the posterior distribution of parametrised model atmospheres via MCMC techniques. Our method also recovers the physical extent of the atmosphere, as well as account for time- and wavelength-dependent uncertainties. We measure a temperature of $3710^{+490}_{-510}$ K, indicating a higher temperature in the upper atmosphere when compared to low-resolution observations. We also show that the Fe i signal is physically separated from the exospheric Fe ii. However, the temperature measurements are highly degenerate with aerosol properties; detection of additional species, using more sophisticated atmospheric models, or combining these methods with low-resolution spectra should help break these degeneracies.


1982 ◽  
Vol 70 ◽  
pp. 145-146
Author(s):  
G.B. Baratta ◽  
A. Altamore ◽  
A. Cassatella ◽  
M. Friedjung ◽  
D. Ponz ◽  
...  

High and low resolution IUE spectra of CI Cyg were obtained at VTLSPA during 1979-81 allowing for an analysis of the spectral variations related to the decreasing activity of the star, and to the eclipse (June 1980).In the high resolution spectra the emission lines have a width slightly larger than the instrumental one. This fact is particularly evident in the HeII 1640 A line and could be related to the peculiar behaviour of this line at low resolution as reported by Michalitsianos et al. (this volume). A systematic radial velocity difference between permitted and intercombination lines was found; this difference should be connected with the structure of the emitting region(s). “Secular” and eclipse varia tion was found in particular in the intercombination line intensities (Viotti et al. 1980). An electron density of ∼ 0.3–1.5×1010 cm–3 was evaluated from the intensity ratios of the NIII] emission lines. No significant difference in these ratios and in the CIII]/NIII] ratio as well was found,during 1979, 1980 (eclipse) and 1981 suggesting no large Ne variation with both the activity phase and eclipse of the star. This result should imply a low density gradient in the partially eclipsed NIII] and CIII] regions. A more detailed analysis of the high resolution data is in course to better clearfy these points, and their implications on the possible models for CI Cyg.


2019 ◽  
Vol 36 (5) ◽  
pp. 745-760 ◽  
Author(s):  
Lia Siegelman ◽  
Fabien Roquet ◽  
Vigan Mensah ◽  
Pascal Rivière ◽  
Etienne Pauthenet ◽  
...  

AbstractMost available CTD Satellite Relay Data Logger (CTD-SRDL) profiles are heavily compressed before satellite transmission. High-resolution profiles recorded at the sampling frequency of 0.5 Hz are, however, available upon physical retrieval of the logger. Between 2014 and 2018, several loggers deployed on elephant seals in the Southern Ocean have been set in continuous recording mode, capturing both the ascent and descent for over 60 profiles per day during several months, opening new horizons for the physical oceanography community. Taking advantage of a new dataset made of seven such loggers, a postprocessing procedure is proposed and validated to improve the quality of all CTD-SRDL data: that is, both high-resolution profiles and compressed low-resolution ones. First, temperature and conductivity are corrected for a thermal mass effect. Then salinity spiking and density inversion are removed by adjusting salinity while leaving temperature unchanged. This method, applied here to more than 50 000 profiles, yields significant and systematic improvements in both temperature and salinity, particularly in regions of rapid temperature variation. The continuous high-resolution dataset is then used to provide updated accuracy estimates of CTD-SRDL data. For high-resolution data, accuracies are estimated to be of ±0.02°C for temperature and ±0.03 g kg−1 for salinity. For low-resolution data, transmitted data points have similar accuracies; however, reconstructed temperature profiles have a reduced accuracy associated with the vertical interpolation of ±0.04°C and a nearly unchanged salinity accuracy of ±0.03 g kg−1.


Sign in / Sign up

Export Citation Format

Share Document