high resolution data
Recently Published Documents


TOTAL DOCUMENTS

920
(FIVE YEARS 382)

H-INDEX

44
(FIVE YEARS 12)

Atmosphere ◽  
2022 ◽  
Vol 13 (1) ◽  
pp. 126
Author(s):  
Shaowu Bao ◽  
Zhan Zhang ◽  
Evan Kalina ◽  
Bin Liu

The HAFS model is an effort under the NGGPS and UFS initiatives to create the next generation of hurricane prediction and analysis system based on FV3-GFS. It has been validated extensively using traditional verification indicators such as tracker error and biases, intensity error and biases, and the radii of gale, damaging and hurricane strength winds. While satellite images have been used to verify hurricane model forecasts, they have not been used on HAFS. The community radiative transfer model CRTM is used to generate model synthetic satellite images from HAFS model forecast state variables. The 24 forecast snapshots in the mature stage of hurricane Dorian in 2019 are used to generate a composite model synthetic GOES-R infrared brightness image. The composite synthetic image is compared to the corresponding composite image generated from the observed GOES-R data, to evaluate the model forecast TC vortex intensity, size, and asymmetric structure. Results show that the HAFS forecast TC Dorian agrees reasonably well with the observation, but the forecast intensity is weaker, its overall vortex size smaller, and the radii of its eye and maximum winds larger than the observed. The evaluation results can be used to further improve the model. While these results are consistent with those obtained by traditional verification methods, evaluations based on composite satellite images provide an additional benefit with richer information because they have near-real-times spatially and temporally continuous high-resolution data with global coverage. Composite satellite infrared images could be used routinely to supplement traditional verification methods in the HAFS and other hurricane model evaluations. Note since this study only evaluated one hurricane, the above conclusions are only applicable to the model behavior of the mature stage of hurricane Dorian in 2019, and caution is needed to extend these conclusions to expect model biases in predicting other TCs. Nevertheless, the consistency between the evaluation using composite satellite images and the traditional metrics, of hurricane Dorian, shows that this method has the potential to be applied to other storms in future studies.


2022 ◽  
Author(s):  
Matthias Sinnesael ◽  
Alfredo Loi ◽  
Marie-Pierre Dabard ◽  
Thijs R. A. Vandenbroucke ◽  
Philippe Claeys

Abstract. To expand traditional cyclostratigraphic numerical methods beyond their common technical limitations and apply them to truly deep-time archives we need to reflect on the development of new approaches to sedimentary archives that traditionally are not targeted for cyclostratigraphic analysis, but that frequently occur in the impoverished deep-time record. Siliciclastic storm-dominated shelf environments are a good example of such records. Our case study focusses on the Middle to Upper Ordovician siliciclastic successions of the Armorican Massif (western France), which are well-studied in terms of sedimentology and sequence stratigraphy. In addition, these sections are protected geological heritage due to the extraordinary quality of the outcrops. We therefore tested the performance of non-destructive high-resolution (cm-scale) portable X-ray fluorescence and natural gamma-ray analyses on outcrop to obtain major and trace element compositions. Despite the challenging outcrop conditions in the tidal beach zone, our geochemical analyses provide useful information regarding general lithology and several specific sedimentary features such as the detection of paleoplacers, or the discrimination between different types of diagenetic concretions such as nodules. Secondly, these new high-resolution data are used to experiment the application of commonly used numerical cyclostratigraphic techniques on this siliciclastic storm-dominated shelf environment, a non-traditional sedimentological setting for cyclostratigraphic analysis. In the lithological relatively homogenous parts of the section spectral power analyses and bandpass filtering hint towards a potential astronomical imprint of some sedimentary cycles, but this needs further confirmation in the absence of more robust independent age constraints.


2022 ◽  
Author(s):  
Richard Nair ◽  
Martin Strube ◽  
Martin Hertel ◽  
Olaf Kolle ◽  
Markus Reichstein ◽  
...  

Minirhizotrons (paired camera systems and buried observatories) are the best current method to make repeatable measurements of fine roots in the field. Automating the technique is also the only way to gather high resolution data necessary for comparison with phenology-relevant above-ground remote sensing, and, when appropriately validated, to assess with high temporal resolution belowground biomass, which can support carbon budgets estimates. Minirhizotron technology has been available for half a century but there are many challenges to automating the technique for global change experiments. Instruments must be cheap enough to replicate on field scales given their shallow field of view, and automated analysis must both be robust to changeable soil and root conditions because ultimately, image properties extracted from minirhizotrons must have biological meaning. Both digital photography and computer technology are rapidly evolving, with huge potential for generating belowground data from images using modern technological advantages. Here we demonstrate a homemade automatic minirhizotron scheme, built with off-the-shelf parts and sampling every two hours, which we paired with a neural network-based image analysis method in a proof-of-concept mesocosm study. We show that we are able to produce a robust daily timeseries of root cover dynamics. The method is applied at the same model across multiple instruments demonstrating good reproducibility of the measurements and a good pairing with an above-ground vegetation index and root biomass recovery through time. We found a sensitivity of the root cover we extracted to soil moisture conditions and time of day (potentially relating to soil moisture), which may only be an issue with high resolution automated imagery and not commonly reported as encountered when training neural networks on traditional, time-distinct minirhizotron studies. We discuss potential avenues for dealing with such issues in future field applications of such devices. If such issues are dealt with to a satisfactory manner in the field, automated timeseries of root biomass and traits from replicated instruments could add a new dimension to phenology understanding at ecosystem level by understanding the dynamics of root properties and traits.


2022 ◽  
Author(s):  
Andrea Mazzeo ◽  
Michael Burrow ◽  
Andrew Quinn ◽  
Eloise A. Marais ◽  
Ajit Singh ◽  
...  

Abstract. Urban conurbations of East Africa are affected by harmful levels of air pollution. The paucity of local air quality networks and the absence of capacity to forecast air quality make it difficult to quantify the real level of air pollution in this area. The chemistry-transport model CHIMERE has been coupled with the meteorological model WRF and used to simulate hourly concentrations of Particulate Matter PM2.5 for three East African urban conurbations: Addis Ababa in Ethiopia, Nairobi in Kenya and Kampala in Uganda. Two existing emission inventories were combined to test the performance of CHIMERE as an air quality tool for a target monthly period of 2017 and the results compared against observed data from urban and rural sites. The results show that the model is able to reproduce hourly and daily temporal variability of aerosol concentrations close to observations both in urban and rural environments. CHIMERE’s performance as a tool for managing air quality was also assessed. The analysis demonstrated that despite the absence of high-resolution data and up-to-date biogenic and anthropogenic emissions, the model was able to reproduce 66–99 % of the daily PM2.5 exceedances above the WHO 24-hour mean PM2.5 guideline (25 µg m−3) in the three cities. An analysis of the 24-hour mean levels of PM2.5 was also carried out for 17 constituencies in the vicinity of Nairobi. This showed that 47 % of the constituencies in the area exhibited a low air quality index for PM2.5 in the unhealthy category for human health exposing between 10000 to 30000 people/km2 to harmful levels of air contamination.


2022 ◽  
Author(s):  
Lenneke M. Jong ◽  
Christopher T. Plummer ◽  
Jason L. Roberts ◽  
Andrew D. Moy ◽  
Mark A. J. Curran ◽  
...  

Abstract. Ice core records from Law Dome in East Antarctica, collected over the the last three decades, provide high resolution data for studies of the climate of Antarctica, Australia and the Southern and Indo-Pacific Oceans. Here we present a set of annually dated records of trace chemistry, stable water isotopes and snow accumulation from Law Dome covering over the period from −11 to 2017 CE (1961 to −66 BP 1950), as well as the level 1 chemistry data from which the annual chemistry records are derived. This dataset provides an update and extensions both forward and back in time of previously published subsets of the data, bringing them together into a coherent set with improved dating. The data are available for download from the Australian Antarctic Data Centre at https://doi.org/10.26179/5zm0-v192.


2022 ◽  
Vol 306 ◽  
pp. 117996
Author(s):  
Mingquan Li ◽  
Edgar Virguez ◽  
Rui Shan ◽  
Jialin Tian ◽  
Shuo Gao ◽  
...  

2021 ◽  
pp. 4529-4536
Author(s):  
Huda M. Hamid ◽  
Fadia W. Al-Azawi

Many satellite systems take cover images like QuickBird for terrain so that these images scan be used to construct 3D models likes Triangulated Irregular Network (TIN), and Digital Elevation Model (DEM). This paper presents a production of 3D TIN for Al-Karkh University of Science in Baghdad - Iraq using QuickBird image data with pixel resolution of 0.6 m. The recent generations of high-resolution satellite imaging systems open a new era of digital mapping and earth observation. It provides not only multi-spectral and high-resolution data but also the capability for stereo mapping. The result of this study is a production of 3D satellite images of the university by merging 1 m DEM with satellite image for ROI using ArcGIS package Version 10.3.


2021 ◽  
Author(s):  
Josh Weyrens ◽  
Rene Germain

Abstract Beech bark disease is a pathogenic complex that has been spreading throughout the American beech’s range since the 1800s. A litany of negative consequences have manifested from the infestation of this disease, many of which deteriorate the ecological functions of forestland. This case study sought to analyze the cost structure for removing a recalcitrant beech understory via mechanized shelterwood harvesting. High-resolution data regarding the day-to-day operation of harvesting equipment was collected using daily production journals. Interviews were conducted with the logging company owner and maintenance supervisor to gather additional information required to calculate machine costs, overhead, job specific costs, and trucking costs. The yield from this harvest was 527 metric tonnes of sawtimber and 4,893 tonnes of clean chips. The total harvesting cost equated to $4,651/ha, with the cost attributed to removing beech at $204/ha. Despite the additional cost of beech removal, the logger generated a total profit of $5,965 and a return on investment of 7.5%, allowing us to conclude that mechanized harvesting can be a viable beech removal strategy given the forest stocking and market conditions that are in place. Study Implications This study breaks down the various costs associated with cutting, skidding, landing, and transporting wood products from a mechanized harvesting operation designed to remediate the effects of beech bark disease. The beech remediation harvest was economically viable for both the landowner and the logger because the timber sale included some valuable hardwood sawtimber, and the harvest system was capable of generating clean chips for a pulp mill with the low-grade hardwood. Furthermore, the landowner’s willingness to accept lower sawtimber stumpage revenues allowed the logger to make a profit and return-on-investment on the job. Had the timber sale been limited to only hardwood pulpwood or fuel chips, the operation would not have been economically viable without the landowner paying for the operation, which, based on our analysis, would be approximately $200/ha.


2021 ◽  
Vol 12 ◽  
Author(s):  
Gabriel Keeble-Gagnère ◽  
Raj Pasam ◽  
Kerrie L. Forrest ◽  
Debbie Wong ◽  
Hannah Robinson ◽  
...  

Array-based single nucleotide polymorphism (SNP) genotyping platforms have low genotype error and missing data rates compared to genotyping-by-sequencing technologies. However, design decisions used to create array-based SNP genotyping assays for both research and breeding applications are critical to their success. We describe a novel approach applicable to any animal or plant species for the design of cost-effective imputation-enabled SNP genotyping arrays with broad utility and demonstrate its application through the development of the Illumina Infinium Wheat Barley 40K SNP array Version 1.0. We show that the approach delivers high quality and high resolution data for wheat and barley, including when samples are jointly hybridised. The new array aims to maximally capture haplotypic diversity in globally diverse wheat and barley germplasm while minimizing ascertainment bias. Comprising mostly biallelic markers that were designed to be species-specific and single-copy, the array permits highly accurate imputation in diverse germplasm to improve the statistical power of genome-wide association studies (GWAS) and genomic selection. The SNP content captures tetraploid wheat (A- and B-genome) and Aegilops tauschii Coss. (D-genome) diversity and delineates synthetic and tetraploid wheat from other wheat, as well as tetraploid species and subgroups. The content includes SNP tagging key trait loci in wheat and barley, as well as direct connections to other genotyping platforms and legacy datasets. The utility of the array is enhanced through the web-based tool, Pretzel (https://plantinformatics.io/) which enables the content of the array to be visualized and interrogated interactively in the context of numerous genetic and genomic resources to be connected more seamlessly to research and breeding. The array is available for use by the international wheat and barley community.


2021 ◽  
Vol 13 (24) ◽  
pp. 5138
Author(s):  
Seyd Teymoor Seydi ◽  
Mahdi Hasanlou ◽  
Jocelyn Chanussot

Wildfires are one of the most destructive natural disasters that can affect our environment, with significant effects also on wildlife. Recently, climate change and human activities have resulted in higher frequencies of wildfires throughout the world. Timely and accurate detection of the burned areas can help to make decisions for their management. Remote sensing satellite imagery can have a key role in mapping burned areas due to its wide coverage, high-resolution data collection, and low capture times. However, although many studies have reported on burned area mapping based on remote sensing imagery in recent decades, accurate burned area mapping remains a major challenge due to the complexity of the background and the diversity of the burned areas. This paper presents a novel framework for burned area mapping based on Deep Siamese Morphological Neural Network (DSMNN-Net) and heterogeneous datasets. The DSMNN-Net framework is based on change detection through proposing a pre/post-fire method that is compatible with heterogeneous remote sensing datasets. The proposed network combines multiscale convolution layers and morphological layers (erosion and dilation) to generate deep features. To evaluate the performance of the method proposed here, two case study areas in Australian forests were selected. The framework used can better detect burned areas compared to other state-of-the-art burned area mapping procedures, with a performance of >98% for overall accuracy index, and a kappa coefficient of >0.9, using multispectral Sentinel-2 and hyperspectral PRISMA image datasets. The analyses of the two datasets illustrate that the DSMNN-Net is sufficiently valid and robust for burned area mapping, and especially for complex areas.


Sign in / Sign up

Export Citation Format

Share Document