temporal frequency
Recently Published Documents


TOTAL DOCUMENTS

856
(FIVE YEARS 214)

H-INDEX

57
(FIVE YEARS 5)

2022 ◽  
Author(s):  
Laique Merlin Djeutchouang ◽  
Nicolette Chang ◽  
Luke Gregor ◽  
Marcello Vichi ◽  
Pedro Manuel Scheel Monteiro

Abstract. The Southern Ocean is a complex system yet is sparsely sampled in both space and time. These factors raise questions about the confidence in present sampling strategies and associated machine learning (ML) reconstructions. Previous studies have not yielded a clear understanding of the origin of uncertainties and biases for the reconstructions of the partial pressure of carbon dioxide (pCO2) at the surface ocean (pCO2ocean). Here, we examine these questions by investigating the sensitivity of pCO2ocean reconstruction uncertainties and biases to a series of semi-idealized observing system simulation experiments (OSSEs) that simulate spatio-temporal sampling scales of surface ocean pCO2 in ways that are comparable to ocean CO2 observing platforms (Ship, Waveglider, Carbon-float, Saildrone). These experiments sampled a high spatial resolution (±10 km) coupled physical and biogeochemical model (NEMO-PISCES) within a sub-domain representative of the Sub-Antarctic and Polar Frontal Zones in the Southern Ocean. The reconstructions were done using a two-member ensemble approach that consisted of two machine learning (ML) methods, (1) the feed-forward neural network and (2) the gradient boosting machines. With the baseline observations being from the simulated ships mimicking observations from the Surface Ocean CO2 Atlas (SOCAT), we applied to each of the scale-sampling simulation scenarios the two-member ensemble method ML2, to reconstruct the full sub-domain pCO2ocean and assess the reconstruction skill through a statistical comparison of reconstructed pCO2ocean and model domain mean. The analysis shows that uncertainties and biases for pCO2ocean reconstructions are very sensitive to both the spatial and temporal scales of pCO2 sampling in the model domain. The four key findings from our investigation are the following: (1) improving ML-based pCO2 reconstructions in the Southern Ocean requires simultaneous high resolution observations of the meridional and the seasonal cycle (< 3 days) of pCO2ocean; (2) Saildrones stand out as the optimal platforms to simultaneously address these requirements; (3) Wavegliders with hourly/daily resolution in pseudo-mooring mode improve on Carbon-floats (10-day period), which suggests that sampling aliases from the low temporal frequency have a greater negative impact on their uncertainties, biases and reconstruction means; and (4) the present summer seasonal sampling biases in SOCAT data in the Southern Ocean may be behind a significant winter bias in the reconstructed seasonal cycle of pCO2ocean.


2022 ◽  
Author(s):  
Adel Daoud ◽  
Felipe Jordan ◽  
Makkunda Sharma ◽  
Fredrik Johansson ◽  
Devdatt Dubhashi ◽  
...  

The application of deep learning methods to survey human development in remote areas with satellite imagery at high temporal frequency can significantly enhance our understanding of spatial and temporal patterns in human development. Current applications have focused their efforts in predicting a narrow set of asset-based measurements of human well-being within a limited group of African countries. Here, we leverage georeferenced village-level census data from across 30 percent of the landmass of India to train a deep-neural network that predicts 16 variables representing material conditions from annual composites of Landsat 7 imagery. The census-based model is used as a feature extractor to train another network that predicts an even larger set of developmental variables (over 90 variables) included in two rounds of the National Family Health Survey (NFHS) survey. The census-based model outperforms the current standard in the literature, night-time-luminosity-based models, as a feature extractor for several of these large set of variables. To extend the temporal scope of the models, we suggest a distribution-transformation procedure to estimate outcomes over time and space in India. Our procedure achieves levels of accuracy in the R-square of 0.92 to 0.60 for 21 development outcomes, 0.59 to 0.30 for 25 outcomes, and 0.29 to 0.00 for 28 outcomes, and 19 outcomes had negative R-square. Overall, the results show that combining satellite data with Indian Census data unlocks rich information for training deep learning models that track human development at an unprecedented geographical and temporal definition.


2021 ◽  
pp. 1-12
Author(s):  
Bilal Tahir ◽  
Muhammad Amir Mehmood

 The confluence of high performance computing algorithms and large scale high-quality data has led to the availability of cutting edge tools in computational linguistics. However, these state-of-the-art tools are available only for the major languages of the world. The preparation of large scale high-quality corpora for low resource language such as Urdu is a challenging task as it requires huge computational and human resources. In this paper, we build and analyze a large scale Urdu language Twitter corpus Anbar. For this purpose, we collect 106.9 million Urdu tweets posted by 1.69 million users during one year (September 2018-August 2019). Our corpus consists of tweets with a rich vocabulary of 3.8 million unique tokens along with 58K hashtags and 62K URLs. Moreover, it contains 75.9 million (71.0%) retweets and 847K geotagged tweets. Furthermore, we examine Anbar using a variety of metrics like temporal frequency of tweets, vocabulary size, geo-location, user characteristics, and entities distribution. To the best of our knowledge, this is the largest repository of Urdu language tweets for the NLP research community which can be used for Natural Language Understanding (NLU), social analytics, and fake news detection.


2021 ◽  
Vol 8 ◽  
Author(s):  
Fatima Gianella ◽  
Michael T. Burrows ◽  
Sarah C. Swan ◽  
Andrew D. Turner ◽  
Keith Davidson

Consistent patterns of Harmful Algal Bloom (HAB) events are not evident across the scientific literature, suggesting that local or regional variability is likely to be important in modulating any overall trend. This study summarizes Scotland-wide temporal and spatial patterns in a robust 15-year high temporal frequency time series (2006–2020) of the incidence of HABs and shellfish biotoxins in blue Mussels (Mytilus edulis), collected as part of the Food Standards Scotland (FSS) regulatory monitoring program. The relationship between the countrywide annual incidence of HAB events and biotoxins with environmental variables was also explored. Temporal patterns exhibited interannual variability, with no year-on-year increase, nor any correlation between annual occurrences. Within years, there was a summer increase in bloom frequency, peaking in July for Dinophysis spp. and Pseudo-nitzschia spp., and a plateau from May to July for Alexandrium spp. Temporal-spatial patterns were analyzed with multivariate statistics on data from monitoring sites aggregated monthly into 50-km grid cells, using Principal Component Analysis (PCA) and cluster K-means analysis. PCA analyses showed correlation between areas with similar temporal dynamics, identifying seasonality as one of the main elements of HAB variability with temporal-spatial patterns being explained by the first and second principal components. Similar patterns among regions in timing and magnitude of blooms were evaluated using K-means clusters. The analysis confirmed that the highest risk from HABs generally occurred during summer, but demonstrated that areas that respond in a similar manner (high or low risk) are not always geographically close. For example, the occurrence of the most prevalent HAB genus, Dinophysis spp., is similar countrywide, but there is a regional trend in risk level with “very-high” and “high” clusters located primarily on the southwest coast, the islands of the central and northern west coast and the Shetland Islands. “Early” and “late” blooms were also associated with certain areas and level of risk. Overall, high risk areas mainly face in a southwest direction, whilst low risk locations face a south or southeast direction. We found relatively few countrywide relationships between environmental variables and HABs, confirming the need for regional analysis to support HAB early warning.


2021 ◽  
Author(s):  
Noureen Ali ◽  
Akhtar Alam ◽  
M Sultan Bhat ◽  
Bilquis Shah

Abstract Disasters not only cause high mortality and suffering, but thwart developmental activities and damage local economies in process of formation. A part of the NW Himalayas, the Kashmir Valley is very distinct with respect to its location, topography, climate, socioeconomic structure, and strategic geopolitical nature owing to which it has witnessed a multitude of disasters ranging from local incidents of rockfalls to catastrophic earthquakes, and has often paid heavily in terms of loss of life and property. However, the information on most of the events is either partially reported or exaggerated or sometimes not recorded at all and largely scattered. Availability of organized and reliable record of past hazards and disasters is essential for tackling the risks and mitigating the future disasters. In this context, the present study attempts to address the lack of data availability by focusing on developing a dependable hazard and disaster catalogue of the Kashmir Valley by investigating into the existing literature and the available secondary data sources. A record of natural hazards and disasters most prevalent in the valley viz., earthquakes, floods, landslides and snow avalanches, has been compiled for the time period 1900 to 2020 by making use of various secondary sources, comprising of 1854 events with a range of triggers and impacts reported in the valley, which provide an insight into the spatial and temporal (frequency and distribution) trends of different hazard types for the selected time-period. Developing a catalogue of events reported in the Kashmir Valley can help in building a hazard and disaster scenario which serves as a reliable information source and is of great value from the perspective of regional design, planning and policy responses to promote disaster risk reduction.


Author(s):  
xiaogu zhong ◽  
Jiancheng Wang

Abstract We review the Seyfert 1.5 Galaxy ESO 362-G18 for exploring the origin of the soft X-ray excess. The Warm Corona and Relativistic Reflection models are two main scenarios to interpret the soft X-ray excess in AGNs at present. We use the simultaneous X-ray observation data of XMM-Newton and NuSTAR on Sep. 24th, 2016 to perform spectral analysis in two steps. First, we analyze the time-average spectra by using Warm Corona and Relativistic Reflection models. Moreover, we also explore the Hybrid model, Double Reflection model and Double Warm Corona model. We find that both of Warm Corona and Relativistic Reflection models can interpret the time-average spectra well but cannot be distinguished easily based on the time-averaged spectra fit statistics. Second, we add the RMS and covariance spectra to perform the spectral analysis with time-average spectra. The result shows that the warm corona could reproduce all of these spectra well. The the hot, optical thin corona and neutral distant reflection will increase their contribution with the temporal frequency, meaning that the corona responsible for X-ray continuum comes from the inner compact X-ray region and the neutral distant reflection is made of some moderate scale neutral clumps.


2021 ◽  
Vol 2 ◽  
Author(s):  
Christopher Small

The Visible Infrared Imaging Radiometer Suite (VIIRS) Day Night Band (DNB) on board the Suomi NPP satellite now provides almost a decade of daily observations of night light. The temporal frequency of sampling, without the degree of temporal averaging of annual composites, makes it necessary to consider the distinction between apparent temporal changes of night light related to the imaging process and actual changes in the underlying sources of the night light being imaged. The most common approach to night light change detection involves direct attribution of observed changes to the phenomenon of interest. Implicit in this approach is the assumption that other forms of actual and apparent change in the light source are negligible or non-existent. An alternative approach is to characterize the spatiotemporal variability prior to deductive attribution of causation so that the attribution can be made in the context of the full range of spatial and temporal variation. The primary objective of this study is to characterize night light variability over a range of spatial and temporal scales to provide a context for interpretation of night light changes observed on both subannual and interannual time scales. This analysis is based on a combination of temporal moments, spatial correlation and Empirical Orthogonal Function (EOF) analysis. A key result of this study is the pervasive heteroskedasticity of VIIRS monthly mean night light. Specifically, the monotonic decrease of variability with increasing mean brightness. Anthropogenic night light is remarkably stable on subannual time scales while background luminance varies considerably. The variance partition from the eigenvalues of the spatiotemporal covariance matrix are 88, 2 and 2% for spatial, seasonal and interannual variance (respectively) in the most diverse region on Earth (Eurasia). Heteroskedasticity is pervasive in the monthly composites; present in all areas for all months of the year, suggesting that much, if not most, of the month-to-month variability may be related to luminance of otherwise stable sources subjected to multiple aspects of the imaging process varying in time. Given the skewed distribution of all night light arising from radial peripheral dimming of bright sources subject to atmospheric scattering, even aggregate metrics using thresholds must be interpreted in light of the fact that much larger numbers of more variable low luminance pixels may statistically overwhelm smaller numbers of stable higher luminance pixels and cause apparent changes related to the imaging process to be interpreted as actual changes in the light sources.


2021 ◽  
Vol 8 (12) ◽  
Author(s):  
Adam Bannister ◽  
Federico Botta

Measuring socio-economic indicators is a crucial task for policy makers who need to develop and implement policies aimed at reducing inequalities and improving the quality of life. However, traditionally this is a time-consuming and expensive task, which therefore cannot be carried out with high temporal frequency. Here, we investigate whether secondary data generated from our grocery shopping habits can be used to generate rapid estimates of deprivation in the city of London in the UK. We show the existence of a relationship between our grocery shopping data and the deprivation of different areas in London, and how we can use grocery shopping data to generate quick estimates of deprivation, albeit with some limitations. Crucially, our estimates can be generated very rapidly with the data used in our analysis, thus opening up the opportunity of having early access to estimates of deprivation. Our findings provide further evidence that new data streams contain accurate information about our collective behaviour and the current state of our society.


Author(s):  
Pascal Monnin ◽  
Anaïs Viry ◽  
Jérôme Damet ◽  
Marie Nowak ◽  
Veronika Vitzthum ◽  
...  

Abstract Objectives. The planar formulation of the noise equivalent quanta (NEQ) and detective quantum efficiency (DQE) used to assess the image quality of projection images does not deal with the influence of temporal resolution on signal blurring and image noise. These metrics require correction factors based on temporal resolution when used for dynamic imaging systems such as fluoroscopy. Additionally, the standard NEQ and detector DQE are determined on pre-processed images in scatter-free conditions for effective energies produced by additional aluminium or copper filters that are not representative of clinical fluoroscopic procedures. In this work, we developed a method to measure “frame NEQ” and “frame system DQE” which include the temporal frequency bandwidth and consider the anti-scatter grid, the detector and the image processing procedures for beam qualities with scatter fractions representative of clinical use. Approach. We used a solid water phantom to simulate a patient and a thin copper disc to measure the spatial resolution. The copper disc, set in uniform rectilinear motion in the image plane, assessed the temporal resolution. These new metrics were tested on two fluoroscopy systems, a C-arm and a floor-mounted cardiology, for multiple parameters: phantom thicknesses from 5 to 20 cm, frame rates from 3 to 30 fps, spatial and temporal image processing of different weights. Main results. The frame NEQ correctly described the image quality for different scatter conditions, temporal resolutions and image processing techniques. The frame system DQE varied between 0.38 and 0.65 within the different beam and scatter conditions, and correctly mitigated the influence of spatial and temporal image processing. Significance. This study introduces and validates an unbiased formulation of in-plane NEQ and system DQE to assess the spatiotemporal image quality of fluoroscopy systems.


Author(s):  
Niels F. Lake ◽  
Núria Martínez-Carreras ◽  
Peter J. Shaw ◽  
Adrian L. Collins

Abstract Purpose This study tests the feasibility of using a submersible spectrophotometer as a novel method to trace and apportion suspended sediment sources in situ and at high temporal frequency. Methods Laboratory experiments were designed to identify how absorbance at different wavelengths can be used to un-mix artificial mixtures of soil samples (i.e. sediment sources). The experiment consists of a tank containing 40 L of water, to which the soil samples and soil mixtures of known proportions were added in suspension. Absorbance measurements made using the submersible spectrophotometer were used to elucidate: (i) the effects of concentrations on absorbance, (ii) the relationship between absorbance and particle size and (iii) the linear additivity of absorbance as a prerequisite for un-mixing. Results The observed relationships between soil sample concentrations and absorbance in the ultraviolet visible (UV–VIS) wavelength range (200–730 nm) indicated that differences in absorbance patterns are caused by soil-specific properties and particle size. Absorbance was found to be linearly additive and could be used to predict the known soil sample proportions in mixtures using the MixSIAR Bayesian tracer mixing model. Model results indicate that dominant contributions to mixtures containing two and three soil samples could be predicted well, whilst accuracy for four-soil sample mixtures was lower (with respective mean absolute errors of 15.4%, 12.9% and 17.0%). Conclusion The results demonstrate the potential for using in situ submersible spectrophotometer sensors to trace suspended sediment sources at high temporal frequency.


Sign in / Sign up

Export Citation Format

Share Document