scholarly journals Analysis of Stable Targets in High-Resolution Polarimetric SAR Data Stacks

2016 ◽  
Vol 7 (1) ◽  
pp. 10-19 ◽  
Author(s):  
B. Scheuchl

Polarimetric SAR data provide information about the scattering of the area observed. The availability of data stacks allows the identification of stable targets and subsequent scattering analysis with a high degree of confidence at full resolution. A novel approach to find and evaluate polarimetric persistent target is presented, that is an extension of well-established analysis methods for single scenes. The use of the Cloude-Pottier distributed target decomposition analysis applied on the temporal averages (as opposed to spatial averaging), combined with a Cameron point target analysis applied on each layer separately to select pixels only, provides an efficient scattering classification of polarimetric persistent point targets in the stack. This method can also be used to analyze targets identified through other means, albeit at a lower degree of confidence. The approach retains the full resolution of the data set, though temporal changes between acquisitions add additional complexity. Result interpretation is therefore performed under consideration of a set of boundary conditions. Results from the analysis of two polarimetric data stacks acquired by RADARSAT-2 are shown.

2006 ◽  
Vol 53 (12) ◽  
pp. 247-256 ◽  
Author(s):  
A. Marquot ◽  
A.-E. Stricker ◽  
Y. Racault

Activated sludge models, and ASM1 in particular, are well recognised and useful mathematical representations of the macroscopic processes involved in the biological degradation of the pollution carried by wastewater. Nevertheless, the use of these models through simulation software requires a careful methodology for their calibration (determination of the model parameters' values) and the validation step (verification with an independent data set). This paper presents the methodology and the results of dynamic calibration and validation tasks as a prior work to a modelling project for defining a reference guideline destined to French designers and operators. To reach these goals, a biological nutrient removal (BNR) wastewater treatment plant (WWTP) with intermittent aeration was selected and monitored for 2 years. Two sets of calibrated parameters are given and discussed. The results of the long-term validation task are presented through a 2-month simulation with lots of operation changes. Finally, it is concluded that, even if calibrating ASM1 with a high degree of confidence with a single set of parameters was not possible, the results of the calibration are sufficient to obtain satisfactory results over long-term dynamic simulation. However, simulating long periods reveals specific calibration issues such as the variation of the nitrification capacity due to external events.


2018 ◽  
Vol 9 (3) ◽  
pp. 611-623
Author(s):  
Dingmu Xiao ◽  
Xiaomei Huang ◽  
Ningsheng Qin

Abstract Tree-ring width standard chronologies were created from Juniperus przewalskii Kom data collected in the southern Three-River Headwaters (TRH) region. Statistical analysis results showed high correlation between the first primary component (PC1) of the four chronologies and instrumental precipitation records during the annual September–August interval. Precipitation of the region was reconstructed for the past 461 years. It was verified that the reconstruction model was stable by split-sample calibration-verification statistics. The reconstruction series revealed 22 extremely dry years and 9 extremely wet years. Results showed relatively dry periods occurred during 1567–1597, 1604–1614, 1641–1656, 1684–1700, 1734–1755, 1817–1830, 1913–1932, 1953–1971, 1990–2005. Relatively wet periods occurred during 1615–1630, 1657–1683, 1701–1733, 1756–1786, 1798–1816, 1844–1855, 1864–1875, 1885–1912, 1933–1952, 1977–1989. Comparison with tree-ring based precipitation reconstructions, and chronologies from surrounding areas provided a high degree of confidence in our reconstruction, and correlated well with the Monsoon Asia Drought Atlas (MADA) dataset in the public section of corresponding grids. The empirical mode decomposition analysis suggests the existence of significant periods with intervals of 2–5, 6–10, 11–18, and 28–60 years. This research contributes to a better understanding of historical variations in precipitation and will aid in future plans to address climate change of the TRH region.


2014 ◽  
Vol 47 (5) ◽  
pp. 1953-1967 ◽  
Author(s):  
Giorgio Licciardi ◽  
Ruggero Giuseppe Avezzano ◽  
Fabio Del Frate ◽  
Giovanni Schiavon ◽  
Jocelyn Chanussot

Author(s):  
S. Rouabah ◽  
M. Ouarzeddine ◽  
B. Azmedroub

Due to the increasing volume of available SAR Data, powerful classification processings are needed to interpret the images. GMM (Gaussian Mixture Model) is widely used to model distributions. In most applications, GMM algorithm is directly applied on raw SAR data, its disadvantage is that forest and urban areas are classified with the same label and gives problems in interpretation. In this paper, a combination between the improved Freeman decomposition and GMM classification is proposed. The improved Freeman decomposition powers are used as feature vectors for GMM classification. The E-SAR polarimetric image acquired over Oberpfaffenhofen in Germany is used as data set. The result shows that the proposed combination can solve the standard GMM classification problem.


2011 ◽  
Vol 68 (11) ◽  
pp. 1952-1969 ◽  
Author(s):  
Patrick W. DeHaan ◽  
Shana R. Bernall ◽  
Joseph M. DosSantos ◽  
Lawrence L. Lockard ◽  
William R. Ardren

Dams and other barriers fragment important migratory corridors for bull trout ( Salvelinus confluentus ) across the species range. Three dams constructed without fish passage facilities prevented migratory bull trout in the Lake Pend Oreille and Clark Fork River system in Idaho and Montana, USA, from returning to their natal spawning tributaries for nearly 100 years. We genotyped bull trout from 39 spawning tributaries to assemble a baseline data set that we used to develop a real-time genotyping and analysis protocol to assist with upstream fish transport decisions. Self-assignment tests and analysis of blind samples indicated that unknown individuals could be assigned to their region of origin with a high degree of confidence. From 2004 to 2010, genetic assignments were conducted for 259 adult bull trout collected below mainstem dams. Based on genetic assignments, 203 fish were transported upstream above one or more dams. This protocol has helped re-establish connectivity in a fragmented system, providing increased numbers of spawning adults for numerically depressed populations above the dams. We discuss the utility of genetic data for assisting with upstream passage decisions.


Author(s):  
S. Rouabah ◽  
M. Ouarzeddine ◽  
B. Azmedroub

Due to the increasing volume of available SAR Data, powerful classification processings are needed to interpret the images. GMM (Gaussian Mixture Model) is widely used to model distributions. In most applications, GMM algorithm is directly applied on raw SAR data, its disadvantage is that forest and urban areas are classified with the same label and gives problems in interpretation. In this paper, a combination between the improved Freeman decomposition and GMM classification is proposed. The improved Freeman decomposition powers are used as feature vectors for GMM classification. The E-SAR polarimetric image acquired over Oberpfaffenhofen in Germany is used as data set. The result shows that the proposed combination can solve the standard GMM classification problem.


Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3673
Author(s):  
Stefan Grushko ◽  
Aleš Vysocký ◽  
Petr Oščádal ◽  
Michal Vocetka ◽  
Petr Novák ◽  
...  

In a collaborative scenario, the communication between humans and robots is a fundamental aspect to achieve good efficiency and ergonomics in the task execution. A lot of research has been made related to enabling a robot system to understand and predict human behaviour, allowing the robot to adapt its motion to avoid collisions with human workers. Assuming the production task has a high degree of variability, the robot’s movements can be difficult to predict, leading to a feeling of anxiety in the worker when the robot changes its trajectory and approaches since the worker has no information about the planned movement of the robot. Additionally, without information about the robot’s movement, the human worker cannot effectively plan own activity without forcing the robot to constantly replan its movement. We propose a novel approach to communicating the robot’s intentions to a human worker. The improvement to the collaboration is presented by introducing haptic feedback devices, whose task is to notify the human worker about the currently planned robot’s trajectory and changes in its status. In order to verify the effectiveness of the developed human-machine interface in the conditions of a shared collaborative workspace, a user study was designed and conducted among 16 participants, whose objective was to accurately recognise the goal position of the robot during its movement. Data collected during the experiment included both objective and subjective parameters. Statistically significant results of the experiment indicated that all the participants could improve their task completion time by over 45% and generally were more subjectively satisfied when completing the task with equipped haptic feedback devices. The results also suggest the usefulness of the developed notification system since it improved users’ awareness about the motion plan of the robot.


Axioms ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 36
Author(s):  
Norma P. Rodríguez-Cándido ◽  
Rafael A. Espin-Andrade ◽  
Efrain Solares ◽  
Witold Pedrycz

This work presents a novel approach to prediction of financial asset prices. Its main contribution is the combination of compensatory fuzzy logic and the classical technical analysis to build an efficient prediction model. The interpretability properties of the model allow its users to incorporate and consider virtually any set of rules from technical analysis, in addition to the investors’ knowledge related to the actual market conditions. This knowledge can be incorporated into the model in the form of subjective assessments made by investors. Such assessments can be obtained, for example, from the graphical analysis commonly performed by traders. The effectiveness of the model was assessed through its systematic application in the stock and cryptocurrency markets. From the results, we conclude that when the model shows a high degree of recommendation, the actual financial assets show high effectiveness.


Author(s):  
Behnam Jahangiri ◽  
Punyaslok Rath ◽  
Hamed Majidifard ◽  
William G. Buttlar

Various agencies have begun to research and introduce performance-related specifications (PRS) for the design of modern asphalt paving mixtures. The focus of most recent studies has been directed toward simplified cracking test development and evaluation. In some cases, development and validation of PRS has been performed, building on these new tests, often by comparison of test values to accelerated pavement test studies and/or to limited field data. This study describes the findings of a comprehensive research project conducted at Illinois Tollway, leading to a PRS for the design of mainline and shoulder asphalt mixtures. A novel approach was developed, involving the systematic establishment of specification requirements based on: 1) selection of baseline values based on minimally acceptable field performance thresholds; 2) elevation of thresholds to account for differences between short-term lab aging and expected long-term field aging; 3) further elevation of thresholds to account for variability in lab testing, plus variability in the testing of field cores; and 4) final adjustment and rounding of thresholds based on a consensus process. After a thorough evaluation of different candidate cracking tests in the course of the project, the Disk-shaped Compact Tension—DC(T)—test was chosen to be retained in the Illinois Tollway PRS and to be presented in this study for the design of crack-resistant mixtures. The DC(T) test was selected because of its high degree of correlation with field results and its excellent repeatability. Tailored Hamburg rut depth and stripping inflection point thresholds were also established for mainline and shoulder mixes.


Genetics ◽  
2004 ◽  
Vol 166 (4) ◽  
pp. 1923-1933 ◽  
Author(s):  
Lorinda K Anderson ◽  
Naser Salameh ◽  
Hank W Bass ◽  
Lisa C Harper ◽  
W Z Cande ◽  
...  

Abstract Genetic linkage maps reveal the order of markers based on the frequency of recombination between markers during meiosis. Because the rate of recombination varies along chromosomes, it has been difficult to relate linkage maps to chromosome structure. Here we use cytological maps of crossing over based on recombination nodules (RNs) to predict the physical position of genetic markers on each of the 10 chromosomes of maize. This is possible because (1) all 10 maize chromosomes can be individually identified from spreads of synaptonemal complexes, (2) each RN corresponds to one crossover, and (3) the frequency of RNs on defined chromosomal segments can be converted to centimorgan values. We tested our predictions for chromosome 9 using seven genetically mapped, single-copy markers that were independently mapped on pachytene chromosomes using in situ hybridization. The correlation between predicted and observed locations was very strong (r2 = 0.996), indicating a virtual 1:1 correspondence. Thus, this new, high-resolution, cytogenetic map enables one to predict the chromosomal location of any genetically mapped marker in maize with a high degree of accuracy. This novel approach can be applied to other organisms as well.


Sign in / Sign up

Export Citation Format

Share Document