temporal sample
Recently Published Documents


TOTAL DOCUMENTS

11
(FIVE YEARS 4)

H-INDEX

3
(FIVE YEARS 1)

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Adam J. Andrews ◽  
Gregory N. Puncher ◽  
Darío Bernal-Casasola ◽  
Antonio Di Natale ◽  
Francesco Massari ◽  
...  

AbstractAtlantic bluefin tuna (Thunnus thynnus; BFT) abundance was depleted in the late 20th and early 21st century due to overfishing. Historical catch records further indicate that the abundance of BFT in the Mediterranean has been fluctuating since at least the 16th century. Here we build upon previous work on ancient DNA of BFT in the Mediterranean by comparing contemporary (2009–2012) specimens with archival (1911–1926) and archaeological (2nd century BCE–15th century CE) specimens that represent population states prior to these two major periods of exploitation, respectively. We successfully genotyped and analysed 259 contemporary and 123 historical (91 archival and 32 archaeological) specimens at 92 SNP loci that were selected for their ability to differentiate contemporary populations or their association with core biological functions. We found no evidence of genetic bottlenecks, inbreeding or population restructuring between temporal sample groups that might explain what has driven catch fluctuations since the 16th century. We also detected a putative adaptive response, involving the cytoskeletal protein synemin which may be related to muscle stress. However, these results require further investigation with more extensive genome-wide data to rule out demographic changes due to overfishing, and other natural and anthropogenic factors, in addition to elucidating the adaptive drivers related to these.


Author(s):  
Zahra Mousavi ◽  
Mohammad Mahdi Kiani ◽  
Hamid Aghajan

AbstractThe brain is constantly anticipating the future of sensory inputs based on past experiences. When new sensory data is different from predictions shaped by recent trends, neural signals are generated to report this surprise. Existing models for quantifying surprise are based on an ideal observer assumption operating under one of the three definitions of surprise set forth as the Shannon, Bayesian, and Confidence-corrected surprise. In this paper, we analyze both visual and auditory EEG and auditory MEG signals recorded during oddball tasks to examine which temporal components in these signals are sufficient to decode the brain’s surprise based on each of these three definitions. We found that for both recording systems the Shannon surprise is always significantly better decoded than the Bayesian surprise regardless of the sensory modality and the selected temporal features used for decoding.Author summaryA regression model is proposed for decoding the level of the brain’s surprise in response to sensory sequences using selected temporal components of recorded EEG and MEG data. Three surprise quantification definitions (Shannon, Bayesian, and Confidence-corrected surprise) are compared in offering decoding power. Four different regimes for selecting temporal samples of EEG and MEG data are used to evaluate which part of the recorded data may contain signatures that represent the brain’s surprise in terms of offering a high decoding power. We found that both the middle and late components of the EEG response offer strong decoding power for surprise while the early components are significantly weaker in decoding surprise. In the MEG response, we found that the middle components have the highest decoding power while the late components offer moderate decoding powers. When using a single temporal sample for decoding surprise, samples of the middle segment possess the highest decoding power. Shannon surprise is always better decoded than the other definitions of surprise for all the four temporal feature selection regimes. Similar superiority for Shannon surprise is observed for the EEG and MEG data across the entire range of temporal sample regimes used in our analysis.


Forests ◽  
2019 ◽  
Vol 10 (7) ◽  
pp. 542
Author(s):  
Jarosław Socha ◽  
Luiza Tymińska-Czabańska

Knowledge of the potential productivity of forest sites is fundamental for making strategic decisions in forest management. Site productivity is usually evaluated using the site index, and therefore the development of site index models is one of the crucial tasks in forest research and forest management. This research aims to develop an effective method for building top-growth and site index models using data from temporary sample plots (TSP). Exploiting the advantages of the generalised algebraic difference approach (GADA), the proposed method overcomes the limitations of the guide curve method that has been to date used in site index modelling using TSPs data and allows to obtain only a set of anamorphic site index curves. The proposed approach enables the construction of dynamic site index models with polymorphism and variable asymptotes. Such models better reflect local, site-specific height growth trajectories and therefore allow more appropriate site index estimation. We tested the proposed method using data collected from 5105 temporary sample plots in Poland. Our results indicate that growth trend estimates using height–age measurements of TSPs may be valuable data for modelling top height growth. For these reasons, the proposed method can be very useful in forest management.


2018 ◽  
Vol 9 (10) ◽  
pp. 952-961 ◽  
Author(s):  
Pengyu Hao ◽  
Huajun Tang ◽  
Zhongxin Chen ◽  
Le Yu ◽  
Mingquan Wu

2018 ◽  
Vol 11 (2) ◽  
pp. 925-938 ◽  
Author(s):  
Timo H. Virtanen ◽  
Pekka Kolmonen ◽  
Larisa Sogacheva ◽  
Edith Rodríguez ◽  
Giulia Saponaro ◽  
...  

Abstract. Satellite-based aerosol products are routinely validated against ground-based reference data, usually obtained from sun photometer networks such as AERONET (AEROsol RObotic NETwork). In a typical validation exercise a spatial sample of the instantaneous satellite data is compared against a temporal sample of the point-like ground-based data. The observations do not correspond to exactly the same column of the atmosphere at the same time, and the representativeness of the reference data depends on the spatiotemporal variability of the aerosol properties in the samples. The associated uncertainty is known as the collocation mismatch uncertainty (CMU). The validation results depend on the sampling parameters. While small samples involve less variability, they are more sensitive to the inevitable noise in the measurement data. In this paper we study systematically the effect of the sampling parameters in the validation of AATSR (Advanced Along-Track Scanning Radiometer) aerosol optical depth (AOD) product against AERONET data and the associated collocation mismatch uncertainty. To this end, we study the spatial AOD variability in the satellite data, compare it against the corresponding values obtained from densely located AERONET sites, and assess the possible reasons for observed differences. We find that the spatial AOD variability in the satellite data is approximately 2 times larger than in the ground-based data, and the spatial variability correlates only weakly with that of AERONET for short distances. We interpreted that only half of the variability in the satellite data is due to the natural variability in the AOD, and the rest is noise due to retrieval errors. However, for larger distances (∼ 0.5∘) the correlation is improved as the noise is averaged out, and the day-to-day changes in regional AOD variability are well captured. Furthermore, we assess the usefulness of the spatial variability of the satellite AOD data as an estimate of CMU by comparing the retrieval errors to the total uncertainty estimates including the CMU in the validation. We find that accounting for CMU increases the fraction of consistent observations.


2017 ◽  
Author(s):  
Timo H. Virtanen ◽  
Pekka Kolmonen ◽  
Larisa Sogacheva ◽  
Edith Rodríguez ◽  
Giulia Saponaro ◽  
...  

Abstract. Satellite based aerosol products are routinely validated against ground based reference data, usually obtained from sunphotometer networks such as AERONET (AEROsol Robotic Network). In a typical validation exercise a spatial sample of the instantaneous satellite data is compared against a temporal sample of the point-like ground based data. The observations do not correspond to exactly the same column of the atmosphere at the same time, and the representativiness of the reference data depends on the spatiotemporal variability of the aerosol properties in the samples. The associated uncertainty is known as the collocation mismatch uncertainty (CMU). The validation results depend on the sampling parameters. While small samples involve less variability, they are more sensitive to the inevitable noise in the measurement data. In this paper we study systematically the effect of the sampling parameters in the validation of AATSR (Advanced Along Track Scanning Radiometer) aerosol optical depth (AOD) product against AERONET data and the associated collocation mismatch uncertainty. To this end, we study the spatial AOD variability in the satellite data, compare it against the corresponding values obtained from densely located AERONET sites, and assess the possible reasons for observed differences. We find that the spatial AOD variability in the satellite data is approximately two times larger than in the ground based data, and the local AOD variability values correlate only weakly for short distances. We interprete that only half of the variability in the satellite data is due to the natural variability in the AOD, and the rest is noise due to retrieval errors. However, for larger distances (∼ 0.5°) the correlation is improved as the noise is averaged out, and the day to day changes in regional AOD variability are well captured. Furthermore, we assess the usefulness of the spatial variability of the satellite AOD data as an estimate of CMU by comparing the retrieval errors to the total uncertainty estimates in the validation. We find that accounting for CMU increases the fraction of consistent observations.


PLoS ONE ◽  
2015 ◽  
Vol 10 (11) ◽  
pp. e0140757 ◽  
Author(s):  
Joshua F. Goldberg ◽  
Tshering Tempa ◽  
Nawang Norbu ◽  
Mark Hebblewhite ◽  
L. Scott Mills ◽  
...  

Author(s):  
Ioannis T. Georgiou

A local damage at the tip of a composite propeller is diagnosed by properly comparing its impact-induced free coupled dynamics to that of a pristine wooden propeller of the same size and shape. This is accomplished by creating indirectly via collocated measurements distributed information for the coupled acceleration field of the propellers. The powerful data-driven modal expansion analysis delivered by the Proper Orthogonal Decomposition (POD) Transform reveals that ensembles of impact-induced collocated coupled experimental acceleration signals are underlined by a high level of spatio-temporal coherence. Thus they furnish a valuable spatio-temporal sample of coupled response induced by a point impulse. In view of this fact, a tri-axial sensor was placed on the propeller hub to collect collocated coupled acceleration signals induced via modal hammer nondestructive impacts and thus obtained a reduced order characterization of the coupled free dynamics. This experimental data-driven analysis reveals that the in-plane unit components of the POD modes for both propellers have similar shapes-nearly identical. For the damaged propeller this POD shape-difference is quite pronounced. The shapes of the POD modes are used to compute indices of difference reflecting directly damage. At the first POD energy level, the shape-difference indices of the damaged composite propeller are quite larger than those of the pristine wooden propeller.


Sign in / Sign up

Export Citation Format

Share Document