scholarly journals Comparison Between Lotka-Volterra and Multivariate Autoregressive Models of Ecological Interaction Systems

2021 ◽  
Author(s):  
Daniel V. Olivença ◽  
Jacob D. Davis ◽  
Eberhard O. Voit

AbstractLotka-Volterra (LV) and Multivariate Autoregressive (MAR) models are computational frameworks with different mathematical structures that have both been proposed for the same purpose of extracting governing features of dynamic interactions among coexisting populations of different species from observed time series data.We systematically compare the feasibility of the two modeling approaches, using four synthetically generated datasets and seven ecological datasets from the literature.The overarching result is that LV models outperform MAR models in most cases and are generally superior for representing cases where the dependent variables deviate greatly from their steady states. A large dynamic range is particularly prevalent when the populations are highly abundant, change considerably over time, and exhibit a large signal-to-noise ratio. By contrast, MAR models are better suited for analyses of populations with low abundances and for investigations where the quantification of noise is important.We conclude that the choice of either one or the other modeling framework should be guided by the specific goals of the analysis and the dynamic features of the data.Availability of algorithms usedhttps://github.com/LBSA-VoitLab/Comparison-Between-LV-and-MAR-Models-of-Ecological-Interaction-Systems

Malware analysis can be classified as static and dynamic analysis. Static analysis involves the inspection of the malicious code by observing the features such as file signatures, strings etc. The code obfuscation techniques such as string encryption, class encryption etc can be easily applied on static code analysis. Dynamic or behavioural data is more difficult to obfuscate as the malicious payload may have already been executed before it is detected. In this paper, the dataset is obtained from repositories such as VirusShare and is run in Cuckoo Sandbox with the help of the agent.py. The dynamic features are extracted from the generated Cuckoo logs in the html and JSON format and it has to be determined whether it is malicious or not using recurrent neural networks. Recurrent Neural Networks are capable of predicting whether an executable is malicious and have the ability to capture time-series data.


Author(s):  
Aris Spanos

The current discontent with the dominant macroeconomic theory paradigm, known as Dynamic Stochastic General Equilibrium (DSGE) models, calls for an appraisal of the methods and strategies employed in studying and modeling macroeconomic phenomena using aggregate time series data. The appraisal pertains to the effectiveness of these methods and strategies in accomplishing the primary objective of empirical modeling: to learn from data about phenomena of interest. The co-occurring developments in macroeconomics and econometrics since the 1930s provides the backdrop for the appraisal with the Keynes vs. Tinbergen controversy at center stage. The overall appraisal is that the DSGE paradigm gives rise to estimated structural models that are both statistically and substantively misspecified, yielding untrustworthy evidence that contribute very little, if anything, to real learning from data about macroeconomic phenomena. A primary contributor to the untrustworthiness of evidence is the traditional econometric perspective of viewing empirical modeling as curve-fitting (structural models), guided by impromptu error term assumptions, and evaluated on goodness-of-fit grounds. Regrettably, excellent fit is neither necessary nor sufficient for the reliability of inference and the trustworthiness of the ensuing evidence. Recommendations on how to improve the trustworthiness of empirical evidence revolve around a broader model-based (non-curve-fitting) modeling framework, that attributes cardinal roles to both theory and data without undermining the credibleness of either source of information. Two crucial distinctions hold the key to securing the trusworthiness of evidence. The first distinguishes between modeling (specification, misspeification testing, respecification, and inference), and the second between a substantive (structural) and a statistical model (the probabilistic assumptions imposed on the particular data). This enables one to establish statistical adequacy (the validity of these assumptions) before relating it to the structural model and posing questions of interest to the data. The greatest enemy of learning from data about macroeconomic phenomena is not the absence of an alternative and more coherent empirical modeling framework, but the illusion that foisting highly formal structural models on the data can give rise to such learning just because their construction and curve-fitting rely on seemingly sophisticated tools. Regrettably, applying sophisticated tools to a statistically and substantively misspecified DSGE model does nothing to restore the trustworthiness of the evidence stemming from it.


2021 ◽  
Author(s):  
Andrew Pretorius ◽  
Emma Smith ◽  
Adam Booth ◽  
Poul Christofferson ◽  
Andy Nowacki ◽  
...  

<p>Seismic surveys are widely used to study the properties of glaciers, basal material and conditions, ice temperature and crystal orientation fabric. The emerging technology of Distributed Acoustic Sensing (DAS) uses fibre optic cables as pseudo-seismic receivers,<br>reconstructing seismic measurements at a higher spatial and temporal resolution than is possible using traditional geophone deployments. DAS generates large volumes of data, especially in passive mode, which can be costly in time and cumbersome to analyse. Machine learning tools provide an effective means of automatically identifying events within these records, avoiding a bottleneck in the data analysis process. Here we present initial trials of machine learning for a borehole-deployed DAS system on Store Glacier, West Greenland. Data were acquired in July 2019, using a Silixa iDAS interrogator and a BRUsens fibre optic cable installed in a 1043 m-deep borehole. The interrogator sampled at 4000 Hz, recording both controlled-source Vertical Seismic Profiles (VSPs), made with hammer-and-plate source, and a 3-day passive record of cryoseismicity.</p><p>We used a Convolutional Neural Network (CNN) to identify seismic events within the seismic record. A CNN is a deep learning algorithm that uses a series of convolutional filters to extract features from a 2-dimensional matrix of values. These features are then used to train a model<br>that can recognise objects or patterns within the dataset. CNNs are a powerful classification tool, widely applied to the analysis of both images and time series data. Previous research has demonstrated the ability of CNNs to recognise seismic phases in time series data for long-range<br>earthquake detection, even when the phases are masked by a low signal-to-noise ratio. For the Store Glacier data, initial results were obtained using a CNN trained on hand-labelled, uniformly-sized windows. At present, these windows have been targeted around high signal-to-noise ratio seismic events in the controlled-source VSPs only. Once trained, the CNN achieved accuracy of 90% in recognising whether new windows contained coherent seismic<br>energy.</p><p>The next phase of analysis will be to assess the performance of the CNN when trained and tested on large passive DAS datasets. The method will then be used for the identification and flagging of seismic events within the passive record for interpretation and event location. The identified signals will be used to provide information on the glacier’s seismic velocity structure, ice temperature and ice crystal orientation fabric and anisotropy. Basal reflections were identified and will be used to provide information on subglacial material properties and conditions of Store Glacier. The efficiency of the CNN allows detailed insight to be made into the origins and style of glacier seismicity, facilitating further advantages of passive DAS instrumentation.</p>


2009 ◽  
Vol 26 (11) ◽  
pp. 2474-2487 ◽  
Author(s):  
Igor R. Ivić ◽  
Dušan S. Zrnić ◽  
Tian-You Yu

Abstract Currently, signal detection and censoring in operational weather radars is performed by using thresholds of the estimated signal-to-noise ratio (SNR) and/or the magnitude of the autocorrelation coefficient at the first temporal lag. The growing popularity of polarimetric radars prompts the quest for improved detection schemes that take advantage of the signals from the two orthogonally polarized electric fields. A hybrid approach is developed based on the sum of the cross-correlation estimates as well as the powers and autocorrelations from each of the dual-polarization returns. The hypothesis that “signal is present” is accepted if the sum exceeds a predetermined threshold; otherwise, the data are considered to represent noise and are censored. The threshold is determined by the acceptable rate of false detections that is less than or equal to a preset value. The scheme is evaluated both in simulations and through implementation on time series data collected by the research weather surveillance radar (KOUN) in Norman, Oklahoma.


2020 ◽  
Vol 12 (19) ◽  
pp. 3219 ◽  
Author(s):  
Yaping Xu ◽  
Nicholas R. Vaughn ◽  
David E. Knapp ◽  
Roberta E. Martin ◽  
Christopher Balzotti ◽  
...  

We present a new method for the detection of coral bleaching using satellite time-series data. While the detection of coral bleaching from satellite imagery is difficult due to the low signal-to-noise ratio of benthic reflectance, we overcame this difficulty using three approaches: 1) specialized pre-processing developed for Planet Dove satellites, 2) a time-series approach for determining baseline reflectance statistics, and 3) a regional filter based on a preexisting map of live coral. The time-series was divided into a baseline period (April-July 2019), when no coral bleaching was known to have taken place, and a bleaching period (August 2019-present), when the bleaching was known to have occurred based on field data. The identification of the bleaching period allowed the computation of a Standardized Bottom Reflectance (SBR) for each region. SBR transforms the weekly bottom reflectance into a value relative to the baseline reflectance distribution statistics, increasing the sensitivity to bleaching detection. We tested three scales of the temporal smoothing of the SBR (weekly, cumulative average, and three-week moving average). Our field verification of coral bleaching throughout the main Hawaiian Islands showed that the cumulative average and three-week moving average smoothing detected the highest proportion of coral bleaching locations, correctly identifying 11 and 10 out of 18 locations, respectively. However, the three-week moving average provided a better sensitivity in coral bleaching detection, with a performance increase of at least one standard deviation, which helps define the confidence level of a detected bleaching event.


Author(s):  
Siddharth Sonti ◽  
Joseph Horn ◽  
Eric Keller ◽  
Asok Ray

This short paper presents a data-driven method for identification of stability margin in rotorcraft system dynamics and the underlying concept is built upon the principles of Symbolic Dynamics. The algorithm involves wavelet-packet-based pre-processing to remove spurious disturbances and to improve the signal-to-noise ratio (SNR) of sensor time series. A quantified measure, called Instability Measure, is constructed from the processed time series data to obtain an estimate of the relative instability of the dynamic modes of interest on the rotorcraft system. The proposed method has been tested with numerical simulations; and correlations between the Instability Measure and the damping parameters of selected dynamic modes of the rotor blade have been established.


Forecasting ◽  
2018 ◽  
Vol 1 (1) ◽  
pp. 47-58 ◽  
Author(s):  
Akram Farhadi ◽  
Joshua Chern ◽  
Daniel Hirsh ◽  
Tod Davis ◽  
Mingyoung Jo ◽  
...  

Increased Intracranial Pressure (ICP) is a serious and often life-threatening condition. If the increased pressure pushes on critical brain structures and blood vessels, it can lead to serious permanent problems or even death. In this study, we propose a novel regression model to forecast ICP episodes in children, 30 min in advance, by using the dynamic characteristics of continuous intracranial pressure, vitals and medications during the last two hours. The correlation between physiological parameters, including blood pressure, respiratory rate, heart rate and the ICP, is analyzed. Linear regression, Lasso regression, support vector machine and random forest algorithms are used to forecast the next 30 min of the recorded ICP. Finally, dynamic features are created based on vitals, medications and the ICP. The weak correlation between blood pressure and the ICP (0.2) is reported. The Root-Mean-Square Error (RMSE) of the random forest model decreased from 1.6 to 0.89% by using the given medication variables in the last two hours. The random forest regression gave an accurate model for the ICP forecast with 0.99 correlation between the forecast and experimental values.


2021 ◽  
Author(s):  
Christopher Endemann ◽  
Bryan M. Krause ◽  
Kirill V. Nourski ◽  
Matthew I. Banks ◽  
Barry Van Veen

AbstractFundamental to elucidating the functional organization of the brain is the assessment of causal interactions between different brain regions. Multivariate autoregressive (MVAR) modeling techniques applied to multisite electrophysiological recordings are a promising avenue for identifying such causal links. They estimate the degree to which past activity in one or more brain regions is predictive of another region’s present activity, while simultaneously accounting for the mediating effects of other regions. Including in the model as many mediating variables as possible has the benefit of drastically reducing the odds of detecting spurious causal connectivity. However, effective bounds on the number of MVAR model coefficients that can be estimated reliably from limited data make exploiting the potential of MVAR models challenging. Here, we utilize well-established dimensionality-reduction techniques to fit MVAR models to human intracranial data from ∽100 – 200 recording sites spanning dozens of anatomically and functionally distinct cortical regions. First, we show that high dimensional MVAR models can be successfully estimated from long segments of data and yield plausible connectivity profiles. Next, we use these models to generate synthetic data with known ground-truth connectivity to explore the utility of applying principal component analysis and group least absolute shrinkage and selection operator (LASSO) to reduce the number of parameters (connections) during model fitting to shorter data segments. We show that group LASSO is highly effective for recovering ground truth connectivity in the limited data regime, capturing important features of connectivity for high-dimensional models with as little as 10 s of data. The methods presented here have broad applicability to the analysis of high-dimensional time series data in neuroscience, facilitating the elucidation of the neural basis of sensation, cognition, and arousal.


2018 ◽  
Author(s):  
César Capinha

AbstractSpatiotemporal forecasts of ecological phenomena are highly useful and significant in scientific and socio-economic applications. Nevertheless, developing the correlative models to make these forecasts is often stalled by the inadequate availability of the ecological time-series data. On the contrary, considerable amounts of temporally discrete biological records are being stored in public databases, and often include the sites and dates of the observation. While these data are reasonably suitable for the development of spatiotemporal forecast models, this possibility remains mostly untested.In this paper, we test an approach to develop spatiotemporal forecasts based on the dates and locations found in species occurrence records. This approach is based on ‘time-series classification’, a field of machine learning, and involves the application of a machine-learning algorithm to classify between time-series representing the environmental conditions that precede the occurrence records and time-series representing other environmental conditions, such as those that generally occur in the sites of the records. We employed this framework to predict the timing of emergence of fruiting bodies of two mushroom species (Boletus edulis and Macrolepiota procera) in countries of Europe, from 2009 to 2015. We compared the predictions from this approach with those from a ‘null’ model, based on the calendar dates of the records.Forecasts made from the environmental-based approach were consistently superior to those drawn from the date-based approach, averaging an area under the receiver operating characteristic curve (AUC) of 0.9 for B. edulis and 0.88 for M. procera, compared to an average AUC of 0.83 achieved by the null models for both species. Prediction errors were distributed across the study area and along the years, lending support to the spatiotemporal representativeness of the values of accuracy measured.Our approach, based on species occurrence records, was able to provide useful forecasts of the timing of emergence of two mushroom species across Europe. Given the increased availability and information contained in this type of records, particularly those supplemented with photographs, the range of events that could be possible to forecast is vast.


Author(s):  
Siddharth Sonti ◽  
Eric Keller ◽  
Joseph Horn ◽  
Asok Ray

This brief paper proposes a dynamic data-driven method for stability monitoring of rotorcraft systems, where the underlying concept is built upon the principles of symbolic dynamics. The stability monitoring algorithm involves wavelet-packet-based preprocessing to remove spurious disturbances and to improve the signal-to-noise ratio (SNR) of the sensor time series. A quantified measure, called Instability Measure, is constructed from the processed time series data to obtain an estimate of the relative instability of the dynamic modes of interest on the rotorcraft system. The efficacy of the proposed method has been established with numerical simulations where correlations between the instability measure and the damping parameter(s) of selected dynamic mode(s) of the rotor blade are established.


Sign in / Sign up

Export Citation Format

Share Document