scholarly journals Deep Learning in Multi-step Forecasting of Chaotic Dynamics

Author(s):  
Matteo Sangiorgio

AbstractThe prediction of chaotic dynamical systems’ future evolution is widely debated and represents a hot topic in the context of nonlinear time series analysis. Recent advances in the field proved that machine learning techniques, and in particular artificial neural networks, are well suited to deal with this problem. The current state-of-the-art primarily focuses on noise-free time series, an ideal situation that never occurs in real-world applications. This chapter provides a comprehensive analysis that aims at bridging the gap between the deterministic dynamics generated by archetypal chaotic systems, and the real-world time series. We also deeply explore the importance of different typologies of noise, namely observation and structural noise. Artificial intelligence techniques turned out to provide robust predictions, and potentially represent an effective and flexible alternative to the traditional physically-based approach for real-world applications. Besides the accuracy of the forecasting, the domain-adaptation analysis attested the high generalization capability of the neural predictors across a relatively heterogeneous spatial domain.

Author(s):  
Prakhar Mehrotra

The objective of this chapter is to discuss the integration of advancements made in the field of artificial intelligence into the existing business intelligence tools. Specifically, it discusses how the business intelligence tool can integrate time series analysis, supervised and unsupervised machine learning techniques and natural language processing in it and unlock deeper insights, make predictions, and execute strategic business action from within the tool itself. This chapter also provides a high-level overview of current state of the art AI techniques and provides examples in the realm of business intelligence. The eventual goal of this chapter is to leave readers thinking about what the future of business intelligence would look like and how enterprise can benefit by integrating AI in it.


Author(s):  
Prakhar Mehrotra

The objective of this chapter is to discuss the integration of advancements made in the field of artificial intelligence into the existing business intelligence tools. Specifically, it discusses how the business intelligence tool can integrate time series analysis, supervised and unsupervised machine learning techniques and natural language processing in it and unlock deeper insights, make predictions, and execute strategic business action from within the tool itself. This chapter also provides a high-level overview of current state of the art AI techniques and provides examples in the realm of business intelligence. The eventual goal of this chapter is to leave readers thinking about what the future of business intelligence would look like and how enterprise can benefit by integrating AI in it.


2021 ◽  
Author(s):  
Hugo Abreu Mendes ◽  
João Fausto Lorenzato Oliveira ◽  
Paulo Salgado Gomes Mattos Neto ◽  
Alex Coutinho Pereira ◽  
Eduardo Boudoux Jatoba ◽  
...  

Within the context of clean energy generation, solar radiation forecast is applied for photovoltaic plants to increase maintainability and reliability. Statistical models of time series like ARIMA and machine learning techniques help to improve the results. Hybrid Statistical + ML are found in all sorts of time series forecasting applications. This work presents a new way to automate the SARIMAX modeling, nesting PSO and ACO optimization algorithms, differently from R's AutoARIMA, its searches optimal seasonality parameter and combination of the exogenous variables available. This work presents 2 distinct hybrid models that have MLPs as their main elements, optimizing the architecture with Genetic Algorithm. A methodology was used to obtain the results, which were compared to LSTM, CLSTM, MMFF and NARNN-ARMAX topologies found in recent works. The obtained results for the presented models is promising for use in automatic radiation forecasting systems since it outperformed the compared models on at least two metrics.


2014 ◽  
Vol 10 (2) ◽  
pp. 18-38 ◽  
Author(s):  
Kung-Jiuan Yang ◽  
Tzung-Pei Hong ◽  
Yuh-Min Chen ◽  
Guo-Cheng Lan

Partial periodic patterns are commonly seen in real-world applications. The major problem of mining partial periodic patterns is the efficiency problem due to a huge set of partial periodic candidates. Although some efficient algorithms have been developed to tackle the problem, the performance of the algorithms significantly drops when the mining parameters are set low. In the past, the authors have adopted the projection-based approach to discover the partial periodic patterns from single-event time series. In this paper, the authors extend it to mine partial periodic patterns from a sequence of event sets which multiple events concurrently occur at the same time stamp. Besides, an efficient pruning and filtering strategy is also proposed to speed up the mining process. Finally, the experimental results on a synthetic dataset and real oil price dataset show the good performance of the proposed approach.


Author(s):  
Wen Xu ◽  
Jing He ◽  
Yanfeng Shu

Transfer learning is an emerging technique in machine learning, by which we can solve a new task with the knowledge obtained from an old task in order to address the lack of labeled data. In particular deep domain adaptation (a branch of transfer learning) gets the most attention in recently published articles. The intuition behind this is that deep neural networks usually have a large capacity to learn representation from one dataset and part of the information can be further used for a new task. In this research, we firstly present the complete scenarios of transfer learning according to the domains and tasks. Secondly, we conduct a comprehensive survey related to deep domain adaptation and categorize the recent advances into three types based on implementing approaches: fine-tuning networks, adversarial domain adaptation, and sample-reconstruction approaches. Thirdly, we discuss the details of these methods and introduce some typical real-world applications. Finally, we conclude our work and explore some potential issues to be further addressed.


Author(s):  
Reza Mazloom ◽  
Hongmin Li ◽  
Doina Caragea ◽  
Cornelia Caragea ◽  
Muhammad Imran

Huge amounts of data generated on social media during emergency situations is regarded as a trove of critical information. The use of supervised machine learning techniques in the early stages of a crisis is challenged by the lack of labeled data for that event. Furthermore, supervised models trained on labeled data from a prior crisis may not produce accurate results, due to inherent crisis variations. To address these challenges, the authors propose a hybrid feature-instance-parameter adaptation approach based on matrix factorization, k-nearest neighbors, and self-training. The proposed feature-instance adaptation selects a subset of the source crisis data that is representative for the target crisis data. The selected labeled source data, together with unlabeled target data, are used to learn self-training domain adaptation classifiers for the target crisis. Experimental results have shown that overall the hybrid domain adaptation classifiers perform better than the supervised classifiers learned from the original source data.


2014 ◽  
Vol 998-999 ◽  
pp. 1138-1145
Author(s):  
Ke Ren Wang ◽  
Wen Xiang Li

Video steganalysis takes effect when videos corrupted by the target steganography method are available. Nevertheless, classical classifiers deteriorate in the opposite case. This paper presents a method to cope with the problem of steganography method mismatch for the detection of motion vector (MV) based steganography. Firstly, Adding-or-Subtracting-One (AoSO) feature against MV based steganography and Transfer Component Analysis (TCA) for domain adaptation are revisited. Distributions of AoSO feature against various MV based steganography methods are illustrated, followed by the potential effect of TCA based AoSO feature. Finally, experiments are carried out on various cases of steganography method mismatch. Performance results demonstrate that TCA+AoSO feature significantly outperforms AoSO feature, and is more favorable for real-world applications.


Electronics ◽  
2021 ◽  
Vol 10 (15) ◽  
pp. 1834
Author(s):  
Abdullah Aljumah

From the end of 2019, the world has been facing the threat of COVID-19. It is predicted that, before herd immunity is achieved globally via vaccination, people around the world will have to tackle the COVID-19 pandemic using precautionary steps. This paper suggests a COVID-19 identification and control system that operates in real-time. The proposed system utilizes the Internet of Things (IoT) platform to capture users’ time-sensitive symptom information to detect potential cases of coronaviruses early on, to track the clinical measures adopted by survivors, and to gather and examine appropriate data to verify the existence of the virus. There are five key components in the framework: symptom data collection and uploading (via communication technology), a quarantine/isolation center, an information processing core (using artificial intelligent techniques), cloud computing, and visualization to healthcare doctors. This research utilizes eight machine/deep learning techniques—Neural Network, Decision Table, Support Vector Machine (SVM), Naive Bayes, OneR, K-Nearest Neighbor (K-NN), Dense Neural Network (DNN), and the Long Short-Term Memory technique—to detect coronavirus cases from time-sensitive information. A simulation was performed to verify the eight algorithms, after selecting the relevant symptoms, on real-world COVID-19 data values. The results showed that five of these eight algorithms obtained an accuracy of over 90%. Conclusively, it is shown that real-world symptomatic information would enable these three algorithms to identify potential COVID-19 cases effectively with enhanced accuracy. Additionally, the framework presents responses to treatment for COVID-19 patients.


2021 ◽  
Author(s):  
Fabian Fernando Jurado Lasso ◽  
Letizia Marchegiani ◽  
Jesus Fabian Jurado ◽  
Adnan Abu Mahfouz ◽  
Xenofon Fafoutis

This paper is aimed to present a comprehensive survey of relevant research over the period 2012-2021 of Software-Defined Wireless Sensor Network (SDWSN) proposals and Machine Learning (ML) techniques to perform network management, policy enforcement, and network configuration functions. This survey provides helpful information and insights to the scientific and industrial communities, and professional organisations interested in SDWSNs, mainly the current state-of-art, machine learning techniques, and open issues.


Sign in / Sign up

Export Citation Format

Share Document