scholarly journals A Deep Learning Model for Forecasting Velocity Structures of the Loop Current System in the Gulf of Mexico

Forecasting ◽  
2021 ◽  
Vol 3 (4) ◽  
pp. 934-953
Author(s):  
Ali Muhamed Ali ◽  
Hanqi Zhuang ◽  
James VanZwieten ◽  
Ali K. Ibrahim ◽  
Laurent Chérubin

Despite the large efforts made by the ocean modeling community, such as the GODAE (Global Ocean Data Assimilation Experiment), which started in 1997 and was renamed as OceanPredict in 2019, the prediction of ocean currents has remained a challenge until the present day—particularly in ocean regions that are characterized by rapid changes in their circulation due to changes in atmospheric forcing or due to the release of available potential energy through the development of instabilities. Ocean numerical models’ useful forecast window is no longer than two days over a given area with the best initialization possible. Predictions quickly diverge from the observational field throughout the water and become unreliable, despite the fact that they can simulate the observed dynamics through other variables such as temperature, salinity and sea surface height. Numerical methods such as harmonic analysis are used to predict both short- and long-term tidal currents with significant accuracy. However, they are limited to the areas where the tide was measured. In this study, a new approach to ocean current prediction based on deep learning is proposed. This method is evaluated on the measured energetic currents of the Gulf of Mexico circulation dominated by the Loop Current (LC) at multiple spatial and temporal scales. The approach taken herein consists of dividing the velocity tensor into planes perpendicular to each of the three Cartesian coordinate system directions. A Long Short-Term Memory Recurrent Neural Network, which is best suited to handling long-term dependencies in the data, was thus used to predict the evolution of the velocity field in each plane, along each of the three directions. The predicted tensors, made of the planes perpendicular to each Cartesian direction, revealed that the model’s prediction skills were best for the flow field in the planes perpendicular to the direction of prediction. Furthermore, the fusion of all three predicted tensors significantly increased the overall skills of the flow prediction over the individual model’s predictions. The useful forecast period of this new model was greater than 4 days with a root mean square error less than 0.05 cm·s−1 and a correlation coefficient of 0.6.

2021 ◽  
Author(s):  
Neha Groves ◽  
Ashwanth Srinivasan ◽  
Leonid Ivanov ◽  
Jill Storie ◽  
Drew Gustafson ◽  
...  

Abstract The Gulf of Mexico's unique circulation characteristics pose a particular threat to marine operations and play a significant role in driving the criteria used for design and life extension analyses of offshore infrastructure. Estimates from existing reanalysis datasets used by operators in GOM show less than ideal correlation with in situ measurements and have a limited resolution that disallows for the capture of ocean features of interest. In this paper, we introduce a new high-resolution long-term reanalysis dataset, Multi-resolution Advanced Current Reanalysis for the Ocean – Gulf of Mexico (MACRO-GOM), based on a state-of the-science hydrodynamic model configured specifically for ocean current forecasting and hindcasting services for the offshore industry that assimilates extensive non-conventional observational data. The underlying hydrodynamic model used is the Woods Hole Group – Tendral Ocean Prediction System (WHG-TOPS). MACRO-GOM is being developed at the native resolution of the TOPS-GOM domain, i.e. 1/32° (~3 km) hourly grid for the 1994-2019 time period (25 years). A 3-level downscaling methodology is used wherein observation based estimates are first dynamically interpolated using a 1/4° model before being downscaled to the 1/16° Inter-American Seas (IAS) domain, which in turn is used to generate time-consistent boundary conditions for the 1/32° reanalysis. A multiscale data assimilation technique is used to constrain the model at synoptic and longer time scales. For this paper, a shorter, 5-year reanalysis run was conducted for the 2015-2019 time period for verification against assimilated and unassimilated observations, WHG's proprietary frontal analyses, and other reanalyses. Both the frontal analyses and Notice to Lesses (NTL) rig mounted ADCP data was withheld from assimilation for comparison. Offshore operations in the GOM can benefit from an improved reanalysis dataset capable of assimilating existing non-conventional observational datasets. Existing hindcast and reanalysis model datasets are limited in their ability to comprehensively and reliably quantify the 3D circulation and kinematic properties of the main features partly because of limited assimilation of observational data. MACRO-GOM incorporates all the advantages of available HYCOM-based reanalyses and further enhances the resolution, accuracy, and reliability by the assimilation of over three decades of WHG's proprietary datasets and frontal analyses for continuous model correction and ground-truthing. The final 25-year high resolution dataset will provide highly reliable design and operational criteria for new and existing infrastructure in GOM.


PeerJ ◽  
2021 ◽  
Vol 9 ◽  
pp. e11262
Author(s):  
Guobin Li ◽  
Xiuquan Du ◽  
Xinlu Li ◽  
Le Zou ◽  
Guanhong Zhang ◽  
...  

DNA-binding proteins (DBPs) play pivotal roles in many biological functions such as alternative splicing, RNA editing, and methylation. Many traditional machine learning (ML) methods and deep learning (DL) methods have been proposed to predict DBPs. However, these methods either rely on manual feature extraction or fail to capture long-term dependencies in the DNA sequence. In this paper, we propose a method, called PDBP-Fusion, to identify DBPs based on the fusion of local features and long-term dependencies only from primary sequences. We utilize convolutional neural network (CNN) to learn local features and use bi-directional long-short term memory network (Bi-LSTM) to capture critical long-term dependencies in context. Besides, we perform feature extraction, model training, and model prediction simultaneously. The PDBP-Fusion approach can predict DBPs with 86.45% sensitivity, 79.13% specificity, 82.81% accuracy, and 0.661 MCC on the PDB14189 benchmark dataset. The MCC of our proposed methods has been increased by at least 9.1% compared to other advanced prediction models. Moreover, the PDBP-Fusion also gets superior performance and model robustness on the PDB2272 independent dataset. It demonstrates that the PDBP-Fusion can be used to predict DBPs from sequences accurately and effectively; the online server is at http://119.45.144.26:8080/PDBP-Fusion/.


2020 ◽  
pp. 158-161
Author(s):  
Chandraprabha S ◽  
Pradeepkumar G ◽  
Dineshkumar Ponnusamy ◽  
Saranya M D ◽  
Satheesh Kumar S ◽  
...  

This paper outfits artificial intelligence based real time LDR data which is implemented in various applications like indoor lightning, and places where enormous amount of heat is produced, agriculture to increase the crop yield, Solar plant for solar irradiance Tracking. For forecasting the LDR information. The system uses a sensor that can measure the light intensity by means of LDR. The data acquired from sensors are posted in an Adafruit cloud for every two seconds time interval using Node MCU ESP8266 module. The data is also presented on adafruit dashboard for observing sensor variables. A Long short-term memory is used for setting up the deep learning. LSTM module uses the recorded historical data from adafruit cloud which is paired with Node MCU in order to obtain the real-time long-term time series sensor variables that is measured in terms of light intensity. Data is extracted from the cloud for processing the data analytics later the deep learning model is implemented in order to predict future light intensity values.


2019 ◽  
Vol 36 (2) ◽  
pp. 231-247 ◽  
Author(s):  
Brian Emery ◽  
Libe Washburn

Abstract HF radars typically produce maps of surface current velocities without estimates of the measurement uncertainties. Many users of HF radar data, including spill response and search and rescue operations, incorporate these observations into models and would thus benefit from quantified uncertainties. Using both simulations and coincident observations from the baseline between two operational SeaSonde HF radars, we demonstrate the utility of expressions for estimating the uncertainty in the direction obtained with the Multiple Signal Classification (MUSIC) algorithm. Simulations of radar backscatter using surface currents from the Regional Ocean Modeling System show a close correspondence between direction of arrival (DOA) errors and estimated uncertainties, with mean values of 15° at 10 dB, falling to less than 3° at 30 dB. Observations from two operational SeaSondes have average DOA uncertainties of 2.7° and 3.8°, with a fraction of the observations (10.5% and 7.1%, respectively) having uncertainties of >10°. Using DOA uncertainties for data quality control improves time series comparison statistics between the two radars, with r2=0.6 increasing to r2=0.75 and RMS difference decreasing from 15 to 12 cm s−1. The analysis illustrates the major sources of error in oceanographic HF radars and suggests that the DOA uncertainties are suitable for assimilation into numerical models.


2021 ◽  
Vol 2 (4) ◽  
pp. 1-18
Author(s):  
Zhaohong Sun ◽  
Wei Dong ◽  
Jinlong Shi ◽  
Kunlun He ◽  
Zhengxing Huang

Survival analysis exhibits profound effects on health service management. Traditional approaches for survival analysis have a pre-assumption on the time-to-event probability distribution and seldom consider sequential visits of patients on medical facilities. Although recent studies leverage the merits of deep learning techniques to capture non-linear features and long-term dependencies within multiple visits for survival analysis, the lack of interpretability prevents deep learning models from being applied to clinical practice. To address this challenge, this article proposes a novel attention-based deep recurrent model, named AttenSurv , for clinical survival analysis. Specifically, a global attention mechanism is proposed to extract essential/critical risk factors for interpretability improvement. Thereafter, Bi-directional Long Short-Term Memory is employed to capture the long-term dependency on data from a series of visits of patients. To further improve both the prediction performance and the interpretability of the proposed model, we propose another model, named GNNAttenSurv , by incorporating a graph neural network into AttenSurv, to extract the latent correlations between risk factors. We validated our solution on three public follow-up datasets and two electronic health record datasets. The results demonstrated that our proposed models yielded consistent improvement compared to the state-of-the-art baselines on survival analysis.


Telecom ◽  
2021 ◽  
Vol 2 (4) ◽  
pp. 446-471
Author(s):  
Percy Kapadia ◽  
Boon-Chong Seet

This paper proposes a potential enhancement of handover for the next-generation multi-tier cellular network, utilizing two fifth-generation (5G) enabling technologies: multi-access edge computing (MEC) and machine learning (ML). MEC and ML techniques are the primary enablers for enhanced mobile broadband (eMBB) and ultra-reliable and low latency communication (URLLC). The subset of ML chosen for this research is deep learning (DL), as it is adept at learning long-term dependencies. A variant of artificial neural networks called a long short-term memory (LSTM) network is used in conjunction with a look-up table (LUT) as part of the proposed solution. Subsequently, edge computing virtualization methods are utilized to reduce handover latency and increase the overall throughput of the network. A realistic simulation of the proposed solution in a multi-tier 5G radio access network (RAN) showed a 40–60% improvement in overall throughput. Although the proposed scheme may increase the number of handovers, it is effective in reducing the handover failure (HOF) and ping-pong rates by 30% and 86%, respectively, compared to the current 3GPP scheme.


2021 ◽  
Vol 29 (3) ◽  
Author(s):  
Bennilo Fernandes ◽  
Kasiprasad Mannepalli

Designing the interaction among human language and a registered emotional database enables us to explore how the system performs and has multiple approaches for emotion detection in patient services. As of now, clustering techniques were primarily used in many prominent areas and in emotional speech recognition, even though it shows best results a new approach to the design is focused on Long Short-Term Memory (LSTM), Bi-Directional LSTM and Gated Recurrent Unit (GRU) as an estimation method for emotional Tamil datasets is available in this paper. A new approach of Deep Hierarchal LSTM/BiLSTM/GRU layer is designed to obtain the best result for long term learning voice dataset. Different combinations of deep learning hierarchal architecture like LSTM & GRU (DHLG), BiLSTM & GRU (DHBG), GRU & LSTM (DHGL), GRU & BiLSTM (DHGB) and dual GRU (DHGG) layer is designed with introduction of dropout layer to overcome the learning problem and gradient vanishing issues in emotional speech recognition. Moreover, to increase the design outcome within each emotional speech signal, various feature extraction combinations are utilized. From the analysis an average classification validity of the proposed DHGB model gives 82.86%, which is slightly higher than other models like DHGL (82.58), DHBG (82%), DHLG (81.14%) and DHGG (80%). Thus, by comparing all the models DHGB gives prominent outcome of 5% more than other four models with minimum training time and low dataset.


2020 ◽  
Vol 11 (1) ◽  
pp. 316
Author(s):  
Namrye Son ◽  
Mina Jung

Solar power generation is an increasingly popular renewable energy topic. Photovoltaic (PV) systems are installed on buildings to efficiently manage energy production and consumption. Because of its physical properties, electrical energy is produced and consumed simultaneously; therefore solar energy must be predicted accurately to maintain a stable power supply. To develop an efficient energy management system (EMS), 22 multivariate numerical models were constructed by combining solar radiation, sunlight, humidity, temperature, cloud cover, and wind speed. The performance of the models was compared by applying a modified version of the traditional long short-term memory (LSTM) approach. The experimental results showed that the six meteorological factors influence the solar power forecast regardless of the season. These are, from most to least important: solar radiation, sunlight, wind speed, temperature, cloud cover, and humidity. The models are rated for suitability to provide medium- and long-term solar power forecasts, and the modified LSTM demonstrates better performance than the traditional LSTM.


Sign in / Sign up

Export Citation Format

Share Document