scholarly journals Transformer for Sub-Seasonal Extreme High Temperature Probabilistic Forecasting Over Eastern China

Author(s):  
Wei Jin ◽  
Wei Zhang ◽  
Jie Hu ◽  
Jiazhen Chen ◽  
Bin Weng ◽  
...  

Abstract Sub-seasonal high temperature forecasting is significant for early warning of extreme heat weather. Currently, deep learning methods, especially Transformer, have been successfully applied to the meteorological field. Relying on the excellent global feature extraction capability in natural language processing, Transformer may be useful to improve the ability in extended periods. To explore this, we introduce the Transformer and propose a Transformer-based model, called Transformer to High Temperature (T2T). In the details of the model, we successively discuss the use of the Transformer and the position encoding in T2T to continuously optimize the model structure in an experimental manner. In the dataset, the multi-version data fusion method is proposed to further improve the prediction of the model with reasonable expansion of the dataset. The performance of well-desinged model (T2T) is verified against the European Centre for Medium-Range Weather Forecasts (ECMWF) and Multi-Layer Perceptron (MLP) at each grid of the 100.5°E to 138°E, 21°N to 54°N domain for the April to October of 2016-2019. For case study initiated from 2 June 2018, the results indicated that T2T is significantly better than ECMWF and MLP, with smaller absolute error and more reliable probabilistic forecast for the extreme high event happened during the third week. Over all, the deterministic forecast of T2T is superior to MLP and ECMWF due to ability of utilize spatial information of grids. T2T also provided a better calibrated probability of high temperature and a sharper prediction probability density function than MLP and ECMWF. All in all, T2T can meet the operational requirements for extended period forecasting of extreme high temperature. Furthermore, our research can provide experience on the development of deep learning in this field and achieve the continuous progress of seamless forecasting systems.

2022 ◽  
Vol 9 ◽  
Author(s):  
Wei Jin ◽  
Wei Zhang ◽  
Jie Hu ◽  
Bin Weng ◽  
Tianqiang Huang ◽  
...  

The high temperature forecast of the sub-season is a severe challenge. Currently, the residual structure has achieved good results in the field of computer vision attributed to the excellent feature extraction ability. However, it has not been introduced in the domain of sub-seasonal forecasting. Here, we develop multi-module daily deterministic and probabilistic forecast models by the residual structure and finally establish a complete set of sub-seasonal high temperature forecasting system in the eastern part of China. The experimental results indicate that our method is effective and outperforms the European hindcast results in all aspects: absolute error, anomaly correlation coefficient, and other indicators are optimized by 8–50%, and the equitable threat score is improved by up to 400%. We conclude that the residual network has a sharper insight into the high temperature in sub-seasonal high temperature forecasting compared to traditional methods and convolutional networks, thus enabling more effective early warnings of extreme high temperature weather.


Water ◽  
2019 ◽  
Vol 11 (11) ◽  
pp. 2291
Author(s):  
Zhou ◽  
Pei ◽  
Xia ◽  
Wu ◽  
Zhong ◽  
...  

Extreme climate events frequently exert serious effects on terrestrial vegetation activity. However, these effects are still uncertain in widely distributed areas with different climate zones. Transect analysis is important to understand how terrestrial vegetation responds to climate change, especially extreme climate events, by substituting space for time. In this paper, seven extreme climate indices and the Normalized Difference Vegetation Index (NDVI) are employed to examine changes in the extreme climate events and vegetation activity. To reduce the uncertainty of the NDVI, two satellite-derived NDVI datasets, including the third generation Global Inventory Monitoring and Modeling System (GIMMS-3g) NDVI dataset and the NDVI from the National Oceanic and Atmospheric Administration (NOAA) satellites on Star Web Servers (SWS), were employed to capture changes in vegetation activity. The impacts of climate extremes on vegetation activity were then assessed over the period of 1982–2012 using the North–South Transect of Eastern China (NSTEC) as a case. The results show that vegetation activity was overall strengthened from 1982 to 2012 in the NSTEC. In addition, extreme high temperature events revealed an increased trend of approximately 5.15 days per decade, while a weakened trend (not significant) was found in extreme cold temperature events. The strengthened vegetation activities could be associated with enhanced extreme high temperature events and weakened extreme cold temperature events over the past decades in most of the NSTEC. Despite this, inversed changes were also found locally between vegetation activity and extreme climate events (e.g., in the Northeast Plain). These phenomena could be associated with differences in vegetation type, human activity, as well as the combined effects of the frequency and intensity of extreme climate events. This study highlights the importance of accounting for the vital roles of extreme climate effects on vegetation activity.


Author(s):  
Sumit Kaur

Abstract- Deep learning is an emerging research area in machine learning and pattern recognition field which has been presented with the goal of drawing Machine Learning nearer to one of its unique objectives, Artificial Intelligence. It tries to mimic the human brain, which is capable of processing and learning from the complex input data and solving different kinds of complicated tasks well. Deep learning (DL) basically based on a set of supervised and unsupervised algorithms that attempt to model higher level abstractions in data and make it self-learning for hierarchical representation for classification. In the recent years, it has attracted much attention due to its state-of-the-art performance in diverse areas like object perception, speech recognition, computer vision, collaborative filtering and natural language processing. This paper will present a survey on different deep learning techniques for remote sensing image classification. 


Aerospace ◽  
2021 ◽  
Vol 8 (6) ◽  
pp. 152
Author(s):  
Micha Zoutendijk ◽  
Mihaela Mitici

The problem of flight delay prediction is approached most often by predicting a delay class or value. However, the aviation industry can benefit greatly from probabilistic delay predictions on an individual flight basis, as these give insight into the uncertainty of the delay predictions. Therefore, in this study, two probabilistic forecasting algorithms, Mixture Density Networks and Random Forest regression, are applied to predict flight delays at a European airport. The algorithms estimate well the distribution of arrival and departure flight delays with a Mean Absolute Error of less than 15 min. To illustrate the utility of the estimated delay distributions, we integrate these probabilistic predictions into a probabilistic flight-to-gate assignment problem. The objective of this problem is to increase the robustness of flight-to-gate assignments. Considering probabilistic delay predictions, our proposed flight-to-gate assignment model reduces the number of conflicted aircraft by up to 74% when compared to a deterministic flight-to-gate assignment model. In general, the results illustrate the utility of considering probabilistic forecasting for robust airport operations’ optimization.


Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3719
Author(s):  
Aoxin Ni ◽  
Arian Azarang ◽  
Nasser Kehtarnavaz

The interest in contactless or remote heart rate measurement has been steadily growing in healthcare and sports applications. Contactless methods involve the utilization of a video camera and image processing algorithms. Recently, deep learning methods have been used to improve the performance of conventional contactless methods for heart rate measurement. After providing a review of the related literature, a comparison of the deep learning methods whose codes are publicly available is conducted in this paper. The public domain UBFC dataset is used to compare the performance of these deep learning methods for heart rate measurement. The results obtained show that the deep learning method PhysNet generates the best heart rate measurement outcome among these methods, with a mean absolute error value of 2.57 beats per minute and a mean square error value of 7.56 beats per minute.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Mu Sook Lee ◽  
Yong Soo Kim ◽  
Minki Kim ◽  
Muhammad Usman ◽  
Shi Sub Byon ◽  
...  

AbstractWe examined the feasibility of explainable computer-aided detection of cardiomegaly in routine clinical practice using segmentation-based methods. Overall, 793 retrospectively acquired posterior–anterior (PA) chest X-ray images (CXRs) of 793 patients were used to train deep learning (DL) models for lung and heart segmentation. The training dataset included PA CXRs from two public datasets and in-house PA CXRs. Two fully automated segmentation-based methods using state-of-the-art DL models for lung and heart segmentation were developed. The diagnostic performance was assessed and the reliability of the automatic cardiothoracic ratio (CTR) calculation was determined using the mean absolute error and paired t-test. The effects of thoracic pathological conditions on performance were assessed using subgroup analysis. One thousand PA CXRs of 1000 patients (480 men, 520 women; mean age 63 ± 23 years) were included. The CTR values derived from the DL models and diagnostic performance exhibited excellent agreement with reference standards for the whole test dataset. Performance of segmentation-based methods differed based on thoracic conditions. When tested using CXRs with lesions obscuring heart borders, the performance was lower than that for other thoracic pathological findings. Thus, segmentation-based methods using DL could detect cardiomegaly; however, the feasibility of computer-aided detection of cardiomegaly without human intervention was limited.


2020 ◽  
Vol 114 ◽  
pp. 242-245
Author(s):  
Jootaek Lee

The term, Artificial Intelligence (AI), has changed since it was first coined by John MacCarthy in 1956. AI, believed to have been created with Kurt Gödel's unprovable computational statements in 1931, is now called deep learning or machine learning. AI is defined as a computer machine with the ability to make predictions about the future and solve complex tasks, using algorithms. The AI algorithms are enhanced and become effective with big data capturing the present and the past while still necessarily reflecting human biases into models and equations. AI is also capable of making choices like humans, mirroring human reasoning. AI can help robots to efficiently repeat the same labor intensive procedures in factories and can analyze historic and present data efficiently through deep learning, natural language processing, and anomaly detection. Thus, AI covers a spectrum of augmented intelligence relating to prediction, autonomous intelligence relating to decision making, automated intelligence for labor robots, and assisted intelligence for data analysis.


Sign in / Sign up

Export Citation Format

Share Document