Big Data Assimilation: Real-time Demonstration Experiment of 30-second-update Forecasting in Tokyo in August 2020

Author(s):  
Takemasa Miyoshi ◽  
Takumi Honda ◽  
Arata Amemiya ◽  
Shigenori Otsuka ◽  
Yasumitsu Maejima ◽  
...  

<p>The Japan’s Big Data Assimilation (BDA) project started in October 2013 and ended its 5.5-year period in March 2019. Here, we developed a novel numerical weather prediction (NWP) system at 100-m resolution updated every 30 seconds for precise prediction of individual convective clouds. This system was designed to fully take advantage of the phased array weather radar (PAWR) which observes reflectivity and Doppler velocity at 30-second frequency for 100 elevation angles at 100-m range resolution. By the end of the 5.5-year project period, we achieved less than 30-second computational time using the Japan’s flagship K computer, whose 10-petaflops performance was ranked #1 in the TOP500 list in 2011, for past cases with all input data such as boundary conditions and observation data being ready to use. The direct follow-on project started in April 2019 under the Japan Science and Technology Agency (JST) AIP (Advanced Intelligence Project) Acceleration Research. We continued the development to achieve real-time operations of this novel 30-second-update NWP system for demonstration at the time of the Tokyo 2020 Olympic and Paralympic games. The games were postponed, but the project achieved real-time demonstration of the 30-second-update NWP system at 500-m resolution using a powerful supercomputer called Oakforest-PACS operated jointly by the Tsukuba University and the University of Tokyo. The additional developments include parameter tuning for more accurate prediction and complete workflow to prepare all input data in real time, i.e., fast data transfer from the novel dual-polarization PAWR called MP-PAWR in Saitama University, and real-time nested-domain forecasts at 18-km, 6-km, and 1.5-km to provide lateral boundary conditions for the innermost 500-m-mesh domain. A real-time test was performed during July 31 and August 7, 2020 and resulted in the actual lead time of more than 27 minutes for 30-minute prediction with very few exceptions of extended delay. Past case experiments showed that this system could capture rapid intensification and decays of convective rains that occurred in the order of less than 10 minutes, while the JMA nowcasting did not predict the rapid changes by its design. This presentation will summarize the real-time demonstration during August 25 and September 7 when Tokyo 2020 Paralympic games were supposed to take place.</p>

2016 ◽  
Vol 11 (2) ◽  
pp. 164-174 ◽  
Author(s):  
Shunichi Koshimura ◽  

A project titled “Establishing the advanced disaster reduction management system by fusion of real-time disaster simulation and big data assimilation,” was launched as Core Research for Evolutional Science and Technology (CREST) by the Japan Science and Technology Agency (JST). Intended to save as many lives as possible in future national crises involving earthquake and tsunami disasters, the project works on a disaster mitigation system of the big data era, based on cooperation of large-scale, high-resolution, real-time numerical simulations and assimilation of real-time observation data. The world’s most advanced specialists in disaster simulation, disaster management, mathematical science, and information science work together to create the world’s first analysis platform for real-time simulation and big data that effectively processes, analyzes, and assimilates data obtained through various observations. Based on quantitative data, the platform designs proactive measures and supports disaster operations immediately after disaster occurrence. The project was launched in 2014 and is working on the following issues at present.Sophistication and fusion of simulations and damage prediction models using observational big data: Development of a real-time simulation core system that predicts the time evolution of disaster effect by assimilating of location information, fire information, and building collapse information which are obtained from mobile terminals, satellite images, aerial images, and other new observation data in addition to sensing data obtained by the undersea high-density seismic observation network.Latent structure analysis and major disaster scenario creation based on a huge amount of simulation results: Development of an analysis and extraction method for the latent structure of a huge amount of disaster scenarios generated by simulation, and creation of severe scenarios with minimum “unexpectedness” by controlling disaster scenario explosion (an explosive increase in the number of predicted scenarios).Establishment of an earthquake and tsunami disaster mitigation big data analysis platform: Development of an earthquake and tsunami disaster mitigation big data analysis platform that realizes analyses of a huge number of disaster scenarios and increases in speed of data assimilation, and clarifies the requirements for operation of the platform as a disaster mitigation system.The project was launched in 2014 as a 5-year project. It consists of element technology development and system fusion, feasibility study as a next-generation disaster mitigation system (validation with/without introduction of the developed real-time simulation and big data analysis platform) in the affected areas of the Great East Japan Earthquake, and test operations in affected areas of the Tokyo metropolitan earthquake and the Nankai Trough earthquake.


2020 ◽  
Author(s):  
Takemasa Miyoshi ◽  
Takmi Honda ◽  
Shigenori Otsuka ◽  
Arata Amemiya ◽  
Yasumitsu Maejima ◽  
...  

<p>The Japan’s Big Data Assimilation (BDA) project started in October 2013 and ended its 5.5-year period in March 2019. The direct follow-on project was accepted and started in April 2019 under the Japan Science and Technology Agency (JST) AIP (Advanced Intelligence Project) Acceleration Research, with emphases on the connection with AI technologies, in particular, an integration of DA and AI with high-performance computation (HPC). The BDA project aimed to fully take advantage of “big data” from advanced sensors such as the phased array weather radar (PAWR) and Himawari-8 geostationary satellite, which provide two orders of magnitude more data than the previous sensors. We have achieved successful case studies with newly-developed 30-second-update, 100-m-mesh numerical weather prediction (NWP) system based on the RIKEN’s SCALE model and local ensemble transform Kalman filter (LETKF) to assimilate PAWR in Osaka and Kobe. We have been actively developing the workflow for real-time weather forecasting in Tokyo in summer 2020. In addition, we developed two precipitation nowcasting systems with the every-30-second PAWR data: one with an optical-flow-based system, the other with a deep-learning-based system. We chose the convolutional Long Short Term Memory (Conv-LSTM) as a deep learning algorithm, and found it effective for precipitation nowcasting. The use of Conv-LSTM would lead to an integration of DA and AI with HPC. This presentation will include an overview of the BDA project toward a DA-AI-HPC integration under the new AIP Acceleration Research scheme, and recent progress of the project.</p>


Healthcare ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 234 ◽  
Author(s):  
Hyun Yoo ◽  
Soyoung Han ◽  
Kyungyong Chung

Recently, a massive amount of big data of bioinformation is collected by sensor-based IoT devices. The collected data are also classified into different types of health big data in various techniques. A personalized analysis technique is a basis for judging the risk factors of personal cardiovascular disorders in real-time. The objective of this paper is to provide the model for the personalized heart condition classification in combination with the fast and effective preprocessing technique and deep neural network in order to process the real-time accumulated biosensor input data. The model can be useful to learn input data and develop an approximation function, and it can help users recognize risk situations. For the analysis of the pulse frequency, a fast Fourier transform is applied in preprocessing work. With the use of the frequency-by-frequency ratio data of the extracted power spectrum, data reduction is performed. To analyze the meanings of preprocessed data, a neural network algorithm is applied. In particular, a deep neural network is used to analyze and evaluate linear data. A deep neural network can make multiple layers and can establish an operation model of nodes with the use of gradient descent. The completed model was trained by classifying the ECG signals collected in advance into normal, control, and noise groups. Thereafter, the ECG signal input in real time through the trained deep neural network system was classified into normal, control, and noise. To evaluate the performance of the proposed model, this study utilized a ratio of data operation cost reduction and F-measure. As a result, with the use of fast Fourier transform and cumulative frequency percentage, the size of ECG reduced to 1:32. According to the analysis on the F-measure of the deep neural network, the model had 83.83% accuracy. Given the results, the modified deep neural network technique can reduce the size of big data in terms of computing work, and it is an effective system to reduce operation time.


2018 ◽  
Vol 33 (2) ◽  
pp. 599-607 ◽  
Author(s):  
John R. Lawson ◽  
John S. Kain ◽  
Nusrat Yussouf ◽  
David C. Dowell ◽  
Dustan M. Wheatley ◽  
...  

Abstract The Warn-on-Forecast (WoF) program, driven by advanced data assimilation and ensemble design of numerical weather prediction (NWP) systems, seeks to advance 0–3-h NWP to aid National Weather Service warnings for thunderstorm-induced hazards. An early prototype of the WoF prediction system is the National Severe Storms Laboratory (NSSL) Experimental WoF System for ensembles (NEWSe), which comprises 36 ensemble members with varied initial conditions and parameterization suites. In the present study, real-time 3-h quantitative precipitation forecasts (QPFs) during spring 2016 from NEWSe members are compared against those from two real-time deterministic systems: the operational High Resolution Rapid Refresh (HRRR, version 1) and an upgraded, experimental configuration of the HRRR. All three model systems were run at 3-km horizontal grid spacing and differ in initialization, particularly in the radar data assimilation methods. It is the impact of this difference that is evaluated herein using both traditional and scale-aware verification schemes. NEWSe, evaluated deterministically for each member, shows marked improvement over the two HRRR versions for 0–3-h QPFs, especially at higher thresholds and smaller spatial scales. This improvement diminishes with forecast lead time. The experimental HRRR model, which became operational as HRRR version 2 in August 2016, also provides added skill over HRRR version 1.


2016 ◽  
Vol 3 (1) ◽  
pp. 67
Author(s):  
Sangeeta Maharjan ◽  
Ram P. Regmi

<p>As part of the ongoing research activities at National Atmospheric Resource and Environmental Research Laboratory (NARERL) to realize high spatial and temporal resolution weather forecasts for Nepal, the Weather Research and Forecasting (WRF) modeling system performance with the National Center for Environmental Protection (NCEP) and National Center for Medium Range Weather Forecast (NCMRWF) initialization global meteorological data sets and the effect of surface observation data assimilation have been examined. The study shows that WRF modeling system reasonably well predicts the diurnal variation of upcoming weather events with both the data sets. The observation data assimilation from entire weather station distributed over the country may lead to the significant improvement in the accuracy and reliability of extended period of forecast. However, upper air observation data assimilation would be necessary to achieve desired precision and reliability of extended weather forecast.</p><p>Journal of Nepal Physical Society Vol.3(1) 2015: 67-72</p>


2010 ◽  
Vol 27 (7) ◽  
pp. 1140-1152 ◽  
Author(s):  
Eunha Lim ◽  
Juanzhen Sun

Abstract A Doppler velocity dealiasing algorithm is developed within the storm-scale four-dimensional radar data assimilation system known as the Variational Doppler Radar Analysis System (VDRAS). The innovative aspect of the algorithm is that it dealiases Doppler velocity at each grid point independently by using three-dimensional wind fields obtained either from an objective analysis using conventional observations and mesoscale model output or from a rapidly updated analysis of VDRAS that assimilates radar data. This algorithm consists of three steps: preserving horizontal shear, global dealiasing using reference wind from the objective analysis or the VDRAS analysis, and local dealiasing. It is automated and intended to be used operationally for radar data assimilation using numerical weather prediction models. The algorithm was tested with 384 volumes of radar data observed from the Next Generation Weather Radar (NEXRAD) for a severe thunderstorm that occurred during 15 June 2002. It showed that the algorithm was effective in dealiasing large areas of aliased velocities when the wind from the objective analysis was used as the reference and that more accurate dealiasing was achieved by using the continuously cycled VDRAS analysis.


Geosciences ◽  
2018 ◽  
Vol 8 (12) ◽  
pp. 489 ◽  
Author(s):  
Jürgen Helmert ◽  
Aynur Şensoy Şorman ◽  
Rodolfo Alvarado Montero ◽  
Carlo De Michele ◽  
Patricia de Rosnay ◽  
...  

The European Cooperation in Science and Technology (COST) Action ES1404 “HarmoSnow”, entitled, “A European network for a harmonized monitoring of snow for the benefit of climate change scenarios, hydrology and numerical weather prediction” (2014-2018) aims to coordinate efforts in Europe to harmonize approaches to validation, and methodologies of snow measurement practices, instrumentation, algorithms and data assimilation (DA) techniques. One of the key objectives of the action was “Advance the application of snow DA in numerical weather prediction (NWP) and hydrological models and show its benefit for weather and hydrological forecasting as well as other applications.” This paper reviews approaches used for assimilation of snow measurements such as remotely sensed and in situ observations into hydrological, land surface, meteorological and climate models based on a COST HarmoSnow survey exploring the common practices on the use of snow observation data in different modeling environments. The aim is to assess the current situation and understand the diversity of usage of snow observations in DA, forcing, monitoring, validation, or verification within NWP, hydrology, snow and climate models. Based on the responses from the community to the questionnaire and on literature review the status and requirements for the future evolution of conventional snow observations from national networks and satellite products, for data assimilation and model validation are derived and suggestions are formulated towards standardized and improved usage of snow observation data in snow DA. Results of the conducted survey showed that there is a fit between the snow macro-physical variables required for snow DA and those provided by the measurement networks, instruments, and techniques. Data availability and resources to integrate the data in the model environment are identified as the current barriers and limitations for the use of new or upcoming snow data sources. Broadening resources to integrate enhanced snow data would promote the future plans to make use of them in all model environments.


2010 ◽  
Vol 25 (6) ◽  
pp. 1816-1825 ◽  
Author(s):  
Fuqing Zhang ◽  
Yonghui Weng ◽  
Ying-Hwa Kuo ◽  
Jeffery S. Whitaker ◽  
Baoguo Xie

Abstract This study examines the prediction and predictability of the recent catastrophic rainfall and flooding event over Taiwan induced by Typhoon Morakot (2009) with a state-of-the-art numerical weather prediction model. A high-resolution convection-permitting mesoscale ensemble, initialized with analysis and flow-dependent perturbations obtained from a real-time global ensemble data assimilation system, is found to be able to predict this record-breaking rainfall event, producing probability forecasts potentially valuable to the emergency management decision makers and the general public. Since all the advanced modeling and data assimilation techniques used here are readily available for real-time operational implementation provided sufficient computing resources are made available, this study demonstrates the potential and need of using ensemble-based analysis and forecasting, along with enhanced computing, in predicting extreme weather events like Typhoon Morakot at operational centers.


1993 ◽  
Vol 18 ◽  
pp. 65-71 ◽  
Author(s):  
Y. Durand ◽  
E. Brun ◽  
L. Merindol ◽  
G. Guyomarc'h ◽  
B. Lesaffre ◽  
...  

Relevant meteorological parameters have been analyzed to provide boundary conditions in real time for an energy, mass and stratigraphical model of snow cover at locations surrounded by meteorological observation points. From the available observation data, this analysis provides hourly meteorological information on every Alpine massif for six different aspects at 300 m elevation intervals. A numerical snow model has been run with these estimated meteorological data for numerous locations in the French Alps during the last ten years. Comparisons with observed snow characteristics (e.g., depth and stratigraphy) have proved the potential of the method.


2020 ◽  
Vol 35 (4) ◽  
pp. 1345-1362 ◽  
Author(s):  
Paula Maldonado ◽  
Juan Ruiz ◽  
Celeste Saulo

AbstractSpecification of suitable initial conditions to accurately forecast high-impact weather events associated with intense thunderstorms still poses a significant challenge for convective-scale forecasting. Radar data assimilation has been showing encouraging results to produce an accurate estimate of the state of the atmosphere at the mesoscale, as it combines high-spatiotemporal-resolution observations with convection-permitting numerical weather prediction models. However, many open questions remain regarding the configuration of state-of-the-art data assimilation systems at the mesoscale and their potential impact upon short-range weather forecasts. In this work, several observing system simulation experiments of a mesoscale convective system were performed to assess the sensitivity of the local ensemble transform Kalman filter to both relaxation-to-prior spread (RTPS) inflation and horizontal localization of the error covariance matrix. Realistic large-scale forcing and model errors have been taken into account in the simulation of reflectivity and Doppler velocity observations. Overall, the most accurate analyses in terms of RMSE were produced with a relatively small horizontal localization cutoff radius (~3.6–7.3 km) and large RTPS inflation parameter (~0.9–0.95). Additionally, the impact of horizontal localization on short-range ensemble forecast was larger compared to inflation, almost doubling the lead times up to which the effect of using a more accurate state to initialize the forecast persisted.


Sign in / Sign up

Export Citation Format

Share Document