Data assimilation for deterministic prediction of vessel motion in real-time

2022 ◽  
Vol 244 ◽  
pp. 110356
Author(s):  
C.T. Liong ◽  
K.H. Chua
2010 ◽  
Author(s):  
Constantinos Evangelinos ◽  
Pierre F. Lermusiaux ◽  
Jinshan Xu ◽  
Jr Haley ◽  
Hill Patrick J. ◽  
...  

2012 ◽  
Vol 12 (12) ◽  
pp. 3719-3732 ◽  
Author(s):  
L. Mediero ◽  
L. Garrote ◽  
A. Chavez-Jimenez

Abstract. Opportunities offered by high performance computing provide a significant degree of promise in the enhancement of the performance of real-time flood forecasting systems. In this paper, a real-time framework for probabilistic flood forecasting through data assimilation is presented. The distributed rainfall-runoff real-time interactive basin simulator (RIBS) model is selected to simulate the hydrological process in the basin. Although the RIBS model is deterministic, it is run in a probabilistic way through the results of calibration developed in a previous work performed by the authors that identifies the probability distribution functions that best characterise the most relevant model parameters. Adaptive techniques improve the result of flood forecasts because the model can be adapted to observations in real time as new information is available. The new adaptive forecast model based on genetic programming as a data assimilation technique is compared with the previously developed flood forecast model based on the calibration results. Both models are probabilistic as they generate an ensemble of hydrographs, taking the different uncertainties inherent in any forecast process into account. The Manzanares River basin was selected as a case study, with the process being computationally intensive as it requires simulation of many replicas of the ensemble in real time.


2018 ◽  
Vol 33 (2) ◽  
pp. 599-607 ◽  
Author(s):  
John R. Lawson ◽  
John S. Kain ◽  
Nusrat Yussouf ◽  
David C. Dowell ◽  
Dustan M. Wheatley ◽  
...  

Abstract The Warn-on-Forecast (WoF) program, driven by advanced data assimilation and ensemble design of numerical weather prediction (NWP) systems, seeks to advance 0–3-h NWP to aid National Weather Service warnings for thunderstorm-induced hazards. An early prototype of the WoF prediction system is the National Severe Storms Laboratory (NSSL) Experimental WoF System for ensembles (NEWSe), which comprises 36 ensemble members with varied initial conditions and parameterization suites. In the present study, real-time 3-h quantitative precipitation forecasts (QPFs) during spring 2016 from NEWSe members are compared against those from two real-time deterministic systems: the operational High Resolution Rapid Refresh (HRRR, version 1) and an upgraded, experimental configuration of the HRRR. All three model systems were run at 3-km horizontal grid spacing and differ in initialization, particularly in the radar data assimilation methods. It is the impact of this difference that is evaluated herein using both traditional and scale-aware verification schemes. NEWSe, evaluated deterministically for each member, shows marked improvement over the two HRRR versions for 0–3-h QPFs, especially at higher thresholds and smaller spatial scales. This improvement diminishes with forecast lead time. The experimental HRRR model, which became operational as HRRR version 2 in August 2016, also provides added skill over HRRR version 1.


Author(s):  
Duanfeng Han ◽  
Kuo Huang ◽  
Yingfei Zan ◽  
Lihao Yuan ◽  
Zhaohui Wu ◽  
...  

2020 ◽  
Vol 164 ◽  
pp. 03004
Author(s):  
Nikolay Ivanovskiy ◽  
Ivan Gorychev ◽  
Aleksandr Yashin ◽  
Sergey Bidenko

The paper considers the task of synthesis of algorithms for identifying random parameters of a vessel, such as attached masses, moment of inertia, and estimating the current parameters of the vessel's motion from real-time measurements of onboard sensors. The task of the synthesis of algorithms for identifying random parameters of the vessel and evaluating the characteristics of the vessel’s movement is to determine (evaluate) the current parameters (attached masses, moment of inertia) and the characteristics of the vessel’s motion (position vector, speed) from the measurements of the vessel’s motion, angular position and angular velocity of the vessel rotation).


2016 ◽  
Vol 11 (2) ◽  
pp. 164-174 ◽  
Author(s):  
Shunichi Koshimura ◽  

A project titled “Establishing the advanced disaster reduction management system by fusion of real-time disaster simulation and big data assimilation,” was launched as Core Research for Evolutional Science and Technology (CREST) by the Japan Science and Technology Agency (JST). Intended to save as many lives as possible in future national crises involving earthquake and tsunami disasters, the project works on a disaster mitigation system of the big data era, based on cooperation of large-scale, high-resolution, real-time numerical simulations and assimilation of real-time observation data. The world’s most advanced specialists in disaster simulation, disaster management, mathematical science, and information science work together to create the world’s first analysis platform for real-time simulation and big data that effectively processes, analyzes, and assimilates data obtained through various observations. Based on quantitative data, the platform designs proactive measures and supports disaster operations immediately after disaster occurrence. The project was launched in 2014 and is working on the following issues at present.Sophistication and fusion of simulations and damage prediction models using observational big data: Development of a real-time simulation core system that predicts the time evolution of disaster effect by assimilating of location information, fire information, and building collapse information which are obtained from mobile terminals, satellite images, aerial images, and other new observation data in addition to sensing data obtained by the undersea high-density seismic observation network.Latent structure analysis and major disaster scenario creation based on a huge amount of simulation results: Development of an analysis and extraction method for the latent structure of a huge amount of disaster scenarios generated by simulation, and creation of severe scenarios with minimum “unexpectedness” by controlling disaster scenario explosion (an explosive increase in the number of predicted scenarios).Establishment of an earthquake and tsunami disaster mitigation big data analysis platform: Development of an earthquake and tsunami disaster mitigation big data analysis platform that realizes analyses of a huge number of disaster scenarios and increases in speed of data assimilation, and clarifies the requirements for operation of the platform as a disaster mitigation system.The project was launched in 2014 as a 5-year project. It consists of element technology development and system fusion, feasibility study as a next-generation disaster mitigation system (validation with/without introduction of the developed real-time simulation and big data analysis platform) in the affected areas of the Great East Japan Earthquake, and test operations in affected areas of the Tokyo metropolitan earthquake and the Nankai Trough earthquake.


Sign in / Sign up

Export Citation Format

Share Document