scholarly journals Evaluation of real time and future global monitoring and forecasting systems at Mercator Océan

2012 ◽  
Vol 9 (2) ◽  
pp. 1123-1185 ◽  
Author(s):  
J.-M. Lellouche ◽  
O. Le Galloudec ◽  
M. Drévillon ◽  
C. Régnier ◽  
E. Greiner ◽  
...  

Abstract. Since December 2010, the global analysis and forecast of the MyOcean system consists in the Mercator Océan NEMO global 1/4° configuration with a 1/12° "zoom" over the Atlantic and Mediterranean Sea. The zoom open boundaries come from the global 1/4° at 20° S and 80° N. The data assimilation uses a reduced order Kalman filter with a 3-D multivariate modal decomposition of the forecast error. It includes an adaptative error and a localization algorithm. A 3D-Var scheme corrects for the slowly evolving large-scale biases in temperature and salinity. Altimeter data, satellite temperature and in situ temperature and salinity vertical profiles are jointly assimilated to estimate the initial conditions for the numerical ocean forecasting. This paper gives a description of the recent systems. The validation procedure is introduced and applied to the current and future systems. This paper shows how the validation impacts on the quality of the systems. It is shown how quality check (in situ, drifters) and data source (satellite temperature) impacts as much as the systems design (model physics and assimilation parameters). The validation demonstrates the accuracy of the MyOcean global products. Their quality is stable in time. The future systems under development still suffer from a drift. This could only be detected with a 5 yr hindcast of the systems. This emphasizes the need for continuous research efforts in the process of building future versions of MyOcean2 forecasting capacities.

2017 ◽  
Vol 12 (4) ◽  
pp. 241-247 ◽  
Author(s):  
Karol Opara ◽  
Jan Zieliński

Modelling of the pavement temperature facilitates winter road maintenance. It is used for predicting the glaze formation and for scheduling the spraying of the de-icing brine. The road weather is commonly forecasted by solving the energy balance equations. It requires setting the initial vertical profile of the pavement temperature, which is often obtained from the Road Weather Information Stations. The paper proposes the use of average air temperature from seven preceding days as a pseudo-observation of the subsurface temperature. Next, the road weather model is run with a few days offset. It first uses the recent, historical weather data and then the available forecasts. This approach exploits the fact that the energy balance models tend to “forget” their initial conditions and converge to the baseline solution. The experimental verification was conducted using the Model of the Environment and Temperature of Roads and the data from a road weather station in Warsaw over a period of two years. The additional forecast error introduced by the proposed pseudo-observational initialization averages 1.2 °C in the first prediction hour and then decreases in time. The paper also discusses the use of Digital Surface Models to take into account the shading effects, which are an essential source of forecast errors in urban areas. Limiting the use of in-situ sensors opens a perspective for an economical, largescale implementation of road meteorological models.


2015 ◽  
Vol 12 (3) ◽  
pp. 1145-1186 ◽  
Author(s):  
V. Turpin ◽  
E. Remy ◽  
P. Y. Le Traon

Abstract. Observing System Experiments (OSEs) are carried out over a one-year period to quantify the impact of Argo observations on the Mercator-Ocean 1/4° global ocean analysis and forecasting system. The reference simulation assimilates sea surface temperature (SST), SSALTO/DUACS altimeter data and Argo and other in situ observations from the Coriolis data center. Two other simulations are carried out where all Argo and half of Argo data sets are withheld. Assimilating Argo observations has a significant impact on analyzed and forecast temperature and salinity fields at different depths. Without Argo data assimilation, large errors occur in analyzed fields as estimated from the differences when compared with in situ observations. For example, in the 0–300 m layer RMS differences between analyzed fields and observations reach 0.25 psu and 1.25 °C in the western boundary currents and 0.1 psu and 0.75 °C in the open ocean. The impact of the Argo data in reducing observation-model forecast error is also significant from the surface down to a depth of 2000 m. Differences between independent observations and forecast fields are thus reduced by 20 % in the upper layers and by up to 40 % at a depth of 2000 m when Argo data are assimilated. At depth, the most impacted regions in the global ocean are the Mediterranean outflow and the Labrador Sea. A significant degradation can be observed when only half of the data are assimilated. All Argo observations thus matter, even with a 1/4° model resolution. The main conclusion is that the performance of global data assimilation systems is heavily dependent on the availability of Argo data.


2011 ◽  
Vol 139 (5) ◽  
pp. 1463-1491 ◽  
Author(s):  
Hiep Van Nguyen ◽  
Yi-Leng Chen

A model self-bogus vortex is constructed by cycle runs using the Weather Research and Forecasting (WRF) model to provide high-resolution initial conditions for tropical cyclone (TC) simulations. The vortex after 1 h of model simulation is used to construct the vortex structure for the initial conditions for the next cycle run. After about 80 cycle runs, the TC structure is well adapted to the model employed and well adjusted to the given large-scale conditions. Three separate simulations using three different initial conditions including global analysis (CTRL), the bogus package from WRF (WB), and the new initialization package (NT) are performed for Typhoon Morakot (2009). The NT scheme shows advantages in generating realistic vortex features including sea level pressure, winds, a warm core, and correct TC size with the meteorological fields away from the observed TC center consistent with the global analysis. The NT scheme also shows significant improvements in TC simulations including asymmetric structure, track, intensity, strength of low-level winds, radar reflectivity, and rainfall. For other runs, such as WB and CTRL, the unbalanced initial vortex needs to adjust to the changing environment during the first 2–3 days of model simulations, which is likely to have negative impacts on the track, intensity, and rainfall forecasts in most cases. For all three different types of model initializations, the model is capable of simulating heavy orographic precipitation over southern Taiwan. However, with a better track forecast, only the NT run simulates the high-reflectivity band associated with the convergence zone between Morakot’s circulations and the southwest monsoon off the southeast coast. In addition to Morakot’s slow movement and relatively large size, Typhoons Goni and Etau were embedded within a moist monsoon gyre. The combined circulations associated with the monsoon gyre and tropical storms bring in moisture-laden flows toward the western slopes of southern Taiwan.


Author(s):  
Chin-Hung Chen ◽  
Kao-Shen Chung ◽  
Shu-Chih Yang ◽  
Li-Hsin Chen ◽  
Pay-Liam Lin ◽  
...  

AbstractA mesoscale convective system that occurred in southwestern Taiwan on 15 June 2008 is simulated using convection-allowing ensemble forecasts to investigate the forecast uncertainty associated with four microphysics schemes—the Goddard Cumulus Ensemble (GCE), Morrison (MOR), WRF single-moment 6-class (WSM6), and WRF double-moment 6-class (WDM6) schemes. First, the essential features of the convective structure, hydrometeor distribution, and microphysical tendencies for the different microphysics schemes are presented through deterministic forecasts. Second, ensemble forecasts with the same initial conditions are employed to estimate the forecast uncertainty produced by the different ensembles with the fixed microphysics scheme. GCE has the largest spread in most state variables due to its most efficient phase conversion between water species. By contrast, MOR results in the least spread. WSM6 and WDM6 have similar vertical spread structures due to their similar ice-phase formulae. However, WDM6 produces more ensemble spread than WSM6 does below the melting layer, resulting from its double-moment treatment of warm rain processes. The model simulations with the four microphysics schemes demonstrate upscale error growth through spectrum analysis of the root-mean difference total energy (RMDTE). The RMDTE results reveal that the GCE and WDM6 schemes are more sensitive to initial condition uncertainty, whereas the MOR and WSM6 schemes are relatively less sensitive to that for this event. Overall, the diabatic heating–cooling processes connect the convective-scale cloud microphysical processes to the large-scale dynamical and thermodynamical fields, and they significantly affect the forecast error signatures in the multiscale weather system.


2015 ◽  
Vol 17 (1) ◽  
pp. 345-352 ◽  
Author(s):  
Camille Garnaud ◽  
Stéphane Bélair ◽  
Aaron Berg ◽  
Tracy Rowlandson

Abstract This study explores the performance of Environment Canada’s Surface Prediction System (SPS) in comparison to in situ observations from the Brightwater Creek soil moisture observation network with respect to soil moisture and soil temperature. To do so, SPS is run at hyperresolution (100 m) over a small domain in southern Saskatchewan (Canada) during the summer of 2014. It is shown that with initial conditions and surface condition forcings based on observations, SPS can simulate soil moisture and soil temperature evolution over time with high accuracy (mean bias of 0.01 m3 m−3 and −0.52°C, respectively). However, the modeled spatial variability is generally much weaker than observed. This is likely related to the model’s use of uniform soil texture, the lack of small-scale orography, as well as a predefined crop growth cycle in SPS. Nonetheless, the spatial averages of simulated soil conditions over the domain are very similar to those observed, suggesting that both are representative of large-scale conditions. Thus, in the context of the National Aeronautics and Space Administration’s (NASA) Soil Moisture Active Passive (SMAP) project, this study shows that both simulated and in situ observations can be upscaled to allow future comparison with upcoming satellite data.


2014 ◽  
Vol 31 (2) ◽  
Author(s):  
Jose Antonio Moreira Lima

This paper is concerned with the planning, implementation and some results of the Oceanographic Modeling and Observation Network, named REMO, for Brazilian regional waters. Ocean forecasting has been an important scientific issue over the last decade due to studies related to climate change as well as applications related to short-range oceanic forecasts. The South Atlantic Ocean has a deficit of oceanographic measurements when compared to other ocean basins such as the North Atlantic Ocean and the North Pacific Ocean. It is a challenge to design an ocean forecasting system for a region with poor observational coverage of in-situ data. Fortunately, most ocean forecasting systems heavily rely on the assimilation of surface fields such as sea surface height anomaly (SSHA) or sea surface temperature (SST), acquired by environmental satellites, that can accurately provide information that constrain major surface current systems and their mesoscale activity. An integrated approach is proposed here in which the large scale circulation in the Atlantic Ocean is modeled in a first step, and gradually nested into higher resolution regional models that are able to resolve important processes such as the Brazil Current and associated mesoscale variability, continental shelf waves, local and remote wind forcing, and others. This article presents the overall strategy to develop the models using a network of Brazilian institutions and their related expertise along with international collaboration. This work has some similarity with goals of the international project Global Ocean Data Assimilation Experiment OceanView (GODAE OceanView).


2018 ◽  
Vol 23 (suppl_1) ◽  
pp. e16-e16
Author(s):  
Ahmed Moussa ◽  
Audrey Larone-Juneau ◽  
Laura Fazilleau ◽  
Marie-Eve Rochon ◽  
Justine Giroux ◽  
...  

Abstract BACKGROUND Transitions to new healthcare environments can negatively impact patient care and threaten patient safety. Immersive in situ simulation conducted in newly constructed single family room (SFR) Neonatal Intensive Care Units (NICUs) prior to occupancy, has been shown to be effective in testing new environments and identifying latent safety threats (LSTs). These simulations overlay human factors to identify LSTs as new and existing process and systems are implemented in the new environment OBJECTIVES We aimed to demonstrate that large-scale, immersive, in situ simulation prior to the transition to a new SFR NICU improves: 1) systems readiness, 2) staff preparedness, 3) patient safety, 4) staff comfort with simulation, and 5) staff attitude towards culture change. DESIGN/METHODS Multidisciplinary teams of neonatal healthcare providers (HCP) and parents of former NICU patients participated in large-scale, immersive in-situ simulations conducted in the new NICU prior to occupancy. One eighth of the NICU was outfitted with equipment and mannequins and staff performed in their native roles. Multidisciplinary debriefings, which included parents, were conducted immediately after simulations to identify LSTs. Through an iterative process issues were resolved and additional simulations conducted. Debriefings were documented and debriefing transcripts transcribed and LSTs classified using qualitative methods. To assess systems readiness and staff preparedness for transition into the new NICU, HCPs completed surveys prior to transition, post-simulation and post-transition. Systems readiness and staff preparedness were rated on a 5-point Likert scale. Average survey responses were analyzed using dependent samples t-tests and repeated measures ANOVAs. RESULTS One hundred eight HCPs and 24 parents participated in six half-day simulation sessions. A total of 75 LSTs were identified and were categorized into eight themes: 1) work organization, 2) orientation and parent wayfinding, 3) communication devices/systems, 4) nursing and resuscitation equipment, 5) ergonomics, 6) parent comfort; 7) work processes, and 8) interdepartmental interactions. Prior to the transition to the new NICU, 76% of the LSTs were resolved. Survey response rate was 31%, 16%, 7% for baseline, post-simulation and post-move surveys, respectively. System readiness at baseline was 1.3/5,. Post-simulation systems readiness was 3.5/5 (p = 0.0001) and post-transition was 3.9/5 (p = 0.02). Staff preparedness at baseline was 1.4/5. Staff preparedness post-simulation was 3.3/5 (p = 0.006) and post-transition was 3.9/5 (p = 0.03). CONCLUSION Large-scale, immersive in situ simulation is a feasible and effective methodology for identifying LSTs, improving systems readiness and staff preparedness in a new SFR NICU prior to occupancy. However, to optimize patient safety, identified LSTs must be mitigated prior to occupancy. Coordinating large-scale simulations is worth the time and cost investment necessary to optimize systems and ensure patient safety prior to transition to a new SFR NICU.


2019 ◽  
Vol 491 (4) ◽  
pp. 5595-5620 ◽  
Author(s):  
Sanson T S Poon ◽  
Richard P Nelson ◽  
Seth A Jacobson ◽  
Alessandro Morbidelli

ABSTRACT The NASA’s Kepler mission discovered ∼700 planets in multiplanet systems containing three or more transiting bodies, many of which are super-Earths and mini-Neptunes in compact configurations. Using N-body simulations, we examine the in situ, final stage assembly of multiplanet systems via the collisional accretion of protoplanets. Our initial conditions are constructed using a subset of the Kepler five-planet systems as templates. Two different prescriptions for treating planetary collisions are adopted. The simulations address numerous questions: Do the results depend on the accretion prescription?; do the resulting systems resemble the Kepler systems, and do they reproduce the observed distribution of planetary multiplicities when synthetically observed?; do collisions lead to significant modification of protoplanet compositions, or to stripping of gaseous envelopes?; do the eccentricity distributions agree with those inferred for the Kepler planets? We find that the accretion prescription is unimportant in determining the outcomes. The final planetary systems look broadly similar to the Kepler templates adopted, but the observed distributions of planetary multiplicities or eccentricities are not reproduced, because scattering does not excite the systems sufficiently. In addition, we find that ∼1 per cent of our final systems contain a co-orbital planet pair in horseshoe or tadpole orbits. Post-processing the collision outcomes suggests that they would not significantly change the ice fractions of initially ice-rich protoplanets, but significant stripping of gaseous envelopes appears likely. Hence, it may be difficult to reconcile the observation that many low-mass Kepler planets have H/He envelopes with an in situ formation scenario that involves giant impacts after dispersal of the gas disc.


Sensors ◽  
2021 ◽  
Vol 21 (8) ◽  
pp. 2830
Author(s):  
Sili Wang ◽  
Mark P. Panning ◽  
Steven D. Vance ◽  
Wenzhan Song

Locating underground microseismic events is important for monitoring subsurface activity and understanding the planetary subsurface evolution. Due to bandwidth limitations, especially in applications involving planetarily-distributed sensor networks, networks should be designed to perform the localization algorithm in-situ, so that only the source location information needs to be sent out, not the raw data. In this paper, we propose a decentralized Gaussian beam time-reverse imaging (GB-TRI) algorithm that can be incorporated to the distributed sensors to detect and locate underground microseismic events with reduced usage of computational resources and communication bandwidth of the network. After the in-situ distributed computation, the final real-time location result is generated and delivered. We used a real-time simulation platform to test the performance of the system. We also evaluated the stability and accuracy of our proposed GB-TRI localization algorithm using extensive experiments and tests.


Sign in / Sign up

Export Citation Format

Share Document