Ensemble generation of regional ocean physics and biogeochemical model uncertainties, empirical consistency and suitability for probabilistic forecasting

Author(s):  
Vassilios Vervatis ◽  
Pierre De Mey-Frémaux ◽  
Bénédicte Lemieux-Dudon ◽  
John Karagiorgos ◽  
Nadia Ayoub ◽  
...  

<div><span>The study builds upon two Copernicus marine projects, SCRUM and SCRUM2, focusing on ensemble forecasting operational capabilities to better serve coastal downscaling. Both projects provided coupled physics-biogeochemistry ensemble generation approaches, tools to strengthen CMEMS in the areas of ocean uncertainty modelling, empirical ensemble consistency and data assimilation, including methods to assess the suitability of ensembles for probabilistic forecasting. The study is conducted by performing short- to medium-range ensembles in the Bay of Biscay, a subdomain of the IBI-MFC. Ensembles were generated using ocean stochastic modelling and incorporating an atmospheric ensemble. Sentinel 3A data from CMEMS TACs and arrays were considered for empirical consistency, using innovation statistics and approaches taking into account correlated observations. Finally, several properties of ensembles were estimated as components of known probabilistic skill scores: the Brier score (BS), and the CRPS. This was done for pseudo-observations (Quasi-Reliable test-bed) and for real verifying observations in a coastal upwelling test case.</span></div>

2009 ◽  
Vol 6 (1) ◽  
pp. 85-102 ◽  
Author(s):  
G. Fischer ◽  
G. Karakaş

Abstract. The flux of materials to the deep sea is dominated by larger, organic-rich particles with sinking rates varying between a few meters and several hundred meters per day. Mineral ballast may regulate the transfer of organic matter and other components by determining the sinking rates, e.g. via particle density. We calculated particle sinking rates from mass flux patterns and alkenone measurements applying the results of sediment trap experiments from the Atlantic Ocean. We have indication for higher particle sinking rates in carbonate-dominated production systems when considering both regional and seasonal data. During a summer coccolithophorid bloom in the Cape Blanc coastal upwelling off Mauritania, particle sinking rates reached almost 570 m per day, most probably due the fast sedimentation of densely packed zooplankton fecal pellets, which transport high amounts of organic carbon associated with coccoliths to the deep ocean despite rather low production. During the recurring winter-spring blooms off NW Africa and in opal-rich production systems of the Southern Ocean, sinking rates of larger particles, most probably diatom aggregates, showed a tendency to lower values. However, there is no straightforward relationship between carbonate content and particle sinking rates. This could be due to the unknown composition of carbonate and/or the influence of particle size and shape on sinking rates. It also remains noticeable that the highest sinking rates occurred in dust-rich ocean regions off NW Africa, but this issue deserves further detailed field and laboratory investigations. We obtained increasing sinking rates with depth. By using a seven-compartment biogeochemical model, it was shown that the deep ocean organic carbon flux at a mesotrophic sediment trap site off Cape Blanc can be captured fairly well using seasonal variable particle sinking rates. Our model provides a total organic carbon flux of 0.29 Tg per year down to 3000 m off the NW African upwelling region between 5 and 35° N. Simple parameterisations of remineralisation and sinking rates in such models, however, limit their capability in reproducing the flux variation in the water column.


Author(s):  
Robert Dell ◽  
Runar Unnthorsson ◽  
C. S. Wei ◽  
William Foley

In small source power generation scenarios in industrial or remote settings a viable small electrical supply for security and monitoring systems is often problematic due to the variability of the energy sources and the stability of the power generated. These small scale systems lack the advantages of a larger power grid. Therefore peak power requirements can be beyond the power generator necessitating energy storage such as batteries. The authors have developed and documented a reliable thermoelectric generator and a test bed. The generator was combined with a battery in order to meet peak power requirements beyond the unassisted range of the generator. This paper presents a test case result with the thermoelectric generator powering a complete web accessible mobile robot system. The robot system can be used for monitoring, physical manipulation of the environment, routine maintenance and in emergencies.


2015 ◽  
Vol 30 (6) ◽  
pp. 1551-1570 ◽  
Author(s):  
Christopher D. Karstens ◽  
Greg Stumpf ◽  
Chen Ling ◽  
Lesheng Hua ◽  
Darrel Kingfield ◽  
...  

Abstract A proposed new method for hazard identification and prediction was evaluated with forecasters in the National Oceanic and Atmospheric Administration Hazardous Weather Testbed during 2014. This method combines hazard-following objects with forecaster-issued trends of exceedance probabilities to produce probabilistic hazard information, as opposed to the static, deterministic polygon and attendant text product methodology presently employed by the National Weather Service to issue severe thunderstorm and tornado warnings. Three components of the test bed activities are discussed: usage of the new tools, verification of storm-based warnings and probabilistic forecasts from a control–test experiment, and subjective feedback on the proposed paradigm change. Forecasters were able to quickly adapt to the new tools and concepts and ultimately produced probabilistic hazard information in a timely manner. The probabilistic forecasts from two severe hail events tested in a control–test experiment were more skillful than storm-based warnings and were found to have reliability in the low-probability spectrum. False alarm area decreased while the traditional verification metrics degraded with increasing probability thresholds. The latter finding is attributable to a limitation in applying the current verification methodology to probabilistic forecasts. Relaxation of on-the-fence decisions exposed a need to provide information for hazard areas below the decision-point thresholds of current warnings. Automated guidance information was helpful in combating potential workload issues, and forecasters raised a need for improved guidance and training to inform consistent and reliable forecasts.


2006 ◽  
Vol 3 (3) ◽  
pp. 251-269 ◽  
Author(s):  
X. Giraud

Abstract. A regional biogeochemical model is applied to the NW African coastal upwelling between 19° N and 27° N to investigate how a water temperature proxy, alkenones, are produced at the sea surface and recorded in the slope sediments. The biogeochemical model has two phytoplankton groups: an alkenone producer group, considered to be coccolithophores, and a group comprising other phytoplankton. The Regional Ocean Modelling System (ROMS) is used to simulate the ocean circulation and takes advantage of the Adaptive Grid Refinement in Fortran (AGRIF) package to set up an embedded griding system. In the simulations the alkenone temperature records in the sediments are between 1.1 and 2.3°C colder than the annual mean SSTs. Despite the seasonality of the coccolithophore production, this temperature difference is not mainly due to a seasonal bias, nor to the lateral advection of phytoplankton and phytodetritus seaward from the cold near-shore waters, but to the production depth of the coccolithophores. If coretop alkenone temperatures are effectively recording the annual mean SSTs, the amount of alkenone produced must vary among the coccolithophores in the water column and depend on physiological factors (e.g. growth rate, nutrient stress).


2012 ◽  
Vol 225 ◽  
pp. 115-126 ◽  
Author(s):  
C.P. McDonald ◽  
V. Bennington ◽  
N.R. Urban ◽  
G.A. McKinley

2007 ◽  
Vol 22 (5) ◽  
pp. 1076-1088 ◽  
Author(s):  
Christopher A. T. Ferro

Abstract This article considers the Brier score for verifying ensemble-based probabilistic forecasts of binary events. New estimators for the effect of ensemble size on the expected Brier score, and associated confidence intervals, are proposed. An example with precipitation forecasts illustrates how these estimates support comparisons of the performances of competing forecasting systems with possibly different ensemble sizes.


2012 ◽  
Vol 93 (12) ◽  
pp. 1833-1843
Author(s):  
Steven G. Decker

Calls for moving from a deterministic to a probabilistic view of weather forecasting have become increasingly urgent over recent decades, yet the primary national forecasting competition and many in-class forecasting games are wholly deterministic in nature. To counter these conflicting trends, a long-running forecasting game at Rutgers University has recently been modified to become probabilistic in nature. Students forecast high- and low-temperature intervals and probabilities of precipitation for two locations: one fixed at the Rutgers cooperative observing station, the other chosen for each forecast window to maximize difficulty. Precipitation errors are tabulated with a Brier score, while temperature errors contain a sharpness component dependent on the width of the forecast interval and an interval miss component dependent on the degree to which the verification falls within the interval. The inclusion of a probabilistic forecasting game allows for the creation of a substantial database of forecasts that can be analyzed using standard probabilistic approaches, such as reliability diagrams, relative operating characteristic curves, and histograms. Discussions of probabilistic forecast quality can be quite abstract for undergraduate students, but the use of a forecast database that students themselves help construct motivates these discussions and helps students make connections between their forecast process, their standing in class rankings, and the verification diagrams they use. Student feedback on the probabilistic game is also discussed.


2021 ◽  
Vol 893 (1) ◽  
pp. 012037
Author(s):  
F Lubis ◽  
I J A Saragih

Abstract The onset of the rainy season is one of the forecast products that is issued regularly by the Indonesian Agency of Meteorology, Climatology, and Geophysics (BMKG), with deterministic information about the month of which the initial 10-days (dasarian) of the rainy season will occur in each a designated area. On the other hand, state-of-the-art of seasonal forecasting methods suggests that probabilistic forecast products are potentially better for decision making. The probabilistic forecast is also more suitable for Indonesia because of the large rainfall variability that adds up to uncertainty in climate model simulations, besides complex geographical factors. The research aims to determine the onset of rainy season and monsoon over Java Island based on rainfall prediction by Constructed Analogue statistical downscaling of CFSv2 (Climate Forecast System version 2) model output. This research attempted to develop a method to produce a probabilistic forecast of the onset of the rainy season, as well as monsoon onset, by utilizing the freely available seasonal model output of CFSv2 operated by the US National Oceanic and Atmospheric Administration (NOAA). In this case, the output of the global model is dynamically downscaled using the modified Constructed Analogue (CA) method with an observational rainfall database from 26 BMKG stations and TRMM 3B43 gridded dataset. This method was then applied to perform hindcast using CFS-R (re-forecast) for the 2011-2014 period. The results show that downscaled CFS predictions with initial data in September (lead-1) give sufficient accuracy, while that initialized in August (lead-2) have large errors for both onsets of the rainy season and monsoon. Further analysis of forecast skill using the Brier score indicates that the CA scheme used in this study showed good performance in predicting the onset of the rainy season with a skill score in the range of 0.2. The probabilistic skill scores indicate that the prediction for East Java is better than the West- and Central-Java regions. It is also found that the results of CA downscaling can capture year-to-year variations, including delays in the onset of the rainy season.


2021 ◽  
Author(s):  
Zouhair Lachkar ◽  
Michael Mehari ◽  
Alain De Verneil ◽  
Marina Lévy ◽  
Shafer Smith

<p>Recent observations and modeling evidence indicate that the Arabian Sea (AS) is a net source of carbon to the atmosphere. Yet, the interannual variability modulating the air-sea CO<sub>2</sub> fluxes in the region, as well as their long-term trends, remain poorly known. Furthermore, while the rising atmospheric concentration of CO<sub>2</sub> is causing surface ocean pH to drop globally, little is known about local and regional acidification trends in the AS, a region hosting a major coastal upwelling system naturally prone to relatively low surface pH. Here, we simulate the evolution of air-sea CO<sub>2</sub> fluxes and reconstruct the progression of ocean acidification in the AS from 1982 through 2019 using an eddy-resolving ocean biogeochemical model covering the full Indian Ocean and forced with observation-based winds and heat and freshwater fluxes. Additionally, using a set of sensitivity simulations that vary in terms of atmospheric CO<sub>2</sub> levels and physical forcing we quantify the variability of fluxes associated with both natural and anthropogenic CO<sub>2</sub> and disentangle the contributions of climate variability and that of atmospheric CO<sub>2</sub> concentrations to the long-term trends in air-sea CO<sub>2</sub> fluxes and acidification. Our analysis reveals a strong variability in the air-sea CO<sub>2</sub> fluxes and pH on a multitude of timescales ranging from the intra-seasonal to the decadal. Furthermore, a strong progression of ocean acidification with an important penetration into the thermocline is simulated locally near the upwelling regions. Our analysis also indicates that in addition to the increasing anthropogenic CO<sub>2</sub> concentrations in the atmosphere, recent warming and monsoon wind changes have substantially modulated these trends regionally.</p>


2010 ◽  
Vol 138 (1) ◽  
pp. 203-211 ◽  
Author(s):  
Riccardo Benedetti

Abstract The problem of probabilistic forecast verification is approached from a theoretical point of view starting from three basic desiderata: additivity, exclusive dependence on physical observations (“locality”), and strictly proper behavior. By imposing such requirements and only using elementary mathematics, a univocal measure of forecast goodness is demonstrated to exist. This measure is the logarithmic score, based on the relative entropy between the observed occurrence frequencies and the predicted probabilities for the forecast events. Information theory is then used as a guide to choose the scoring-scale offset for obtaining meaningful and fair skill scores. Finally the Brier score is assessed and, for single-event forecasts, its equivalence to the second-order approximation of the logarithmic score is shown. The large part of the presented results are far from being new or original, nevertheless their use still meets with some resistance in the weather forecast community. This paper aims at providing a clear presentation of the main arguments for using the logarithmic score.


Sign in / Sign up

Export Citation Format

Share Document