scholarly journals In situ SST Quality Monitor (iQuam)

2014 ◽  
Vol 31 (1) ◽  
pp. 164-180 ◽  
Author(s):  
Feng Xu ◽  
Alexander Ignatov

Abstract The quality of in situ sea surface temperatures (SSTs) is critical for calibration and validation of satellite SSTs. In situ SSTs come from different countries, agencies, and platforms. As a result, their quality is often suboptimal, nonuniform, and measurement-type specific. This paper describes a system developed at the National Oceanic and Atmospheric Administration (NOAA), the in situ SST Quality Monitor (iQuam; www.star.nesdis.noaa.gov/sod/sst/iquam/). It performs three major functions with the Global Telecommunication System (GTS) data: 1) quality controls (QC) in situ SSTs, using Bayesian reference and buddy checks similar to those adopted in the Met Office, in addition to providing basic screenings, such as duplicate removal, plausibility, platform track, and SST spike checks; 2) monitors quality-controlled SSTs online, in near–real time; and 3) serves reformatted GTS SST data to NOAA and external users with quality flags appended. Currently, iQuam’s web page displays global monthly maps of measurement locations stratified by four in situ platform types (drifters, ships, and tropical and coastal moorings) as well as their corresponding “in situ minus reference” SST statistics. Time series of all corresponding SST and QC statistics are also trended. The web page user can also monitor individual in situ platforms. The current status of iQuam and ongoing improvements are discussed.

Author(s):  
Satinder Kaur ◽  
Sunil Gupta

Inform plays a very important role in life and nowadays, the world largely depends on the World Wide Web to obtain any information. Web comprises of a lot of websites of every discipline, whereas websites consists of web pages which are interlinked with each other with the help of hyperlinks. The success of a website largely depends on the design aspects of the web pages. Researchers have done a lot of work to appraise the web pages quantitatively. Keeping in mind the importance of the design aspects of a web page, this paper aims at the design of an automated evaluation tool which evaluate the aspects for any web page. The tool takes the HTML code of the web page as input, and then it extracts and checks the HTML tags for the uniformity. The tool comprises of normalized modules which quantify the measures of design aspects. For realization, the tool has been applied on four web pages of distinct sites and design aspects have been reported for comparison. The tool will have various advantages for web developers who can predict the design quality of web pages and enhance it before and after implementation of website without user interaction.


2020 ◽  
Vol 12 (16) ◽  
pp. 2642
Author(s):  
Stelios Mertikas ◽  
Achilleas Tripolitsiotis ◽  
Craig Donlon ◽  
Constantin Mavrocordatos ◽  
Pierre Féménias ◽  
...  

This work presents the latest calibration results for the Copernicus Sentinel-3A and -3B and the Jason-3 radar altimeters as determined by the Permanent Facility for Altimetry Calibration (PFAC) in west Crete, Greece. Radar altimeters are used to provide operational measurements for sea surface height, significant wave height and wind speed over oceans. To maintain Fiducial Reference Measurement (FRM) status, the stability and quality of altimetry products need to be continuously monitored throughout the operational phase of each altimeter. External and independent calibration and validation facilities provide an objective assessment of the altimeter’s performance by comparing satellite observations with ground-truth and in-situ measurements and infrastructures. Three independent methods are employed in the PFAC: Range calibration using a transponder, sea-surface calibration relying upon sea-surface Cal/Val sites, and crossover analysis. Procedures to determine FRM uncertainties for Cal/Val results have been demonstrated for each calibration. Biases for Sentinel-3A Passes No. 14, 278 and 335, Sentinel-3B Passes No. 14, 71 and 335, as well as for Jason-3 Passes No. 18 and No. 109 are given. Diverse calibration results by various techniques, infrastructure and settings are presented. Finally, upgrades to the PFAC in support of the Copernicus Sentinel-6 ‘Michael Freilich’, due to launch in November 2020, are summarized.


2009 ◽  
Vol 66 (7) ◽  
pp. 1467-1479 ◽  
Author(s):  
Sarah L. Hughes ◽  
N. Penny Holliday ◽  
Eugene Colbourne ◽  
Vladimir Ozhigin ◽  
Hedinn Valdimarsson ◽  
...  

Abstract Hughes, S. L., Holliday, N. P., Colbourne, E., Ozhigin, V., Valdimarsson, H., Østerhus, S., and Wiltshire, K. 2009. Comparison of in situ time-series of temperature with gridded sea surface temperature datasets in the North Atlantic. – ICES Journal of Marine Science, 66: 1467–1479. Analysis of the effects of climate variability and climate change on the marine ecosystem is difficult in regions where long-term observations of ocean temperature are sparse or unavailable. Gridded sea surface temperature (SST) products, based on a combination of satellite and in situ observations, can be used to examine variability and long-term trends because they provide better spatial coverage than the limited sets of long in situ time-series. SST data from three gridded products (Reynolds/NCEP OISST.v2., Reynolds ERSST.v3, and the Hadley Centre HadISST1) are compared with long time-series of in situ measurements from ICES standard sections in the North Atlantic and Nordic Seas. The variability and trends derived from the two data sources are examined, and the usefulness of the products as a proxy for subsurface conditions is discussed.


2020 ◽  
Author(s):  
Cunmin Guo ◽  
Weihua Fang

<p>Strong winds over the sea surface induced by tropical cyclones (TCs) of Northwest Pacific (NWP) basin have been posing great threats to maritime activities, and quantitative assessment on its hazard intensity is of great importance. In the past, most studies focused on the modeling of winds over the land and areas of major island areas numerically or statistically. However, there is no systematic assessment of TC wind hazard over the NWP basin with long-term wind time series based on windfield modeling of historical TC events. In this study, the footprints of historical TC events during 1949~2019 were modeled based on the parametric models developed in previous studies, which simulate the winds of both gradient layer and planetary boundary layer. The historical TC track data were obtained from the China Meteorological Administration, and the wind records from the Global Telecommunication System (GTS) data were used for the calibration and validation of the models. The spatial resolution of the modeling output is 1km for winds over the sea surface. In order to reflect wind speed heterogeneity over the land of small islands, the wind speeds were modeled with 90-meter resolution by considering local terrain effects and roughness heights of islands, derived from 90m SRTM DEM data and 30m land-used data. Based on the simulated wind footprints of the 2384 TC events during 1949~2019, the relationships between wind intensity and frequency of each modeling pixel were analyzed and fitted with General Extreme Value (GEV) distribution. A series of wind hazard maps, including wind speeds for return periods of 5a, 10a, 20a, 50a and 100a, and the exceedance probabilities of wind scales from 10 to 17, etc were produced. These wind hazard maps are useful to the management of TC disaster risks in the NWP basin.</p>


2018 ◽  
Vol 35 (2) ◽  
pp. 281-297 ◽  
Author(s):  
Jinbo Wang ◽  
Lee-Lueng Fu ◽  
Bo Qiu ◽  
Dimitris Menemenlis ◽  
J. Thomas Farrar ◽  
...  

AbstractThe wavenumber spectrum of sea surface height (SSH) is an important indicator of the dynamics of the ocean interior. While the SSH wavenumber spectrum has been well studied at mesoscale wavelengths and longer, using both in situ oceanographic measurements and satellite altimetry, it remains largely unknown for wavelengths less than ~70 km. The Surface Water Ocean Topography (SWOT) satellite mission aims to resolve the SSH wavenumber spectrum at 15–150-km wavelengths, which is specified as one of the mission requirements. The mission calibration and validation (CalVal) requires the ground truth of a synoptic SSH field to resolve the targeted wavelengths, but no existing observational network is able to fulfill the task. A high-resolution global ocean simulation is used to conduct an observing system simulation experiment (OSSE) to identify the suitable oceanographic in situ measurements for SWOT SSH CalVal. After fixing 20 measuring locations (the minimum number for resolving 15–150-km wavelengths) along the SWOT swath, four instrument platforms were tested: pressure-sensor-equipped inverted echo sounders (PIES), underway conductivity–temperature–depth (UCTD) sensors, instrumented moorings, and underwater gliders. In the context of the OSSE, PIES was found to be an unsuitable tool for the target region and for SSH scales 15–70 km; the slowness of a single UCTD leads to significant aliasing by high-frequency motions at short wavelengths below ~30 km; an array of station-keeping gliders may meet the requirement; and an array of moorings is the most effective system among the four tested instruments for meeting the mission’s requirement. The results shown here warrant a prelaunch field campaign to further test the performance of station-keeping gliders.


2012 ◽  
Vol 2 (3) ◽  
pp. 172-187 ◽  
Author(s):  
J. Reinking ◽  
A. Härting ◽  
L. Bastos

AbstractWith the growing global efforts to estimate the influence of civilization on the climate change it would be desirable to survey sea surface heights (SSH) not only by remote sensing techniques like satellite altimetry or (GNSS) Global Navigation Satellite System reflectometry but also by direct and in-situ measurements in the open ocean. In recent years different groups attempted to determine SSH by ship-based GNSS observations. Due to recent advances in kinematic GNSS (PPP) Precise Point Positioning analysis it is already possible to derive GNSS antenna heights with a quality of a few centimeters. Therefore it is foreseeable that this technique will be used more intensively in the future, with obvious advantages in sea positioning. For the determination of actual SSH from GNSS-derived antenna heights aboard seagoing vessels some essential hydrostatic and hydrodynamic corrections must be considered in addition to ocean dynamics and related corrections. Systematic influences of ship dynamics were intensively analyzed and sophisticated techniques were developed at the Jade University during the last decades to precisely estimate mandatory corrections. In this paper we will describe the required analyses and demonstrate their application by presenting a case study from an experiment on a cruise vessel carried out in March 2011 in the Atlantic Ocean.


2021 ◽  
Author(s):  
Xavier Perrot ◽  
Jacqueline Boutin ◽  
Jean Luc Vergely ◽  
Frédéric Rouffi ◽  
Adrien Martin ◽  
...  

<p>This study is performed in the frame of the European Space Agency (ESA) Climate Change Initiative (CCI+) for Sea Surface Salinity (SSS), which aims at generating global SSS fields from all available satellite L-band radiometer measurements over the longest possible period with a great stability. By combining SSS from the Soil Moisture and Ocean Salinity, SMOS, Aquarius and the Soil Moisture Active Passive, SMAP missions, CCI+SSS fields (Boutin et al. 2020) are the only one to provide a 10 year time series of satellite salinity with such quality: global rms difference of weekly 25x25km<span>2 </span>CCI+SSS with respect to in situ Argo SSS of 0.17 pss, correlation coefficient of 0.97 (see https://pimep.ifremer.fr/diffusion/analyses/mdb-database/GO/cci-l4-esa-merged-oi-v2.31-7dr/argo/report/pimep-mdb-report_GO_cci-l4-esa-merged-oi-v2.31-7dr_argo_20201215.pdf). Nevertheless, we found that some systematic biases remained. In this presentation, we will show how they will be reduced in the next CCI+SSS version.</p><p>The key satellite mission ensuring the longest time period, since 2010, at global scale, is SMOS. We implemented a re-processing of the whole SMOS dataset by changing some key points. Firstly we replace the Klein and Swift (1977) dielectric constant parametrization by the new Boutin et al. (2020) one. Secondly we change the reference dataset used to perform a vicarious calibration over the south east Pacific Ocean (the so-called Ocean Target Transformation), by using Argo interpolated fields (ISAS, Gaillard et al. 2016) contemporaneous to the satellite measurements instead of the World Ocean Atlas climatology. And thirdly the auxiliary data (wind, SST, atmospheric parameters) used as priors in the retrieval scheme, which come in the original SMOS processing from the ECMWF forecast model were replaced by ERA5 reanalysis.</p><p>Our results are showing a quantitative improvement in the stability of the SMOS CCI+SSS with respect to in situ measurements for all the period as well as a decrease of the spread of the difference between SMOS and in situ salinity measurements.</p><p>Bibliography:</p><p>J. Boutin et al. (2020), Correcting Sea Surface Temperature Spurious Effects in Salinity Retrieved From Spaceborne L-Band Radiometer Measurements, IEEE Transactions on Geoscience and Remote Sensing, doi: 10.1109/TGRS.2020.3030488.</p><p>F. Gaillard et al. (2016), In Situ–Based Reanalysis of the Global Ocean Temperature and Salinity with ISAS: Variability of the Heat Content and Steric Height, Journal of Climate, vol. 29, no. 4, pp. 1305-1323, doi: 10.1175/JCLI-D-15-0028.1.</p><p>L. Klein and C. Swift (1977), An improved model for the dielectric constant of sea water at microwave frequencies, IEEE Transactions on Antennas and Propagation, vol. 25, no. 1, pp. <span>104-111, </span>doi: 10.1109/JOE.1977.1145319.</p><p>Data reference:</p><p>J. Boutin et al. (2020): ESA Sea Surface Salinity Climate Change Initiative (Sea_Surface_Salinity_cci): Weekly sea surface salinity product, v2.31, for 2010 to 2019. Centre for Environmental Data Analysis. https://catalogue.ceda.ac.uk/uuid/eacb7580e1b54afeaabb0fd2b0a53828</p>


Author(s):  
B. M. Subraya

For many years, the World Wide Web (Web) functioned quite well without any concern about the quality of performance. The designers of the Web page, as well as the users were not much worried about the performance attributes. The Web, in the initial stages of development, was primarily meant to be an information provider rather than a medium to transact business, into which it has grown. The expectations from the users were also limited only to seek the information available on the Web. Thanks to the ever growing population of Web surfers (now in the millions), information found on the Web underwent a dimensional change in terms of nature, content, and depth.


2004 ◽  
Vol 4 (1) ◽  
Author(s):  
David Carabantes Alarcón ◽  
Carmen García Carrión ◽  
Juan Vicente Beneit Montesinos

La calidad en Internet tiene un gran valor, y más aún cuando se trata de una página web sobre salud como es un recurso sobre drogodependencias. El presente artículo recoge los estimadores y sistemas más destacados sobre calidad web para el desarrollo de un sistema específico de valoración de la calidad de recursos web sobre drogodependencias. Se ha realizado una prueba de viabilidad mediante el análisis de las principales páginas web sobre este tema (n=60), recogiendo la valoración, desde el punto de vista del usuario, de la calidad de los recursos. Se han detectado aspectos de mejora en cuanto a la exactitud y fiabilidad de la información, autoría, y desarrollo de descripciones y valoraciones de los enlaces externos. AbstractThe quality in Internet has a great value, and still more when is a web page on health like a resource of drug dependence. This paper contains the estimators and systems on quality in the web for the development of a specific system to value the quality of a web site about drug dependence. A test of viability by means of the analysis of the main web pages has been made on this subject, gathering the valuation from the point of view of the user of the quality of the resources. Aspects of improvement as the exactitude and reliability of the information, responsibility, and development of descriptions and valuations of the external links have been detected.


2012 ◽  
Vol 601 ◽  
pp. 394-400
Author(s):  
Taeh Wan Kim ◽  
Ho Cheol Jeon ◽  
Joong Min Choi

Document similarity search is to retrieve a ranked list of similar documents and find documents similar to a query document in a text corpus or a web page on the web. But most of the previous researches regarding searching for similar documents are focused on classifying documents based on the contents of documents. To solve this problem, we propose a novel retrieval approach based on undirected graphs to represent each document in corpus. In addition, this study also considers unified graph in conjunction with multiple graphs to improve the quality of searching for similar documents. Experimental results on the Reuters-21578 data demonstrate that the proposed system has better performance and success than the traditional approach.


Sign in / Sign up

Export Citation Format

Share Document