scholarly journals REGIONAL NETWORK FOR COASTAL ENGINEERING DATA

1976 ◽  
Vol 1 (15) ◽  
pp. 5
Author(s):  
Richard J. Seymour ◽  
Meredith H. Sessions

The California Department of Navigation and Ocean Development (DNOD), responsible for shoreline protection within the state, was particularly aware of the lack of coastal wave statistics to support their beach erosion program. As a direct result of the 1974 ASCE-sponsored New Orleans Conference on Ocean Wave Measurement and Analysis, discussion was initiated within DNOD and then with the Scripps Institution of Oceanography (SIO) at La Jolla, on the feasibility o"f establishing a regional wave monitoring network for California. The initial specification presented by DNOD was for a 200-station network reporting directional wave spectra twice daily for a period of ten years. SIO ocean engineering personnel responded with a system concept employing low-cost pressure transducers hardwired to shore with a dialup telephone data gathering link to a central station. The initial cost estimates appeared attractive when compared with Corps of Engineers experience as reported in Peacock (1974). As a result, a small program was funded in February 1975 at Scripps to demonstrate critical hardware items through the breadboard stage. With the successful completion of this work, additional funds were allocated by DNOD as matching funds for a California Sea Grant Project. Th_e first station in the network began operation on 3 December 1975 at Imperial Beach, California. A second station was added at Ocean Beach (San Diego) on 27 March 1976, a third at SIO (La Jolla) on 18 May 1976 and the fourth at Oceanside, California on 2 June 1976. The locations of these initial stations are shown in Figure 1. Considerable effort has been directed during the past 10 years toward the development of numerical models to predict deep-water wave conditions from meteorological data. Reasonable results have been obtained and sufficient accuracy achieved to allow routing of both commercial and military ship traffic.

Data in Brief ◽  
2021 ◽  
pp. 107127
Author(s):  
Jose M. Barcelo-Ordinas ◽  
Pau Ferrer-Cid ◽  
Jorge Garcia-Vidal ◽  
Mar Viana ◽  
Ana Ripoll

Atmosphere ◽  
2021 ◽  
Vol 12 (1) ◽  
pp. 91
Author(s):  
Santiago Lopez-Restrepo ◽  
Andres Yarce ◽  
Nicolás Pinel ◽  
O.L. Quintero ◽  
Arjo Segers ◽  
...  

The use of low air quality networks has been increasing in recent years to study urban pollution dynamics. Here we show the evaluation of the operational Aburrá Valley’s low-cost network against the official monitoring network. The results show that the PM2.5 low-cost measurements are very close to those observed by the official network. Additionally, the low-cost allows a higher spatial representation of the concentrations across the valley. We integrate low-cost observations with the chemical transport model Long Term Ozone Simulation-European Operational Smog (LOTOS-EUROS) using data assimilation. Two different configurations of the low-cost network were assimilated: using the whole low-cost network (255 sensors), and a high-quality selection using just the sensors with a correlation factor greater than 0.8 with respect to the official network (115 sensors). The official stations were also assimilated to compare the more dense low-cost network’s impact on the model performance. Both simulations assimilating the low-cost model outperform the model without assimilation and assimilating the official network. The capability to issue warnings for pollution events is also improved by assimilating the low-cost network with respect to the other simulations. Finally, the simulation using the high-quality configuration has lower error values than using the complete low-cost network, showing that it is essential to consider the quality and location and not just the total number of sensors. Our results suggest that with the current advance in low-cost sensors, it is possible to improve model performance with low-cost network data assimilation.


2021 ◽  
Vol 13 (15) ◽  
pp. 8263
Author(s):  
Marius Bodor

An important aspect of air pollution analysis consists of the varied presence of particulate matter in analyzed air samples. In this respect, the present work aims to present a case study regarding the evolution in time of quantified particulate matter of different sizes. This study is based on data acquisitioned in an indoor location, already used in a former particulate matter-related article; thus, it can be considered as a continuation of that study, with the general aim to demonstrate the necessity to expand the existing network for pollution monitoring. Besides particle matter quantification, a correlation of the obtained results is also presented against meteorological data acquisitioned by the National Air Quality Monitoring Network. The transformation of quantified PM data in mass per volume and a comparison with other results are also addressed.


2020 ◽  
Vol 12 (24) ◽  
pp. 10677
Author(s):  
Ronghui Ye ◽  
Jun Kong ◽  
Chengji Shen ◽  
Jinming Zhang ◽  
Weisheng Zhang

Accurate salinity prediction can support the decision-making of water resources management to mitigate the threat of insufficient freshwater supply in densely populated estuaries. Statistical methods are low-cost and less time-consuming compared with numerical models and physical models for predicting estuarine salinity variations. This study proposes an alternative statistical model that can more accurately predict the salinity series in estuaries. The model incorporates an autoregressive model to characterize the memory effect of salinity and includes the changes in salinity driven by river discharge and tides. Furthermore, the Gamma distribution function was introduced to correct the hysteresis effects of river discharge, tides and salinity. Based on fixed corrections of long-term effects, dynamic corrections of short-term effects were added to weaken the hysteresis effects. Real-world model application to the Pearl River Estuary obtained satisfactory agreement between predicted and measured salinity peaks, indicating the accuracy of salinity forecasting. Cross-validation and weekly salinity prediction under small, medium and large river discharges were also conducted to further test the reliability of the model. The statistical model provides a good reference for predicting salinity variations in estuaries.


Sensors ◽  
2018 ◽  
Vol 18 (10) ◽  
pp. 3405 ◽  
Author(s):  
Manuel Espinosa-Gavira ◽  
Agustín Agüera-Pérez ◽  
Juan González de la Rosa ◽  
José Palomares-Salas ◽  
José Sierra-Fernández

Very short-term solar forecasts are gaining interest for their application on real-time control of photovoltaic systems. These forecasts are intimately related to the cloud motion that produce variations of the irradiance field on scales of seconds and meters, thus particularly impacting in small photovoltaic systems. Very short-term forecast models must be supported by updated information of the local irradiance field, and solar sensor networks are positioning as the more direct way to obtain these data. The development of solar sensor networks adapted to small-scale systems as microgrids is subject to specific requirements: high updating frequency, high density of measurement points and low investment. This paper proposes a wireless sensor network able to provide snapshots of the irradiance field with an updating frequency of 2 Hz. The network comprised 16 motes regularly distributed over an area of 15 m × 15 m (4 motes × 4 motes, minimum intersensor distance of 5 m). The irradiance values were estimated from illuminance measurements acquired by lux-meters in the network motes. The estimated irradiances were validated with measurements of a secondary standard pyranometer obtaining a mean absolute error of 24.4 W/m 2 and a standard deviation of 36.1 W/m 2 . The network was able to capture the cloud motion and the main features of the irradiance field even with the reduced dimensions of the monitoring area. These results and the low-cost of the measurement devices indicate that this concept of solar sensor networks would be appropriate not only for photovoltaic plants in the range of MW, but also for smaller systems such as the ones installed in microgrids.


Author(s):  
M. Abdelaziz ◽  
M. Elsayed

<p><strong>Abstract.</strong> Underwater photogrammetry in archaeology in Egypt is a completely new experience applied for the first time on the submerged archaeological site of the lighthouse of Alexandria situated on the eastern extremity of the ancient island of Pharos at the foot of Qaitbay Fort at a depth of 2 to 9 metres. In 2009/2010, the CEAlex launched a 3D photogrammetry data-gathering programme for the virtual reassembly of broken artefacts. In 2013 and the beginning of 2014, with the support of the Honor Frost Foundation, methods were developed and refined to acquire manual photographic data of the entire underwater site of Qaitbay using a DSLR camera, simple and low cost materials to obtain a digital surface model (DSM) of the submerged site of the lighthouse, and also to create 3D models of the objects themselves, such as statues, bases of statues and architectural elements. In this paper we present the methodology used for underwater data acquisition, data processing and modelling in order to generate a DSM of the submerged site of Alexandria’s ancient lighthouse. Until 2016, only about 7200&amp;thinsp;m<sup>2</sup> of the submerged site, which exceeds more than 13000&amp;thinsp;m<sup>2</sup>, was covered. One of our main objectives in this project is to georeference the site since this would allow for a very precise 3D model and for correcting the orientation of the site as regards the real-world space.</p>


2014 ◽  
Vol 4 (4) ◽  
pp. 686-689 ◽  
Author(s):  
N. Baluch ◽  
Z. M. Udin ◽  
C. S. Abdullah

The world’s most common alloy, steel, is the material of choice when it comes to making products as diverse as oil rigs to cars and planes to skyscrapers, simply because of its functionality, adaptability, machine-ability and strength. Newly developed grades of Advanced High Strength Steel (AHSS) significantly outperform competing materials for current and future automotive applications. This is a direct result of steel’s performance flexibility, as well as of its many benefits including low cost, weight reduction capability, safety attributes, reduced greenhouse gas emissions and superior recyclability. To improve crash worthiness and fuel economy, the automotive industry is, increasingly, using AHSS. Today, and in the future, automotive manufacturers must reduce the overall weight of their cars. The most cost-efficient way to do this is with AHSS. However, there are several parameters that decide which of the AHSS types to be used; the most important parameters are derived from the geometrical form of the component and the selection of forming and blanking methods. This paper describes the different types of AHSS, highlights their advantages for use in auto metal stampings, and discusses about the new challenges faced by stampers, particularly those serving the automotive industry.


2021 ◽  
Author(s):  
Sonu Kumar Jha ◽  
Mohit Kumar ◽  
Vipul Arora ◽  
Sachchida Nand Tripathi ◽  
Vidyanand Motiram Motghare ◽  
...  

<div>Air pollution is a severe problem growing over time. A dense air-quality monitoring network is needed to update the people regarding the air pollution status in cities. A low-cost sensor device (LCSD) based dense air-quality monitoring network is more viable than continuous ambient air quality monitoring stations (CAAQMS). An in-field calibration approach is needed to improve agreements of the LCSDs to CAAQMS. The present work aims to propose a calibration method for PM2.5 using domain adaptation technique to reduce the collocation duration of LCSDs and CAAQMS. A novel calibration approach is proposed in this work for the measured PM2.5 levels of LCSDs. The dataset used for the experimentation consists of PM2.5 values and other parameters (PM10, temperature, and humidity) at hourly duration over a period of three months data. We propose new features, by combining PM2.5, PM10, temperature, and humidity, that significantly improved the performance of calibration. Further, the calibration model is adapted to the target location for a new LCSD with a collocation time of two days. The proposed model shows high correlation coefficient values (R2) and significantly low mean absolute percentage error (MAPE) than that of other baseline models. Thus, the proposed model helps in reducing the collocation time while maintaining high calibration performance.</div>


The European Life project, called DYNAMAP, has been devoted to provide a realimage of the noise generated by vehicular trafficin urban and suburban areas, developing a dynamic acoustic map based on a limited numberof low-cost permanent noise monitoring stations.In the urban area of Milan, the system has beenimplemented over the pilot area named Area 9.Traffic noise data, collected by the monitoringstations, each one representative of a numberof roads with similar characteristics (e.g. dailytraffic flow), are used to build-up a “real time”noise map. DYNAMAP has a statistical structure and this implies that information capturedby each sensor must be representative of an extended area, thus uncorrelated from other stations. The study of the correlations among thesensors represents a key-point in designing themonitoring network. Another important aspectregards the “contemporaneity” of noise fluctuations predicted by DYNAMAP with those effectively measured at an arbitrary location. Integration times heavily affect the result, with correlation coefficients up to 0.8-0.9 for updating timesof 1h. Higher correlations are observed when averaging over groups of roads with similar traffic flow characteristics


Author(s):  
L. Marek ◽  
M. Campbell ◽  
M. Epton ◽  
M. Storer ◽  
S. Kingham

The opportunity of an emerging smart city in post-disaster Christchurch has been explored as a way to improve the quality of life of people suffering Chronic Obstructive Pulmonary Disease (COPD), which is a progressive disease that affects respiratory function. It affects 1 in 15 New Zealanders and is the 4th largest cause of death, with significant costs to the health system. While, cigarette smoking is the leading cause of COPD, long-term exposure to other lung irritants, such as air pollution, chemical fumes, or dust can also cause and exacerbate it. Currently, we do know little what happens to the patients with COPD after they leave a doctor’s care. By learning more about patients’ movements in space and time, we can better understand the impacts of both the environment and personal mobility on the disease. This research is studying patients with COPD by using GPS-enabled smartphones, combined with the data about their spatiotemporal movements and information about their actual usage of medication in near real-time. We measure environmental data in the city, including air pollution, humidity and temperature and how this may subsequently be associated with COPD symptoms. In addition to the existing air quality monitoring network, to improve the spatial scale of our analysis, we deployed a series of low-cost Internet of Things (IoT) air quality sensors as well. The study demonstrates how health devices, smartphones and IoT sensors are becoming a part of a new health data ecosystem and how their usage could provide information about high-risk health hotspots, which, in the longer term, could lead to improvement in the quality of life for patients with COPD.


Sign in / Sign up

Export Citation Format

Share Document