scholarly journals The Use of Geoinformatics in Coastal Atmospheric Transport Phenomena: The Athens Experiment

2021 ◽  
Vol 9 (11) ◽  
pp. 1197
Author(s):  
Theodoros Nitis ◽  
Nicolas Moussiopoulos

Coastal environment, an area where abrupt changes occur between land and sea, significantly affects the quality of life of a high portion of the Earth’s population. Therefore, the wide range of phenomena observed in coastal areas need to be assessed reliably regarding both data sets and methods applied. In particular, the study of coastal atmospheric transport phenomena which affect a variety of activities in coastal areas, using modeling techniques, demand accurate estimations of a range of meteorological and climatological variables related to the planetary boundary layer. However, the accuracy of such estimations is not obvious. Geoinformatics is able to fill this gap and provide the framework for the design, processing and implementation of accurate geo-databases. This paper aims to highlight the role of geoinformatics in the context of coastal meteorology and climatology. More precisely, it aims to reveal the effect on the performance of a Mesoscale Meteorological Model when a new scheme regarding the input surface parameters is developed using satellite data and application of Geographical Information Systems. The development of the proposed scheme is described and evaluated using the coastal Metropolitan Area of Athens, Greece as a case study. The results indicate a general improvement in the model performance based on the statistical evaluations of three meteorological parameters (temperature, wind speed and wind direction) using four appropriate indicators. The best performance was observed for temperature, then for wind direction and finally for wind speed. The necessity of the proposed new scheme is further discussed.

Author(s):  
Verónica Lango-Reynoso ◽  
Karla Teresa González-Figueroa ◽  
Fabiola Lango-Reynoso ◽  
María del Refugio Castañeda-Chávez ◽  
Jesús Montoya-Mendoza

Objective: This article describes and analyzes the main concepts of coastal ecosystems, these as a result of research concerning land-use change assessments in coastal areas. Design/Methodology/Approach: Scientific articles were searched using keywords in English and Spanish. Articles regarding land-use change assessment in coastal areas were selected, discarding those that although being on coastal zones and geographic and soil identification did not use Geographic Information System (GIS). Results: A GIS is a computer-based tool for evaluating the land-use change in coastal areas by quantifying variations. It is analyzed through GIS and its contributions; highlighting its importance and constant monitoring. Limitations of the study/Implications: This research analyzes national and international scientific information, published from 2007 to 2019, regarding the land-use change in coastal areas quantified with the digital GIS tool. Findings/Conclusions: GIS are useful tools in the identification and quantitative evaluation of changes in land-use in coastal ecosystems; which require constant evaluation due to their high dynamism.


Author(s):  
B. L. Turner II ◽  
D. R. Foster

Frontiers advance and retreat, both figuratively and literally. At this moment they are advancing in three ways relevant to the subject of this book and the ongoing project on which it is based. First, after more than a century of reductionist hegemony, various science communities worldwide increasingly recognize the need to improve complementary, synthesis understanding—a way of putting the reductionist pieces of the problem back together again in order to understand how the ‘whole’ system works and to identify the emergent properties that follow from the complex interactions of the pieces. Synthesis understanding is not, of course, new. In the late eighteenth century, Immanuel Kant argued for it as one of the pillars of science in the reorganization of knowledge in the European academy (Turner 2002a) and designated geography as one of the ‘synthesis sciences’. Its contemporary rediscovery, however, rests in the science of global environmental change (Lawton 2001; Steffen et al. 2002), especially efforts to model complex systems, such as those in ocean–atmosphere–land interactions, and has been expanded by emerging research agendas seeking to couple human and environment systems, often registered under the label of ‘sustainability science’ (e.g. Kates et al. 2001; NRC 1999). Second, within these developments landuse and land-cover change (or, simply, land change) is singled out because of its centrality to a wide range of environmental concerns, including global climate change, regional–local hydrological impacts, biodiversity, and, of course, human development and ecosystem integrity (e.g. Brookfield 1995; NRC 2000; Watson et al. 2001). The need to advance an integrated land-change science is also increasingly recognized, one in which human, ecological, and remote sensing and geographical information systems (GIS) sciences are intertwined in problem-solving (Liverman et al. 1998; Klepeis and Turner 2001; Turner 2002b). And central to this effort is the need to advance geographically (spatially) explicit land-change models that can explain and project coupled human-ecological systems, and thus serve a wide range of research and assessment constituencies, from carbon to biodiversity to human vulnerability (IGBP 1999; Irwin and Geoghegan 2001; Kates et al. 2001; Liverman et al. 1998; Veldkamp and Lambin 2001). These two developments—synthesis science and integrated land science directed towards geographically explicit land-change models—constitute the broader intellectual and research frontiers to which this work contributes.


2018 ◽  
Vol 20 (1) ◽  
pp. 122-127 ◽  

The development of methodologies for assessing water quality in coastal areas including mapping of eutrophication levels is a research area of high interest. A wide range of methodological approaches can be found in the literature, including multivariate techniques, since marine eutrophication is a multi-parametric phenomenon. In this context, statistical analysis and in particular Principal Component Analysis (PCA) have been widely applied. However, no attempt has been presented so far for mapping eutrophication levels based on information acquired from PCA results in integration with spatial analysis methods. The rapid development of Geographical Information Systems provides the appropriate framework for the development and application of methodologies integrating statistical analysis, spatial analysis methods and mapping techniques. This paper proposes such a methodological approach for assessing sea water quality in coastal areas. The methodology is clearly described and the Strait of Mytilene at the east of the Island of Lesvos in the NE Aegean Sea, Greece is used as a case study.


2018 ◽  
Vol 11 (6) ◽  
pp. 3801-3814 ◽  
Author(s):  
Norman Wildmann ◽  
Nikola Vasiljevic ◽  
Thomas Gerz

Abstract. In the context of the Perdigão 2017 experiment, the German Aerospace Center (DLR) deployed three long-range scanning Doppler lidars with the dedicated purpose of investigating the wake of a single wind turbine at the experimental site. A novel method was tested for the first time to investigate wake properties with ground-based lidars over a wide range of wind directions. For this method, the three lidars, which were space- and time-synchronized using the WindScanner software, were programmed to measure with crossing beams at individual points up to 10 rotor diameters downstream of the wind turbine. Every half hour, the measurement points were adapted to the current wind direction to obtain a high availability of wake measurements in changing wind conditions. The linearly independent radial velocities where the lidar beams intersect allow the calculation of the wind vector at those points. Two approaches to estimating the prevailing wind direction were tested throughout the campaign. In the first approach, velocity azimuth display (VAD) scans of one of the lidars were used to calculate a 5 min average of wind speed and wind direction every half hour, whereas later in the experiment 5 min averages of sonic anemometer measurements of a meteorological mast close to the wind turbine became available in real time and were used for the scanning adjustment. Results of wind speed deficit measurements are presented for two measurement days with varying northwesterly winds, and it is evaluated how well the lidar beam intersection points match the actual wake location. The new method allowed wake measurements to be obtained over the whole measurement period, whereas a static scanning setup would only have captured short periods of wake occurrences.


2021 ◽  
pp. 36-55
Author(s):  
Karel Charvat ◽  
Runar Bergheim ◽  
Raitis Bērziņš ◽  
František Zadražil ◽  
Dailis Langovskis ◽  
...  

For the purpose of exploiting the potential of cloud connectivity in geographical information systems, the Map Whiteboard technology introduced in this article does for web mapping what Google Docs does for word processing; create a shared user interface where multiple parties collaboratively can develop maps and map data while seeing each other work in realtime. To develop the Map Whiteboard concept, we have applied a methodology whereby we have collected technical and functional requirements through a series of hackathons, implemented a prototype in several stages, and subjected this to rigorous testing in a lab environment and with selected users from relevant environments at intermediate scale. The work has resulted in a fully functional prototype that exploits WebSockets via a cloud service to reflect map and data changes between multiple connected clients. The technology has a demonstrated potential for use in a wide range of web GIS applications, something that is facilitated by the interfaces already implemented towards mainstream mapping frameworks like OpenLayers and QGIS-two of the most popular frameworks for Web GIS solutions. Further development and testing are required before operationalization in mission-critical environments. In conclusion, the Map Whiteboard concept offers a starting point for exploiting cloud connectivity within GIS to facilitate the digitalization of common processes within the government and private sector. The technology is ready for early adopters and welcomes the contribution of interested parties.


2018 ◽  
Author(s):  
Norman Wildmann ◽  
Nikola Vasiljevic ◽  
Thomas Gerz

Abstract. In the context of the Perdigão 2017 experiment, the German Aerospace Center (DLR) deployed three long-range scanning Doppler lidars with the dedicated purpose of investigating the wake of a single wind turbine at the experimental site. A novel method was established to investigate wake properties with ground-based lidars over a wide range of wind directions. For this method, the three lidars, which were space- and time-synchronized using the WindScanner software, were programmed to measure with crossing beams at individual points up to ten rotor diameters downstream the wind turbine. Every half hour, the measurement points were adapted to the current wind direction to obtain a high availability of wake measurements in changing wind conditions. The linearly independent radial velocities where the lidar beams intersect allow the calculation of the wind vector at those points. Two approaches to estimate the prevailing wind direction were tested throughout the campaign. In the first approach, VAD scans of one of the lidars were used to calculate a five-minute average of wind speed and wind direction every half hour, whereas later in the experiment, five-minute averages of sonic anemometer measurements of a meteorological mast close to the wind turbine became available in real-time and were used for the scanning adjustment. Results of wind speed deficit measurements are presented for two measurement days with varying westerly winds and it is evaluated how well the lidar beam intersection points match the actual wake location. The new method allowed to obtain wake measurements over the whole measurement period, whereas a static scanning setup would only have captured short periods of wake occurrences. The analysed cases reveal that state-of-the-art engineering models for wakes underestimate the actual wind speed deficit.


Energies ◽  
2021 ◽  
Vol 14 (16) ◽  
pp. 5095
Author(s):  
Ahmed Abdulkareem Ahmed Adulaimi ◽  
Biswajeet Pradhan ◽  
Subrata Chakraborty ◽  
Abdullah Alamri

This study estimates the equivalent continuous sound pressure level (Leq) during peak daily periods (‘rush hour’) along the New Klang Valley Expressway (NKVE) in Shah Alam, Malaysia, using a land use regression (LUR) model based on machine learning, statistical regression, and geographical information systems (GIS). The research utilises two types of soft computing methods including machine learning (i.e., decision tree, random frost algorithms) and statistical regression (i.e., linear regression, support vector regression algorithms) to determine the best approach to create a prediction Leq map at the NKVE in Shah Alam, Malaysia. The selection of the best algorithm is accomplished by considering correlation, correlation coefficient, mean-absolute-error, mean-square-error, root-mean-square-error, and mean absolute percentage error. Traffic noise level was monitored using three sound level meters (TES 52A), and a traffic tally was done to analyse the traffic flow. Wind speed was gauged using a wind speed meter. The study relied on a variety of noise predictors including wind speed, digital elevation model, land use type (specifically, if it was residential, industrial, or natural reserve), residential density, road type (expressway, primary, and secondary) and traffic noise average (Leq). The above parameters were fed as inputs into the LUR model. Additional noise influencing factors such as traffic lights, intersections, road toll gates, gas stations, and public transportation infrastructures (bus stop and bus line) are also considered in this study. The models utilised parameters derived from LiDAR (Light Detection and Ranging) data, and various GIS (Geographical Information Systems) layers were extracted to produce the prediction maps. The results highlighted the superior performances by the machine learning (random forest) models compared to the statistical regression-based models.


Author(s):  
Mike Davies ◽  
Robert Murley ◽  
Ian Adsley

Traditional techniques for the assessment of pollutants in contaminated land, notably brown-field sites, may not yield the speed and accuracy now required for estimates of risk and remediation cost. Detailed site investigation is often limited by the time and cost of laboratory-based analysis techniques and time-consuming data collation phases. Thus, relatively straightforward technical issues, such as the mapping of priority areas of a site, can be unnecessarily delayed and expensive. The GROUNDHOG system was developed to address these problems and to provide a platform for the development of a range of techniques for the radiological survey of potentially contaminated land. The system brings together the best of well-established and recent technologies. Visualisation of the survey results is improved by the use of Geographical Information Systems and Database systems allow an audit trail to be maintained as part of a Quality Assurance programme. Development of the Groundhog system has continued, increasing the sensitivity of the system for some applications, using gamma radiation spectrometry systems to provide qualitative measurements and constructing ruggedised systems for surveys of areas where the risks associated with manual surveys are deemed unacceptable. In recent years, ‘conventional’ Groundhog surveys have been performed on many nuclear and non-nuclear sites, for a wide range of reasons: de-licensing nuclear facilities; pre- and post-remediation surveys of contaminated land; during the remediation of contaminated land, to reduce waste volume. Specialised versions of the system have been developed and used for the location of discrete nuclear fuel ‘particles’ on beaches, sub-surface measurements have been made for estimating waste volume and a submarine survey has been conducted. This paper describes some of the projects completed and the technologies used to perform the work.


2011 ◽  
Vol 26 (S1) ◽  
pp. s46-s46
Author(s):  
K.M. Simon-Agolory ◽  
K.Z. Watkins

It is common knowledge that having an individual or family disaster plan is vital for saving lives and property before, during and after a disaster. First responders have the daunting task of helping many people during a disaster. It would make their jobs easier if people had disaster plans before a disaster. However, for a variety of reasons, few people have a disaster plan. People often do not develop disaster plans due to the time required to devise a plan, a lack of knowledge of the benefits of having a plan, or the effort required for the primarily manual process of developing a disaster plan. Wilberforce University has designed a solution called Wilberforce's Information Library Boosting Emergency Recovery (WILBER) which is a customized, online tool to quickly and automatically generate disaster plans to help save lives and property as well as mitigate the impacts of a potential disaster. WILBER utilizes an interdisciplinary approach to automatically generate a basic disaster preparedness plan. The system addresses a wide range of disasters but focuses on floods, earthquakes and technological disasters such as terrorism and nuclear disasters. WILBER automatically processes locally relevant data intelligently and combines mathematical analysis; distributed computing; individual and business risk management; current and historical information from a comprehensive Geographical Information Systems (GIS) that includes imagery, infrastructure, demographic, and environmental data; and wireless sensors for real time condition assessment. Not planning for a disaster only increases the potential magnitude of a disaster. WILBER allows citizens to quickly establish immediate procedures in the event of an emergency which in turn can lessen the burden on first responders and reduces the likelihood of loss of life. This research is funded by the Department of Energy's National Nuclear Security Administration and conducted by the Wilberforce University Disaster Recovery Center in Wilberforce, Ohio, USA.


2011 ◽  
Vol 1 (2) ◽  
pp. 263-270 ◽  
Author(s):  
H. K. Watson ◽  
R. A. Diaz-Chavez

This paper synthesizes lessons learnt from research that aimed to identify land in the dryland regions of eight sub-Saharan African study countries where bioenergy feedstocks production has a low risk of detrimental environmental and socio-economic effects. The methodology involved using geographical information systems (GISs) to interrogate a wide range of datasets, aerial photograph and field verification, an extensive literature review, and obtaining information from a wide range of stakeholders. The GIS work revealed that Africa's drylands potentially have substantial areas available and agriculturally suitable for bioenergy feedstocks production. The other work showed that land-use and biomass dynamics in Africa's drylands are greatly influenced by the inherent ‘disequilibrium’ behaviour of these environments. This behaviour challenges the sustainability concept and perceptions regarding the drivers, nature and consequences of deforestation, land degradation and other factors. An assessment of the implications of this behaviour formed the basis for the practical guidance suggested for bioenergy feedstock producers and bioenergy policy makers.


Sign in / Sign up

Export Citation Format

Share Document