scholarly journals The preliminary inventory of coseismic ground failures related to December 2020 – January 2021 Petrinja earthquake series

2021 ◽  
Vol 74 (2) ◽  
Author(s):  
Davor Pollak ◽  
◽  
Vlatko Gulam ◽  
Tomislav Novosel ◽  
Radovan Avanić ◽  
...  

The most recent major earthquake series struck near Petrinja (December 29th 2020 M 6.2), and triggered extensive ground failures in the wider area of Petrinja, Sisak and Glina. Coseismic ground failures including subsidence dolines, liquefaction and landslides have been documented over a large area by various experts and teams. These data are stored in the newly created inventory, which is openly presented in this paper. This inventory is administered and updated by the Croatian Geological Survey, and will be available online via a Web Map Service (WMS) (www.hgi-cgs.hr). The aim of the inventory is to not only provide data for the development of susceptibility maps and more detailed exploration for possible remediation measures, but also to define the priorities for immediate action. The earthquake triggered the rapid development of dropout dolines which endanger the local populations of the villages of Mečenčani and Borojevići. This is still an ongoing process in the vicinity of the houses and therefore in-situ exploration started immediately. Liquefaction related to alluvial sediments of the Sava, Kupa and Glina rivers occurred almost exclusively in loose and pure sands, and was accompanied by sand boils, subsidence and lateral spreading. Liquefaction also presents a greater hazard because settlement of houses and river embankments occurred. Lateral spreading caused failures of river flood embankments and natural river banks. According to the data known to date, the majority of the coseismic landslides were reactivated with minor displacements. Despite that, it has been recognised that houses at the edge, or in landslide colluvium suffered greater damage than other houses located outside the landslide impact zone.

2011 ◽  
Vol 8 (1) ◽  
pp. 189-218 ◽  
Author(s):  
A. L. Gemmell ◽  
R. M. Barciela ◽  
J. D. Blower ◽  
K. Haines ◽  
Q. Harpham ◽  
...  

Abstract. As part of a large European coastal operational oceanography project (ECOOP), we have developed a web portal for the display and comparison of model and in-situ marine data. The distributed model and in-situ datasets are accessed via an Open Geospatial Consortium Web Map Service (WMS) and Web Feature Service (WFS) respectively. These services were developed independently and readily integrated for the purposes of the ECOOP project, illustrating the ease of interoperability resulting from adherence to international standards. The key feature of the portal is the ability to display co-plotted timeseries of the in-situ and model data and the quantification of misfits between the two. By using standards-based web technology we allow the user to quickly and easily explore over twenty model data feeds and compare these with dozens of in-situ data feeds without being concerned with the low level details of differing file formats or the physical location of the data. Scientific and operational benefits to this work include model validation, quality control of observations, data assimilation and decision support in near real time. In these areas it is essential to be able to bring different data streams together from often disparate locations.


Ocean Science ◽  
2011 ◽  
Vol 7 (4) ◽  
pp. 445-454 ◽  
Author(s):  
A. L. Gemmell ◽  
R. M. Barciela ◽  
J. D. Blower ◽  
K. Haines ◽  
Q. Harpham ◽  
...  

Abstract. As part of a large European coastal operational oceanography project (ECOOP), we have developed a web portal for the display and comparison of model and in situ marine data. The distributed model and in situ datasets are accessed via an Open Geospatial Consortium Web Map Service (WMS) and Web Feature Service (WFS) respectively. These services were developed independently and readily integrated for the purposes of the ECOOP project, illustrating the ease of interoperability resulting from adherence to international standards. The key feature of the portal is the ability to display co-plotted timeseries of the in situ and model data and the quantification of misfits between the two. By using standards-based web technology we allow the user to quickly and easily explore over twenty model data feeds and compare these with dozens of in situ data feeds without being concerned with the low level details of differing file formats or the physical location of the data. Scientific and operational benefits to this work include model validation, quality control of observations, data assimilation and decision support in near real time. In these areas it is essential to be able to bring different data streams together from often disparate locations.


2021 ◽  
Vol 5 (1) ◽  
Author(s):  
Mohsen Moazzami Gudarzi ◽  
Maryana Asaad ◽  
Boyang Mao ◽  
Gergo Pinter ◽  
Jianqiang Guo ◽  
...  

AbstractThe use of two-dimensional materials in bulk functional applications requires the ability to fabricate defect-free 2D sheets with large aspect ratios. Despite huge research efforts, current bulk exfoliation methods require a compromise between the quality of the final flakes and their lateral size, restricting the effectiveness of the product. In this work, we describe an intercalation-assisted exfoliation route, which allows the production of high-quality graphene, hexagonal boron nitride, and molybdenum disulfide 2D sheets with average aspect ratios 30 times larger than that obtained via conventional liquid-phase exfoliation. The combination of chlorosulfuric acid intercalation with in situ pyrene sulfonate functionalisation produces a suspension of thin large-area flakes, which are stable in various polar solvents. The described method is simple and requires no special laboratory conditions. We demonstrate that these suspensions can be used for fabrication of laminates and coatings with electrical properties suitable for a number of real-life applications.


Nanoscale ◽  
2021 ◽  
Author(s):  
Qiufan Wang ◽  
Jiaheng Liu ◽  
Guofu Tian ◽  
Daohong Zhang

The rapid development of human-machine interface and artificial intelligence is dependent on flexible and wearable soft devices such as sensors and energy storage systems. One of the key factors for...


Author(s):  
Yuan Gao ◽  
Souha Toukabri ◽  
Ye Yu ◽  
Andreas Richter ◽  
Robert Kirchner
Keyword(s):  

2017 ◽  
pp. 2485-2488
Author(s):  
Christopher D. Michaelis ◽  
Daniel P Ames
Keyword(s):  
Web Map ◽  

2011 ◽  
Vol 679-680 ◽  
pp. 777-780 ◽  
Author(s):  
Shoji Ushio ◽  
Ayumu Adachi ◽  
Kazuhiro Matsuda ◽  
Noboru Ohtani ◽  
Tadaaki Kaneko

As a new graphene functionality applicable to post-implantation high temperature annealing of SiC, a method of in situ formation and removal of large area epitaxial few-layer graphene on 4H-SiC(0001) Si-face is proposed. It is demonstrated that the homogeneous graphene layer formed by Si sublimation can be preserved without the decomposition of the underlying SiC substrate even in the excess of 2000 oC in ultrahigh vacuum. It is due to the existence of the stable (6√3×6√3) buffer layer at the interface. To ensure this cap function, the homogeneity of the interface must be guaranteed. In order to do that, precise control of the initial SiC surface flatness is required. Si-vapor etching is a simple and versatile SiC surface pre/post- treatment method, where thermally decomposed SiC surface is compensated by a Si-vapor flux from Si solid source in the same semi-closed TaC container. While this Si-vapor etching allows precise control of SiC etch depth and surface step-terrace structures, it also provides a “decap” function to remove of the graphene layer. The surface properties after the each process were characterized by AFM and Raman spectroscopy.


2009 ◽  
Vol 9 (4) ◽  
pp. 17465-17494
Author(s):  
D. B. Atkinson ◽  
P. Massoli ◽  
N. T. O'Neill ◽  
P. K. Quinn ◽  
S. Brooks ◽  
...  

Abstract. During the 2006 Texas Air Quality Study and Gulf of Mexico Atmospheric Composition and Climate Study (TexAQS-GoMACCS 2006), the optical, chemical and microphysical properties of atmospheric aerosols were measured on multiple mobile platforms and at ground based stations. In situ measurements of the aerosol light extinction coefficient (σep) were performed by two multi-wavelength cavity ring-down (CRD) instruments, one located on board the NOAA R/V Ronald H. Brown (RHB) and the other located at the University of Houston, Moody Tower (UHMT). An AERONET sunphotometer was also located at the UHMT to measure the columnar aerosol optical depth (AOD). The σep data were used to extract the extinction Ångström exponent (åep), a measure of the wavelength dependence of σep. There was general agreement between the åep (and to a lesser degree σep measurements by the two spatially separated CRD instruments during multi-day periods, suggesting a regional scale consistency of the sampled aerosols. Two spectral models are applied to the σep and AOD data to extract the fine mode fraction of extinction (η) and the fine mode effective radius (Reff f). These two parameters are robust measures of the fine mode contribution to total extinction and the fine mode size distribution respectively. The results of the analysis are compared to Reff f values extracted using AERONET V2 retrievals and calculated from in situ particle size measurements on the RHB and at UHMT. During a time period when fine mode aerosols dominated the extinction over a large area extending from Houston/Galveston Bay and out into the Gulf of Mexico, the various methods for obtaining Reff f agree qualitatively (showing the same temporal trend) and quantitatively (pooled standard deviation=28 nm).


The recent progress for spatial resolution of remote sensing imagery led to generate many types of Very HighResolution (VHR) satellite images, consequently, general speaking, it is possible to prepare accurate base map larger than 1:10,000 scale. One of these VHR satellite image is WorldView-3 sensor that launched in August 2014. The resolution of 0.31m makes WorldView-3 the highest resolution commercial satellite in the world. In the current research, a pan-sharpen image from that type, covering an area at Giza Governorate in Egypt, used to determine the suitable large-scale map that could be produced from that image. To reach this objective, two different sources for acquiring Ground Control Points (GCPs). Firstly, very accurate field measurements using GPS and secondly, Web Map Service (WMS) server (in the current research is Google Earth) which is considered a good alternative when GCPs are not available, are used. Accordingly, three scenarios are tested, using the same set of both 16 Ground Control Points (GCPs) as well as 14 Check Points (CHKs), used for evaluation the accuracy of geometric correction of that type of images. First approach using both GCPs and CHKs coordinates acquired by GPS. Second approach using GCPs coordinates acquired by Google Earth and CHKs acquired by GPS. Third approach using GCPs and CHKs coordinates by Google Earth. Results showed that, first approach gives Root Mean Square Error (RMSE) planimeteric discrepancy for GCPs of 0.45m and RMSE planimeteric discrepancy for CHKs of 0.69m. Second approach gives RMSE for GCPs of 1.10m and RMSE for CHKs of 1.75m. Third approach gives RMSE for GCPs of 1.10m and RMSE for CHKs of 1.40m. Taking map accuracy specification of 0.5mm of map scale, the worst values for CHKs points (1.75m&1,4m) resulted from using Google Earth as a source, gives the possibility of producing 1:5000 large-scale map compared with the best value of (0.69m) (map scale 1:2500). This means, for the given parameters of the current research, large scale maps could be produced using Google Earth, in case of GCPs are not available accurately from the field surveying, which is very useful for many users.


Sign in / Sign up

Export Citation Format

Share Document