scholarly journals Discriminating among distinct source models of the 1908 Messina Straits earthquake by modelling intensity data through full wavefield seismograms

2014 ◽  
Vol 198 (1) ◽  
pp. 164-173 ◽  
Author(s):  
Vincenzo Convertito ◽  
Nicola Alessandro Pino
Author(s):  
Douglas L. Dorset

The quantitative use of electron diffraction intensity data for the determination of crystal structures represents the pioneering achievement in the electron crystallography of organic molecules, an effort largely begun by B. K. Vainshtein and his co-workers. However, despite numerous representative structure analyses yielding results consistent with X-ray determination, this entire effort was viewed with considerable mistrust by many crystallographers. This was no doubt due to the rather high crystallographic R-factors reported for some structures and, more importantly, the failure to convince many skeptics that the measured intensity data were adequate for ab initio structure determinations.We have recently demonstrated the utility of these data sets for structure analyses by direct phase determination based on the probabilistic estimate of three- and four-phase structure invariant sums. Examples include the structure of diketopiperazine using Vainshtein's 3D data, a similar 3D analysis of the room temperature structure of thiourea, and a zonal determination of the urea structure, the latter also based on data collected by the Moscow group.


1990 ◽  
Vol 29 (04) ◽  
pp. 282-288 ◽  
Author(s):  
A. van Oosterom

AbstractThis paper introduces some levels at which the computer has been incorporated in the research into the basis of electrocardiography. The emphasis lies on the modeling of the heart as an electrical current generator and of the properties of the body as a volume conductor, both playing a major role in the shaping of the electrocardiographic waveforms recorded at the body surface. It is claimed that the Forward-Problem of electrocardiography is no longer a problem. Several source models of cardiac electrical activity are considered, one of which can be directly interpreted in terms of the underlying electrophysiology (the depolarization sequence of the ventricles). The importance of using tailored rather than textbook geometry in inverse procedures is stressed.


2014 ◽  
Vol 49 (3) ◽  
pp. 283-294
Author(s):  
Gyöngyvér Szanyi ◽  
Zoltán Gráczer ◽  
Erzsébet Győri

2021 ◽  
Vol 14 (5) ◽  
Author(s):  
Shaghayegh Karimzadeh ◽  
Aysegul Askan
Keyword(s):  

2021 ◽  
Vol 112 (11-12) ◽  
pp. 3501-3513
Author(s):  
Yannik Lockner ◽  
Christian Hopmann

AbstractThe necessity of an abundance of training data commonly hinders the broad use of machine learning in the plastics processing industry. Induced network-based transfer learning is used to reduce the necessary amount of injection molding process data for the training of an artificial neural network in order to conduct a data-driven machine parameter optimization for injection molding processes. As base learners, source models for the injection molding process of 59 different parts are fitted to process data. A different process for another part is chosen as the target process on which transfer learning is applied. The models learn the relationship between 6 machine setting parameters and the part weight as quality parameter. The considered machine parameters are the injection flow rate, holding pressure time, holding pressure, cooling time, melt temperature, and cavity wall temperature. For the right source domain, only 4 sample points of the new process need to be generated to train a model of the injection molding process with a degree of determination R2 of 0.9 or and higher. Significant differences in the transferability of the source models can be seen between different part geometries: The source models of injection molding processes for similar parts to the part of the target process achieve the best results. The transfer learning technique has the potential to raise the relevance of AI methods for process optimization in the plastics processing industry significantly.


2013 ◽  
Vol 28 (S2) ◽  
pp. S481-S490
Author(s):  
Oriol Vallcorba ◽  
Anna Crespi ◽  
Jordi Rius ◽  
Carles Miravitlles

The viability of the direct-space strategy TALP (Vallcorba et al., 2012b) to solve crystal structures of molecular compounds from laboratory powder diffraction data is shown. The procedure exploits the accurate metric refined from a ‘Bragg-Brentano’ powder pattern to extract later the intensity data from a second ‘texture-free’ powder pattern with the DAJUST software (Vallcorba et al., 2012a). The experimental setup for collecting this second pattern consists of a circularly collimated X-ray beam and a 2D detector. The sample is placed between two thin Mylar® foils, which reduces or even eliminates preferred orientation. With the combination of the DAJUST and TALP software a preliminary but rigorous structural study of organic compounds can be carried out at the laboratory level. In addition, the time-consuming filling of capillaries with diameters thinner than 0.3mm is avoided.


2021 ◽  
Vol 13 (8) ◽  
pp. 1424
Author(s):  
Lucas Terres de Lima ◽  
Sandra Fernández-Fernández ◽  
João Francisco Gonçalves ◽  
Luiz Magalhães Filho ◽  
Cristina Bernardes

Sea-level rise is a problem increasingly affecting coastal areas worldwide. The existence of free and open-source models to estimate the sea-level impact can contribute to improve coastal management. This study aims to develop and validate two different models to predict the sea-level rise impact supported by Google Earth Engine (GEE)—a cloud-based platform for planetary-scale environmental data analysis. The first model is a Bathtub Model based on the uncertainty of projections of the sea-level rise impact module of TerrSet—Geospatial Monitoring and Modeling System software. The validation process performed in the Rio Grande do Sul coastal plain (S Brazil) resulted in correlations from 0.75 to 1.00. The second model uses the Bruun rule formula implemented in GEE and can determine the coastline retreat of a profile by creatting a simple vector line from topo-bathymetric data. The model shows a very high correlation (0.97) with a classical Bruun rule study performed in the Aveiro coast (NW Portugal). Therefore, the achieved results disclose that the GEE platform is suitable to perform these analysis. The models developed have been openly shared, enabling the continuous improvement of the code by the scientific community.


2021 ◽  
Vol 10 (5) ◽  
pp. 277
Author(s):  
Xiaoen Li ◽  
Yang Xiao ◽  
Fenzhen Su ◽  
Wenzhou Wu ◽  
Liang Zhou

For the sustainable development of marine fishery resources, it is essential to comprehensively, accurately, and objectively obtain the spatial characteristics and evolution law of fishing intensity. However, previous studies have focused more on the use of single data sources, such as AIS (Automatic Information System) and VBD (VIIRS boat detection), to obtain fishing intensity information and, as such, have encountered some problems, such as insufficient comprehensive data coverage for ships, non-uniform spatial distribution of data signal acquisition, and insufficient accuracy in obtaining fishing intensity information. The development of big data and remote sensing Earth observation technology has provided abundant data sources and technical support for the acquisition of fishing intensity data for marine fisheries. Based on this situation, this paper proposes a framework that integrates the data of fishing vessels from two sources (AIS, with high space-time granularity, and VBD, with short revisit cycle and high sensitivity), in order to obtain such information based on closely matching and fusing the vector point data of ship positions. With the help of this framework and the strategy of indirectly representing fishing intensity by data point density after fusion, the spatial characteristics and rules of fishing intensity in typical seasons (February, April, September, and November) in the northern South China Sea in 2018 were systematically analyzed and investigated. The results revealed the following: (1) Matching and fusing AIS and VBD data can provide a better perspective to produce robust and accurate marine fishery intensity data. The two types of data have a low proximity match rate (approximately 1.89% and 6.73% of their respective inputs) and the matching success for fishing vessels in the data was 49.42%. (2) Single AIS data can be used for nearshore (50 to 70 km) marine fishery analysis research, while VBD data reflect the objective marine fishing in space, showing obvious complementarity with AIS. (3) The fishing intensity grid data obtained from the integrated data show that high-intensity fishing in the study area was concentrated in the coastal area of Maoming City, Guangdong (0–50 km); the coastal area of Guangxi Beihai (10–70 km); around Hainan Island in Zhangzhou (10–30 km); and the Sanya nearshore area (0–50 km). However, it did not decay with increasing offshore distance, such as at the Trans-Vietnamese boundary in the Beibu Gulf, near the China–Vietnam Common Fisheries Area (50 km) and high-intensity fishing areas. (4) The obtained fishing intensity data (AIS, VBD, and AIS + VBD) were quantitatively analyzed, showing that the CV (Coefficient of Variation) of the average for each month (after fusing the two types of data) was 0.995, indicating that the distribution of the combined data was better than that before fusion (before fusion: AIS = 0.879, VBD = 1.642). Therefore, the integration of AIS and VBD can meet the need for a more effective, comprehensive, and accurate fishing intensity analysis in marine fishery resources.


Sign in / Sign up

Export Citation Format

Share Document