scholarly journals Hydrothermal Alteration at the San Vito Area of the Campi Flegrei Geothermal System in Italy: Mineral Review and Geochemical Modeling

Minerals ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 810
Author(s):  
Monica Piochi ◽  
Barbara Cantucci ◽  
Giordano Montegrossi ◽  
Gilda Currenti

The Campi Flegrei geothermal system sets in one of the most famous and hazardous volcanic caldera in the world. The geothermal dynamics is suspected to have a crucial role in the monitored unrest phases and in the eruption triggering as well. Numerical models in the literature do not properly consider the geochemical effects of fluid-rock interaction into the hydrothermal circulation and this gap limits the wholly understanding of the dynamics. This paper focuses on fluid-rock interaction effects at the Campi Flegrei and presents relevant information requested for reactive transport simulations. In particular, we provide: (1) an extensive review of available data and new petrographic analyses of the San Vito cores rearranged in a conceptual model useful to define representative geochemical and petrophysical parameters of rock formations suitable for numerical simulations and (2) the implemented thermodynamic and kinetic data set calibrated for the San Vito 1 well area, central in the geothermal reservoir. A preliminary 0D-geochemical model, performed with a different contribution of CO2 at high (165 °C) and low (85 °C) temperatures, firstly allows reproducing the hydrothermal reactions over time of the Campanian Ignimbrite formation, the most important deposits in the case study area.

2019 ◽  
Vol 16 (32) ◽  
pp. 108-118
Author(s):  
Marcos Antônio KLUNK ◽  
Sudipta DASGUPTA ◽  
Mohuli DAS ◽  
Paulo Roberto WANDER

The numerical modeling of transport and reaction was used for the understanding of the evolution of the diagenetic processes and their importance in the characterization and prediction of oil reservoir quality. Geochemical models are represented by numerical equations based on the physical-chemical properties of minerals. There are many software’s available in the market to simulate systems and geochemical reactions. The codes are divided into three distinct categories: coupled transport of reaction, modeling speciation, and batch mode according to the numerical method. Simple systems have clear connections between inputs and outputs. Complex systems have multiple factors that provide a probability distribution of data inputs that interact in specific functions. The outputs produced as a result are therefore impossible to predict with complete accuracy. Several research groups tried to develop numerical codes for geochemical modeling. The critical factors for the use of these systems are (i) verification of the simulation results with empirical data set and (ii) sensitivity analysis of these results, for the construction of general models which provide a predictive character. This last factor is particularly important as it establishes the qualitative and quantitative impact of each parameter in the simulations. Thus, with a complete numerical model diagenetic, it is possible to perform various simulations modifying one or the other parameter to test the sensitivity in the construction of these different geological scenarios. This set includes mineral composition and texture, the composition of fluids, paragenetic sequence, and burial history. This work brings fundamental concepts related to this topic as well as an analysis of commercial software available.


Author(s):  
Gabriele Pieke

Art history has its own demands for recording visual representations. Objectivity and authenticity are the twin pillars of recording artistic data. As such, techniques relevant to epigraphic study, such as making line drawings, may not always be the best approach to an art historical study, which addresses, for example, questions about natural context and materiality of the artwork, the semantic, syntactic, and chronological relation between image and text, work procedures, work zones, and workshop traditions, and interactions with formal structures and beholders. Issues critical to collecting data for an art historical analysis include recording all relevant information without overcrowding the data set, creating neutral (i.e., not subjective) photographic images, collecting accurate color data, and, most critically, firsthand empirical study of the original artwork. A call for greater communication in Egyptology between epigraphy/palaeography and art history is reinforced by drawing attention to images as tools of communication and the close connection between the written word and figural art in ancient Egypt.


2021 ◽  
pp. 016555152110184
Author(s):  
Gunjan Chandwani ◽  
Anil Ahlawat ◽  
Gaurav Dubey

Document retrieval plays an important role in knowledge management as it facilitates us to discover the relevant information from the existing data. This article proposes a cluster-based inverted indexing algorithm for document retrieval. First, the pre-processing is done to remove the unnecessary and redundant words from the documents. Then, the indexing of documents is done by the cluster-based inverted indexing algorithm, which is developed by integrating the piecewise fuzzy C-means (piFCM) clustering algorithm and inverted indexing. After providing the index to the documents, the query matching is performed for the user queries using the Bhattacharyya distance. Finally, the query optimisation is done by the Pearson correlation coefficient, and the relevant documents are retrieved. The performance of the proposed algorithm is analysed by the WebKB data set and Twenty Newsgroups data set. The analysis exposes that the proposed algorithm offers high performance with a precision of 1, recall of 0.70 and F-measure of 0.8235. The proposed document retrieval system retrieves the most relevant documents and speeds up the storing and retrieval of information.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
◽  
Elmar Kotter ◽  
Luis Marti-Bonmati ◽  
Adrian P. Brady ◽  
Nandita M. Desouza

AbstractBlockchain can be thought of as a distributed database allowing tracing of the origin of data, and who has manipulated a given data set in the past. Medical applications of blockchain technology are emerging. Blockchain has many potential applications in medical imaging, typically making use of the tracking of radiological or clinical data. Clinical applications of blockchain technology include the documentation of the contribution of different “authors” including AI algorithms to multipart reports, the documentation of the use of AI algorithms towards the diagnosis, the possibility to enhance the accessibility of relevant information in electronic medical records, and a better control of users over their personal health records. Applications of blockchain in research include a better traceability of image data within clinical trials, a better traceability of the contributions of image and annotation data for the training of AI algorithms, thus enhancing privacy and fairness, and potentially make imaging data for AI available in larger quantities. Blockchain also allows for dynamic consenting and has the potential to empower patients and giving them a better control who has accessed their health data. There are also many potential applications of blockchain technology for administrative purposes, like keeping track of learning achievements or the surveillance of medical devices. This article gives a brief introduction in the basic technology and terminology of blockchain technology and concentrates on the potential applications of blockchain in medical imaging.


Paleobiology ◽  
10.1666/12050 ◽  
2013 ◽  
Vol 39 (4) ◽  
pp. 628-647 ◽  
Author(s):  
Leah J. Schneider ◽  
Timothy J. Bralower ◽  
Lee R. Kump ◽  
Mark E. Patzkowsky

The Paleocene-Eocene Thermal Maximum (PETM; ca. 55.8 Ma) is thought to coincide with a profound but entirely transient change among nannoplankton communities throughout the ocean. Here we explore the ecology of nannoplankton during the PETM by using multivariate analyses of a global data set that is based upon the distribution of taxa in time and space. We use these results, coupled with stable isotope data and geochemical modeling, to reinterpret the ecology of key genera. The results of the multivariate analyses suggest that the community was perturbed significantly in coastal and high-latitudes sites compared to the open ocean, and the relative influence of temperature and nutrient availability on the assemblage varies regionally. The open ocean became more stratified and less productive during the PETM and the oligotrophic assemblage responded primarily to changes in nutrient availability. Alternatively, assemblages at the equator and in the Southern Ocean responded to temperature more than to nutrient reduction. In addition, the assemblage change at the PETM was not merely transient—there is evidence of adaptation and a long-term change in the nannoplankton community that persists after the PETM and results in the disappearance of a high-latitude assemblage. The long-term effect on communities caused by transient warming during the PETM has implications for modern-day climate change, suggesting similar permanent changes to nannoplankton community structure as the oceans warm.


Author(s):  
Andre´ L. C. Fujarra ◽  
Rodolfo T. Gonc¸alves ◽  
Fernando Faria ◽  
Marcos Cueva ◽  
Kazuo Nishimoto ◽  
...  

A great deal of works has been developed on the Spar VIM issue. There are, however, very few published works concerning VIM of monocolumn platforms, partly due to the fact that the concept is fairly recent and the first unit was only installed last year. In this context, the present paper presents a meticulous study on VIM for this type of platform concept. Model test experiments were performed to check the influence of many factors on VIM, such as different headings, wave/current coexistence, different drafts, suppression elements, and the presence of risers. The results of the experiments presented here are inline and cross-flow motion amplitudes, ratios of actual oscillation and natural periods, and motions in the XY plane. This is, therefore, a very extensive and important data set for comparisons and validations of theoretical and numerical models for VIM prediction.


2021 ◽  
Author(s):  
Willemijn Pauw ◽  
Remco Hageman ◽  
Joris van den Berg ◽  
Pieter Aalberts ◽  
Hironori Yamaji ◽  
...  

Abstract Integrity of mooring system is of high importance in the offshore industry. In-service assessment of loads in the mooring lines is however very challenging. Direct monitoring of mooring line loads through load cells or inclinometers requires subsea installation work and continuous data transmission. Other solutions based on GPS and motion monitoring have been presented as solutions to overcome these limitations [1]. Monitoring solutions based on GPS and motion data provide good practical benefits, because monitoring can be conducted from accessible area. The procedure relies on accurate numerical models to model the relation between global motions and response of the mooring system. In this paper, validation of this monitoring approach for a single unit will be presented. The unit under consideration is a turret-moored unit operating in Australia. In-service measurements of motions, GPS and line tensions are available. A numerical time-domain model of the mooring system was created. This model was used to simulate mooring line tensions due to measured FPSO motions. Using the measured unit response avoids the uncertainty resulting from a prediction of the hydrodynamic response. Measurements from load cells in various mooring lines are available. These measurements were compared against the results obtained from the simulations for validation of the approach. Three different periods, comprising a total of five weeks of data, were examined in more detail. Two periods are mild weather conditions with different dominant wave directions. The third period features heavy weather conditions. In this paper, the data set and numerical model are presented. A comparison between the measured and numerically calculated mooring line forces will be presented. Differences between the calculated and measured forces are examined. This validation study has shown that in-service monitoring of mooring line loads through GPS and motion data provides a new opportunity for mooring integrity assessment with reduced monitoring system complexity.


2015 ◽  
Vol 12 (5) ◽  
pp. 1339-1356 ◽  
Author(s):  
N. S. Jones ◽  
A. Ridgwell ◽  
E. J. Hendy

Abstract. Calcification by coral reef communities is estimated to account for half of all carbonate produced in shallow water environments and more than 25% of the total carbonate buried in marine sediments globally. Production of calcium carbonate by coral reefs is therefore an important component of the global carbon cycle; it is also threatened by future global warming and other global change pressures. Numerical models of reefal carbonate production are needed for understanding how carbonate deposition responds to environmental conditions including atmospheric CO2 concentrations in the past and into the future. However, before any projections can be made, the basic test is to establish model skill in recreating present-day calcification rates. Here we evaluate four published model descriptions of reef carbonate production in terms of their predictive power, at both local and global scales. We also compile available global data on reef calcification to produce an independent observation-based data set for the model evaluation of carbonate budget outputs. The four calcification models are based on functions sensitive to combinations of light availability, aragonite saturation (Ωa) and temperature and were implemented within a specifically developed global framework, the Global Reef Accretion Model (GRAM). No model was able to reproduce independent rate estimates of whole-reef calcification, and the output from the temperature-only based approach was the only model to significantly correlate with coral-calcification rate observations. The absence of any predictive power for whole reef systems, even when consistent at the scale of individual corals, points to the overriding importance of coral cover estimates in the calculations. Our work highlights the need for an ecosystem modelling approach, accounting for population dynamics in terms of mortality and recruitment and hence calcifier abundance, in estimating global reef carbonate budgets. In addition, validation of reef carbonate budgets is severely hampered by limited and inconsistent methodology in reef-scale observations.


2019 ◽  
Author(s):  
Lisa Schilhan ◽  
Christian Kaier

In times of an ever-increasing information overload, Academic Search Engine Optimization (ASEO) supports findability of relevant information and contributes to the FAIR principles. It enhances efficiency in literature and data search and therefore plays an increasing role in the research lifecycle. ASEO is an important aspect to consider when preparing a scientific manuscript for publication. Authors can increase the visibility of their papers in library catalogues, databases, repositories and search engines with simple measures like choosing informative author keywords. The more (meta-)data these search algorithms can use, the higher the probability that a data set or paper will show up in a result list. ASEO enables search algorithms and readers to quickly and unambiguously identify relevant content, thus also helping institutions to increase research visibility. In addition, authors and publishers share an interest in describing content in a way that makes it easy to find it. Librarians, with their extensive knowledge and wealth of experience in literature research and metadata management such as keyword assignment, can provide valuable advice on the role of as correct and complete metadata as possible and on suitable keywords for search algorithms. For this reason, the Publication Services at Graz University Library have recently started offering training and workshops for authors. The presentation will provide an introduction into strategies to enhance visibility and findability of online content, such as research articles, with some theoretical background as well as practical examples.


2021 ◽  
Author(s):  
Kevin Bellinguer ◽  
Robin Girard ◽  
Guillaume Bontron ◽  
Georges Kariniotakis

<div> <p>In recent years, the share of photovoltaic (PV) power in Europe has grown: the installed capacity increased from around 10 GW in 2008 to nearly 119 GW in 2018 [1]. Due to the intermittent nature of PV generation, new challenges arise regarding economic profitability and the safe operation of the power network. To overcome these issues, a special effort is made to develop efficient PV generation forecasting tools.</p> <p> </p> <p>For short-term PV production forecasting, past production observations are typically the main drivers. In addition, spatio-temporal (ST) inputs such as Satellite-Derived Surface Irradiance (SDSI) provide relevant information regarding the weather situation in the vicinity of the farm. Moreover, the literature shows us that Numerical Weather Predictions (NWPs) provide relevant information regarding weather trends.</p> <p> </p> <p>NWPs can be integrated in the forecasting process in two different ways. The most straightforward approach considers NWPs as explanatory input variables to the forecasting models. Thus, the atmosphere dynamics are directly carried by the NWPs. The alternative considers NWPs as state variables: weather information is used to filter the training data set to obtain a coherent subset of PV production observations measured under similar weather conditions as the PV production to be predicted. This approach is based on analog methods and makes the weather dynamics to be implicitly contained in the PV production observations. This conditioned learning approach permits to perform local regressions and is adaptive in the sense that the model training is conditioned to the weather situation.</p> <p>The specialized literature focuses on spot NWPs which permits to find situations that evolve in the same way but does not preserve ST patterns. In this context, the addition of SDSI features cannot make the most of the conditioning process. Ref. [3] proposes to use geopotential fields, which are wind drivers, as analog predictors.</p> <p> </p> <p>In this work, we propose the following contributions to the state of the art:</p> <p>We investigate the influence of spot NWPs on the performances of an auto-regressive (AR) and a random forest models according to the two above-mentioned approaches: either as additional explanatory features and/or as analog features. The analogy score proposed by [2] is used to find similar weather situations, then the model is trained over the associated PV production observations. The results highlight that the linear model performs better with the conditioned approach while the non-linear model obtains better performances when fed with explanatory features.</p> <p>Then, the similarity score is extended to gridded NWPs data through the use of a principal component analysis. This method allows to condition the learning to large-scale weather information. A comparison between spot and gridded NWPs conditioned approaches applied with AR model highlights that gridded NWPs improves the influence of SDSI over forecasting performances.</p> <p> </p> <p>The proposed approaches are evaluated using 9 PV plants in France and for a testing period of 12 months.</p> <p> </p> <strong>References</strong> <p>[1]      IRENA - https://www.irena.org/Statistics/Download-Data</p> <p>[2]      Alessandrini, Delle Monache, et al. An analog ensemble for short-term probabilistic solar power forecast. Applied Energy, 2015. https://doi.org/10.1016/j.apenergy.2015.08.011</p> <p>[3]      Bellinguer, Girard, Bontron, Kariniotakis. Short-term Forecasting of Photovoltaic Generation based on Conditioned Learning of Geopotential Fields. 2020, UPEC. https://doi.org/10.1109/UPEC49904.2020.9209858</p> </div>


Sign in / Sign up

Export Citation Format

Share Document