GAMMA RAY INDEX–SHALE VOLUME TRANSFORMS

2021 ◽  
Author(s):  
David Kennedy ◽  

Although a relationship between gamma ray log response and shale volume had been recognized since the introduction of gamma ray logging in the late 1930s and early 1940s, the formula for gamma ray index, and the equating of gamma ray index to shale volume apparently appeared in the late 1960s. Contemporaneously there appeared three similar, alternative, non-linear relationships in 1969, 1970, and 1971. These functions were based upon observations and empirical graphical functions. Subsequently, these graphical functions were fit using very dissimilar-looking formulas. Only the 1969 data set was published in support of the graphical functions. No attempt to link these functions with a single formula was ever made, and only vague verbal explanations have been offered for the non-linear functions. Further, the 1969 publication was in Russian, partly mistranslated, and the mistranslation never corrected. Consequently, two of the resulting formulas are misapplied. In this article I review the four standard non-linear functions (i.e., Larionov’s two, Stieber’s, Clavier’s), examine their similarities, and show that a single function would serve the same purpose as all four, thereby eliminating a source of confusion for formation evaluators. When these shale (or clay) volume versus gamma ray index transforms are inverted to functions of gamma ray index versus shale (or clay) fractional volume a remark-able property is revealed: the increment of radioactivity per unit shale volume decreases with increases in fractional shale volume. In other words, if one unit of shale per unit volume produces a gamma ray intensity of 10 API units we would think it strange if 10 units of shale per unit volume produced only, say, 60 API units of gamma radiation (instead of 100). Yet, this is the message contained in these functions. The cause for this phenomenon has been speculated upon, but only briefly and not often. To remedy this lack of speculation, I propose a physical model and give it mathematical form. This model is in-tended as a challenge to theoretical-minded petrophysicists to falsify it, make it better, or propose an alternative and more realistic model. I also provide (in Appendix C) a digital listing of all the published graphical data in the literature that support the introduction of the non-linear shale (and clay) fractional volume – gamma ray index transforms.

Kybernetes ◽  
2015 ◽  
Vol 44 (5) ◽  
pp. 788-805 ◽  
Author(s):  
Francisco Javier Rondan-Cataluña ◽  
Jorge Arenas-Gaitán ◽  
Patricio Esteban Ramírez-Correa

Purpose – The purpose of this paper is to provide a complete and chronological view of the evolution of the main acceptance and use of technology models, from the 1970s to the present day. Design/methodology/approach – A comparison of partial least squares (linear model) and WarpPLS (non-linear model) has been run for each acceptation of technology model: TRA, TAM0, TAM1, TAM2, TAM3, UTAUT, UTAUT2. The data set collects the information of mobile internet users. Findings – The authors have concluded that UTAUT2 model obtains a better explanation power than the rest of technology acceptance models (TAMs) in the sample of mobile internet users. Furthermore, all models have a better explanation power using non-linear relationships than the traditional linear approach. Originality/value – The vast majority of research published to date with regard to the Theory of Reasoned Action (TRA), the Technology Acceptance Model (TAM), and the Unified Theory of Acceptance and Use of Technology (UTAUT) are based on structural equation models assuming linear relationships between variables. The originality of this study is that it incorporates non-linear relationships and compares the same models using both approaches.


2021 ◽  
Author(s):  
Rebekah Gelpi ◽  
Nayan Saxena ◽  
George Lifchits ◽  
Daphna Buchsbaum ◽  
Christopher G. Lucas

People are capable of learning diverse functional relationships from data; nevertheless, they are most accurate when learning linear relationships, and deviate further from estimating the true relationship when presented with non-linear functions. We investigate whether, when given the opportunity to learn actively, people choose samples in an efficient fashion, and whether better sampling policies improve their ability to learn linear and non-linear functions. We find that, across multiple different function families, people make informative sampling choices consistent with a simple, low-effort policy that minimizes uncertainty at extreme values without requiring adaptation to evidence. While participants were most accurate at learning linear functions, those who more closely adhered to the simple sampling strategy also made better predictions across all non-linear functions. We discuss how the use of this heuristic might reflect rational allocation of limited cognitive resources.


2021 ◽  
Vol 23 (4) ◽  
pp. 543-564
Author(s):  
Bowo Setiyono ◽  
Ahmad Maulin Naufa

This study examines whether liquidity, as measured by net stable funding ratio (NSFR), impacts bank performance and risk. Based on an annual panel data set consisting of 2,909 banks from 127 countries, we find that NSFR reduces both performance and risk. These results are uniquely different in the robustness analysis under various settings (non-linear relationships, high versus low NSFR, and conventional versus Islamicbanks). Overall, NSFR implementation brings benefits in the form of risk reduction rather than performance improvement to banks around the world.


2021 ◽  
Vol 62 (3) ◽  
pp. 46-52
Author(s):  

The existence of shale has a major effect on reservoir quality because it reduces the rock’s both the porosity and permeability. There are several types of shale, and they can be distributed in the sand in four different ways: laminated, structural, dispersed, or any combination of these. Each of them has various features and physical properties. Therefore, shale volume estimation is one of the most important and challengin tasks to be solved information evaluation. There are many equations proposed to calculate shale volume from Gamma - ray log; however, none of them could be considered the best method that can be applied to all case studies. This study aims to propose a new approach to estimate shale volume from well - logging data. Gamma - ray and other logs were used as input data for an artificial neural network (ANN) to predict the shale volume. We apply this technique to the 1143 data set of the ocean drilling program (ODP) in the East Sea. The authors compared the result to core data and recognized that utilization of several logs and ANN gives a better estimation than conventional methods (more accurate and can reflect the trend of actual shale volume).


2020 ◽  
Vol 16 (8) ◽  
pp. 1088-1105
Author(s):  
Nafiseh Vahedi ◽  
Majid Mohammadhosseini ◽  
Mehdi Nekoei

Background: The poly(ADP-ribose) polymerases (PARP) is a nuclear enzyme superfamily present in eukaryotes. Methods: In the present report, some efficient linear and non-linear methods including multiple linear regression (MLR), support vector machine (SVM) and artificial neural networks (ANN) were successfully used to develop and establish quantitative structure-activity relationship (QSAR) models capable of predicting pEC50 values of tetrahydropyridopyridazinone derivatives as effective PARP inhibitors. Principal component analysis (PCA) was used to a rational division of the whole data set and selection of the training and test sets. A genetic algorithm (GA) variable selection method was employed to select the optimal subset of descriptors that have the most significant contributions to the overall inhibitory activity from the large pool of calculated descriptors. Results: The accuracy and predictability of the proposed models were further confirmed using crossvalidation, validation through an external test set and Y-randomization (chance correlations) approaches. Moreover, an exhaustive statistical comparison was performed on the outputs of the proposed models. The results revealed that non-linear modeling approaches, including SVM and ANN could provide much more prediction capabilities. Conclusion: Among the constructed models and in terms of root mean square error of predictions (RMSEP), cross-validation coefficients (Q2 LOO and Q2 LGO), as well as R2 and F-statistical value for the training set, the predictive power of the GA-SVM approach was better. However, compared with MLR and SVM, the statistical parameters for the test set were more proper using the GA-ANN model.


2021 ◽  
Vol 503 (3) ◽  
pp. 4032-4049
Author(s):  
Antonio Ambrosone ◽  
Marco Chianese ◽  
Damiano F G Fiorillo ◽  
Antonio Marinelli ◽  
Gennaro Miele ◽  
...  

ABSTRACT Starburst galaxies, which are known as ‘reservoirs’ of high-energy cosmic-rays, can represent an important high-energy neutrino ‘factory’ contributing to the diffuse neutrino flux observed by IceCube. In this paper, we revisit the constraints affecting the neutrino and gamma-ray hadronuclear emissions from this class of astrophysical objects. In particular, we go beyond the standard prototype-based approach leading to a simple power-law neutrino flux, and investigate a more realistic model based on a data-driven blending of spectral indexes, thereby capturing the observed changes in the properties of individual emitters. We then perform a multi-messenger analysis considering the extragalactic gamma-ray background (EGB) measured by Fermi-LAT and different IceCube data samples: the 7.5-yr high-energy starting events (HESE) and the 6-yr high-energy cascade data. Along with starburst galaxies, we take into account the contributions from blazars and radio galaxies as well as the secondary gamma-rays from electromagnetic cascades. Remarkably, we find that, differently from the highly-constrained prototype scenario, the spectral index blending allows starburst galaxies to account for up to $40{{\ \rm per\ cent}}$ of the HESE events at $95.4{{\ \rm per\ cent}}$ CL, while satisfying the limit on the non-blazar EGB component. Moreover, values of $\mathcal {O}(100\, \mathrm{PeV})$ for the maximal energy of accelerated cosmic-rays by supernovae remnants inside the starburst are disfavoured in our scenario. In broad terms, our analysis points out that a better modelling of astrophysical sources could alleviate the tension between neutrino and gamma-ray data interpretation.


1994 ◽  
Vol 47 (9) ◽  
pp. 1771 ◽  
Author(s):  
PK Kipkemboi ◽  
AJ Easteal

The empirical solvent polarity parameters ENR and ET for the solvatochromic compounds Nile Red (1) and pyridinium-N-phenoxide betaine (2), respectively, have been determined as a function of composition for water+t -butyl alcohol and water+t-butylamine binary mixtures, over the whole composition range at 298 K. For both systems the two parameters vary with composition in a strongly non-linear fashion, and the polarity of the mixture decreases with increasing proportion of the organic cosolvent. The non-linear variation of the polarity parameters is attributed to water-cosolvent hydrophobic interactions at low cosolvent contents, and hydrogen-bonding interactions at higher cosolvent contents. Permittivity and refractive index have also been measured at 298 K for both systems, and both properties are strongly non-linear functions of composition.


1976 ◽  
Vol 18 (1) ◽  
pp. 51-61
Author(s):  
Yasuhiro Kobayashi∗ ◽  
Masaaki Ohkita ◽  
Michio Inoue ◽  
Masao Nakamura
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document