scholarly journals Towards a pragmatic use of statistics in ecology

PeerJ ◽  
2021 ◽  
Vol 9 ◽  
pp. e12090
Author(s):  
Leonardo Braga Castilho ◽  
Paulo Inácio Prado

Although null hypothesis testing (NHT) is the primary method for analyzing data in many natural sciences, it has been increasingly criticized. Recently, approaches based on information theory (IT) have become popular and were held by many to be superior because it enables researchers to properly assess the strength of the evidence that data provide for competing hypotheses. Many studies have compared IT and NHT in the context of model selection and stepwise regression, but a systematic comparison of the most basic uses of statistics by ecologists is still lacking. We used computer simulations to compare how both approaches perform in four basic test designs (t-test, ANOVA, correlation tests, and multiple linear regression). Performance was measured by the proportion of simulated samples for which each method provided the correct conclusion (power), the proportion of detected effects with a wrong sign (S-error), and the mean ratio of the estimated effect to the true effect (M-error). We also checked if the p-value from significance tests correlated to a measure of strength of evidence, the Akaike weight. In general both methods performed equally well. The concordance is explained by the monotonic relationship between p-values and evidence weights in simple designs, which agree with analytic results. Our results show that researchers can agree on the conclusions drawn from a data set even when they are using different statistical approaches. By focusing on the practical consequences of inferences, such a pragmatic view of statistics can promote insightful dialogue among researchers on how to find a common ground from different pieces of evidence. A less dogmatic view of statistical inference can also help to broaden the debate about the role of statistics in science to the entire path that leads from a research hypothesis to a statistical hypothesis.

2018 ◽  
Author(s):  
Leonardo Castilho ◽  
Paulo Inácio Prado

AbstractAlthough null hypothesis testing (NHT) is the primary method for analyzing data in many natural sciences, it has been increasingly criticized. Recently, a new method based on information theory (IT) has become popular and is held by many to be superior for many reasons, not least because is enables researchers to properly assess the strength of the evidence that data provide for competing hypotheses. Many studies have compared IT and NHT in the context of model selection and stepwise regression, but a systematic comparison of the most simple but realistic uses of statistics by ecologists is still lacking. We used computer simulations to compare how both methods perform in four basic test designs (t-test, ANOVA, correlation tests, and multiple linear regression). Performance was measured by the proportion of simulated samples for which each method provided the correct conclusion (power), the proportion of detected effects with a wrong sign (S-error), and the mean ratio of the estimated effect to the true effect (M-error). We also checked if the p-value from significance tests correlated to a measure of strength of evidence, the Akaike weight. In most cases both methods performed equally well. The concordance is explained by the monotonic relationship between p-values and evidence weights in simple designs, which agree with analytic results. Our results show that researchers can agree on the conclusions drawn from a data set even when they are using different statistical approaches. By focusing on the practical consequences of inferences, such a pragmatic view of statistics can promote insightful dialogue among researchers on how to find a common ground from different pieces of evidence. A less dogmatic view of statistical inference can also help to broaden the debate about the role of statistics in science to the entire path that leads from a research hypothesis to a statistical hypothesis.


2015 ◽  
Vol 12 (1) ◽  
pp. 1-4 ◽  
Author(s):  
P. Blanc ◽  
C. Coulaud ◽  
L. Wald

Abstract. New Caledonia experiences a decrease in surface solar irradiation since 2004. It is of order of 4% of the mean yearly irradiation over the 10 years period: 2004–2013, and amounts to −9 W m−2. The preeminent roles of the changes in cloud cover and to a lesser extent, those in aerosol optical depth on the decrease in yearly irradiation are evidenced. The study highlights the role of data sets offering a worldwide coverage in understanding changes in solar radiation and planning large solar energy plants such as the ICOADS (International Comprehensive Ocean-Atmosphere Data Set) of the NOAA and MACC (Monitoring Atmosphere Composition and Climate) data sets combined with the McClear model.


1984 ◽  
Vol 58 (2) ◽  
pp. 123-127 ◽  
Author(s):  
Gary Smith

AbstractOf the 2,652 Lymnaea truncatula collected from sites in Cumbria and Gwynedd during 1973–75, when the Prevalence of fascioliasis in the primary host was low and declining, only 123 were infected with Fasciola hepatica. Dissection of these snails revealed that the proportion infected, the mean redial burden and the Proportion of mature rediae in each snail increased with shell length.The results are compared with a similar data set acquired when the prevalence of infection in both the Primary and intermediate hosts was uncharacteristically high. Although the results are qualitatively similar, there are important quantitative differences. The mean redial burden of infected snails at times of high disease Prevalence was generally twice as high as that reported in the present study. It is suggested that differences in habitat microclimate could not account entirely for the observed differences in redial burden and the role of multiple miracidial infection is discussed.


2014 ◽  
Vol 31 (1) ◽  
pp. 91-124
Author(s):  
Michael Dorfman

In a series of works published over a period of twenty five years, C.W. Huntington, Jr. has developed a provocative and radical reading of Madhyamaka (particularly Early Indian Madhyamaka) inspired by ‘the insights of post- Wittgensteinian pragmatism and deconstruction’ (1993, 9). This article examines the body of Huntington’s work through the filter of his seminal 2007 publication, ‘The Nature of the M?dhyamika Trick’, a polemic aimed at a quartet of other recent commentators on Madhyamaka (Robinson, Hayes, Tillemans and Garfield) who attempt ‘to read N?g?rjuna through the lens of modern symbolic logic’ (2007, 103), a project which is the ‘end result of a long and complex scholastic enterprise … [which] can be traced backwards from contemporary academic discourse to fifteenth century Tibet, and from there into India’ (2007, 111) and which Huntington sees as distorting the Madhyamaka project which was not aimed at ‘command[ing] assent to a set of rationally grounded doctrines, tenets, or true conclusions’ (2007, 129). This article begins by explicating some disparate strands found in Huntington’s work, which I connect under a radicalized notion of ‘context’. These strands consist of a contextualist/pragmatic theory of truth (as opposed to a correspondence theory of truth), a contextualist epistemology (as opposed to one relying on foundationalist epistemic warrants), and a contextualist ontology where entities are viewed as necessarily relational (as opposed to possessing a context-independent essence.) I then use these linked theories to find fault with Huntington’s own readings of Candrak?rti and N?g?rjuna, arguing that Huntington misreads the semantic context of certain key terms (tarka, d???i, pak?a and pratijñ?) and fails to follow the implications of N?g?rjuna and Candrak?rti’s reliance on the role of the pram??as in constituting conventional reality. Thus, I find that Huntington’s imputation of a rejection of logic and rational argumentation to N?g?rjuna and Candrak?rti is unwarranted. Finally, I offer alternate readings of the four contemporary commentators selected by Huntington, using the conceptual apparatus developed earlier to dismiss Robinson’s and Hayes’s view of N?g?rjuna as a charlatan relying on logical fallacies, and to find common ground between Huntington’s project and the view of N?g?rjuna developed by Tillemans and Garfield as a thinker committed using reason to reach, through rational analysis, ‘the limits of thought.’


2020 ◽  
Author(s):  
Marc Philipp Bahlke ◽  
Natnael Mogos ◽  
Jonny Proppe ◽  
Carmen Herrmann

Heisenberg exchange spin coupling between metal centers is essential for describing and understanding the electronic structure of many molecular catalysts, metalloenzymes, and molecular magnets for potential application in information technology. We explore the machine-learnability of exchange spin coupling, which has not been studied yet. We employ Gaussian process regression since it can potentially deal with small training sets (as likely associated with the rather complex molecular structures required for exploring spin coupling) and since it provides uncertainty estimates (“error bars”) along with predicted values. We compare a range of descriptors and kernels for 257 small dicopper complexes and find that a simple descriptor based on chemical intuition, consisting only of copper-bridge angles and copper-copper distances, clearly outperforms several more sophisticated descriptors when it comes to extrapolating towards larger experimentally relevant complexes. Exchange spin coupling is similarly easy to learn as the polarizability, while learning dipole moments is much harder. The strength of the sophisticated descriptors lies in their ability to linearize structure-property relationships, to the point that a simple linear ridge regression performs just as well as the kernel-based machine-learning model for our small dicopper data set. The superior extrapolation performance of the simple descriptor is unique to exchange spin coupling, reinforcing the crucial role of choosing a suitable descriptor, and highlighting the interesting question of the role of chemical intuition vs. systematic or automated selection of features for machine learning in chemistry and material science.


2012 ◽  
pp. 66-77 ◽  
Author(s):  
I. A. Lavrinenko ◽  
O. V. Lavrinenko ◽  
D. V. Dobrynin

The satellite images show that the area of marshes in the Kolokolkova bay was notstable during the period from 1973 up to 2011. Until 2010 it varied from 357 to 636 ha. After a severe storm happened on July 24–25, 2010 the total area of marshes was reduced up to 43–50 ha. The mean value of NDVI for studied marshes, reflecting the green biomass, varied from 0.13 to 0.32 before the storm in 2010, after the storm the NDVI decreased to 0.10, in 2011 — 0.03. A comparative analysis of species composition and structure of plant communities described in 2002 and 2011, allowed to evaluate the vegetation changes of marshes of the different topographic levels. They are fol­lowing: a total destruction of plant communities of the ass. Puccinellietum phryganodis and ass. Caricetum subspathaceae on low and middle marches; increasing role of halophytic species in plant communities of the ass. Caricetum glareosae vic. Calamagrostis deschampsioides subass. typicum on middle marches; some changes in species composition and structure of plant communities of the ass. Caricetum glareosae vic. Calamagrostis deschampsioides subass. festucetosum rubrae on high marches and ass. Parnassio palustris–Salicetum reptantis in transition zone between marches and tundra without changes of their syntaxonomy; a death of moss cover in plant communities of the ass. Caricetum mackenziei var. Warnstorfia exannulata on brackish coastal bogs. The possible reasons of dramatic vegetation dynamics are discussed. The dating of the storm makes it possible to observe the directions and rates of the succession of marches vegetation.


2004 ◽  
Vol 35 (2) ◽  
pp. 119-137 ◽  
Author(s):  
S.D. Gurney ◽  
D.S.L. Lawrence

Seasonal variations in the stable isotopic composition of snow and meltwater were investigated in a sub-arctic, mountainous, but non-glacial, catchment at Okstindan in northern Norway based on analyses of δ18O and δD. Samples were collected during four field periods (August 1998; April 1999; June 1999 and August 1999) at three sites lying on an altitudinal transect (740–970 m a.s.l.). Snowpack data display an increase in the mean values of δ18O (increasing from a mean value of −13.51 to −11.49‰ between April and August), as well as a decrease in variability through the melt period. Comparison with a regional meteoric water line indicates that the slope of the δ18O–δD line for the snowpacks decreases over the same period, dropping from 7.49 to approximately 6.2.This change points to the role of evaporation in snowpack ablation and is confirmed by the vertical profile of deuterium excess. Snowpack seepage data, although limited, also suggest reduced values of δD, as might be associated with local evaporation during meltwater generation. In general, meltwaters were depleted in δ18O relative to the source snowpack at the peak of the melt (June), but later in the year (August) the difference between the two was not statistically significant. The diurnal pattern of isotopic composition indicates that the most depleted meltwaters coincide with the peak in temperature and, hence, meltwater production.


Author(s):  
Mitchell Green

We first correct some errors in Lepore and Stone’s discussion of speaker meaning and its relation to linguistic meaning. With a proper understanding of those notions and their relation, we may then motivate a liberalization of speaker meaning that includes overtly showing one’s psychological state. I then distinguish this notion from that of expression, which, although communicative, is less cognitively demanding than speaker meaning since it need not be overt. This perspective in turn enables us to address Lepore and Stone’s broadly Davidsonian view of figurative language, which rightly emphasizes the role of imagination and perspective-taking associated with such language, but mistakenly suggests it is sui generis relative to other types of pragmatic process, and beyond the realm of communication. Figurative utterances may influence conversational common ground, and may be assessed for their aptness; they also have a characteristically expressive role that a Davidsonian view lacks the resources to explain.


Author(s):  
Michael W. Pratt ◽  
M. Kyle Matsuba

Chapter 6 reviews research on the topic of vocational/occupational development in relation to the McAdams and Pals tripartite personality framework of traits, goals, and life stories. Distinctions between types of motivations for the work role (as a job, career, or calling) are particularly highlighted. The authors then turn to research from the Futures Study on work motivations and their links to personality traits, identity, generativity, and the life story, drawing on analyses and quotes from the data set. To illustrate the key concepts from this vocation chapter, the authors end with a case study on Charles Darwin’s pivotal turning point, his round-the-world voyage as naturalist for the HMS Beagle. Darwin was an emerging adult in his 20s at the time, and we highlight the role of this journey as a turning point in his adult vocational development.


Water ◽  
2021 ◽  
Vol 13 (13) ◽  
pp. 1787
Author(s):  
Leena J. Shevade ◽  
Franco A. Montalto

Green infrastructure (GI) is viewed as a sustainable approach to stormwater management that is being rapidly implemented, outpacing the ability of researchers to compare the effectiveness of alternate design configurations. This paper investigated inflow data collected at four GI inlets. The performance of these four GI inlets, all of which were engineered with the same inlet lengths and shapes, was evaluated through field monitoring. A forensic interpretation of the observed inlet performance was conducted using conclusions regarding the role of inlet clogging and inflow rate as described in the previously published work. The mean inlet efficiency (meanPE), which represents the percentage of tributary area runoff that enters the inlet was 65% for the Nashville inlet, while at Happyland the NW inlet averaged 30%, the SW inlet 25%, and the SE inlet 10%, considering all recorded events during the monitoring periods. The analysis suggests that inlet clogging was the main reason for lower inlet efficiency at the SW and NW inlets, while for the SE inlet, performance was compromised by a reverse cross slope of the street. Spatial variability of rainfall, measurement uncertainty, uncertain tributary catchment area, and inlet depression characteristics are also correlated with inlet PE. The research suggests that placement of monitoring sensors should consider low flow conditions and a strategy to measure them. Additional research on the role of various maintenance protocols in inlet hydraulics is recommended.


Sign in / Sign up

Export Citation Format

Share Document