A Statistical Approach to Difference-Delay Equation Modelling in Ecology — Two Case Studies

Author(s):  
K. S. Lim ◽  
H. Tong
2016 ◽  
Vol 73 (8) ◽  
pp. 1251-1260 ◽  
Author(s):  
Suresh Andrew Sethi ◽  
Catherine Bradley

Missed counts are commonplace when enumerating fish passing a weir. Typically “connect-the-dots” linear interpolation is used to impute missed passage; however, this method fails to characterize uncertainty about estimates and cannot be implemented when the tails of a run are missed. Here, we present a statistical approach to imputing missing passage at weirs that addresses these shortcomings, consisting of a parametric run curve model to describe the smoothed arrival dynamics of a fish population and a process variation model to describe the likelihood of observed data. Statistical arrival models are fit in a Bayesian framework and tested with a suite of missing data simulation trials and against a selection of Pacific salmon (Oncorhynchus spp.) case studies from the Yukon River drainage, Alaska, USA. When compared against linear interpolation, statistical arrival models produced equivalent or better expected accuracy and a narrower range of bias outcomes. Statistical arrival models also successfully imputed missing passage counts for scenarios where the tails of a run were missed.


2003 ◽  
Vol 69 ◽  
pp. 23-38 ◽  
Author(s):  
A. Blom

This article surveys corpus research into what might generally be referred to as conventionalism in language use, and discusses its implications for second language teaching methodology. After an introductory section with traditional and more recent views on the idiomatic and formulaic nature of language use, Section 2 presents a survey of corpus-based case studies into preselected types of lexical patterning like idioms and collocations. Section 3 presents the cruder statistical approach of automatically assessing the frequency of recurrent combinations in texts. The results suggest that conventionalism extends far beyond the traditionally recognised patterns, and might be the basic combinatory principle underlying the composition of written and oral texts. After a short section with some evidence that conventional sequencing is also a feature in learner output, the concluding section (5) argues that to do justice to the phenomenon of conventionalism in language use, teaching methodology should be essentially text-based.


2017 ◽  
Vol 8 (2) ◽  
pp. 69-77 ◽  
Author(s):  
Luca Santoro

Abstract. The aim of this work is to analyze latitude measurements typically used in historical geographical works through a statistical approach. We use two sets of different age as case studies: Ptolemy's Geography and Riccioli's work on geography. A statistical approach to historical latitude and longitude databases can reveal systematic errors in geographical georeferencing processes. On the other hand, once exploiting the right statistical analysis, this approach can also lead to new information about ancient city locations.


2003 ◽  
Vol 9 (1) ◽  
pp. 2-11 ◽  
Author(s):  
Dexter Dunphy

ABSTRACTThis paper addresses the issue of corporate sustainability. It examines why achieving sustainability is becoming an increasingly vital issue for society and organisations, defines sustainability and then outlines a set of phases through which organisations can move to achieve increasing levels of sustainability. Case studies are presented of organisations at various phases indicating the benefits, for the organisation and its stakeholders, which can be made at each phase. Finally the paper argues that there is a marked contrast between the two competing philosophies of neo-conservatism (economic rationalism) and the emerging philosophy of sustainability. Management schools have been strongly influenced by economic rationalism, which underpins the traditional orthodoxies presented in such schools. Sustainability represents an urgent challenge for management schools to rethink these traditional orthodoxies and give sustainability a central place in the curriculum.


1978 ◽  
Vol 9 (4) ◽  
pp. 220-235
Author(s):  
David L. Ratusnik ◽  
Carol Melnick Ratusnik ◽  
Karen Sattinger

Short-form versions of the Screening Test of Spanish Grammar (Toronto, 1973) and the Northwestern Syntax Screening Test (Lee, 1971) were devised for use with bilingual Latino children while preserving the original normative data. Application of a multiple regression technique to data collected on 60 lower social status Latino children (four years and six months to seven years and one month) from Spanish Harlem and Yonkers, New York, yielded a small but powerful set of predictor items from the Spanish and English tests. Clinicians may make rapid and accurate predictions of STSG or NSST total screening scores from administration of substantially shortened versions of the instruments. Case studies of Latino children from Chicago and Miami serve to cross-validate the procedure outside the New York metropolitan area.


2014 ◽  
Vol 23 (1) ◽  
pp. 42-54 ◽  
Author(s):  
Tanya Rose Curtis

As the field of telepractice grows, perceived barriers to service delivery must be anticipated and addressed in order to provide appropriate service delivery to individuals who will benefit from this model. When applying telepractice to the field of AAC, additional barriers are encountered when clients with complex communication needs are unable to speak, often present with severe quadriplegia and are unable to position themselves or access the computer independently, and/or may have cognitive impairments and limited computer experience. Some access methods, such as eye gaze, can also present technological challenges in the telepractice environment. These barriers can be overcome, and telepractice is not only practical and effective, but often a preferred means of service delivery for persons with complex communication needs.


Sign in / Sign up

Export Citation Format

Share Document