How robust are Bayesian posterior inferences based on a Ricker model with regards to measurement errors and prior assumptions about parameters?

2001 ◽  
Vol 58 (11) ◽  
pp. 2284-2297 ◽  
Author(s):  
E Rivot ◽  
E Prévost ◽  
E Parent

We present a Bayesian approach of a Ricker stock-recruitment (S/R) analysis accounting for measurement errors on S/R data. We assess the sensitivity of posterior inferences to (i) the choice of Ricker model parameterizations, with special regards to management-related ones, and (ii) prior parameter distributions. Closed forms for Ricker parameter posterior distributions exist given S/R data known without error. We use this property to develop a procedure based on the Rao–Blackwell formula. This procedure achieves integration of measurement errors by averaging these closed forms over possible S/R data sets sampled from distributions derived from a stochastic model relating field data to the S and R variables. High-quality Bayesian estimates are obtained. The analysis of the influence of different parameterizations and of the priors is made easier. We illustrate our methodological approach by a case study of Atlantic salmon (Salmo salar). Posterior distributions for S and R are computed from a mark–recapture stochastic model. Ignoring measurement errors underestimates parameter uncertainty and overestimates both stock productivity and density dependence. We warn against using management-related parameterizations because it makes the strong prior assumption of long-term sustainability of stocks. Posterior inferences are sensitive to the choice of prior. The use of informative priors as a remedy is discussed.

1995 ◽  
Vol 52 (5) ◽  
pp. 993-1006 ◽  
Author(s):  
Y. Chen ◽  
J. E. Paloheimo

Variations in environmental variables and (or) errors in measuring stock and recruitment often result in large and heterogeneous variations in fitting fish stock–recruitment (SR) data to a regression model. This makes the commonly used least squares (LS) method inappropriate in estimating the SR relationship. Hence, we propose the following procedure: (i) identify possible outliers in fitting data to a given SR model using the least median of the squared orthogonal distance that is not sensitive to atypical values and requires no assumption on distribution of errors and (ii) apply the LS method to the SR data with defined outliers being down weighted. We showed by simulation that the SR parameters of the Ricker model could be estimated with smaller estimation errors and biases using the proposed procedures than with the traditional LS approach. Examination of four sets of published field data leads us to suggest fitting fish SR data to suitable models using the proposed estimation method and interpreting the results with the assistance of knowledge on the relevant environmental variables and measurement errors. However, our interpretation should be viewed as a working hypothesis requiring special studies to clarify the causal links between environmental variables and recruitment.


2012 ◽  
Vol 4 (1) ◽  
pp. 91-100 ◽  
Author(s):  
K. Vaníček ◽  
L. Metelka ◽  
P. Skřivánková ◽  
M. Staněk

Abstract. Homogenized data series of total ozone measurements taken by the regularly and well calibrated Dobson and Brewer spectrophotometers at Hradec Králové (Czech) and the data from the re-analyses ERA-40 and ERA-Interim were merged and compared to investigate differences between the particular data sets originated in Central Europe, the Northern Hemisphere (NH) mid-latitudes. The Dobson-to-Brewer transfer function and the algorithm for approximation of the data from the re-analyses were developed, tested and applied for creation of instrumentally consistent and completed total ozone data series of the 50-yr period 1961–2010 of observations. This correction has reduced the well-known seasonal differences between Dobson and Brewer data below the 1% calibration limit of the spectrophotometers. Incorporation of the ERA-40 and ERA-Interim total ozone data on days with missing measurements significantly improved completeness and reliability of the data series mainly in the first two decades of the period concerned. Consistent behaviour of the original and corrected/merged data sets was found in the pre-ozone-hole period (1961–1985). In the post-Pinatubo (1994–2010) era the data series show seasonal differences that can introduce uncertainty in estimation of ozone recovery mainly in the winter-spring season when the effect of the Montreal Protocol and its Amendments is expected. All the data sets confirm substantial depletion of ozone also in the summer months that gives rise to the question about its origin. The merged and completed data series of total ozone will be further analyzed to quantify chemical ozone losses and contribution of natural atmospheric processes to the ozone depletion over the region. This case study points out the importance of selection and evaluation of the quality and consistency of the input data sets used in estimation of long-term ozone changes including recovery of the ozone layer over the selected areas. Data are available from the PANGAEA database at doi:10.1594/PANGAEA.779819.


Author(s):  
Tiziano Cattaneo ◽  
Roberto De Lotto ◽  
Elisabetta Maria Venco

In regional and urban planning such as in design actions they are usually involved different themes and disciplines; especially when the goal is to improve, restore and re-functionalize existing minor settlements in rural-urban context. For this reason it is necessary to define integrated methodologies able to face inter-scalar issues and interdisciplinary themes. Authors propose a framework for a decision support system based on the treatment of geographical data and on the integration of the data sets that have dissimilar origin, diverse formats (they may be not only digital) and different meaning value. This complete data set refers to various disciplines and it is possible to deduce specific knowledge throughout analytical passages and assessment steps. In the paper authors describe: a methodological approach to support planning activities; the technical support to seek a (dynamic) balance between urban density and rural fragmentation; a Best Practices database to support scenarios in rural-urban context. Authors first expose the application field, than the logical framework of the whole process, then describe some related spatial analysis applications and finally they introduce comprehensive case study of the whole procedure.


2012 ◽  
Vol 5 (1) ◽  
pp. 445-473
Author(s):  
K. Vaníček ◽  
L. Metelka ◽  
P. Skřivánková ◽  
M. Staněk

Abstract. Homogenized data series of total ozone measurements taken by the regularly and well calibrated Dobson and Brewer spectrophotometers at Hradec Králové (Czech) and the data from the re-analyses ERA-40 and ERA-Interim were assimilated and combined to investigate differences between the particular data sets over Central Europe, the NH mid-latitudes. The Dobson-to-Brewer transfer function and the algorithm for approximation of the data from the re-analyses were developed, tested and applied for creation of instrumentally consistent and completed total ozone data series of the 50-yr period 1961–2010 of observations. The assimilation has reduced the well-known seasonal differences between Dobson and Brewer data below the 1% calibration limit of the spectrophotometers. Incorporation of the ERA-40 and ERA-Interim total ozone data on days with missing measurements significantly improved completeness and reliability of the data series mainly in the first two decades of the period concerned. Consistent behaviour of the original and assimilated data sets was found in the pre-ozone-hole period (1961–1985). In the post-Pinatubo (1994–2010) era the data series show seasonal differences that can introduce uncertainty in estimation of ozone recovery mainly in the winter-spring season when the effect of the Montreal Protocol and its Amendments is expected. All the data sets confirm substantial depletion of ozone also in the summer months that gives rise to the question about its origin. The assimilated and completed data series of total ozone will be further analyzed to quantify chemical ozone losses and contribution of natural atmospheric processes to the ozone depletion over the region. This case study points out importance of selection and evaluation of the quality and consistency of the input data sets used in estimation of long-term ozone changes including recovery of the ozone layer over the selected areas. Data are available from the PANGAEA database at http://dx.doi.org/10.1594/PANGAEA.779819.


Animals ◽  
2021 ◽  
Vol 11 (7) ◽  
pp. 1928
Author(s):  
Peter J. Wolf ◽  
Rachael E. Kreisler ◽  
Julie K. Levy

In a frequently cited 2005 paper, a Ricker model was used to assess the effectiveness of trap–neuter–return (TNR) programs for managing free-roaming domestic cat populations. The model (which was originally developed for application in the management of fisheries) used data obtained from two countywide programs, and the results indicated that any population reductions, if they existed, were at best modest. In the present study, we applied the same analysis methods to data from two long-term (i.e., >20 years) TNR programs for which significant population reductions have been documented. Our results revealed that the model cannot account for some key aspects of typical TNR programs, and the wild population swings it predicts do not correspond to the relative stability of free-roaming cat populations. A Ricker model is therefore inappropriate for use in assessing the effectiveness of TNR programs. A more recently developed, stochastic model, which accounts for the movement of cats in and out of a given area, is better suited for predicting the sterilization effort necessary to reduce free-roaming cat numbers through TNR programs.


2021 ◽  
Author(s):  
Colin Morice ◽  
John Kennedy ◽  
Nick Rayner ◽  
Jonathan Winn ◽  
Emma Hogan ◽  
...  

<p>The new HadCRUT5 data set combines meteorological station air temperature records with sea-surface temperature measurements in a data set of near-surface temperature anomalies from the year 1850 to present. Major developments in HadCRUT5 include: updates to underpinning observation data holdings; use of an updated assessment of the impacts of changing marine measurement methods; and adoption of a statistical gridding method to extend estimates into sparsely observed regions of the globe, such as the Arctic. The data are presented as a 200-member ensemble that spans the assessed uncertainty associated with adjustments for long-term observational biases, observing platform measurement errors and the interaction of observational sampling with gridding methods. The impacts of methodological changes in HadCRUT5 on diagnostics of the global climate will be discussed and compared to results derived from other state-of-the-art global data sets.</p>


2015 ◽  
Vol 15 (10) ◽  
pp. 2209-2225 ◽  
Author(s):  
M. P. Wadey ◽  
J. M. Brown ◽  
I. D. Haigh ◽  
T. Dolphin ◽  
P. Wisse

Abstract. The extreme sea levels and waves experienced around the UK's coast during the 2013/14 winter caused extensive coastal flooding and damage. Coastal managers seek to place such extremes in relation to the anticipated standards of flood protection, and the long-term recovery of the natural system. In this context, return periods are often used as a form of guidance. This paper provides these levels for the winter storms, and discusses their application to the given data sets for two UK case study sites: Sefton, northwest England, and Suffolk, east England. Tide gauge records and wave buoy data were used to compare the 2013/14 storms with return periods from a national data set, and also joint probabilities of sea level and wave heights were generated, incorporating the recent events. The 2013/14 high waters and waves were extreme due to the number of events, as well as the extremity of the 5 December 2013 "Xaver" storm, which had a high return period at both case study sites. The national-scale impact of this event was due to its coincidence with spring high tide at multiple locations. Given that this event is such an outlier in the joint probability analyses of these observed data sets, and that the season saw several events in close succession, coastal defences appear to have provided a good level of protection. This type of assessment could in the future be recorded alongside defence performance and upgrade. Ideally other variables (e.g. river levels at estuarine locations) would also be included, and with appropriate offsetting for local trends (e.g. mean sea-level rise) so that the storm-driven component of coastal flood events can be determined. This could allow long-term comparison of storm severity, and an assessment of how sea-level rise influences return levels over time, which is important for consideration of coastal resilience in strategic management plans.


Author(s):  
Eliana Trinaistic

In Canada, the non-profit organizations (NPO) and settlement sectors are increasingly re-examining their responsibility for service delivery and service design. With a growing interest in understanding how to include design principles and an “innovation” mindset in addressing the long-term outcomes of social services, new instruments are introduced as a way to experiment with different modes of engagement among the various stakeholders. The aim of community hackathons or civic hacks—a derivative of tech gatherings customized to fit public engagement—is to collaboratively rethink, redesign, and resolve a range of social and policy issues that communities are facing, from settlement, the environment, health, or legal services. Although hackathons and civic hacks aspire to be democratic, relationship-driven instruments, aligned with non-profit principles of inclusion and diversity, they are also risky propositions from the perspective of the non-profit organizational culture in Canada in that they tend to lack solid structure, clear rules, and fixed outcomes. Despite the challenges, the promise of innovation is too attractive to be disregarded, and some non-profits are embarking (with or without the government’s help) on incorporating hackathons into their toolkits. This case study will present a practitioner’s perspective on the outcomes of two community hackathons, one exploring migration data sets and the other on language policy innovation, co-developed between 2016 and 2019 by MCIS Language Solutions, a Toronto based not-for-profit social enterprise, in partnership with various partners. The case study examines how the hackathon as an instrument can aid settlement sectors and governments in fostering non-profit innovation to rethinking the trajectory of taking solutions to scale.


2015 ◽  
Vol 3 (4) ◽  
pp. 2665-2708 ◽  
Author(s):  
M. P. Wadey ◽  
J. M. Brown ◽  
I. D. Haigh ◽  
T. Dolphin ◽  
P. Wisse

Abstract. The extreme sea levels and waves experienced around the UK's coast during the 2013/2014 winter caused extensive coastal flooding and damage. In such circumstances, coastal managers seek to place such extremes in relation to the anticipated standards of flood protection, and the long-term recovery of the natural system. In this context, return periods are often used as a form of guidance. We therefore provide these levels for the winter storms, as well as discussing their application to the given data sets and case studies (two UK case study sites: Sefton, northwest England; and Suffolk, east England). We use tide gauge records and wave buoy data to compare the 2013/2014 storms with return periods from a national dataset, and also generate joint probabilities of sea level and waves, incorporating the recent events. The UK was hit at a national scale by the 2013/2014 storms, although the return periods differ with location. We also note that the 2013/2014 high water and waves were extreme due to the number of events, as well as the extremity of the 5 December 2013 "Xaver" storm, which had a very high return period at both case study sites. Our return period analysis shows that the national scale impact of this event is due to its coincidence with spring high tide at multiple locations as the tide and storm propagated across the continental shelf. Given that this event is such an outlier in the joint probability analyses of these observed data sets, and that the season saw several events in close succession, coastal defences appear to have provided a good level of protection. This type of assessment should be recorded alongside details of defence performance and upgrade, with other variables (e.g. river levels at estuarine locations) included and appropriate offsetting for linear trends (e.g. mean sea level rise) so that the storm-driven component of coastal flood events can be determined. Local offsetting of the mean trends in sea level allows long-term comparison of storm severity and also enables an assessment of how sea level rise is influencing return levels over time, which is important when considering long-term coastal resilience in strategic management plans.


2019 ◽  
Vol 12 (1) ◽  
pp. 42 ◽  
Author(s):  
Timothy Nagle-McNaughton ◽  
Rónadh Cox

Repeat photogrammetry is increasingly the go-too tool for long-term geomorphic monitoring, but quantifying the differences between structure-from-motion (SfM) models is a developing field. Volumetric differencing software (such as the open-source package CloudCompare) provides an efficient mechanism for quantifying change in landscapes. In this case study, we apply this methodology to coastal boulder deposits on Inishmore, Ireland. Storm waves are known to move these rocks, but boulder transportation and evolution of the deposits are not well documented. We used two disparate SfM data sets for this analysis. The first model was built from imagery captured in 2015 using a GoPro Hero 3+ camera (fisheye lens) and the second used 2017 imagery from a DJI FC300X camera (standard digital single-lens reflex (DSLR) camera); and we used CloudCompare to measure the differences between them. This study produced two noteworthy findings: First, volumetric differencing reveals that short-term changes in boulder deposits can be larger than expected, and that frequent monitoring can reveal not only the scale but the complexities of boulder transport in this setting. This is a valuable addition to our growing understanding of coastal boulder deposits. Second, SfM models generated by different imaging hardware can be successfully compared at sub-decimeter resolution, even when one of the camera systems has substantial lens distortion. This means that older image sets, which might not otherwise be considered of appropriate quality for co-analysis with more recent data, should not be ignored as data sources in long-term monitoring studies.


Sign in / Sign up

Export Citation Format

Share Document