scholarly journals Considerations of the Scale of Radiocarbon Offsets in the East Mediterranean, and Considering a Case for the Latest (Most Recent) Likely Date for the Santorini Eruption

Radiocarbon ◽  
2012 ◽  
Vol 54 (3-4) ◽  
pp. 449-474 ◽  
Author(s):  
Sturt W Manning ◽  
Bernd Kromer

The debate over the dating of the Santorini (Thera) volcanic eruption has seen sustained efforts to criticize or challenge the radiocarbon dating of this time horizon. We consider some of the relevant areas of possible movement in the14C dating—and, in particular, any plausible mechanisms to support as late (most recent) a date as possible. First, we report and analyze data investigating the scale of apparent possible14C offsets (growing season related) in the Aegean-Anatolia-east Mediterranean region (excluding the southern Levant and especially pre-modern, pre-dam Egypt, which is a distinct case), and find no evidence for more than very small possible offsets from several cases. This topic is thus not an explanation for current differences in dating in the Aegean and at best provides only a few years of latitude. Second, we consider some aspects of the accuracy and precision of14C dating with respect to the Santorini case. While the existing data appear robust, we nonetheless speculate that examination of the frequency distribution of the14C data on short-lived samples from the volcanic destruction level at Akrotiri on Santorini (Thera) may indicate that the average value of the overall data sets is not necessarily the most appropriate14C age to use for dating this time horizon. We note the recent paper of Soter (2011), which suggests that in such a volcanic context some (small) age increment may be possible from diffuse CO2emissions (the effect is hypothetical at this stage and hasnotbeen observed in the field), and that "if short-lived samples from the same stratigraphic horizon yield a wide range of14C ages, the lower values may be the least altered by old CO2." In this context, it might be argued that a substantive “low” grouping of14C ages observable within the overall14C data sets on short-lived samples from the Thera volcanic destruction level centered about 3326–3328 BP is perhaps more representative of the contemporary atmospheric14C age (without any volcanic CO2contamination). This is a subjective argument (since, in statistical terms, the existing studies using the weighted average remain valid) that looks to support as late a date as reasonable from the14C data. The impact of employing this revised14C age is discussed. In general, a late 17th century BC date range is found (to remain) to be most likelyeven ifsuch a late-dating strategy is followed—a late 17th century BC date range is thus a robust finding from the14C evidence even allowing for various possible variation factors. However, the possibility of a mid-16th century BC date (within ∼1593–1530 cal BC) is increased when compared against previous analyses if the Santorini data are considered in isolation.

2011 ◽  
Vol 76 (3) ◽  
pp. 547-572 ◽  
Author(s):  
Charles Perreault

I examine how our capacity to produce accurate culture-historical reconstructions changes as more archaeological sites are discovered, dated, and added to a data set. More precisely, I describe, using simulated data sets, how increases in the number of known sites impact the accuracy and precision of our estimations of (1) the earliest and (2) latest date of a cultural tradition, (3) the date and (4) magnitude of its peak popularity, as well as (5) its rate of spread and (6) disappearance in a population. I show that the accuracy and precision of inferences about these six historical processes are not affected in the same fashion by changes in the number of known sites. I also consider the impact of two simple taphonomic site destruction scenarios on the results. Overall, the results presented in this paper indicate that unless we are in possession of near-total samples of sites, and can be certain that there are no taphonomic biases in the universe of sites to be sampled, we will make inferences of varying precision and accuracy depending on the aspect of a cultural trait’s history in question.


2015 ◽  
Vol 2015 ◽  
pp. 1-10 ◽  
Author(s):  
J. Zyprych-Walczak ◽  
A. Szabelska ◽  
L. Handschuh ◽  
K. Górczak ◽  
K. Klamecka ◽  
...  

High-throughput sequencing technologies, such as the Illumina Hi-seq, are powerful new tools for investigating a wide range of biological and medical problems. Massive and complex data sets produced by the sequencers create a need for development of statistical and computational methods that can tackle the analysis and management of data. The data normalization is one of the most crucial steps of data processing and this process must be carefully considered as it has a profound effect on the results of the analysis. In this work, we focus on a comprehensive comparison of five normalization methods related to sequencing depth, widely used for transcriptome sequencing (RNA-seq) data, and their impact on the results of gene expression analysis. Based on this study, we suggest a universal workflow that can be applied for the selection of the optimal normalization procedure for any particular data set. The described workflow includes calculation of the bias and variance values for the control genes, sensitivity and specificity of the methods, and classification errors as well as generation of the diagnostic plots. Combining the above information facilitates the selection of the most appropriate normalization method for the studied data sets and determines which methods can be used interchangeably.


Author(s):  
Marcel Bengs ◽  
Finn Behrendt ◽  
Julia Krüger ◽  
Roland Opfer ◽  
Alexander Schlaefer

Abstract Purpose Brain Magnetic Resonance Images (MRIs) are essential for the diagnosis of neurological diseases. Recently, deep learning methods for unsupervised anomaly detection (UAD) have been proposed for the analysis of brain MRI. These methods rely on healthy brain MRIs and eliminate the requirement of pixel-wise annotated data compared to supervised deep learning. While a wide range of methods for UAD have been proposed, these methods are mostly 2D and only learn from MRI slices, disregarding that brain lesions are inherently 3D and the spatial context of MRI volumes remains unexploited. Methods We investigate whether using increased spatial context by using MRI volumes combined with spatial erasing leads to improved unsupervised anomaly segmentation performance compared to learning from slices. We evaluate and compare 2D variational autoencoder (VAE) to their 3D counterpart, propose 3D input erasing, and systemically study the impact of the data set size on the performance. Results Using two publicly available segmentation data sets for evaluation, 3D VAEs outperform their 2D counterpart, highlighting the advantage of volumetric context. Also, our 3D erasing methods allow for further performance improvements. Our best performing 3D VAE with input erasing leads to an average DICE score of 31.40% compared to 25.76% for the 2D VAE. Conclusions We propose 3D deep learning methods for UAD in brain MRI combined with 3D erasing and demonstrate that 3D methods clearly outperform their 2D counterpart for anomaly segmentation. Also, our spatial erasing method allows for further performance improvements and reduces the requirement for large data sets.


Accounting ◽  
2021 ◽  
pp. 609-614
Author(s):  
Vu Cam Nhung ◽  
Lai Cao Mai Phuong

This paper examines the impact of corruption on employers' efficiency in Vietnamese firms. The Generalized Least Square (GLS) estimation method was used for data sets surveyed for Vietnamese firms in 63 localities. The research results show that the unofficial costs in the industry and the total informal costs accounting for 10% or more of revenue will negatively affect the labor efficiency of these enterprises. For costs related to administrative procedures, businesses accept to pay these fees in order to save waiting time and it contributes to increase the efficiency of employers in businesses. In addition to the corruption factor, the study also shows that the number of employees, the location of operation, the average value of fixed assets per employee and the return on equity also affect the efficiency of use. employees in Vietnamese enterprises.


1998 ◽  
Vol 38 (7) ◽  
pp. 777 ◽  
Author(s):  
G. E. Rayment ◽  
K. I. Peverill ◽  
B. C. Shelley

Summary. In relatively few years, the Australian Soil and Plant Analysis Council Inc. (ASPAC) has conducted 2 inter-laboratory proficiency programs on plant material and 3 inter-laboratory proficiency programs on soils. The purpose of these performance-based programs is to enhance the quality of soil and plant analysis in Australasia, with guidance where necessary from the soil and plant expertise of ASPAC members. ASPAC’s inaugural ‘Accreditation Committee’ reviewed published standards and existing laboratory accreditation/proficiency programs in Australia and internationally before developing what is now in full operation. This historical perspective and the 12 principles that guide operations of ASPAC’s soil and plant proficiency programs are described, as are the numeric procedures used to determine satisfactory performance. Certificates are issued to successful laboratories on completion of each program. Moreover, these remain current until signed certificates from the next equivalent program are released. Wide variations in some data sets suggest there is considerable scope to improve laboratory accuracy, particularly for soil chemical tests. Some of these differences are sufficient to markedly affect the assessment of fertiliser requirements. The present ‘Accreditation Committee’, in addition to State Representatives, serve as ‘points-of-contact’ for laboratories that require assistance to overcome problems with analytical accuracy and precision. ASPAC encourages its member laboratories to seek and maintain NATA (National Association of Testing Authorities, Australia) accreditation, in addition to participating regularly in the performance-based proficiency programs run by ASPAC.


Author(s):  
Stephen Douglas

Abstract Body-worn cameras (BWCs) have been presented as a technological innovation to cultivate greater civility in police–citizen interactions. Attempts have been made to clarify the impact of BWCs upon various policing outcomes, but the effects of BWCs on assaults against police has received scant research attention. Existing studies have been limited to a handful of jurisdictions with limited generalizability to a broader range of police organizations. Combining a number of official data sets for the years 2011–13, the current study assesses the relationship between BWCs and police victimization by focusing on total assaults and firearm assaults against police officers in a sample of 516 police agencies. The results indicate that BWC usage is negatively associated with police victimization in both models. This suggests that BWCs can assist in preventing the occurrence of general and extreme violence against police in a wide range of law enforcement agencies in varied settings.


2017 ◽  
Vol 25 (03) ◽  
pp. 479-494
Author(s):  
MOSLEM MOHAMMADI-JENGHARA ◽  
HOSSEIN EBRAHIMPOUR-KOMLEH

Microarray technology is used as a source of data for a wide range of biology studies. Useful biological information can be extracted from the analysis of microarray data, namely, the impact of a particular gene expression on the expression of other genes or the determination of expressed genes under different conditions. The purpose of this paper is to find co-behavioral genes in different data sets for different times and conditions. In other words, genes with similar behavior, same increase or decrease, under different medical, stress, and time conditions in terms of expression are determined. Multi-valued discretization of expression data was used for extracting genes with identical behavior. The algorithm proposed in this study is based on data and methods ensemble. The data ensemble technique was used to extract candidate genes with identical behavior. Other methods were also applied on all the data sets; as a result, many co-behavioral candidate genes with different similarity and correlation values were identified. Finally, the ultimate output was created from the ensemble of different methods. By applying the algorithm on yeast gene expression data, meaningful relations among genes were extracted.


2020 ◽  
Vol 24 (7) ◽  
pp. 3725-3735
Author(s):  
Ali Fallah ◽  
Sungmin O ◽  
Rene Orth

Abstract. Precipitation is a crucial variable for hydro-meteorological applications. Unfortunately, rain gauge measurements are sparse and unevenly distributed, which substantially hampers the use of in situ precipitation data in many regions of the world. The increasing availability of high-resolution gridded precipitation products presents a valuable alternative, especially over poorly gauged regions. This study examines the usefulness of current state-of-the-art precipitation data sets in hydrological modeling. For this purpose, we force a conceptual hydrological model with multiple precipitation data sets in >200 European catchments to obtain runoff and evapotranspiration. We consider a wide range of precipitation products, which are generated via (1) the interpolation of gauge measurements (E-OBS and Global Precipitation Climatology Centre (GPCC) V.2018), (2)  data assimilation into reanalysis models (ERA-Interim, ERA5, and Climate Forecast System Reanalysis – CFSR), and (3) a combination of multiple sources (Multi-Source Weighted-Ensemble Precipitation; MSWEP V2). Evaluation is done at the daily and monthly timescales during the period of 1984–2007. We find that simulated runoff values are highly dependent on the accuracy of precipitation inputs; in contrast, simulated evapotranspiration is generally much less influenced in our comparatively wet study region. We also find that the impact of precipitation uncertainty on simulated runoff increases towards wetter regions, while the opposite is observed in the case of evapotranspiration. Finally, we perform an indirect performance evaluation of the precipitation data sets by comparing the runoff simulations with streamflow observations. Thereby, E-OBS yields the particularly strong agreement, while ERA5, GPCC V.2018, and MSWEP V2 show good performances. We further reveal climate-dependent performance variations of the considered data sets, which can be used to guide their future development. The overall best agreement is achieved when using an ensemble mean generated from all the individual products. In summary, our findings highlight a climate-dependent propagation of precipitation uncertainty through the water cycle; while runoff is strongly impacted in comparatively wet regions, such as central Europe, there are increasing implications for evapotranspiration in drier regions.


2019 ◽  
Author(s):  
Andrew J. Page ◽  
Sarah Bastkowski ◽  
Muhammad Yasir ◽  
A. Keith Turner ◽  
Thanh Le Viet ◽  
...  

AbstractBackgroundBacteria have evolved over billions of years to survive in a wide range of environments. Currently, there is an incomplete understanding of the genetic basis for mechanisms underpinning survival in stressful conditions, such as the presence of anti-microbials. Transposon mutagenesis has been proven to be a powerful tool to identify genes and networks which are involved in survival and fitness under a given condition by simultaneously assaying the fitness of millions of mutants, thereby relating genotype to phenotype and contributing to an understanding of bacterial cell biology. A recent refinement of this approach allows the roles of essential genes in conditional stress survival to be inferred by altering their expression. These advancements combined with the rapidly falling costs of sequencing now allows comparisons between multiple experiments to identify commonalities in stress responses to different conditions. This capacity however poses a new challenge for analysis of multiple data sets in conjunction.ResultsTo address this analysis need, we have developed ‘AlbaTraDIS’; a software application for rapid large-scale comparative analysis of TraDIS experiments that predicts the impact of transposon insertions on nearby genes. AlbaTraDIS can identify genes which are up or down regulated, or inactivated, between multiple conditions, producing a filtered list of genes for further experimental validation as well as several accompanying data visualisations. We demonstrate the utility of our new approach by applying it to identify genes used byEscherichia colito survive in a wide range of different concentrations of the biocide Triclosan. AlbaTraDIS automatically identified all well characterised Triclosan resistance genes, including the primary target,fabI. A number of new loci were also implicated in Triclosan resistance and the predicted phenotypes for a selection of these were validated experimentally and results showed high consistency with predictions.ConclusionsAlbaTraDIS provides a simple and rapid method to analyse multiple transposon mutagenesis data sets allowing this technology to be used at large scale. To our knowledge this is the only tool currently available that can perform these tasks. AlbaTraDIS is written in Python 3 and is available under the open source licence GNU GPL 3 fromhttps://github.com/quadram-institute-bioscience/albatradis.


Author(s):  
Yu. A. Ezrokhi ◽  
E. A. Khoreva

The paper considers techniques to develop a mathematical model using a method of «parallel compressors». The model is intended to estimate the impact of the air inlet distortion on the primary parameters of the aero-engine.  The paper presents rated estimation results in the context of twin spool turbofan design for two typical cruiser modes of flight of the supersonic passenger jet. In estimation the base values σbase and the average values of the inlet ram recovery σave remained invariable. Thus, parametrical calculations were performed for each chosen relative value of the area of low-pressure region.The paper shows that an impact degree of the inlet distortion on the engine thrust for two modes under consideration is essentially different. In other words, if in the subsonic mode the impact assessment can be confined only to taking into account the influence of decreasing average values of the inlet total pressure, the use of such an assumption in the supersonic cruiser mode may result in considerable errors.With invariable values of the pressure recovery factor at the engine intake, which correspond to the speed of flight for a typical air inlet of external compression σbase, and average value σave, a parameter Δσuneven  has the main effect on the engine thrust, and degree of this effect essentially depends on a difference between σave and σbase values.


Sign in / Sign up

Export Citation Format

Share Document