The Provenance of Philippians: A Response to the Analyses of Michael Flexsenhar, Heike Omerzu, Angela Standhartinger and Cédric Brélaz

2021 ◽  
pp. 0142064X2199054
Author(s):  
Douglas A. Campbell

A critical synthesis of the arguments made by Flexsenhar, Omerzu, Standhartinger and Brélaz, concerning the provenance of Paul’s letter to the Philippians, suggests: (1) ‘the whole of the praetorium’ referenced in 1.13 is a group of people working in an official provincial building, hence (2) in view of Paul’s incarceration awaiting imminent trial, this is probably in a provincial capital, (3) where a group of imperial slaves, who, following attested practice, identify themselves as ‘a household of Caesar’ (4.22), and originally from Philippi, have migrated to join the local congregation. Further critical consideration suggests, moreover, that, although Ephesus is a plausible location for the explanation of this data, Corinth is a still more powerful and economic explanation of this and related data points.

2021 ◽  
Vol 13 (1) ◽  
Author(s):  
Bingyin Hu ◽  
Anqi Lin ◽  
L. Catherine Brinson

AbstractThe inconsistency of polymer indexing caused by the lack of uniformity in expression of polymer names is a major challenge for widespread use of polymer related data resources and limits broad application of materials informatics for innovation in broad classes of polymer science and polymeric based materials. The current solution of using a variety of different chemical identifiers has proven insufficient to address the challenge and is not intuitive for researchers. This work proposes a multi-algorithm-based mapping methodology entitled ChemProps that is optimized to solve the polymer indexing issue with easy-to-update design both in depth and in width. RESTful API is enabled for lightweight data exchange and easy integration across data systems. A weight factor is assigned to each algorithm to generate scores for candidate chemical names and optimized to maximize the minimum value of the score difference between the ground truth chemical name and the other candidate chemical names. Ten-fold validation is utilized on the 160 training data points to prevent overfitting issues. The obtained set of weight factors achieves a 100% test accuracy on the 54 test data points. The weight factors will evolve as ChemProps grows. With ChemProps, other polymer databases can remove duplicate entries and enable a more accurate “search by SMILES” function by using ChemProps as a common name-to-SMILES translator through API calls. ChemProps is also an excellent tool for auto-populating polymer properties thanks to its easy-to-update design.


2021 ◽  
Author(s):  
Mathias Sablé-Meyer ◽  
Janek Guerrini ◽  
Salvador Mascarenhas

We show that probabilistic decision-making behavior characteristic of reasoning by representativeness or typicality arises in minimalistic settings lacking many of the features previously thought to be necessary conditions for the phenomenon. Specifically, we develop a version of a classical experiment by Kahneman and Tversky (1973) on base-rate neglect, where participants have full access to the probabilistic distribution, conveyed entirely visually and without reliance on familiar stereotypes, rich descriptions, or individuating information. We argue that the notion of evidential support as studied in (Bayesian) confirmation theory offers a good account of our experimental findings, as has been proposed for related data points from the representativeness literature. In a nutshell, when faced with competing alternatives to choose from, humans are sometimes less interested in picking the option with the highest probability of being true (posterior probability), and instead choose the option best supported by available evidence. We point out that this theoretical avenue is descriptively powerful, but has an as-yet unclear explanatory dimension. Building on approaches to reasoning from linguistic semantics, we propose that the chief trigger of confirmation-theoretic mechanisms in deliberate reasoning is a linguistically-motivated tendency to interpret certain experimental setups as intrinsically contrastive, in a way best cashed out by modern linguistic semantic theories of questions. These questions generate pragmatic pressures for interpreting surrounding information as having been meant to help answer the question, which will naturally give rise to confirmation-theoretic effects, very plausibly as a byproduct of iterated Bayesian update as proposed by modern Bayesian theories of relevance-based reasoning in pragmatics. Our experiment provides preliminary but tantalizing evidence in favor of this hypothesis, as participants displayed significantly more confirmation-theoretic behavior in a condition that highlighted the question-like, contrastive nature of the task.


2019 ◽  
Vol 6 (1) ◽  
Author(s):  
Marcel Schweiker ◽  
Michael Kleber ◽  
Andreas Wagner

AbstractData was collected in the field, from an office building located in Frankfurt, Germany, over the period of 4 years. The building was designed as a low-energy building and featured natural ventilation for individual control of air quality as well as buoyancy-driven night ventilation in combination with a central atrium as a passive cooling strategy. The monitored data include in total 116 data points related to outdoor and indoor environmental data, energy related data, and data related to occupancy and occupant behaviour. Data points representing a state were logged with the real timestamp of the event taking place, all other data points were recorded in 10 minute intervals. Data were collected in 17 cell offices with a size of ~20 m2, facing either east or west). Each office has one fixed and two operable windows, internal top light windows between office and corridor (to allow for night ventilation into the atrium) and sun protection elements (operated both manually and automatically). Each office is occupied by one or two persons.


Author(s):  
Zenji Horita ◽  
Ryuzo Nishimachi ◽  
Takeshi Sano ◽  
Minoru Nemoto

Absorption correction is often required in quantitative x-ray microanalysis of thin specimens using the analytical electron microscope. For such correction, it is convenient to use the extrapolation method[l] because the thickness, density and mass absorption coefficient are not necessary in the method. The characteristic x-ray intensities measured for the analysis are only requirement for the absorption correction. However, to achieve extrapolation, it is imperative to obtain data points more than two at different thicknesses in the identical composition. Thus, the method encounters difficulty in analyzing a region equivalent to beam size or the specimen with uniform thickness. The purpose of this study is to modify the method so that extrapolation becomes feasible in such limited conditions. Applicability of the new form is examined by using a standard sample and then it is applied to quantification of phases in a Ni-Al-W ternary alloy.The earlier equation for the extrapolation method was formulated based on the facts that the magnitude of x-ray absorption increases with increasing thickness and that the intensity of a characteristic x-ray exhibiting negligible absorption in the specimen is used as a measure of thickness.


Author(s):  
W.C. de Bruijn ◽  
A.A.W. de Jong ◽  
C.W.J. Sorber

One aspect of enzyme cytochemistry is, whether all macrophage lysosomal hydrolytical enzymes are present in an active form, or are activated upon stimulation. Integrated morphometrical and chemical analysis has been chosen as a tool to illucidate that cytochemical problem. Mouse peritoneal resident macrophages have been used as a model for this complicated integration of morphometrical and element-related data. Only aldehyde-fixed cells were treated with three cytochemical reactions to detect different enzyme activities within one cell (for details see [1,2]). The enzyme-related precipitates anticipated to be differentiated, were:(1).lysosomal barium and sulphur from aryl sulphatase activity,(2).lysosomal cerium and phosphate from acid phosphatase activity and(3).platinum/di-amino-benzidine( D A B) complex from endogenous peroxidase activity.


1966 ◽  
Vol 05 (02) ◽  
pp. 67-74 ◽  
Author(s):  
W. I. Lourie ◽  
W. Haenszeland

Quality control of data collected in the United States by the Cancer End Results Program utilizing punchcards prepared by participating registries in accordance with a Uniform Punchcard Code is discussed. Existing arrangements decentralize responsibility for editing and related data processing to the local registries with centralization of tabulating and statistical services in the End Results Section, National Cancer Institute. The most recent deck of punchcards represented over 600,000 cancer patients; approximately 50,000 newly diagnosed cases are added annually.Mechanical editing and inspection of punchcards and field audits are the principal tools for quality control. Mechanical editing of the punchcards includes testing for blank entries and detection of in-admissable or inconsistent codes. Highly improbable codes are subjected to special scrutiny. Field audits include the drawing of a 1-10 percent random sample of punchcards submitted by a registry; the charts are .then reabstracted and recoded by a NCI staff member and differences between the punchcard and the results of independent review are noted.


1997 ◽  
Vol 78 (02) ◽  
pp. 855-858 ◽  
Author(s):  
Armando Tripodi ◽  
Veena Chantarangkul ◽  
Marigrazia Clerici ◽  
Barbara Negri ◽  
Pier Mannuccio Mannucci

SummaryA key issue for the reliable use of new devices for the laboratory control of oral anticoagulant therapy with the INR is their conformity to the calibration model. In the past, their adequacy has mostly been assessed empirically without reference to the calibration model and the use of International Reference Preparations (IRP) for thromboplastin. In this study we reviewed the requirements to be fulfilled and applied them to the calibration of a new near-patient testing device (TAS, Cardiovascular Diagnostics) which uses thromboplastin-containing test cards for determination of the INR. On each of 10 working days citrat- ed whole blood and plasma samples were obtained from 2 healthy subjects and 6 patients on oral anticoagulants. PT testing on whole blood and plasma was done with the TAS and parallel testing for plasma by the manual technique with the IRP CRM 149S. Conformity to the calibration model was judged satisfactory if the following requirements were met: (i) there was a linear relationship between paired log-PTs (TAS vs CRM 149S); (ii) the regression line drawn through patients data points, passed through those of normals; (iii) the precision of the calibration expressed as the CV of the slope was <3%. A good linear relationship was observed for calibration plots for plasma and whole blood (r = 0.98). Regression lines drawn through patients data points, passed through those of normals. The CVs of the slope were in both cases 2.2% and the ISIs were 0.965 and 1.000 for whole blood and plasma. In conclusion, our study shows that near-patient testing devices can be considered reliable tools to measure INR in patients on oral anticoagulants and provides guidelines for their evaluation.


Author(s):  
Uppuluri Sirisha ◽  
G. Lakshme Eswari

This paper briefly introduces Internet of Things(IOT) as a intellectual connectivity among the physical objects or devices which are gaining massive increase in the fields like efficiency, quality of life and business growth. IOT is a global network which is interconnecting around 46 million smart meters in U.S. alone with 1.1 billion data points per day[1]. The total installation base of IOT connecting devices would increase to 75.44 billion globally by 2025 with a increase in growth in business, productivity, government efficiency, lifestyle, etc., This paper familiarizes the serious concern such as effective security and privacy to ensure exact and accurate confidentiality, integrity, authentication access control among the devices.


2018 ◽  
Vol 8 (3) ◽  
pp. 247-266
Author(s):  
Michelle L. Wilson

Initially, Oliver Twist (1839) might seem representative of the archetypal male social plot, following an orphan and finding him a place by discovering the father and settling the boy within his inheritance. But Agnes Fleming haunts this narrative, undoing its neat, linear transmission. This reconsideration of maternal inheritance and plot in the novel occurs against the backdrop of legal and social change. I extend the critical consideration of the novel's relationship to the New Poor Law by thinking about its reflection on the bastardy clauses. And here, of course, is where the mother enters. Under the bastardy clauses, the responsibility for economic maintenance of bastard children was, for the first time, legally assigned to the mother, relieving the father of any and all obligation. Oliver Twist manages to critique the bastardy clauses for their release of the father, while simultaneously embracing the placement of the mother at the head of the family line. Both Oliver and the novel thus suggest that it is the mother's story that matters, her name through which we find our own. And by containing both plots – that of the father and the mother – Oliver Twist reveals the violence implicit in traditional modes of inheritance in the novel and under the law.


Sign in / Sign up

Export Citation Format

Share Document