scholarly journals K-shell line ratios as a powerful constraint on plasma conditions in MagLIF experiments.

2020 ◽  
Author(s):  
Patrick Knapp
Keyword(s):  

2021 ◽  
Vol 17 (12) ◽  
Author(s):  
Kara L. Feilich ◽  
J. D. Laurence-Chasen ◽  
Courtney Orsbon ◽  
Nicholas J. Gidmark ◽  
Callum F. Ross

Three-dimensional (3D) tongue movements are central to performance of feeding functions by mammals and other tetrapods, but 3D tongue kinematics during feeding are poorly understood. Tongue kinematics were recorded during grape chewing by macaque primates using biplanar videoradiography. Complex shape changes in the tongue during chewing are dominated by a combination of flexion in the tongue's sagittal planes and roll about its long axis. As hypothesized for humans, in macaques during tongue retraction, the middle (molar region) of the tongue rolls to the chewing (working) side simultaneous with sagittal flexion, while the tongue tip flexes to the other (balancing) side. Twisting and flexion reach their maxima early in the fast close phase of chewing cycles, positioning the food bolus between the approaching teeth prior to the power stroke. Although 3D tongue kinematics undoubtedly vary with food type, the mechanical role of this movement—placing the food bolus on the post-canine teeth for breakdown—is likely to be a powerful constraint on tongue kinematics during this phase of the chewing cycle. The muscular drivers of these movements are likely to include a combination of intrinsic and extrinsic tongue muscles.



2018 ◽  
Vol 99 (3) ◽  
pp. 547-567 ◽  
Author(s):  
Annmarie G. Carlton ◽  
Joost de Gouw ◽  
Jose L. Jimenez ◽  
Jesse L. Ambrose ◽  
Alexis R. Attwood ◽  
...  

Abstract The Southeast Atmosphere Studies (SAS), which included the Southern Oxidant and Aerosol Study (SOAS); the Southeast Nexus (SENEX) study; and the Nitrogen, Oxidants, Mercury and Aerosols: Distributions, Sources and Sinks (NOMADSS) study, was deployed in the field from 1 June to 15 July 2013 in the central and eastern United States, and it overlapped with and was complemented by the Studies of Emissions, Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS) campaign. SAS investigated atmospheric chemistry and the associated air quality and climate-relevant particle properties. Coordinated measurements from six ground sites, four aircraft, tall towers, balloon-borne sondes, existing surface networks, and satellites provide in situ and remotely sensed data on trace-gas composition, aerosol physicochemical properties, and local and synoptic meteorology. Selected SAS findings indicate 1) dramatically reduced NOx concentrations have altered ozone production regimes; 2) indicators of “biogenic” secondary organic aerosol (SOA), once considered part of the natural background, were positively correlated with one or more indicators of anthropogenic pollution; and 3) liquid water dramatically impacted particle scattering while biogenic SOA did not. SAS findings suggest that atmosphere–biosphere interactions modulate ambient pollutant concentrations through complex mechanisms and feedbacks not yet adequately captured in atmospheric models. The SAS dataset, now publicly available, is a powerful constraint to develop predictive capability that enhances model representation of the response and subsequent impacts of changes in atmospheric composition to changes in emissions, chemistry, and meteorology.



2020 ◽  
Vol 492 (4) ◽  
pp. 5930-5939 ◽  
Author(s):  
Shengdong Lu ◽  
Dandan Xu ◽  
Yunchong Wang ◽  
Shude Mao ◽  
Junqiang Ge ◽  
...  

ABSTRACT We investigate the Fundamental Plane (FP) evolution of early-type galaxies in the IllustrisTNG-100 simulation (TNG100) from redshift z = 0 to z = 2. We find that a tight plane relation already exists as early as z = 2. Its scatter stays as low as ∼0.08 dex across this redshift range. Both slope parameters b and c (where R ∝ σbIc with R, σ, and I being the typical size, velocity dispersion, and surface brightness) of the plane evolve mildly since z = 2, roughly consistent with observations. The FP residual $\rm Res$ ($\equiv \, a\, +\, b\log \sigma \, +\, c\log I\, -\, \log R$, where a is the zero-point of the FP) is found to strongly correlate with stellar age, indicating that stellar age can be used as a crucial fourth parameter of the FP. However, we find that 4c + b + 2 = δ, where δ ∼ 0.8 for FPs in TNG, rather than zero as is typically inferred from observations. This implies that a tight power-law relation between the dynamical mass-to-light ratio Mdyn/L and the dynamical mass Mdyn (where Mdyn ≡ 5σ2R/G, with G being the gravitational constant) is not present in the TNG100 simulation. Recovering such a relation requires proper mixing between dark matter and baryons, as well as star formation occurring with correct efficiencies at the right mass scales. This represents a powerful constraint on the numerical models, which has to be satisfied in future hydrodynamical simulations.



2014 ◽  
Vol 29 (36) ◽  
pp. 1450197 ◽  
Author(s):  
Chao-Jun Feng ◽  
Xin-Zhou Li

The measurement of the tensor-to-scalar ratio r shows a very powerful constraint to theoretical inflation models through the detection of B-mode. In this paper, we propose a single inflation model with infinity power series potential called the Infinity Power Series (IPS) inflation model, which is well consistent with latest observations from Planck and BICEP2. Furthermore, it is found that in the IPS model, the absolute value running of the spectral index increases while the tensor-to-scalar ratio becomes large, namely both large r ≈0.20 and [Formula: see text] are realized in the IPS model. In the meanwhile, the number of e-folds is also big enough N≈50–60 to solve the problems in Big Bang cosmology.



Entropy ◽  
2018 ◽  
Vol 20 (12) ◽  
pp. 978 ◽  
Author(s):  
Steven Frank

The fundamental equations of various disciplines often seem to share the same basic structure. Natural selection increases information in the same way that Bayesian updating increases information. Thermodynamics and the forms of common probability distributions express maximum increase in entropy, which appears mathematically as loss of information. Physical mechanics follows paths of change that maximize Fisher information. The information expressions typically have analogous interpretations as the Newtonian balance between force and acceleration, representing a partition between the direct causes of change and the opposing changes in the frame of reference. This web of vague analogies hints at a deeper common mathematical structure. I suggest that the Price equation expresses that underlying universal structure. The abstract Price equation describes dynamics as the change between two sets. One component of dynamics expresses the change in the frequency of things, holding constant the values associated with things. The other component of dynamics expresses the change in the values of things, holding constant the frequency of things. The separation of frequency from value generalizes Shannon’s separation of the frequency of symbols from the meaning of symbols in information theory. The Price equation’s generalized separation of frequency and value reveals a few simple invariances that define universal geometric aspects of change. For example, the conservation of total frequency, although a trivial invariance by itself, creates a powerful constraint on the geometry of change. That constraint plus a few others seem to explain the common structural forms of the equations in different disciplines. From that abstract perspective, interpretations such as selection, information, entropy, force, acceleration, and physical work arise from the same underlying geometry expressed by the Price equation.



2019 ◽  
Author(s):  
Kaidi Wu ◽  
David Alan Dunning

We examine hypocognition, in which people lack a cognitive or linguistic representation of a concept necessary to identify, interpret, or remember instances of it. Six studies (N = 2,085) revealed that hypocognition degrades retention of fundamental information in everyday living, such as frequency of encounter. Hypocognitive participants reported encountering instances of a concept less often compared to those who knew the concept (Study 1). They failed to discern the presence and encode the frequency of objects for which they were hypocognitive, such as American participants when observing exotic fruits (Studies 2A & 3) and alphabetic letters rendered as unfamiliar symbols (Studies 2B & 5). Hypocognition occurs across cultures: British participants tracked the frequency of Asian dumplings less accurately than Chinese participants, who tracked the frequency of cheese less accurately than the British (Study 4). Lacking an underlying concept impedes remembering even when novel verbal labels are present (Study 5). Finite channels of conceptual knowledge impose a powerful constraint on what people identify, recognize, and remember in their everyday environment. The concepts that people lack impoverish their experience with the world.



2018 ◽  
Vol 63 ◽  
pp. 191-264
Author(s):  
Antone Amarilli ◽  
Michael Benedikt ◽  
Pierre Bourhis ◽  
Michael Vanden Boom

We consider entailment problems involving powerful constraint languages such as frontier-guarded existential rules in which we impose additional semantic restrictions on a set of distinguished relations. We consider restricting a relation to be transitive, restricting a relation to be the transitive closure of another relation, and restricting a relation to be a linear order. We give some natural variants of guardedness that allow inference to be decidable in each case, and isolate the complexity of the corresponding decision problems. Finally we show that slight changes in these conditions lead to undecidability.



2013 ◽  
Vol 47 ◽  
pp. 393-439 ◽  
Author(s):  
N. Taghipour ◽  
D. Fierens ◽  
J. Davis ◽  
H. Blockeel

Lifted probabilistic inference algorithms exploit regularities in the structure of graphical models to perform inference more efficiently. More specifically, they identify groups of interchangeable variables and perform inference once per group, as opposed to once per variable. The groups are defined by means of constraints, so the flexibility of the grouping is determined by the expressivity of the constraint language. Existing approaches for exact lifted inference use specific languages for (in)equality constraints, which often have limited expressivity. In this article, we decouple lifted inference from the constraint language. We define operators for lifted inference in terms of relational algebra operators, so that they operate on the semantic level (the constraints' extension) rather than on the syntactic level, making them language-independent. As a result, lifted inference can be performed using more powerful constraint languages, which provide more opportunities for lifting. We empirically demonstrate that this can improve inference efficiency by orders of magnitude, allowing exact inference where until now only approximate inference was feasible.



2020 ◽  
Author(s):  
James Annan ◽  
Julia Hargreaves ◽  
Thorsten Mauritsen ◽  
Bjorn Stevens

<p>We examine what can be learnt about climate sensitivity from variability in the surface air temperature record over the instrumental period, from around 1880 to the present. While many previous studies have used the trend in the time series to constrain equilibrium climate sensitivity, it has recently been argued that temporal variability may also be a powerful constraint. We explore this question in the context of a simple widely used energy balance model of the climate system. We consider two recently-proposed summary measures of variability and also show how the full information content can be optimally used in this idealised scenario. We find that the constraint provided by variability is inherently skewed and its power is inversely related to the sensitivity itself, discriminating most strongly between low sensitivity values and weakening substantially for higher values. As a result of this, is only when the sensitivity is very low that the variability can provide a tight constraint. Our results support the analysis of variability as a potentially useful tool in helping to constrain equilibrium climate sensitivity, but suggest caution in the interpretation of precise results.</p>





Sign in / Sign up

Export Citation Format

Share Document