scholarly journals Predictive structure or paradigm size? Investigating the effects of i-complexity and e-complexity on the learnability of morphological systems

2021 ◽  
Vol 9 (1) ◽  
Author(s):  
Tamar Johnson ◽  
Kexin Gao ◽  
Kenny Smith ◽  
Hugh Rabagliati ◽  
Jennifer Culbertson

Research on cross-linguistic differences in morphological paradigms reveals a wide range of variation on many dimensions, including the number of categories expressed, the number of unique forms, and the number of inflectional classes. However, in an influential paper, Ackerman & Malouf (2013) argue that there is one dimension on which languages do not differ widely: in predictive structure. Predictive structure in a paradigm describes the extent to which forms predict each other, called i-complexity. Ackerman & Malouf (2013) show that although languages differ according to measure of surface paradigm complexity, called e-complexity, they tend to have low i-complexity. They conclude that morphological paradigms have evolved under a pressure for low i-complexity, such that even paradigms with very high e-complexity are relatively easy to learn so long as they have low i-complexity. While this would potentially explain why languages are able to maintain large paradigms, recent work by Johnson et al. (submitted) suggests that both neural networks and human learners may actually be more sensitive to e-complexity than i-complexity. Here we will build on this work, reporting a series of experiments under more realistic learning conditions which confirm that indeed, across a range of paradigms that vary in either e- or i-complexity, neural networks (LSTMs) are sensitive to both, but show a larger effect of e-complexity (and other measures associated with size and diversity of forms). In human learners, we fail to find any effect of i-complexity at all. Further, analysis of a large number of randomly generated paradigms show that e- and i-complexity are negatively correlated: paradigms with high e-complexity necessarily show low i-complexity.These findings suggest that the observations made by Ackerman & Malouf (2013) for natural language paradigms may stem from the nature of these measures rather than learning pressures specially attuned to i-complexity.

2020 ◽  
Author(s):  
Tamar Johnson ◽  
Kexin Gao ◽  
Kenny Smith ◽  
Hugh Rabagliati ◽  
Jennifer Culbertson

Research on cross-linguistic differences in morphological paradigms reveals a wide range of variation on many dimensions, including the number of categories expressed, the number of unique forms, and the number of inflectional classes. However, in an influential paper, Ackerman & Malouf (2013) argue that there is one dimension on which languages do not differ widely: in predictive structure. Predictive structure in a paradigm describes the extent to which forms predict each other, called i-complexity. Ackerman & Malouf (2013) show that although languages differ according to measure of surface paradigm complexity, called e-complexity, they tend to have low i-complexity. They conclude that morphological paradigms have evolved under a pressure for low i-complexity, such that even paradigms with very high e-complexity are relatively easy to learn so long as they have low i-complexity. While this would potentially explain why languages are able to maintain large paradigms, recent work by Johnson et al. (submitted) suggests that both neural networks and human learners may actually be more sensitive to e-complexity than i-complexity. Here we will build on this work, reporting a series of experiments under more realistic learning conditions which confirm that indeed, across a range of paradigms that vary in either e- or i-complexity, neural networks (LSTMs) are sensitive to both, but show a larger effect of e-complexity (and other measures associated with size and diversity of forms). In human learners, we fail to find any effect of i-complexity at all. Further, analysis of a large number of randomly generated paradigms show that e- and i-complexity are negatively correlated: paradigms with high e-complexity necessarily show low i-complexity. These findings suggest that the observations made by Ackerman & Malouf (2013) for natural language paradigms may stem from the nature of these measures rather than learning pressures specially attuned to i-complexity.


1958 ◽  
Vol 31 (3) ◽  
pp. 559-561
Author(s):  
E. M. Bevilacqua

Abstract Tetramethylthiuram disulfide (TMTD), together with a sufficient amount of zinc oxide, is the simplest and most efficient vulcanizing agent known for introduction of sulfur crosslinks into rubber. No other combination approaches it in efficiency over a wide range of temperatures, although under limited conditions sulfur cures can be obtained with approximately as high a crosslink density per mole of vulcanizing agent. The stoichiometry of vulcanization with this curative has been explored in some detail in recent work of Scheele and coworkers. They have confirmed the observation by Jarrijon that during vulcanization dithiocarbamate is formed equivalent to close to two-thirds of the TMTD taken. The generality of this result was established in an exhaustive series of experiments, covering a range of structures of thiuram disulfide and of polyolefin, of temperature, and of concentration of reactants.


Author(s):  
Gerald B. Feldewerth

In recent years an increasing emphasis has been placed on the study of high temperature intermetallic compounds for possible aerospace applications. One group of interest is the B2 aiuminides. This group of intermetaliics has a very high melting temperature, good high temperature, and excellent specific strength. These qualities make it a candidate for applications such as turbine engines. The B2 aiuminides exist over a wide range of compositions and also have a large solubility for third element substitutional additions, which may allow alloying additions to overcome their major drawback, their brittle nature.One B2 aluminide currently being studied is cobalt aluminide. Optical microscopy of CoAl alloys produced at the University of Missouri-Rolla showed a dramatic decrease in the grain size which affects the yield strength and flow stress of long range ordered alloys, and a change in the grain shape with the addition of 0.5 % boron.


2004 ◽  
pp. 21-29
Author(s):  
G.V. Pyrog

In domestic scientific and public opinion, interest in religion as a new worldview paradigm is very high. Today's attention to the Christian religion in our society is connected, in our opinion, with the specificity of its value system, which distinguishes it from other forms of consciousness: the idea of ​​God, the absolute, the eternity of moral norms. That is why its historical forms do not receive accurate characteristics and do not matter in the mass consciousness. Modern religious beliefs do not always arise as a result of the direct influence of church preaching. The emerging religious values ​​are absorbed in a wide range of philosophical, artistic, ethical ideas, acting as a compensation for what is generally defined as spirituality. At the same time, the appeal to Christian values ​​became very popular.


Alloy Digest ◽  
1993 ◽  
Vol 42 (2) ◽  

Abstract Durcomet 100 is an improved version of Alloy CD-4 MCu with better corrosion and wear resistance. The alloy is used in the annealed condition and possesses excellent corrosion resistance over a wide range of corrosion environments. Mechanical strength is also very high. This datasheet provides information on composition, physical properties, hardness, and tensile properties as well as fracture toughness. It also includes information on corrosion resistance as well as heat treating and joining. Filing Code: SS-540. Producer or source: Duriron Company Inc.


2019 ◽  
pp. 28-34
Author(s):  
Margarita Castillo-Téllez ◽  
Beatriz Castillo-Téllez ◽  
Juan Carlos Ovando-Sierra ◽  
Luz María Hernández-Cruz

For millennia, humans have used hundreds of medicinal plants to treat diseases. Currently, many species with important characteristics are known to alleviate a wide range of health problems, mainly in rural areas, where the use of these resources is very high, even replacing scientific medicine almost completely. This paper presents the dehydration of medicinal plants that are grown in the State of Campeche through direct and indirect solar technologies in order to evaluate the influence of air flow and temperature on the color of the final product through the L* a* scale. b*, analyzing the activity of water and humidity during the drying process. The experimental results showed that the direct solar dryer with forced convection presents a little significant color change in a drying time of 400 min on average, guaranteeing the null bacterial proliferation and reaching a final humidity between 9 % and 11 %.


Author(s):  
Svitlana Lobchenko ◽  
Tetiana Husar ◽  
Viktor Lobchenko

The results of studies of the viability of spermatozoa with different incubation time at different concentrations and using different diluents are highlighted in the article. (Un) concentrated spermatozoa were diluented: 1) with their native plasma; 2) medium 199; 3) a mixture of equal volumes of plasma and medium 199. The experiment was designed to generate experimental samples with spermatozoa concentrations prepared according to the method, namely: 0.2; 0.1; 0.05; 0.025 billion / ml. The sperm was evaluated after 2, 4, 6 and 8 hours. The perspective of such a study is significant and makes it possible to research various aspects of the subject in a wide range. In this regard, a series of experiments were conducted in this area. The data obtained are statistically processed and allow us to highlight the results that relate to each stage of the study. In particular, in this article it was found out some regularities between the viability of sperm, the type of diluent and the rate of rarefaction, as evidenced by the data presented in the tables. As a result of sperm incubation, the viability of spermatozoa remains at least the highest trend when sperm are diluted to a concentration of 0.1 billion / ml, regardless of the type of diluent used. To maintain the viability of sperm using this concentration of medium 199 is not better than its native plasma, and its mixture with an equal volume of plasma through any length of time incubation of such sperm. Most often it is at this concentration of sperm that their viability is characterized by the lowest coefficient of variation, regardless of the type of diluent used, which may indicate the greatest stability of the result under these conditions. The viability of spermatozoa with a concentration of 0.1 billion / ml is statistically significantly reduced only after 6 or even 8 hours of incubation. If the sperm are incubated for only 2 hours, regardless of the type of diluent used, the sperm concentrations tested do not affect the viability of the sperm. Key words: boar, spermatozoa, sperm plasma, concentration, incubation, medium 199, activity, viability, rarefaction.


1994 ◽  
Vol 29 (3) ◽  
pp. 207-209 ◽  
Author(s):  
H. Puzicha

Effluents from point sources (industries, communities) and diffuse inputs introduce pollutants into the water of the river Rhine and cause a basic contaminant load. The aim is to establish a biological warning system to detect increased toxicity in addition to the already existing chemical-physical monitoring system. To cover a wide range of biocides, continuous working biotests at different trophic levels (bacteria, algae, mussels, water fleas, fishes) have been developed and proved. These are checked out for sensitivity against toxicants, reaction time, validity of data and practical handling under field conditions at the river. Test-specific appropriate methods are found to differentiate between the normal range of variation and true alarm signals.


2020 ◽  
Vol 499 (3) ◽  
pp. 4418-4431 ◽  
Author(s):  
Sujatha Ramakrishnan ◽  
Aseem Paranjape

ABSTRACT We use the Separate Universe technique to calibrate the dependence of linear and quadratic halo bias b1 and b2 on the local cosmic web environment of dark matter haloes. We do this by measuring the response of halo abundances at fixed mass and cosmic web tidal anisotropy α to an infinite wavelength initial perturbation. We augment our measurements with an analytical framework developed in earlier work that exploits the near-lognormal shape of the distribution of α and results in very high precision calibrations. We present convenient fitting functions for the dependence of b1 and b2 on α over a wide range of halo mass for redshifts 0 ≤ z ≤ 1. Our calibration of b2(α) is the first demonstration to date of the dependence of non-linear bias on the local web environment. Motivated by previous results that showed that α is the primary indicator of halo assembly bias for a number of halo properties beyond halo mass, we then extend our analytical framework to accommodate the dependence of b1 and b2 on any such secondary property that has, or can be monotonically transformed to have, a Gaussian distribution. We demonstrate this technique for the specific case of halo concentration, finding good agreement with previous results. Our calibrations will be useful for a variety of halo model analyses focusing on galaxy assembly bias, as well as analytical forecasts of the potential for using α as a segregating variable in multitracer analyses.


2020 ◽  
Vol 6 (1) ◽  
Author(s):  
Malte Seemann ◽  
Lennart Bargsten ◽  
Alexander Schlaefer

AbstractDeep learning methods produce promising results when applied to a wide range of medical imaging tasks, including segmentation of artery lumen in computed tomography angiography (CTA) data. However, to perform sufficiently, neural networks have to be trained on large amounts of high quality annotated data. In the realm of medical imaging, annotations are not only quite scarce but also often not entirely reliable. To tackle both challenges, we developed a two-step approach for generating realistic synthetic CTA data for the purpose of data augmentation. In the first step moderately realistic images are generated in a purely numerical fashion. In the second step these images are improved by applying neural domain adaptation. We evaluated the impact of synthetic data on lumen segmentation via convolutional neural networks (CNNs) by comparing resulting performances. Improvements of up to 5% in terms of Dice coefficient and 20% for Hausdorff distance represent a proof of concept that the proposed augmentation procedure can be used to enhance deep learning-based segmentation for artery lumen in CTA images.


Sign in / Sign up

Export Citation Format

Share Document