scholarly journals Vocal development in a large-scale crosslinguistic corpus

Author(s):  
Meg Cychosz ◽  
Alejandrina Cristia ◽  
Elika Bergelson ◽  
Marisa Casillas ◽  
Gladys Baudet ◽  
...  

This study evaluates whether early vocalizations develop in similar ways in children across diverse cultural contexts. We analyze data from daylong audio-recordings of 49 children (1-36 months) from five different language/cultural backgrounds. Citizen scientists annotated these recordings to determine if child vocalizations contained canonical transitions or not (e.g., "ba'' versus "ee''). Results revealed that the proportion of clips reported to contain canonical transitions increased with age. Further, this proportion exceeded 0.15 by around 7 months, replicating and extending previous findings on canonical vocalization development but using data from the natural environments of a culturally and linguistically diverse sample. This work explores how crowdsourcing can be used to annotate corpora, helping establish developmental milestones relevant to multiple languages and cultures. Lower inter-annotator reliability on the crowdsourcing platform, relative to more traditional in-lab expert annotators, means that a larger number of unique annotators and/or annotations are required and that crowdsourcing may not be a suitable method for more fine-grained annotation decisions. Audio clips used for this project are compiled into a large-scale infant vocal corpus that is available for other researchers to use in future work.

Author(s):  
Anil S. Baslamisli ◽  
Partha Das ◽  
Hoang-An Le ◽  
Sezer Karaoglu ◽  
Theo Gevers

AbstractIn general, intrinsic image decomposition algorithms interpret shading as one unified component including all photometric effects. As shading transitions are generally smoother than reflectance (albedo) changes, these methods may fail in distinguishing strong photometric effects from reflectance variations. Therefore, in this paper, we propose to decompose the shading component into direct (illumination) and indirect shading (ambient light and shadows) subcomponents. The aim is to distinguish strong photometric effects from reflectance variations. An end-to-end deep convolutional neural network (ShadingNet) is proposed that operates in a fine-to-coarse manner with a specialized fusion and refinement unit exploiting the fine-grained shading model. It is designed to learn specific reflectance cues separated from specific photometric effects to analyze the disentanglement capability. A large-scale dataset of scene-level synthetic images of outdoor natural environments is provided with fine-grained intrinsic image ground-truths. Large scale experiments show that our approach using fine-grained shading decompositions outperforms state-of-the-art algorithms utilizing unified shading on NED, MPI Sintel, GTA V, IIW, MIT Intrinsic Images, 3DRMS and SRD datasets.


2019 ◽  
Vol 375 (1791) ◽  
pp. 20180522 ◽  
Author(s):  
Mante S. Nieuwland ◽  
Dale J. Barr ◽  
Federica Bartolozzi ◽  
Simon Busch-Moreno ◽  
Emily Darley ◽  
...  

Composing sentence meaning is easier for predictable words than for unpredictable words. Are predictable words genuinely predicted, or simply more plausible and therefore easier to integrate with sentence context? We addressed this persistent and fundamental question using data from a recent, large-scale ( n = 334) replication study, by investigating the effects of word predictability and sentence plausibility on the N400, the brain's electrophysiological index of semantic processing. A spatio-temporally fine-grained mixed-effect multiple regression analysis revealed overlapping effects of predictability and plausibility on the N400, albeit with distinct spatio-temporal profiles. Our results challenge the view that the predictability-dependent N400 reflects the effects of either prediction or integration, and suggest that semantic facilitation of predictable words arises from a cascade of processes that activate and integrate word meaning with context into a sentence-level meaning. This article is part of the theme issue ‘Towards mechanistic models of meaning composition’.


Author(s):  
S. Bhattacharya ◽  
C. Braun ◽  
U. Leopold

Abstract. In this paper, we address the curse of dimensionality and scalability issues while managing vast volumes of multidimensional raster data in the renewable energy modeling process in an appropriate spatial and temporal context. Tensor representation provides a convenient way to capture inter-dependencies along multiple dimensions. In this direction, we propose a sophisticated way of handling large-scale multi-layered spatio-temporal data, adopted for raster-based geographic information systems (GIS). We chose Tensorflow, an open source software library developed by Google using data flow graphs, and the tensor data structure. We provide a comprehensive performance evaluation of the proposed model against r.sun in GRASS GIS. Benchmarking shows that the tensor-based approach outperforms by up to 60%, concerning overall execution time for high-resolution datasets and fine-grained time intervals for daily sums of solar irradiation [Wh.m-2.day-1].


2018 ◽  
Author(s):  
Mante S. Nieuwland ◽  
Dale J. Barr ◽  
Federica Bartolozzi ◽  
Simon Busch-Moreno ◽  
Emily Darley ◽  
...  

AbstractComposing sentence meaning is easier for predictable words than for unpredictable words. Are predictable words genuinely predicted, or simply more plausible and therefore easier to integrate with sentence context? We addressed this persistent and fundamental question using data from a recent, large-scale (N= 334) replication study, by investigating the effects of word predictability and sentence plausibility on the N400, the brain’s electrophysiological index of semantic processing. A spatiotemporally fine-grained mixed effects multiple regression analysis revealed overlapping effects of predictability and plausibility on the N400, albeit with distinct spatiotemporal profiles. Our results challenge the view that the predictability-dependent N400 reflects the effects ofeitherpredictionorintegration, and suggest that semantic facilitation of predictable words arises from a cascade of processes that activate and integrate word meaning with context into a sentence-level meaning.


2020 ◽  
Author(s):  
Lana Ruck ◽  
P. Thomas Schoenemann

AbstractOpen data initiatives such as the UK Biobank and Human Connectome Project provide researchers with access to neuroimaging, genetic, and other data for large samples of left-and right-handed participants, allowing for more robust investigations of handedness than ever before. Handedness inventories are universal tools for assessing participant handedness in these large-scale neuroimaging contexts. These self-report measures are typically used to screen and recruit subjects, but they are also widely used as variables in statistical analyses of fMRI and other data. Recent investigations into the validity of handedness inventories, however, suggest that self-report data from these inventories might not reflect hand preference/performance as faithfully as previously thought. Using data from the Human Connectome Project, we assessed correspondence between three handedness measures – the Edinburgh Handedness Inventory (EHI), the Rolyan 9-hole pegboard, and grip strength – in 1179 healthy subjects. We show poor association between the different handedness measures, with roughly 10% of the sample having at least one behavioral measure which indicates hand-performance bias opposite to the EHI score, and over 65% of left-handers having one or more mismatched handedness scores. We discuss implications for future work, urging researchers to critically consider direction, degree, and consistency of handedness in their data.


NASPA Journal ◽  
1998 ◽  
Vol 35 (4) ◽  
Author(s):  
Jackie Clark ◽  
Joan Hirt

The creation of small communities has been proposed as a way of enhancing the educational experience of students at large institutions. Using data from a survey of students living in large and small residences at a public research university, this study does not support the common assumption that small-scale social environments are more conducive to positive community life than large-scale social environments.


Author(s):  
Holly M. Smith

Consequentialists have long debated (as deontologists should) how to define an agent’s alternatives, given that (a) at any particular time an agent performs numerous “versions” of actions, (b) an agent may perform several independent co-temporal actions, and (c) an agent may perform sequences of actions. We need a robust theory of human action to provide an account of alternatives that avoids previously debated problems. After outlining Alvin Goldman’s action theory (which takes a fine-grained approach to act individuation) and showing that the agent’s alternatives must remain invariant across different normative theories, I address issue (a) by arguing that an alternative for an agent at a time is an entire “act tree” performable by her, rather than any individual act token. I argue further that both tokens and trees must possess moral properties, and I suggest principles governing how these are inherited among trees and tokens. These proposals open a path for future work addressing issues (b) and (c).


2019 ◽  
Vol 22 (3) ◽  
pp. 365-380 ◽  
Author(s):  
Matthias Olthaar ◽  
Wilfred Dolfsma ◽  
Clemens Lutz ◽  
Florian Noseleit

In a competitive business environment at the Bottom of the Pyramid smallholders supplying global value chains may be thought to be at the whims of downstream large-scale players and local market forces, leaving no room for strategic entrepreneurial behavior. In such a context we test the relationship between the use of strategic resources and firm performance. We adopt the Resource Based Theory and show that seemingly homogenous smallholders deploy resources differently and, consequently, some do outperform others. We argue that the ‘resource-based theory’ results in a more fine-grained understanding of smallholder performance than approaches generally applied in agricultural economics. We develop a mixed-method approach that allows one to pinpoint relevant, industry-specific resources, and allows for empirical identification of the relative contribution of each resource to competitive advantage. The results show that proper use of quality labor, storage facilities, time of selling, and availability of animals are key capabilities.


Author(s):  
Paul Oehlmann ◽  
Paul Osswald ◽  
Juan Camilo Blanco ◽  
Martin Friedrich ◽  
Dominik Rietzel ◽  
...  

AbstractWith industries pushing towards digitalized production, adaption to expectations and increasing requirements for modern applications, has brought additive manufacturing (AM) to the forefront of Industry 4.0. In fact, AM is a main accelerator for digital production with its possibilities in structural design, such as topology optimization, production flexibility, customization, product development, to name a few. Fused Filament Fabrication (FFF) is a widespread and practical tool for rapid prototyping that also demonstrates the importance of AM technologies through its accessibility to the general public by creating cost effective desktop solutions. An increasing integration of systems in an intelligent production environment also enables the generation of large-scale data to be used for process monitoring and process control. Deep learning as a form of artificial intelligence (AI) and more specifically, a method of machine learning (ML) is ideal for handling big data. This study uses a trained artificial neural network (ANN) model as a digital shadow to predict the force within the nozzle of an FFF printer using filament speed and nozzle temperatures as input data. After the ANN model was tested using data from a theoretical model it was implemented to predict the behavior using real-time printer data. For this purpose, an FFF printer was equipped with sensors that collect real time printer data during the printing process. The ANN model reflected the kinematics of melting and flow predicted by models currently available for various speeds of printing. The model allows for a deeper understanding of the influencing process parameters which ultimately results in the determination of the optimum combination of process speed and print quality.


Geosciences ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 41
Author(s):  
Tim Jurisch ◽  
Stefan Cantré ◽  
Fokke Saathoff

A variety of studies recently proved the applicability of different dried, fine-grained dredged materials as replacement material for erosion-resistant sea dike covers. In Rostock, Germany, a large-scale field experiment was conducted, in which different dredged materials were tested with regard to installation technology, stability, turf development, infiltration, and erosion resistance. The infiltration experiments to study the development of a seepage line in the dike body showed unexpected measurement results. Due to the high complexity of the problem, standard geo-hydraulic models proved to be unable to analyze these results. Therefore, different methods of inverse infiltration modeling were applied, such as the parameter estimation tool (PEST) and the AMALGAM algorithm. In the paper, the two approaches are compared and discussed. A sensitivity analysis proved the presumption of a non-linear model behavior for the infiltration problem and the Eigenvalue ratio indicates that the dike infiltration is an ill-posed problem. Although this complicates the inverse modeling (e.g., termination in local minima), parameter sets close to an optimum were found with both the PEST and the AMALGAM algorithms. Together with the field measurement data, this information supports the rating of the effective material properties of the applied dredged materials used as dike cover material.


Sign in / Sign up

Export Citation Format

Share Document