processing component
Recently Published Documents


TOTAL DOCUMENTS

50
(FIVE YEARS 10)

H-INDEX

9
(FIVE YEARS 2)

2021 ◽  
Vol 24 (2) ◽  
pp. 1740-1747
Author(s):  
Anton Leuski ◽  
David Traum

NPCEditor is a system for building a natural language processing component for virtual humans capable of engaging a user in spoken dialog on a limited domain. It uses a statistical language classification technology for mapping from user's text input to system responses. NPCEditor provides a user-friendly editor for creating effective virtual humans quickly. It has been deployed as a part of various virtual human systems in several applications.


2021 ◽  
Vol 3 ◽  
Author(s):  
Niccolò Marini ◽  
Sebastian Otálora ◽  
Damian Podareanu ◽  
Mart van Rijthoven ◽  
Jeroen van der Laak ◽  
...  

Algorithms proposed in computational pathology can allow to automatically analyze digitized tissue samples of histopathological images to help diagnosing diseases. Tissue samples are scanned at a high-resolution and usually saved as images with several magnification levels, namely whole slide images (WSIs). Convolutional neural networks (CNNs) represent the state-of-the-art computer vision methods targeting the analysis of histopathology images, aiming for detection, classification and segmentation. However, the development of CNNs that work with multi-scale images such as WSIs is still an open challenge. The image characteristics and the CNN properties impose architecture designs that are not trivial. Therefore, single scale CNN architectures are still often used. This paper presents Multi_Scale_Tools, a library aiming to facilitate exploiting the multi-scale structure of WSIs. Multi_Scale_Tools currently include four components: a pre-processing component, a scale detector, a multi-scale CNN for classification and a multi-scale CNN for segmentation of the images. The pre-processing component includes methods to extract patches at several magnification levels. The scale detector allows to identify the magnification level of images that do not contain this information, such as images from the scientific literature. The multi-scale CNNs are trained combining features and predictions that originate from different magnification levels. The components are developed using private datasets, including colon and breast cancer tissue samples. They are tested on private and public external data sources, such as The Cancer Genome Atlas (TCGA). The results of the library demonstrate its effectiveness and applicability. The scale detector accurately predicts multiple levels of image magnification and generalizes well to independent external data. The multi-scale CNNs outperform the single-magnification CNN for both classification and segmentation tasks. The code is developed in Python and it will be made publicly available upon publication. It aims to be easy to use and easy to be improved with additional functions.


2021 ◽  
pp. 089020702110076
Author(s):  
Marina Fiori ◽  
Shagini Udayar ◽  
Ashley Vesely Maillefer

The relationship between emotional intelligence (EI) and emotion information processing (EIP) has received surprisingly little attention in the literature. The present research addresses these gaps in the literature by introducing a conceptualization of emotional intelligence as composed of two distinct components: (1) EIK or emotion Knowledge component, captured by current ability emotional intelligence tests, related to top-down, higher order reasoning about emotions, and which depends more strongly on acquired and culture-bound knowledge about emotions; (2) EIP or emotion information Processing component, measured with emotion information processing tasks, requires faster processing and is based on bottom-up attention-related responses to emotion information. In Study 1 ( N = 349) we tested the factorial structure of this new EIP component within the nomological network of intelligence and current ability emotional intelligence. In Study 2 ( N =111) we tested the incremental validity of EIP in predicting both overall performance and the charisma of a presenter while presenting in a stressful situation. Results support the importance of acknowledging the role of emotion information processing in the emotional intelligence literature and point to the utility of introducing a new EI measure that would capture stable individual differences in how individuals process emotion information.


Author(s):  
Caitlin Coughler ◽  
Emily Michaela Hamel ◽  
Janis Oram Cardy ◽  
Lisa M. D. Archibald ◽  
David W. Purcell

Purpose Developmental language disorder (DLD), an unexplained problem using and understanding spoken language, has been hypothesized to have an underlying auditory processing component. Auditory feedback plays a key role in speech motor control. The current study examined whether auditory feedback is used to regulate speech production in a similar way by children with DLD and their typically developing (TD) peers. Method Participants aged 6–11 years completed tasks measuring hearing, language, first formant (F1) discrimination thresholds, partial vowel space, and responses to altered auditory feedback with F1 perturbation. Results Children with DLD tended to compensate more than TD children for the positive F1 manipulation and compensated less than TD children in the negative shift condition. Conclusion Our findings suggest that children with DLD make atypical use of auditory feedback.


10.36850/e2 ◽  
2020 ◽  
Vol 1 (1) ◽  
pp. 27-38 ◽  
Author(s):  
Juliane Traxler ◽  
Roxane V. Philips ◽  
Andreas von Leupoldt ◽  
Johan W. S. Vlaeyen

Pain can be considered as a signal of “bodily error”: Errors – discrepancies between the actual and optimal/targeted state – can put organisms at danger and activate behavioral defensive systems. If the error relates to the body, pain is the warning signal that motivates protective action such as avoidance behavior to safeguard our body’s integrity. Hence, pain shares the functionality of errors. On the neural level, an important error processing component is the error-related negativity (ERN), a negative deflection in the electroencephalographic (EEG) signal generated primarily in the anterior cingulate cortex within 100 ms after error commission. Despite compelling evidence that the ERN plays an important role in the development of various psychopathologies and is implicated in learning and adjustment of behavior, its relation to pain-related avoidance has not yet been examined. Based on findings from anxiety research, it seems conceivable that individuals with elevated ERN amplitudes are more prone to engage in pain-related avoidance behavior, which may, under certain conditions, be a risk factor for developing chronic pain. Consequently, this new line of research promises to contribute to our understanding of human pain. As in most novel research areas, a first crucial step for integrating the scientific fields of ERN and pain is developing a paradigm suited to address the needs from both fields. The present manuscript presents the development and piloting of an experimental task measuring both ERN and avoidance behavior in response to painful mistakes, as well as the challenges encountered herein. A total of 12 participants underwent one of six different task versions. We describe in detail each of these versions, including their results, shortcomings, our solutions, and subsequent steps. Finally, we provide some advice for researchers aiming at developing novel paradigms.


2020 ◽  
Vol 4 (1) ◽  
pp. 11 ◽  
Author(s):  
Angkush Kumar Ghosh ◽  
AMM Sharif Ullah ◽  
Akihiko Kubo ◽  
Takeshi Akamatsu ◽  
Doriana Marilena D’Addona

Industry 4.0 requires phenomenon twins to functionalize the relevant systems (e.g., cyber-physical systems). A phenomenon twin means a computable virtual abstraction of a real phenomenon. In order to systematize the construction process of a phenomenon twin, this study proposes a system defined as the phenomenon twin construction system. It consists of three components, namely the input, processing, and output components. Among these components, the processing component is the most critical one that digitally models, simulates, and validates a given phenomenon extracting information from the input component. What kind of modeling, simulation, and validation approaches should be used while constructing the processing component for a given phenomenon is a research question. This study answers this question using the case of surface roughness—a complex phenomenon associated with all material removal processes. Accordingly, this study shows that for modeling the surface roughness of a machined surface, the approach called semantic modeling is more effective than the conventional approach called the Markov chain. It is also found that to validate whether or not a simulated surface roughness resembles the expected roughness, the outcomes of the possibility distribution-based computing and DNA-based computing are more effective than the outcomes of a conventional computing wherein the arithmetic mean height of surface roughness is calculated. Thus, apart from the conventional computing approaches, the leading edge computational intelligence-based approaches can digitize manufacturing processes more effectively.


2019 ◽  
Vol 35 (10) ◽  
pp. 1-3 ◽  
Author(s):  
Mahdi Safa ◽  
Sylvia Baeza ◽  
Kelly Weeks

Purpose This study explores the use of Blockchain technology as a new solution to many current problems in construction information management. The study shows that Blockchain has the potential to address several issues such as confidentiality, provenance tracking; monitoring channel and ledger metrics; disintermediation; non-repudiation; change tracing; multiparty aggregation; traceability inter-organizational recordkeeping; and data ownership. Design A systematic analysis of a paper published, “Potentials of Blockchain Technology for Construction Management,” is offered. The structured results are provided for the purpose of contributing to the discussion of the topic. Findings The results of this study shows that the suitable position for the integration of Blockchain is the interface-points of the transaction processing component of the Building Information Modeling server. This technology also can help in controlling and fingerprinting all information exchanges and communication. The conclusion drawn from the study provides a foundation from which further research can be developed. Originality and Value The findings of this study will help construction project managers and senior executives with a deeper understanding of Blockchain technology and its long-term implications for the construction industry; coupled with knowledge of its relationship to other emerging technologies such as BIM. Propositions for smart contracts deployment and further research are suggested.


Schwa deletion is important factor for conversion of Grapheme to Phoneme. In Hindi language each consonant has weak vowel. This weak vowel is called as inherent schwa. These schwa is deleted some cases in pronunciation. Written form and speech forms are different in Indian language. Schwa plays important role in speech form. Deletion and retention of weak vowel decides how words are pronounced. Words morphology is main factors that affects pronunciation. In current paper, we describe schwa handling, deletion and retention rules. Based on different rule we developed schwa deletion algorithm. This algorithm has been tested over 6000 high frequency words. We received accuracy result up to 80%. Based on result an application has been developed to provide user interface for the text processing component of text to speech system


2019 ◽  
Vol 2019 ◽  
pp. 1-14
Author(s):  
Yaqing Shi ◽  
Song Huang ◽  
Changyou Zheng ◽  
Haijin Ji

The aggregate query of moving objects on road network keeps being popular in the ITS research community. The existing methods often assume that the sampling frequency of the positioning devices like GPS or roadside radar is dense enough, making the result’s uncertainty negligible. However, such assumption is not always tenable, especially in the extreme occasions like wartime. Regarding this issue, a hybrid aggregate index framework is proposed in this paper, in order to perform aggregate queries on massive trajectories that are sampled sparsely. Firstly, this framework uses an offline batch processing component based on the UPBI-Sketch index to acquire each object’s most likely position between two continuous sampling instants. Next, it introduces the AMH+-Sketch index to processing the aggregate operation online, making sure each object is counted only once in the result. The experimental results show that the hybrid framework can ensure the query accuracy by adjusting the parameters L and U of AMH+-Sketch index and its space storage advantage becomes more and more obvious when the data scale is very large.


Sign in / Sign up

Export Citation Format

Share Document