scholarly journals Joint Representation and Truncated Inference Learning for Correlation Filter Based Tracking

Author(s):  
Yingjie Yao ◽  
Xiaohe Wu ◽  
Lei Zhang ◽  
Shiguang Shan ◽  
Wangmeng Zuo
2009 ◽  
Vol 41 (1) ◽  
pp. 44-52
Author(s):  
Zhi-Ya Liu ◽  
Lei Mo
Keyword(s):  

2019 ◽  
Vol 31 (5) ◽  
pp. 792
Author(s):  
Zongmin Li ◽  
Hongjiao Fu ◽  
Yujie Liu ◽  
Hua Li

2021 ◽  
Vol 436 ◽  
pp. 273-282
Author(s):  
Youmin Yan ◽  
Xixian Guo ◽  
Jin Tang ◽  
Chenglong Li ◽  
Xin Wang

2021 ◽  
Vol 438 ◽  
pp. 94-106
Author(s):  
Shiyu Xuan ◽  
Shengyang Li ◽  
Zifei Zhao ◽  
Zhuang Zhou ◽  
Wanfeng Zhang ◽  
...  

Author(s):  
Nujud Aloshban ◽  
Anna Esposito ◽  
Alessandro Vinciarelli

AbstractDepression is one of the most common mental health issues. (It affects more than 4% of the world’s population, according to recent estimates.) This article shows that the joint analysis of linguistic and acoustic aspects of speech allows one to discriminate between depressed and nondepressed speakers with an accuracy above 80%. The approach used in the work is based on networks designed for sequence modeling (bidirectional Long-Short Term Memory networks) and multimodal analysis methodologies (late fusion, joint representation and gated multimodal units). The experiments were performed over a corpus of 59 interviews (roughly 4 hours of material) involving 29 individuals diagnosed with depression and 30 control participants. In addition to an accuracy of 80%, the results show that multimodal approaches perform better than unimodal ones owing to people’s tendency to manifest their condition through one modality only, a source of diversity across unimodal approaches. In addition, the experiments show that it is possible to measure the “confidence” of the approach and automatically identify a subset of the test data in which the performance is above a predefined threshold. It is possible to effectively detect depression by using unobtrusive and inexpensive technologies based on the automatic analysis of speech and language.


Sign in / Sign up

Export Citation Format

Share Document