acoustic modeling
Recently Published Documents


TOTAL DOCUMENTS

734
(FIVE YEARS 117)

H-INDEX

31
(FIVE YEARS 4)

2022 ◽  
Vol 165 ◽  
pp. 108307
Author(s):  
Gang Wang ◽  
Deyu Kong ◽  
Wenlong Li ◽  
Junfang Ni ◽  
Xianjie Shi

2022 ◽  
Author(s):  
Gao Jun Wu ◽  
Tejal K. Shanbhag ◽  
Eduardo S. Molina ◽  
Sanjiva K. Lele ◽  
Juan J. Alonso

Bioprinting ◽  
2021 ◽  
pp. e00186
Author(s):  
Ali A. Rostam-Alilou ◽  
Hamid Jafari ◽  
Ali Zolfagharian ◽  
Ahmad Serjouei ◽  
Mahdi Bodaghi

2021 ◽  
Vol 11 (21) ◽  
pp. 10475
Author(s):  
Xiao Zhou ◽  
Zhenhua Ling ◽  
Yajun Hu ◽  
Lirong Dai

An encoder–decoder with attention has become a popular method to achieve sequence-to-sequence (Seq2Seq) acoustic modeling for speech synthesis. To improve the robustness of the attention mechanism, methods utilizing the monotonic alignment between phone sequences and acoustic feature sequences have been proposed, such as stepwise monotonic attention (SMA). However, the phone sequences derived by grapheme-to-phoneme (G2P) conversion may not contain the pauses at the phrase boundaries in utterances, which challenges the assumption of strictly stepwise alignment in SMA. Therefore, this paper proposes to insert hidden states into phone sequences to deal with the situation that pauses are not provided explicitly, and designs a semi-stepwise monotonic attention (SSMA) to model these inserted hidden states. In this method, hidden states are introduced that absorb the pause segments in utterances in an unsupervised way. Thus, the attention at each decoding frame has three options, moving forward to the next phone, staying at the same phone, or jumping to a hidden state. Experimental results show that SSMA can achieve better naturalness of synthetic speech than SMA when phrase boundaries are not available. Moreover, the pause positions derived from the alignment paths of SSMA matched the manually labeled phrase boundaries quite well.


2021 ◽  
Vol 25 (5) ◽  
pp. 18-26
Author(s):  
Juchan Son ◽  
Sumin Hong ◽  
Jeongjae Hwang ◽  
Min Kuk Kim ◽  
Daesik Kim

2021 ◽  
Vol 150 (4) ◽  
pp. A83-A83
Author(s):  
Brendan J. DeCourcy ◽  
Ying-Tsong Lin ◽  
Weifeng G. Zhang

2021 ◽  
Vol 150 (4) ◽  
pp. A317-A318
Author(s):  
Jessica Desrochers ◽  
Lora Van Uffelen ◽  
Sarah E. Webster ◽  
Matthew A. Dzieciuch ◽  
Peter F. Worcester

Sign in / Sign up

Export Citation Format

Share Document