Network language

2021 ◽  
pp. 54-90
Author(s):  
Zhou Yan
Keyword(s):  
2014 ◽  
Vol 644-650 ◽  
pp. 6079-6082
Author(s):  
Xiao Hua Chen

Starting from the late 20th century, with the development of computer technology and the widespread application of the Internet, human has stepped into the age marked by digital and global network of the information. Network has been used widely, becoming increasingly prevalent in network communication, network language and network culture emerged, while the network English learning arises at the historic moment.The use of English as an international language, it has become an unprecedented prosperity. However, restricted to school classroom English learning or training institutions has been unable to meet people's needs, English autonomous learning outside the classroom eagerly on the agenda. Network creates the English autonomous learning is the most important virtual places, it focus on a large number of rich resources, which can stimulate the learning interest, and suitable for collaborative learning, and to larger extent realize the personalized learning, effectively promote English autonomous learning.


2018 ◽  
Vol 28 (09) ◽  
pp. 1850007
Author(s):  
Francisco Zamora-Martinez ◽  
Maria Jose Castro-Bleda

Neural Network Language Models (NNLMs) are a successful approach to Natural Language Processing tasks, such as Machine Translation. We introduce in this work a Statistical Machine Translation (SMT) system which fully integrates NNLMs in the decoding stage, breaking the traditional approach based on [Formula: see text]-best list rescoring. The neural net models (both language models (LMs) and translation models) are fully coupled in the decoding stage, allowing to more strongly influence the translation quality. Computational issues were solved by using a novel idea based on memorization and smoothing of the softmax constants to avoid their computation, which introduces a trade-off between LM quality and computational cost. These ideas were studied in a machine translation task with different combinations of neural networks used both as translation models and as target LMs, comparing phrase-based and [Formula: see text]-gram-based systems, showing that the integrated approach seems more promising for [Formula: see text]-gram-based systems, even with nonfull-quality NNLMs.


2010 ◽  
Vol 13 (3) ◽  
pp. 307-341 ◽  
Author(s):  
Yintang Dai ◽  
Shiyong Zhang ◽  
Jidong Chen ◽  
Tianyuan Chen ◽  
Wei Zhang

Sign in / Sign up

Export Citation Format

Share Document