context switch
Recently Published Documents


TOTAL DOCUMENTS

55
(FIVE YEARS 10)

H-INDEX

10
(FIVE YEARS 1)

Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1449
Author(s):  
Tianbo Ji ◽  
Chenyang Lyu ◽  
Zhichao Cao ◽  
Peng Cheng

Neural auto-regressive sequence-to-sequence models have been dominant in text generation tasks, especially the question generation task. However, neural generation models suffer from the global and local semantic semantic drift problems. Hence, we propose the hierarchical encoding–decoding mechanism that aims at encoding rich structure information of the input passages and reducing the variance in the decoding phase. In the encoder, we hierarchically encode the input passages according to its structure at four granularity-levels: [word, chunk, sentence, document]-level. Second, we progressively select the context vector from the document-level representations to the word-level representations at each decoding time step. At each time-step in the decoding phase, we progressively select the context vector from the document-level representations to word-level. We also propose the context switch mechanism that enables the decoder to use the context vector from the last step when generating the current word at each time-step.It provides a means of improving the stability of the text generation process during the decoding phase when generating a set of consecutive words. Additionally, we inject syntactic parsing knowledge to enrich the word representations. Experimental results show that our proposed model substantially improves the performance and outperforms previous baselines according to both automatic and human evaluation. Besides, we implement a deep and comprehensive analysis of generated questions based on their types.


2021 ◽  
Author(s):  
Bruno Dourado Miranda ◽  
Rômulo Silva De Oliveira ◽  
Andreu Carminati

Real-Time Operating Systems (RTOS) have their own modules that need to be executed to manage system resources and such modules add overhead to task response times. FreeRTOS is used for experimental purposes since its is a widely used open-source RTOS. This work presents the investigation of two important sources of overhead: Function Tick, a FreeRTOS time marker, and the Context Switch between tasks. In this paper we also describe a model for reducing Tick analysis pessimism due to its temporal variation. Experiments measuring the execution time of Tick and Context Switch on ARM-Cortex M4 microprocessor were made to present the Best-Case Execution Time and the Worst-Case Execution time within a periodic task scenario. Measurements are used to validate the analytic models.


2021 ◽  
Vol 102 ◽  
pp. 102753
Author(s):  
Xin Long ◽  
Jigang Wu ◽  
Yalan Wu ◽  
Long Chen ◽  
Yidong Li

Author(s):  
Chun-Feng Wu ◽  
Yuan-Hao Chang ◽  
Ming-Chang Yang ◽  
Tei-Wei Kuo
Keyword(s):  

Author(s):  
Md Enamul Haque ◽  
S. M. Zobaed ◽  
Muhammad Usama Islam ◽  
Faaiza Mohammad Areef
Keyword(s):  

2019 ◽  
Vol 40 (2) ◽  
pp. 34-45 ◽  
Author(s):  
Juan M. Rosas ◽  
James Byron Nelson

AbstractContext dependence of information has been shown to be based, at least in part, on the attention contexts received at the time of training. Recent research suggests that attention to irrelevant contexts may be a byproduct of the activation of a general exploratory attentional mechanism prompted by high prediction errors associated with situations of uncertainty. Alternatively, low prediction errors may engage an attentional mechanism of exploitation in situations in which contexts play a relevant role. A selective review discusses the potential of this approach to explain context switch effects from an attentional perspective.


Sign in / Sign up

Export Citation Format

Share Document