A 0.1pJ/b/dB 1.62-to-10.8Gb/s Video Interface Receiver with Fully Adaptive Equalization Using Un-Even Data Level

Author(s):  
Jinhyung Lee ◽  
Kwangho Lee ◽  
Hyojun Kim ◽  
Byungmin Kim ◽  
Kwanseo Park ◽  
...  
2020 ◽  
Vol 55 (8) ◽  
pp. 2186-2195
Author(s):  
Jinhyung Lee ◽  
Kwangho Lee ◽  
Hyojun Kim ◽  
Byungmin Kim ◽  
Kwanseo Park ◽  
...  
Keyword(s):  

2019 ◽  
Vol 28 (1) ◽  
pp. 224-231
Author(s):  
Randa S. Hammad ◽  
El_Sayed M. El_Rabaie ◽  
Fathi. E. Abd-El-Samie ◽  
Ibrahim M. El-Dokany

2014 ◽  
Vol 33 (4) ◽  
pp. 221-245 ◽  
Author(s):  
Alexander Kogan ◽  
Michael G. Alles ◽  
Miklos A. Vasarhelyi ◽  
Jia Wu

SUMMARY: This study develops a framework for a continuous data level auditing system and uses a large sample of procurement data from a major health care provider to simulate an implementation of this framework. In this framework, the first layer monitors compliance with deterministic business process rules and the second layer consists of analytical monitoring of business processes. A distinction is made between exceptions identified by the first layer and anomalies identified by the second one. The unique capability of continuous auditing to investigate (and possibly remediate) the identified anomalies in “pseudo-real time” (e.g., on a daily basis) is simulated and evaluated. Overall, evidence is provided that continuous auditing of complete population data can lead to superior results, but only when audit practices change to reflect the new reality of data availability. Data Availability: The data are proprietary. Please contact the authors for details.


Author(s):  
Alexander Haberl ◽  
Dirk Praetorius ◽  
Stefan Schimanko ◽  
Martin Vohralík

AbstractWe consider a second-order elliptic boundary value problem with strongly monotone and Lipschitz-continuous nonlinearity. We design and study its adaptive numerical approximation interconnecting a finite element discretization, the Banach–Picard linearization, and a contractive linear algebraic solver. In particular, we identify stopping criteria for the algebraic solver that on the one hand do not request an overly tight tolerance but on the other hand are sufficient for the inexact (perturbed) Banach–Picard linearization to remain contractive. Similarly, we identify suitable stopping criteria for the Banach–Picard iteration that leave an amount of linearization error that is not harmful for the residual a posteriori error estimate to steer reliably the adaptive mesh-refinement. For the resulting algorithm, we prove a contraction of the (doubly) inexact iterates after some amount of steps of mesh-refinement/linearization/algebraic solver, leading to its linear convergence. Moreover, for usual mesh-refinement rules, we also prove that the overall error decays at the optimal rate with respect to the number of elements (degrees of freedom) added with respect to the initial mesh. Finally, we prove that our fully adaptive algorithm drives the overall error down with the same optimal rate also with respect to the overall algorithmic cost expressed as the cumulated sum of the number of mesh elements over all mesh-refinement, linearization, and algebraic solver steps. Numerical experiments support these theoretical findings and illustrate the optimal overall algorithmic cost of the fully adaptive algorithm on several test cases.


Sign in / Sign up

Export Citation Format

Share Document