Memory Hierarchy and Data Reuse Decision Exploration

Author(s):  
Francky Catthoor ◽  
Sven Wuytack ◽  
Eddy De Greef ◽  
Florin Balasa ◽  
Lode Nachtergaele ◽  
...  
Keyword(s):  
2012 ◽  
Vol 2012 ◽  
pp. 1-10 ◽  
Author(s):  
Alba Sandyra Bezerra Lopes ◽  
Ivan Saraiva Silva ◽  
Luciano Volcan Agostini

The motion estimation is the most complex module in a video encoder requiring a high processing throughput and high memory bandwidth, mainly when the focus is high-definition videos. The throughput problem can be solved increasing the parallelism in the internal operations. The external memory bandwidth may be reduced using a memory hierarchy. This work presents a memory hierarchy model for a full-search motion estimation core. The proposed memory hierarchy model is based on a data reuse scheme considering the full search algorithm features. The proposed memory hierarchy expressively reduces the external memory bandwidth required for the motion estimation process, and it provides a very high data throughput for the ME core. This throughput is necessary to achieve real time when processing high-definition videos. When considering the worst bandwidth scenario, this memory hierarchy is able to reduce the external memory bandwidth in 578 times. A case study for the proposed hierarchy, using32×32search window and8×8block size, was implemented and prototyped on a Virtex 4 FPGA. The results show that it is possible to reach 38 frames per second when processing full HD frames (1920×1080pixels) using nearly 299 Mbytes per second of external memory bandwidth.


2019 ◽  
Vol 5 (3) ◽  
pp. 317-337
Author(s):  
B. Custers ◽  
H. U Vrabec ◽  
M. Friedewald
Keyword(s):  

2018 ◽  
Vol 175 ◽  
pp. 02009
Author(s):  
Carleton DeTar ◽  
Steven Gottlieb ◽  
Ruizi Li ◽  
Doug Toussaint

With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.


Author(s):  
Annabelle Cumyn ◽  
Roxanne Dault ◽  
Adrien Barton ◽  
Anne-Marie Cloutier ◽  
Jean-François Ethier

A survey was conducted to assess citizens, research ethics committee members, and researchers’ attitude toward information and consent for the secondary use of health data for research within learning health systems (LHSs). Results show that the reuse of health data for research to advance knowledge and improve care is valued by all parties; consent regarding health data reuse for research has fundamental importance particularly to citizens; and all respondents deemed important the existence of a secure website to support the information and consent processes. This survey was part of a larger project that aims at exploring public perspectives on alternate approaches to the current consent models for health data reuse to take into consideration the unique features of LHSs. The revised model will need to ensure that citizens are given the opportunity to be better informed about upcoming research and have their say, when possible, in the use of their data.


2021 ◽  
Vol 27 (1) ◽  
Author(s):  
Alex McKeown ◽  
Miranda Mourby ◽  
Paul Harrison ◽  
Sophie Walker ◽  
Mark Sheehan ◽  
...  

AbstractData platforms represent a new paradigm for carrying out health research. In the platform model, datasets are pooled for remote access and analysis, so novel insights for developing better stratified and/or personalised medicine approaches can be derived from their integration. If the integration of diverse datasets enables development of more accurate risk indicators, prognostic factors, or better treatments and interventions, this obviates the need for the sharing and reuse of data; and a platform-based approach is an appropriate model for facilitating this. Platform-based approaches thus require new thinking about consent. Here we defend an approach to meeting this challenge within the data platform model, grounded in: the notion of ‘reasonable expectations’ for the reuse of data; Waldron’s account of ‘integrity’ as a heuristic for managing disagreement about the ethical permissibility of the approach; and the element of the social contract that emphasises the importance of public engagement in embedding new norms of research consistent with changing technological realities. While a social contract approach may sound appealing, however, it is incoherent in the context at hand. We defend a way forward guided by that part of the social contract which requires public approval for the proposal and argue that we have moral reasons to endorse a wider presumption of data reuse. However, we show that the relationship in question is not recognisably contractual and that the social contract approach is therefore misleading in this context. We conclude stating four requirements on which the legitimacy of our proposal rests.


2017 ◽  
Vol 60 (4) ◽  
pp. 85-85 ◽  
Author(s):  
Jonathan Ullman
Keyword(s):  

1995 ◽  
Vol 23 (3) ◽  
pp. 28
Author(s):  
Daniel Tabak
Keyword(s):  

2014 ◽  
Vol 49 (6) ◽  
pp. 65-76 ◽  
Author(s):  
Kevin Stock ◽  
Martin Kong ◽  
Tobias Grosser ◽  
Louis-Noël Pouchet ◽  
Fabrice Rastello ◽  
...  
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document