A tool to minimize the time costs of parallel computations through optimal processing power allocation

1990 ◽  
Vol 20 (3) ◽  
pp. 283-300 ◽  
Author(s):  
Bin Qin ◽  
Howard A. Sholl ◽  
Reda A. Ammar
Author(s):  
Alexandra Briasouli ◽  
Daniela Minkovska ◽  
Lyudmila Stoyanova

Big Data has been created from virtually everything around us at all times. Every digital media interaction generates data, from computer browsing and online retail to iTunes shopping and Facebook likes. This data is captured from multiple sources, with terrifying speed, volume and variety. But in order to extract substantial value from them, one must possess the optimal processing power, the appropriate analysis tools and, of course, the corresponding skills. The range of data collected by businesses today is almost unreal. According to IBM, more than 2.5 times four million data bytes generated per year, while the amount of data generated increases at such an astonishing rate that 90 % of it has been generated in just the last two years. Big Data have recently attracted substantial interest from both academics and practitioners. Big Data Analytics (BDA) is increasingly becoming a trending practice that many organizations are adopting with the purpose of constructing valuable information from BD. The analytics process, including the deployment and use of BDA tools, is seen by organizations as a tool to improve operational efficiency though it has strategic potential, drive new revenue streams and gain competitive advantages over business rivals. However, there are different types of analytic applications to consider. This paper presents a view of the BD challenges and methods to help to understand the significance of using the Big Data Technologies. This article based on a bibliographic review, on texts published in scientific journals, on relevant research dealing with the big data that have exploded in recent years, as they are increasingly linked to technology


2009 ◽  
Vol 5 (2) ◽  
pp. 10 ◽  
Author(s):  
Jose Luis Zamorano ◽  

3D echocardiography (3DE) will gain increasing acceptance as a routine clinical tool as the technology evolves due to advances in technology and computer processing power. Images obtained from 3DE provide more accurate assessment of complex cardiac anatomy and sophisticated functional mechanisms compared with conventional 2D echocardiography (2DE), and are comparable to those achieved with magnetic resonance imaging. Many of the limitations associated with the early iterations of 3DE prevented their widespread clinical application. However, recent significant improvements in transducer and post-processing software technologies have addressed many of these issues. Furthermore, the most recent advances in the ability to image the entire heart in realtime and fully automated quantification have poised 3DE to become more ubiquitous in clinical routine. Realtime 3DE (RT3DE) systems offer further improvements in the diagnostic and treatment planning capabilities of cardiac ultrasound. Innovations such as the ability to acquire non-stitched, realtime, full-volume 3D images of the heart in a single heart cycle promise to overcome some of the current limitations of current RT3DE systems, which acquire images over four to seven cardiac cycles, with the need for gating and the potential for stitch artefacts.


2011 ◽  
Vol E94-B (8) ◽  
pp. 2316-2327 ◽  
Author(s):  
Jong-Ho LEE ◽  
Oh-Soon SHIN
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document