Interactive volume rendering of real-time three-dimensional ultrasound images

Author(s):  
Johnny Kuo ◽  
Gregory Bredthauer ◽  
John Castellucci ◽  
Olaf Von Ramm
2006 ◽  
Vol 51 (6) ◽  
pp. 304-310 ◽  
Author(s):  
V. F. Kravchenko ◽  
V. I. Ponomaryov ◽  
V. I. Pustovoĭt ◽  
R. Sansores-Pech

Author(s):  
Yanyang Zeng ◽  
Panpan Jia

The underwater acoustics is primary and most effective method for underwater object detection and the complex underwater acoustics battlefield environment can be visually described by the three-dimensional (3D) energy field. Through solving the 3D propagation models, the traditional underwater acoustics volume data can be obtained, but it is large amount of calculation. In this paper, a novel modeling approach, which transforms two-dimensional (2D) wave equation into 2D space and optimizes energy loss propagation model, is proposed. In this way, the information for the obtained volume data will not be lost too much. At the same time, it can meet the requirements of data processing for the real-time visualization. In the process of volume rendering, 3D texture mapping methods is used. The experimental results are evaluated on data size and frame rate, showing that our approach outperforms other approaches and the approach can achieve better results in real time and visual effects.


Author(s):  
Daniel Jie Yuan Chin ◽  
Ahmad Sufril Azlan Mohamed ◽  
Khairul Anuar Shariff ◽  
Mohd Nadhir Ab Wahab ◽  
Kunio Ishikawa

Three-dimensional reconstruction plays an important role in assisting doctors and surgeons in diagnosing bone defects’ healing progress. Common three-dimensional reconstruction methods include surface and volume rendering. As the focus is on the shape of the bone, volume rendering is omitted. Many improvements have been made on surface rendering methods like Marching Cubes and Marching Tetrahedra, but not many on working towards real-time or near real-time surface rendering for large medical images, and studying the effects of different parameter settings for the improvements. Hence, in this study, an attempt towards near real-time surface rendering for large medical images is made. Different parameter values are experimented on to study their effect on reconstruction accuracy, reconstruction and rendering time, and the number of vertices and faces. The proposed improvement involving three-dimensional data smoothing with convolution kernel Gaussian size 0.5 and mesh simplification reduction factor of 0.1, is the best parameter value combination for achieving a good balance between high reconstruction accuracy, low total execution time, and a low number of vertices and faces. It has successfully increased the reconstruction accuracy by 0.0235%, decreased the total execution time by 69.81%, and decreased the number of vertices and faces by 86.57% and 86.61% respectively.


2009 ◽  
Author(s):  
Emad M. Boctor ◽  
Mohammad Matinfar ◽  
Omar Ahmad ◽  
Hassan Rivaz ◽  
Michael Choti ◽  
...  

2001 ◽  
Vol 14 (S1) ◽  
pp. 202-204 ◽  
Author(s):  
Jinwoo Hwang ◽  
June Sic Kim ◽  
Jae Seok Kim ◽  
In Young Kim ◽  
Sun I. Kim

2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
J D Kasprzak ◽  
M Kierepka ◽  
J Z Peruga ◽  
D Dudek ◽  
B Machura ◽  
...  

Abstract Background Three-dimensional (3D) echocardiographic data acquired from transesophageal (TEE) window are commonly used in planning and during percutaneous structural cardiac interventions (PSCI). Purpose We hypothesized that innovative, interactive mixed reality display can be integrated with procedural PSCI workflow to improve perception and interpretation of 3D data representing cardiac anatomy. Methods 3D TEE datasets were acquired before, during and after the completion of PSCI in 8 patients (occluders: 2 atrial appendage, 2 patent foramen ovale and 3 atrial septal implantations and percutaneous mitral commissurotomy). 30 Carthesian DICOM files were used to test the feasibility of mixed reality with commercially available head-mounted device (overlying hologram of 3D TEE data onto real-world view) as display for the interventional or imaging operator. Dedicated software was used for files conversion and 3D rendering of data to display device (in 1 case real-time Wi-Fi streaming from echocardiograph) and spatial manipulation of hologram during PSCI. Custom viewer was used to perform volume rendering and adjustment (cropping, transparency and shading control). Results Pre- and intraprocedural 3D TEE was performed in all 8 patients (5 women, age 40–83). Thirty selected 3DTEE datasets were successfully transferred and displayed in mixed reality head-mounted device as a holographic image overlying the real world view. The analysis was performed both before and during the procedure and compared with flatscreen 2-D display of the echocardiograph. In one case, real-time data transfer was successfully implemented during mitral balloon commissurotomy. The quality of visualization was judged as good without diagnostic content loss in all (100%) datasets. Both target structures and additional anatomical details were clearly presented including fenestrations of atrial septal defect, prominent Eustachian valve and earlier cardiac implants. Volume rendered views were touchlessly manipulated and displayed with a selection of intensity windows, transfer functions, and filters. Detail display was judged comparable to current 2-D volume-rendering on commercial workstations and touchless user interface - comfortable for optimization of views during PSCI. Conclusions Mixed reality display using a commercially available head-mounted device can be successfully integrated with preparation and execution of PSCI. The benefits of this solution include touchless image control and unobstructed real world viewing facilitating intraprocedural use, thus showing superiority over virtual or enhanced reality solutions. Expected progress includes integration of color flow data and optimization of real-time streaming option.


1996 ◽  
Vol 167 (3) ◽  
pp. 581-583 ◽  
Author(s):  
P T Johnson ◽  
D G Heath ◽  
D F Bliss ◽  
B Cabral ◽  
E K Fishman

Sign in / Sign up

Export Citation Format

Share Document