scholarly journals Radiomics for Everyone: A New Tool Simplifies Creating Parametric Maps for the Visualization and Quantification of Radiomics Features

Tomography ◽  
2021 ◽  
Vol 7 (3) ◽  
pp. 477-487
Author(s):  
Damon Kim ◽  
Laura J. Jensen ◽  
Thomas Elgeti ◽  
Ingo G. Steffen ◽  
Bernd Hamm ◽  
...  

Aim was to develop a user-friendly method for creating parametric maps that would provide a comprehensible visualization and allow immediate quantification of radiomics features. For this, a self-explanatory graphical user interface was designed, and for the proof of concept, maps were created for CT and MR images and features were compared to those from conventional extractions. Especially first-order features were concordant between maps and conventional extractions, some even across all examples. Potential clinical applications were tested on CT and MR images for the differentiation of pulmonary lesions. In these sample applications, maps of Skewness enhanced the differentiation of non-malignant lesions and non-small lung carcinoma manifestations on CT images and maps of Variance enhanced the differentiation of pulmonary lymphoma manifestations and fungal infiltrates on MR images. This new and simple method for creating parametric maps makes radiomics features visually perceivable, allows direct feature quantification by placing a region of interest, can improve the assessment of radiological images and, furthermore, can increase the use of radiomics in clinical routine.

Author(s):  
Yannick van Hierden ◽  
Timo Dietrich ◽  
Sharyn Rundle-Thiele

In recent years, the relevance of eHealth interventions has become increasingly evident. However, a sequential procedural application to cocreating eHealth interventions is currently lacking. This paper demonstrates the implementation of a participatory design (PD) process to inform the design of an eHealth intervention aiming to enhance well-being. PD sessions were conducted with 57 people across four sessions. Within PD sessions participants experienced prototype activities, provided feedback and designed program interventions. A 5-week eHealth well-being intervention focusing on lifestyle, habits, physical activity, and meditation was proposed. The program is suggested to be delivered through online workshops and online community interaction. A five-step PD process emerged; namely, (1) collecting best practices, (2) participatory discovery, (3) initial proof-of-concept, (4) participatory prototyping, and (5) pilot intervention proof-of-concept finalisation. Health professionals, behaviour change practitioners and program planners can adopt this process to ensure end-user cocreation using the five-step process. The five-step PD process may help to create user-friendly programs.


1994 ◽  
Vol 4 (3) ◽  
pp. 315-318 ◽  
Author(s):  
Richard C. Semelka ◽  
Ann S. Bagley ◽  
Elizabeth D. Brown ◽  
Mervyn A. Kroeker
Keyword(s):  

2014 ◽  
Vol 35 (4) ◽  
Author(s):  
Angshuman Majumdar ◽  
Satabdi Das ◽  
Sankar Gangopadhyay

AbstractBased on the simple power series formulation of fundamental mode developed by Chebyshev formalism in the low V region, we prescribe analytical expression for effective core area of graded index fiber. Taking step and parabolic index fibers as examples, we estimate the effective core areas as well as effective refractive index for different normalized frequencies (V number) having low values. We also show that our estimations match excellently with the available exact results. The concerned predictions by our method require little computation. Thus, this simple but accurate formalism will be user friendly for the system engineers.


2018 ◽  
Vol 4 (1) ◽  
pp. 331-335
Author(s):  
David Schote ◽  
Tim Pfeiffer ◽  
Georg Rose

AbstractComputed tomography (CT) scans are frequently used intraoperatively, for example to control the positioning of implants during intervention. Often, to provide the required information, a full field of view is unnecessary. I nstead, the region-of-interest (ROI) imaging can be performed, allowing for substantial reduction in the applied X-ray dose. However, ROI imaging leads to data inconsistencies, caused by the truncation of the projections. This lack of information severely impairs the quality of the reconstructed images. This study presents a proof-of-concept for a new approach that combines the incomplete CT data with ultrasound data and time of flight measurements in order to restore some of the lacking information. The routine is evaluated in a simulation study using the original Shepp-Logan phantom in ROI cases with different degrees of truncation. Image quality is assessed by means of normalized root mean square error. The proposed method significantly reduces truncation artifacts in the reconstructions and achieves considerable radiation exposure reductions.


2013 ◽  
Vol 647 ◽  
pp. 325-330 ◽  
Author(s):  
Yu Fan Zeng ◽  
Xue Jun Zhang ◽  
Wen Yan ◽  
Li Ling Long ◽  
Yu Kun Huang ◽  
...  

The fibrous texture in liver is one of important signs for interpreting the chronic liver diseases in radiologists’ routines. In order to investigate the usefulness of various texture features calculated by computer algorithm on hepatic magnetic resonance (MR) images, 15 texture features were calculated from the gray level co-occurrence matrix (GLCM) within a region of interest (ROI) which was selected from the MR images with 6 stages of hepatic fibrosis. By different combination of 15 features as input vectors, the classifier had different performance in staging the hepatic fibrosis. Each combination of texture features was tested by Support Vector Machine (SVM) with leave one case out method. 173 patients’ MR images including 6 stages of hepatic fibrosis were scanned within recent two years. The result showed that optimal number of features was confirmed from 3 to 7 by investigating the classified accuracy rate between each stage/group. It is evident that angular second moment, entropy, sum average and sum entropy played the most significant role in classification.


1997 ◽  
Vol 38 (1) ◽  
pp. 165-172 ◽  
Author(s):  
G. Torheim ◽  
M. Lombardi ◽  
P. A. Rinck

Purpose: A computer system for the manual, semi-automatic, and automatic analysis of dynamic MR images was to be developed on UNIX and personal computer platforms. The system was to offer an integrated and standardized way of performing both image processing and analysis that was independent of the MR unit used. Material and Methods: The system consists of modules that are easily adaptable to special needs. Data from MR units or other diagnostic imaging equipment in techniques such as CT, ultrasonography, or nuclear medicine can be processed through the ACR-NEMA/DICOM standard file formats. A full set of functions is available, among them cine-loop visual analysis, and generation of time-intensity curves., Parameters such as cross-correlation coefficients, area under the curve, peak/maximum intensity, wash-in and wash-out slopes, time to peak, and relative signal intensity/contrast enhancement can be calculated. Other parameters can be extracted by fitting functions like the gamma-variate function. Region-of-interest data and parametric values can easily be exported. Results and Conclusion: The system has been successfully tested in animal and patient examinations.


Geophysics ◽  
2016 ◽  
Vol 81 (6) ◽  
pp. V403-V413 ◽  
Author(s):  
Julián L. Gómez ◽  
Danilo R. Velis

We developed a new and simple method for denoising seismic data, which was inspired by data-driven empirical mode decomposition (EMD) algorithms. The method, which can be applied either as a trace-by-trace process or in the [Formula: see text] domain, replaces the use of the cubic interpolation scheme, which is required to calculate the mean envelopes of the signal and the residues, by window averaging. The resulting strategy is not viewed as an EMD per se, but a user-friendly version of EMD-based algorithms that permits us to attain, in a fraction of the time, the same level of noise cancellation as standard EMD implementations. Furthermore, the proposed method requires less user intervention and easily processes millions of traces in minutes rather than in hours as required by conventional EMD-based techniques on a standard PC. We compared the performance of the new method against standard EMD methods in terms of computational cost and signal preservation and applied them to denoise synthetic and field (microseismic and poststack) data containing random, erratic, and coherent noise. The corresponding [Formula: see text] EMDs implementations for lateral continuity enhancement were analyzed and compared against the classical [Formula: see text] deconvolution to test the method.


Sign in / Sign up

Export Citation Format

Share Document