scholarly journals AI-Based Radiological Imaging for HCC: Current Status and Future of Ultrasound

Diagnostics ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 292
Author(s):  
Hitoshi Maruyama ◽  
Tadashi Yamaguchi ◽  
Hiroaki Nagamatsu ◽  
Shuichiro Shiina

Hepatocellular carcinoma (HCC) is a common cancer worldwide. Recent international guidelines request an identification of the stage and patient background/condition for an appropriate decision for the management direction. Radiomics is a technology based on the quantitative extraction of image characteristics from radiological imaging modalities. Artificial intelligence (AI) algorithms are the principal axis of the radiomics procedure and may provide various results from large data sets beyond conventional techniques. This review article focused on the application of the radiomics-related diagnosis of HCC using radiological imaging (computed tomography, magnetic resonance imaging, and ultrasound (B-mode, contrast-enhanced ultrasound, and elastography)), and discussed the current role, limitation and future of ultrasound. Although the evidence has shown the positive effect of AI-based ultrasound in the prediction of tumor characteristics and malignant potential, posttreatment response and prognosis, there are still a number of issues in the practical management of patients with HCC. It is highly expected that the wide range of applications of AI for ultrasound will support the further improvement of the diagnostic ability of HCC and provide a great benefit to the patients.

Author(s):  
Fenxiao Chen ◽  
Yun-Cheng Wang ◽  
Bin Wang ◽  
C.-C. Jay Kuo

Abstract Research on graph representation learning has received great attention in recent years since most data in real-world applications come in the form of graphs. High-dimensional graph data are often in irregular forms. They are more difficult to analyze than image/video/audio data defined on regular lattices. Various graph embedding techniques have been developed to convert the raw graph data into a low-dimensional vector representation while preserving the intrinsic graph properties. In this review, we first explain the graph embedding task and its challenges. Next, we review a wide range of graph embedding techniques with insights. Then, we evaluate several stat-of-the-art methods against small and large data sets and compare their performance. Finally, potential applications and future directions are presented.


SLEEP ◽  
2020 ◽  
Vol 43 (Supplement_1) ◽  
pp. A24-A26
Author(s):  
J Hammarlund ◽  
R Anafi

Abstract Introduction We recently used unsupervised machine learning to order genome scale data along a circadian cycle. CYCLOPS (Anafi et al PNAS 2017) encodes high dimensional genomic data onto an ellipse and offers the potential to identify circadian patterns in large data-sets. This approach requires many samples from a wide range of circadian phases. Individual data-sets often lack sufficient samples. Composite expression repositories vastly increase the available data. However, these agglomerated datasets also introduce technical (e.g. processing site) and biological (e.g. age or disease) confounders that may hamper circadian ordering. Methods Using the FLUX machine learning library we expanded the CYCLOPS network. We incorporated additional encoding and decoding layers that model the influence of labeled confounding variables. These layers feed into a fully connected autoencoder with a circular bottleneck, encoding the estimated phase of each sample. The expanded network simultaneously estimates the influence of confounding variables along with circadian phase. We compared the performance of the original and expanded networks using both real and simulated expression data. In a first test, we used time-labeled data from a single-center describing human cortical samples obtained at autopsy. To generate a second, idealized processing center, we introduced gene specific biases in expression along with a bias in sample collection time. In a second test, we combined human lung biopsy data from two medical centers. Results The performance of the original CYCLOPS network degraded with the introduction of increasing, non-circadian confounds. The expanded network was able to more accurately assess circadian phase over a wider range of confounding influences. Conclusion The addition of labeled confounding variables into the network architecture improves circadian data ordering. The use of the expanded network should facilitate the application of CYCLOPS to multi-center data and expand the data available for circadian analysis. Support This work was supported by the National Cancer Institute (1R01CA227485-01)


2006 ◽  
Vol 2 (14) ◽  
pp. 592-592
Author(s):  
Paresh Prema ◽  
Nicholas A. Walton ◽  
Richard G. McMahon

Observational astronomy is entering an exciting new era with large surveys delivering deep multi-wavelength data over a wide range of the electromagnetic spectrum. The last ten years has seen a growth in the study of high redshift galaxies discovered with the method pioneered by Steidel et al. (1995) used to identify galaxies above z>1. The technique is designed to take advantage of the multi-wavelength data now available for astronomers that can extend from X-rays to radio wavelength. The technique is fast becoming a useful way to study large samples of objects at these high redshifts and we are currently designing and implementing an automated technique to study these samples of objects. However, large surveys produce large data sets that have now reached terabytes (e.g. for the Sloan Digital Sky Survey, <http://www.sdss.org>) in size and petabytes over the next 10yr (e.g., LSST, <http://www.lsst.org>). The Virtual Observatory is now providing a means to deal with this issue and users are now able to access many data sets in a quicker more useful form.


2014 ◽  
Author(s):  
Hua Chen ◽  
Jody Hey ◽  
Montgomery Slatkin

Recent positive selection can increase the frequency of an advantageous mutant rapidly enough that a relatively long ancestral haplotype will be remained intact around it. We present a hidden Markov model (HMM) to identify such haplotype structures. With HMM identified haplotype structures, a population genetic model for the extent of ancestral haplotypes is then adopted for parameter inference of the selection intensity and the allele age. Simulations show that this method can detect selection under a wide range of conditions and has higher power than the existing frequency spectrum-based method. In addition, it provides good estimate of the selection coefficients and allele ages for strong selection. The method analyzes large data sets in a reasonable amount of running time. This method is applied to HapMap III data for a genome scan, and identifies a list of candidate regions putatively under recent positive selection. It is also applied to several genes known to be under recent positive selection, including the LCT, KITLG and TYRP1 genes in Northern Europeans, and OCA2 in East Asians, to estimate their allele ages and selection coefficients.


2000 ◽  
Vol 5 ◽  
pp. 39-52 ◽  
Author(s):  
Zoran Constantinescu

This paper overviews some aspects of using different levels of accuracy and complexity for the visualization of large data sets. Current status of the volume of data sets that can be generated is presented, together with some of the inherent problems due to such large data volumes, visualization requirements, and display limitations of existing hardware graphics. Some methods for selecting, generating and implementing different levels of detail are presented.


2019 ◽  
Vol 97 (Supplement_3) ◽  
pp. 52-53
Author(s):  
Ignacy Misztal

Abstract Early application of genomic selection relied on SNP estimation with phenotypes or de-regressed proofs (DRP). Chips of 50k SNP seemed sufficient. Estimated breeding value was an index with parent average and deduction to eliminate double counting. Use of SNP selection or weighting increased accuracy with small data sets but less or none with large data sets. Use of DRP with female information required ad-hoc modifications. As BLUP is biased by genomic selection, use of DRP under genomic selection required adjustments. Efforts to include potentially causative SNP derived from sequence analysis showed limited or no gain. The genomic selection was greatly simplified using single-step GBLUP (ssGBLUP) because the procedure automatically creates the index, can use any combination of male and female genotypes, and accounts for preselection. ssGBLUP requires careful scaling for compatibility between pedigree and genomic relationships to avoid biases especially under strong selection. Large data computations in ssGBLUP were solved by exploiting limited dimensionality of SNP due to limited effective population size. With such dimensionality ranging from 4k in chicken to about 15k in Holsteins, the inverse of GRM can be created directly (e.g., by the APY algorithm) in linear cost. Due to its simplicity and accuracy ssGBLUP is routinely used for genomic selection by major companies in chicken, pigs and beef. ssGBLUP can be used to derive SNP effects for indirect prediction, and for GWAS, including computations of the P-values. An alternative single-step called ssBR exists that uses SNP effects instead of GRM. As BLUP is affected by pre-selection, there is need for new validation procedures unaffected by selection, and for parameter estimation that accounts for all the genomic data used in selection. Another issue are reduced variances due to the Bulmer effect.


2016 ◽  
Vol 9 (2) ◽  
pp. 383-407 ◽  
Author(s):  
S. Hassinen ◽  
D. Balis ◽  
H. Bauer ◽  
M. Begoin ◽  
A. Delcloo ◽  
...  

Abstract. The three Global Ozone Monitoring Experiment-2 instruments will provide unique and long data sets for atmospheric research and applications. The complete time period will be 2007–2022, including the period of ozone depletion as well as the beginning of ozone layer recovery. Besides ozone chemistry, the GOME-2 (Global Ozone Monitoring Experiment-2) products are important e.g. for air quality studies, climate modelling, policy monitoring and hazard warnings. The heritage for GOME-2 is in the ERS/GOME and Envisat/SCIAMACHY instruments. The current Level 2 (L2) data cover a wide range of products such as ozone and minor trace gas columns (NO2, BrO, HCHO, H2O, SO2), vertical ozone profiles in high and low spatial resolution, absorbing aerosol indices, surface Lambertian-equivalent reflectivity database, clear-sky and cloud-corrected UV indices and surface UV fields with different weightings and photolysis rates. The Satellite Application Facility on Ozone and Atmospheric Chemistry Monitoring (O3M SAF) processes and disseminates data 24/7. Data quality is guaranteed by the detailed review processes for the algorithms, validation of the products as well as by a continuous quality monitoring of the products and processing. This paper provides an overview of the O3M SAF project background, current status and future plans for the utilisation of the GOME-2 data. An important focus is the provision of summaries of the GOME-2 products including product principles and validation examples together with sample images. Furthermore, this paper collects references to the detailed product algorithm and validation papers.


MRS Bulletin ◽  
2009 ◽  
Vol 34 (10) ◽  
pp. 717-724 ◽  
Author(s):  
David N. Seidman ◽  
Krystyna Stiller

AbstractAtom-probe tomography (APT) is in the midst of a dynamic renaissance as a result of the development of well-engineered commercial instruments that are both robust and ergonomic and capable of collecting large data sets, hundreds of millions of atoms, in short time periods compared to their predecessor instruments. An APT setup involves a field-ion microscope coupled directly to a special time-of-flight (TOF) mass spectrometer that permits one to determine the mass-to-charge states of individual field-evaporated ions plus theirx-,y-, andz-coordinates in a specimen in direct space with subnanoscale resolution. The three-dimensional (3D) data sets acquired are analyzed using increasingly sophisticated software programs that utilize high-end workstations, which permit one to handle continuously increasing large data sets. APT has the unique ability to dissect a lattice, with subnanometer-scale spatial resolution, using either voltage or laser pulses, on an atom-by-atom and atomic plane-by-plane basis and to reconstruct it in 3D with the chemical identity of each detected atom identified by TOF mass spectrometry. Employing pico- or femtosecond laser pulses using visible (green or blue light) to ultraviolet light makes the analysis of metallic, semiconducting, ceramic, and organic materials practical to different degrees of success. The utilization of dual-beam focused ion-beam microscopy for the preparation of microtip specimens from multilayer and surface films, semiconductor devices, and for producing site-specific specimens greatly extends the capabilities of APT to a wider range of scientific and engineering problems than could previously be studied for a wide range of materials: metals, semiconductors, ceramics, biominerals, and organic materials.


2020 ◽  
Vol 98 (4) ◽  
Author(s):  
Ignacy Misztal ◽  
Daniela Lourenco ◽  
Andres Legarra

Abstract Early application of genomic selection relied on SNP estimation with phenotypes or de-regressed proofs (DRP). Chips of 50k SNP seemed sufficient for an accurate estimation of SNP effects. Genomic estimated breeding values (GEBV) were composed of an index with parent average, direct genomic value, and deduction of a parental index to eliminate double counting. Use of SNP selection or weighting increased accuracy with small data sets but had minimal to no impact with large data sets. Efforts to include potentially causative SNP derived from sequence data or high-density chips showed limited or no gain in accuracy. After the implementation of genomic selection, EBV by BLUP became biased because of genomic preselection and DRP computed based on EBV required adjustments, and the creation of DRP for females is hard and subject to double counting. Genomic selection was greatly simplified by single-step genomic BLUP (ssGBLUP). This method based on combining genomic and pedigree relationships automatically creates an index with all sources of information, can use any combination of male and female genotypes, and accounts for preselection. To avoid biases, especially under strong selection, ssGBLUP requires that pedigree and genomic relationships are compatible. Because the inversion of the genomic relationship matrix (G) becomes costly with more than 100k genotyped animals, large data computations in ssGBLUP were solved by exploiting limited dimensionality of genomic data due to limited effective population size. With such dimensionality ranging from 4k in chickens to about 15k in cattle, the inverse of G can be created directly (e.g., by the algorithm for proven and young) at a linear cost. Due to its simplicity and accuracy, ssGBLUP is routinely used for genomic selection by the major chicken, pig, and beef industries. Single step can be used to derive SNP effects for indirect prediction and for genome-wide association studies, including computations of the P-values. Alternative single-step formulations exist that use SNP effects for genotyped or for all animals. Although genomics is the new standard in breeding and genetics, there are still some problems that need to be solved. This involves new validation procedures that are unaffected by selection, parameter estimation that accounts for all the genomic data used in selection, and strategies to address reduction in genetic variances after genomic selection was implemented.


2021 ◽  
Vol 62 (12) ◽  
Author(s):  
Richard Miles ◽  
Arthur Dogariu ◽  
Laura Dogariu

AbstractModern “non-intrusive” optical methods are providing revolutionary capabilities for diagnostics of hypersonic flow fields. They generate accurate information on the performance of ground test facilities and provide local time accurate measurements of near-wall and off-body flow fields surrounding hypersonic test articles. They can follow the true molecular motion of the flow and detect nonequilibrium states and gas mixtures. They can be used to capture a wide range of turbulent scales and can produce highly accurate velocity, temperature and density measurements as well as time-frozen images that provide intuitive understanding of flow phenomena. Recent review articles address many of these methods and their applications. The methods highlighted in this review are those that have been enabled or greatly improved by new, versatile laser systems, particularly including kHz rate femtosecond lasers and MHz rate pulse burst lasers. Although these methods can be applied to combusting environments, the focus of this review is on external high Mach number flows surrounding test articles and wind tunnel core flow properties. The high repetition rates enable rapid time evolving flows to be analyzed and enable the collection of large data sets necessary for statistical analysis. Future capabilities based on the use of atomic vapor filters and on frequency tunable, injection locked MHz rate lasers are promising.


Sign in / Sign up

Export Citation Format

Share Document