feature preservation
Recently Published Documents


TOTAL DOCUMENTS

59
(FIVE YEARS 6)

H-INDEX

9
(FIVE YEARS 0)

2021 ◽  
Author(s):  
Christine Park ◽  
Hyeon Ki Jeong ◽  
Ricardo Henao ◽  
Meenal K. Kheterpal

BACKGROUND De-identifying facial images is critical for protecting patient anonymity in the era of increasing tools for automatic image analysis in dermatology. OBJECTIVE The purpose of this paper was to review the current literature in the field of automatic facial de-identification algorithms. METHODS We conducted a systematic search using a combination of headings and keywords to encompass the concepts of facial de-identification and privacy preservation. The databases MEDLINE (via Pubmed), Embase (via Elsevier) and Web of Science (via Clarivate) were queried from inception to 5/1/2021. Studies of wrong design and outcomes were excluded during the screening and review process. RESULTS A total of 18 studies were included in the final review reporting various methodologies of facial de-identification algorithms. The study methods were rated individually for their utility for use cases in dermatology pertaining to skin color/pigmentation and texture preservation, data utility, and human detection. Most studies notable in the literature address feature preservation while sacrificing skin color and texture. CONCLUSIONS Facial de-identification algorithms are sparse and inadequate to preserve both facial features and skin pigmentation/texture quality in facial photographs. A novel approach is needed to ensure greater patient anonymity, while increasing data access for automated image analysis in dermatology for improved patient care.



2021 ◽  
Author(s):  
Vijay S. Mahadevan ◽  
Jorge E. Guerra ◽  
Xiangmin Jiao ◽  
Paul Kuberry ◽  
Yipeng Li ◽  
...  

Abstract. Strongly coupled nonlinear phenomena such as those described by Earth System Models (ESM) are composed of multiple component models with independent mesh topologies and scalable numerical solvers. A common operation in ESM is to remap or interpolate results from one component's computational mesh to another, e.g., from the atmosphere to the ocean, during the temporal integration of the coupled system. Several remapping schemes are currently in use or available for ESM. However, a unified approach to compare the properties of these different schemes has not been attempted previously. We present a rigorous methodology for the evaluation and intercomparison of remapping methods through an independently implemented suite of metrics that measure the ability of a method to adhere to constraints such as grid independence, monotonicity, global conservation, and local extrema or feature preservation. A comprehensive set of numerical evaluations are conducted based on a progression of scalar fields from idealized and smooth to more general climate data with strong discontinuities and strict bounds. We examine four remapping algorithms with distinct design approaches, namely ESMF Regrid, TempestRemap, Generalized Moving-Least-Squares (GMLS) with post-processing filters, and Weighted-Least-Squares Essentially Non-oscillatory Remap (WLS-ENOR) method. By repeated iterative application of the high-order remapping methods to the test fields, we verify the accuracy of each scheme in terms of their observed convergence order for smooth data and determine the bounded error propagation using the challenging, realistic field data on both uniform and regionally refined mesh cases. In addition to retaining high-order accuracy under idealized conditions, the methods also demonstrate robust remapping performance when dealing with non-smooth data. There is a failure to maintain monotonicity in the traditional L2-minimization approaches used in ESMF and TempestRemap, in contrast to stable recovery through nonlinear filters used in both meshless (GMLS) and hybrid mesh-based (WLS-ENOR) schemes. Local feature preservation analysis indicates that high-order methods perform better than low-order dissipative schemes for all test cases. The behavior of these remappers remains consistent when applied on regionally refined meshes, indicating mesh invariant implementations. The MIRA intercomparison protocol proposed in this paper and the detailed comparison of the four algorithms demonstrate that the new schemes, namely GMLS and WLS-ENOR, are competitive compared to standard conservative minimization methods requiring computation of mesh intersections. The work presented in this paper provides a foundation that can be extended to include complex field definitions, realistic mesh topologies, and spectral element discretizations thereby allowing for a more complete analysis of production-ready remapping packages.



Symmetry ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 399
Author(s):  
Miao Gong ◽  
Zhijiang Zhang ◽  
Dan Zeng

High-precision and high-density three-dimensional point cloud models usually contain redundant data, which implies extra time and hardware costs in the subsequent data processing stage. To analyze and extract data more effectively, the point cloud must be simplified before data processing. Given that point cloud simplification must be sensitive to features to ensure that more valid information can be saved, in this paper, a new simplification algorithm for scattered point clouds with feature preservation, which can reduce the amount of data while retaining the features of data, is proposed. First, the Delaunay neighborhood of the point cloud is constructed, and then the edge points of the point cloud are extracted by the edge distribution characteristics of the point cloud. Second, the moving least-square method is used to obtain the normal vector of the point cloud and the valley ridge points of the model. Then, potential feature points are identified further and retained on the basis of the discrete gradient idea. Finally, non-feature points are extracted. Experimental results show that our method can be applied to models with different curvatures and effectively avoid the hole phenomenon in the simplification process. To further improve the robustness and anti-noise ability of the method, the neighborhood of the point cloud can be extended to multiple levels, and a balance between simplification speed and accuracy needs to be found.



2021 ◽  
Vol 22 (Supplement_1) ◽  
Author(s):  
G Delso ◽  
K Suryanarayanan ◽  
JT Ortiz-Perez ◽  
S Prat ◽  
A Doltra ◽  
...  

Abstract Funding Acknowledgements Type of funding sources: None. Introduction Myocardial delayed enhancement (MDE) MRI plays an important role in the identification of several cardiac conditions, both ischemic and non-ischemic (e.g. myocarditis, IDC, amyloidosis). 3D imaging offers increased resolution, full heart coverage and better depiction of complex pathologies, but its image quality is limited by long acquisition times. Deep learning (DL) models enable advanced reconstruction algorithms that yield regularized images in practical computation times. In this study we evaluate a novel 3D-DL reconstruction to overcome the trade-off between reconstructed quality and acquisition time on MDE data. Methods A group of 14 subjects referred for CMR (5 F / 9 M, 59 ± 11 y.o., 78 ± 13 kg) were scanned with a 3D MDE sequence prototype: SPGR with IR preparation, fat & spatial saturation, respiratory navigator, ARC 2x, FOV 40x40cm, ST 1.4-2.4mm, matrix 280²-320², FA 20deg, BW 62.5 kHz, TE 2.1 ± 0.1ms, TI based on a CINE IR scout. All were retrospectively reconstructed using a 3D DL algorithm, trained on a database of over 700 datasets to reconstruct high-quality images with adjustable noise reduction. The images were compared with standard 3D Cartesian reconstruction by two experienced cardiologists, to identify alterations in morphology or contrast distribution. Noise was estimated using the intensity standard deviation on a blood pool ROI. Feature preservation was estimated using the structural similarity index (SSI). Results The new method improved perceived image quality without loss of structural information or resolution (fig 1). Quantitative analysis (fig 2) confirmed these results: The average coefficient of variation in the blood was 0.08 ± 0.02 in the reference and 0.05 ± 0.02 with the new method; Given a target image noise level, DL reconstruction yielded up to 10% better SSI, compared to anisotropic filtering. The clinical review didn’t reveal diagnostically significant alterations of structure or uptake pattern. A perceived reduction of sharpness was initially reported but individual examination of landmarks (e.g. pulmonary and coronary arteries) confirmed that no relevant features were being lost with the new reconstruction. Discussion The 3D MDE images obtained with DL reconstruction improved the trade-off between image noise -estimated by the blood pool intensity deviation- and feature preservation -estimated by SSI-. Consistent improvement of image quality without morphological alterations of diagnostic relevance indicates that the new method can be considered for clinical practice. The next step in the validation process will require testing the robustness over a large set of cases with heterogeneous acquisition settings. Conclusion We presented the preliminary evaluation of a deep learning reconstruction method with 3D myocardial delayed enhancement data. The results show systematic improvement of overall image quality without loss of relevant diagnostic information. Abstract Figure.



IEEE Access ◽  
2021 ◽  
Vol 9 ◽  
pp. 60201-60214
Author(s):  
Qiuchen Zhu ◽  
Tran Hiep Dinh ◽  
Manh Duong Phung ◽  
Quang Phuc Ha


2020 ◽  
Vol 27 (4) ◽  
pp. 417-435 ◽  
Author(s):  
Yaqian Liang ◽  
Fazhi He ◽  
Xiantao Zeng

Large-scale 3D models consume large computing and storage resources. To address this challenging problem, this paper proposes a new method to obtain the optimal simplified 3D mesh models with the minimum approximation error. First, we propose a feature-preservation edge collapse operation to maintain the feature edges, in which the collapsing cost is calculated in a novel way by combining Gauss curvature and Quadratic Error Metrics (QEM). Second, we introduce the edge splitting operation into the mesh simplification process and propose a hybrid ‘undo/redo’ mechanism that combines the edge splitting and edge collapse operation to reduce the number of long and narrow triangles. Third, the proposed ‘undo/redo’ mechanism can also reduce the approximation error; however, it is impossible to manually choose the best operation sequence combination that can result in the minimum approximation error. To solve this problem, we formulate the proposed mesh simplification process as an optimization model, in which the solution space is composed of the possible combinations of operation sequences, and the optimization objective is the minimum of the approximation error. Finally, we propose a novel optimization algorithm, WOA-DE, by replacing the exploration phase of the original Whale Optimization Algorithm (WOA) with the mutate and crossover operations of Differential Evolution (DE) to compute the optimal simplified mesh model more efficiently. We conduct numerous experiments to test the capabilities of the proposed method, and the experimental results show that our method outperforms the previous methods in terms of the geometric feature preservation, triangle quality, and approximation error.



2020 ◽  
Vol 40 (2) ◽  
pp. 752-763 ◽  
Author(s):  
Gopinath Palanisamy ◽  
Natarajan B. Shankar ◽  
Palanisamy Ponnusamy ◽  
Varun P. Gopi




Sign in / Sign up

Export Citation Format

Share Document