scholarly journals Shearlets as feature extractor for semantic edge detection: the model-based and data-driven realm

Author(s):  
Héctor Andrade-Loarca ◽  
Gitta Kutyniok ◽  
Ozan Öktem

Semantic edge detection has recently gained a lot of attention as an image-processing task, mainly because of its wide range of real-world applications. This is based on the fact that edges in images contain most of the semantic information. Semantic edge detection involves two tasks, namely pure edge detection and edge classification. Those are in fact fundamentally distinct in terms of the level of abstraction that each task requires. This fact is known as the distracted supervision paradox and limits the possible performance of a supervised model in semantic edge detection. In this work, we will present a novel hybrid method that is based on a combination of the model-based concept of shearlets, which provides probably optimally sparse approximations of a model class of images, and the data-driven method of a suitably designed convolutional neural network. We show that it avoids the distracted supervision paradox and achieves high performance in semantic edge detection. In addition, our approach requires significantly fewer parameters than a pure data-driven approach. Finally, we present several applications such as tomographic reconstruction and show that our approach significantly outperforms former methods, thereby also indicating the value of such hybrid methods for biomedical imaging.

2021 ◽  
Vol 1 ◽  
pp. 61-70
Author(s):  
Ilia Iuskevich ◽  
Andreas-Makoto Hein ◽  
Kahina Amokrane-Ferka ◽  
Abdelkrim Doufene ◽  
Marija Jankovic

AbstractUser experience (UX) focused business needs to survive and plan its new product development (NPD) activities in a highly turbulent environment. The latter is a function of volatile UX and technology trends, competition, unpredictable events, and user needs uncertainty. To address this problem, the concept of design roadmapping has been proposed in the literature. It was argued that tools built on the idea of design roadmapping have to be very flexible and data-driven (i.e., be able to receive feedback from users in an iterative manner). At the same time, a model-based approach to roadmapping has emerged, promising to achieve such flexibility. In this work, we propose to incorporate design roadmapping to model-based roadmapping and integrate it with various user testing approaches into a single tool to support a flexible data-driven NPD planning process.


Author(s):  
Pierpaolo De Filippi ◽  
Simone Formentin ◽  
Sergio M. Savaresi

The design of an active stability control system for two-wheeled vehicles is a fully open problem and it constitutes a challenging task due to the complexity of two-wheeled vehicles dynamics and the strong interaction between the vehicle and the driver. This paper describes and compares two different methods, a model-based and a data-driven approach, to tune a Multi-Input-Multi-Output controller which allows to enhance the safety while guaranteeing a good driving feeling. The two strategies are tested on a multibody motorcycle simulator on challenging maneuvers such as kick-back and strong braking while cornering at high speed.


Author(s):  
Mohammed A. Alam ◽  
Michael H. Azarian ◽  
Michael Osterman ◽  
Michael Pecht

This paper presents the application of model-based and data-driven approaches for prognostics and health management (PHM) of embedded planar capacitors under elevated temperature and voltage conditions. An embedded planar capacitor is a thin laminate that serves both as a power/ground plane and as a parallel plate capacitor in a multilayered printed wiring board (PWB). These capacitors are typically used for decoupling applications and are found to reduce the required number of surface mount capacitors. The capacitor laminate used in this study consisted of an epoxy-barium titanate (BaTiO3) composite dielectric sandwiched between Cu layers. Three electrical parameters, capacitance, dissipation factor, and insulation resistance, were monitored in-situ once every hour during testing under elevated temperature and voltage aging conditions. The failure modes observed were a sharp drop in insulation resistance and a gradual decrease in capacitance. An approach to model the time-to-failure associated with these failure modes as a function of the stress level is presented in this paper. Model-based PHM can be used to predict the time-to-failure associated with a single failure mode, consisting of a drop in either insulation resistance or capacitance. However, failure of an embedded capacitor could occur due to either of these two failure modes and was not captured using a single model. A combined model for both these failure modes can be developed but there was a large variance in the time-to-failure data of failures as a result of a sharp drop in insulation resistance. Therefore a data-driven approach, which utilizes the trend and correlation between the parameters to predict remaining life, was investigated to perform PHM. The data-driven approach used in this paper is the Mahalanobis distance (MD) method that reduces a multivariate data set to a single parameter by considering correlations among the parameters. The Mahalanobis distance method was successful in predicting the failures as a result of a gradual decrease in capacitance. However, prediction of failures as a result of a drop in insulation resistance was generally challenging due to their sudden onset. An experimental approach to address such sudden failures is discussed to facilitate identifying any trends in the parameters prior to failure.


2020 ◽  
Author(s):  
Daniel Bennett

We introduce an unobtrusive, computational method for measuring readiness-to-hand and task-engagement during interaction."Readiness-to-hand" is an influential concept describing fluid, intuitive tool use, with attention on task rather than tool; it has longbeen significant in HCI research, most recently via metrics of tool-embodiment and immersion. We build on prior work in cognitivescience which relates readiness-to-hand and task engagement to multifractality: a measure of complexity in behaviour. We conduct areplication study (N=28), and two new experiments (N=44, N=30), which show that multifractality correlates with task-engagement and other features of readiness-to-hand overlooked in previous measures, including familiarity with task. This is the first evaluation of multifractal measures of behaviour in HCI. Since multifractality occurs in a wide range of behaviours and input signals, we support future work by sharing scripts and data (https://osf.io/2hm9u/), and introducing a new data-driven approach to parameter selection


2018 ◽  
Vol 10 (02) ◽  
pp. 1840001 ◽  
Author(s):  
Catherine M. Sweeney-Reed ◽  
Slawomir J. Nasuto ◽  
Marcus F. Vieira ◽  
Adriano O. Andrade

Empirical mode decomposition (EMD) provides an adaptive, data-driven approach to time–frequency analysis, yielding components from which local amplitude, phase, and frequency content can be derived. Since its initial introduction to electroencephalographic (EEG) data analysis, EMD has been extended to enable phase synchrony analysis and multivariate data processing. EMD has been integrated into a wide range of applications, with emphasis on denoising and classification. We review the methodological developments, providing an overview of the diverse implementations, ranging from artifact removal to seizure detection and brain–computer interfaces. Finally, we discuss limitations, challenges, and opportunities associated with EMD for EEG analysis.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Peter Balogh ◽  
John Gounley ◽  
Sayan Roychowdhury ◽  
Amanda Randles

AbstractIn order to understand the effect of cellular level features on the transport of circulating cancer cells in the microcirculation, there has been an increasing reliance on high-resolution in silico models. Accurate simulation of cancer cells flowing with blood cells requires resolving cellular-scale interactions in 3D, which is a significant computational undertaking warranting a cancer cell model that is both computationally efficient yet sufficiently complex to capture relevant behavior. Given that the characteristics of metastatic spread are known to depend on cancer type, it is crucial to account for mechanistic behavior representative of a specific cancer’s cells. To address this gap, in the present work we develop and validate a means by which an efficient and popular membrane model-based approach can be used to simulate deformable cancer cells and reproduce experimental data from specific cell lines. Here, cells are modeled using the immersed boundary method (IBM) within a lattice Boltzmann method (LBM) fluid solver, and the finite element method (FEM) is used to model cell membrane resistance to deformation. Through detailed comparisons with experiments, we (i) validate this model to represent cancer cells undergoing large deformation, (ii) outline a systematic approach to parameterize different cell lines to optimally fit experimental data over a range of deformations, and (iii) provide new insight into nucleated vs. non-nucleated cell models and their ability to match experiments. While many works have used the membrane-model based method employed here to model generic cancer cells, no quantitative comparisons with experiments exist in the literature for specific cell lines undergoing large deformation. Here, we describe a phenomenological, data-driven approach that can not only yield good agreement for large deformations, but explicitly detail how it can be used to represent different cancer cell lines. This model is readily incorporated into cell-resolved hemodynamic transport simulations, and thus offers significant potential to complement experiments towards providing new insights into various aspects of cancer progression.


Author(s):  
Mika P. Malila ◽  
Patrik Bohlinger ◽  
Susanne Støle-Hentschel ◽  
Øyvind Breivik ◽  
Gaute Hope ◽  
...  

AbstractWe propose a methodology for despiking ocean surface wave time series based on a Bayesian approach to data-driven learning known as Gaussian Process (GP) regression. We show that GP regression can be used for both robust detection of erroneous measurements and interpolation over missing values, while also obtaining a measure of the uncertainty associated with these operations. In comparison with a recent dynamical phase space-based despiking method, our data-driven approach is here shown to lead to improved wave signal correlation and spectral tail consistency, although at a significant increase in computational cost. Our results suggest that GP regression is thus especially suited for offline quality control requiring robust noise detection and replacement, where the subsequent analysis of the despiked data is sensitive to the accidental removal of extreme or rare events such as abnormal or rogue waves. We assess our methodology on measurements from an array of four co-located 5-Hz laser altimeters during a much-studied storm event the North Sea covering a wide range of sea states.


2020 ◽  
Vol 6 (4) ◽  
pp. 1269-1282 ◽  
Author(s):  
Hossein S. Ghadikolaei ◽  
Hadi Ghauch ◽  
Gabor Fodor ◽  
Mikael Skoglund ◽  
Carlo Fischione

Sign in / Sign up

Export Citation Format

Share Document