Additive and Multiplicative Piecewise-Smooth Segmentation Models in a Functional Minimization Approach

2015 ◽  
pp. 339-356
2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Arnaud Liehrmann ◽  
Guillem Rigaill ◽  
Toby Dylan Hocking

Abstract Background Histone modification constitutes a basic mechanism for the genetic regulation of gene expression. In early 2000s, a powerful technique has emerged that couples chromatin immunoprecipitation with high-throughput sequencing (ChIP-seq). This technique provides a direct survey of the DNA regions associated to these modifications. In order to realize the full potential of this technique, increasingly sophisticated statistical algorithms have been developed or adapted to analyze the massive amount of data it generates. Many of these algorithms were built around natural assumptions such as the Poisson distribution to model the noise in the count data. In this work we start from these natural assumptions and show that it is possible to improve upon them. Results Our comparisons on seven reference datasets of histone modifications (H3K36me3 & H3K4me3) suggest that natural assumptions are not always realistic under application conditions. We show that the unconstrained multiple changepoint detection model with alternative noise assumptions and supervised learning of the penalty parameter reduces the over-dispersion exhibited by count data. These models, implemented in the R package CROCS (https://github.com/aLiehrmann/CROCS), detect the peaks more accurately than algorithms which rely on natural assumptions. Conclusion The segmentation models we propose can benefit researchers in the field of epigenetics by providing new high-quality peak prediction tracks for H3K36me3 and H3K4me3 histone modifications.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Markus J. Ankenbrand ◽  
Liliia Shainberg ◽  
Michael Hock ◽  
David Lohr ◽  
Laura M. Schreiber

Abstract Background Image segmentation is a common task in medical imaging e.g., for volumetry analysis in cardiac MRI. Artificial neural networks are used to automate this task with performance similar to manual operators. However, this performance is only achieved in the narrow tasks networks are trained on. Performance drops dramatically when data characteristics differ from the training set properties. Moreover, neural networks are commonly considered black boxes, because it is hard to understand how they make decisions and why they fail. Therefore, it is also hard to predict whether they will generalize and work well with new data. Here we present a generic method for segmentation model interpretation. Sensitivity analysis is an approach where model input is modified in a controlled manner and the effect of these modifications on the model output is evaluated. This method yields insights into the sensitivity of the model to these alterations and therefore to the importance of certain features on segmentation performance. Results We present an open-source Python library (misas), that facilitates the use of sensitivity analysis with arbitrary data and models. We show that this method is a suitable approach to answer practical questions regarding use and functionality of segmentation models. We demonstrate this in two case studies on cardiac magnetic resonance imaging. The first case study explores the suitability of a published network for use on a public dataset the network has not been trained on. The second case study demonstrates how sensitivity analysis can be used to evaluate the robustness of a newly trained model. Conclusions Sensitivity analysis is a useful tool for deep learning developers as well as users such as clinicians. It extends their toolbox, enabling and improving interpretability of segmentation models. Enhancing our understanding of neural networks through sensitivity analysis also assists in decision making. Although demonstrated only on cardiac magnetic resonance images this approach and software are much more broadly applicable.


Author(s):  
S. Jelbart ◽  
K. U. Kristiansen ◽  
P. Szmolyan ◽  
M. Wechselberger

AbstractSingular exponential nonlinearities of the form $$e^{h(x)\epsilon ^{-1}}$$ e h ( x ) ϵ - 1 with $$\epsilon >0$$ ϵ > 0 small occur in many different applications. These terms have essential singularities for $$\epsilon =0$$ ϵ = 0 leading to very different behaviour depending on the sign of h. In this paper, we consider two prototypical singularly perturbed oscillators with such exponential nonlinearities. We apply a suitable normalization for both systems such that the $$\epsilon \rightarrow 0$$ ϵ → 0 limit is a piecewise smooth system. The convergence to this nonsmooth system is exponential due to the nonlinearities we study. By working on the two model systems we use a blow-up approach to demonstrate that this exponential convergence can be harmless in some cases while in other scenarios it can lead to further degeneracies. For our second model system, we deal with such degeneracies due to exponentially small terms by extending the space dimension, following the approach in Kristiansen (Nonlinearity 30(5): 2138–2184, 2017), and prove—for both systems—existence of (unique) limit cycles by perturbing away from singular cycles having desirable hyperbolicity properties.


Sign in / Sign up

Export Citation Format

Share Document