scholarly journals Distributions of Hyper-Local Configuration Elements to Characterize, Compare, and Assess Landscape-Level Spatial Patterns

Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 420
Author(s):  
Tarmo K. Remmel

Even with considerable attention in recent decades, measuring and working with patterns remains a complex task due to the underlying dynamic processes that form these patterns, the influence of scales, and the many further implications stemming from their representation. This work scrutinizes binary classes mapped onto regular grids and counts the relative frequencies of all first-order configuration components and then converts these measurements into empirical probabilities of occurrence for either of the two landscape classes. The approach takes into consideration configuration explicitly and composition implicitly (in a common framework), while the construction of a frequency distribution provides a generic model of landscape structure that can be used to simulate structurally similar landscapes or to compare divergence from other landscapes. The technique is first tested on simulated data to characterize a continuum of landscapes across a range of spatial autocorrelations and relative compositions. Subsequent assessments of boundary prominence are explored, where outcomes are known a priori, to demonstrate the utility of this novel method. For a binary map on a regular grid, there are 32 possible configurations of first-order orthogonal neighbours. The goal is to develop a workflow that permits patterns to be characterized in this way and to offer an approach that identifies how relatively divergent observed patterns are, using the well-known Kullback–Leibler divergence.

2015 ◽  
Author(s):  
Jonathan M. Koller ◽  
M. Jonathan Vachon ◽  
G. Larry Bretthorst ◽  
Kevin J. Black

ABSTRACTWe recently described rapid quantitative pharmacodynamic imaging, a novel method for estimating sensitivity of a biological system to a drug. We tested its accuracy in simulated biological signals with varying receptor sensitivity and varying levels of random noise, and presented initial proof-of-concept data from functional MRI (fMRI) studies in primate brain. However, the initial simulation testing used a simple iterative approach to estimate pharmacokinetic-pharmacodynamic (PKPD) parameters, an approach that was computationally efficient but returned parameters only from a small, discrete set of values chosen a priori.Here we revisit the simulation testing using a Bayesian method to estimate the PKPD parameters. This improved accuracy compared to our previous method, and noise without intentional signal was never interpreted as signal. We also reanalyze the fMRI proof-of-concept data. The success with the simulated data, and with the limited fMRI data, is a necessary first step toward further testing of rapid quantitative pharmacodynamic imaging.


Author(s):  
Kevin N. Otto

Abstract Among the many tasks designers must perform, evaluation of concepts based on performance criteria is fundamental. A formal evaluation of a design defines a measurement of the design: an assessment reflected by real valued numbers. A measurement requires some basic a-priori information from the designer. In particular, a base-point design is required from which the remaining designs are relatively measured. Also, a metric reference design is required to compare the deviation of each remaining design from the base point design. Given these two reference designs, any other design can be evaluated numerically. Such engineering methods as concept selection charts, QFD, optimization, and many current research methods in engineering design use these measurement fundamentals to evaluate designs. Measurement theory provides a common framework to discuss solution evaluation within all of these methods. Further, the minimum formalization needed to make evaluations among design configurations is also demonstrated.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Camilo Broc ◽  
Therese Truong ◽  
Benoit Liquet

Abstract Background The increasing number of genome-wide association studies (GWAS) has revealed several loci that are associated to multiple distinct phenotypes, suggesting the existence of pleiotropic effects. Highlighting these cross-phenotype genetic associations could help to identify and understand common biological mechanisms underlying some diseases. Common approaches test the association between genetic variants and multiple traits at the SNP level. In this paper, we propose a novel gene- and a pathway-level approach in the case where several independent GWAS on independent traits are available. The method is based on a generalization of the sparse group Partial Least Squares (sgPLS) to take into account groups of variables, and a Lasso penalization that links all independent data sets. This method, called joint-sgPLS, is able to convincingly detect signal at the variable level and at the group level. Results Our method has the advantage to propose a global readable model while coping with the architecture of data. It can outperform traditional methods and provides a wider insight in terms of a priori information. We compared the performance of the proposed method to other benchmark methods on simulated data and gave an example of application on real data with the aim to highlight common susceptibility variants to breast and thyroid cancers. Conclusion The joint-sgPLS shows interesting properties for detecting a signal. As an extension of the PLS, the method is suited for data with a large number of variables. The choice of Lasso penalization copes with architectures of groups of variables and observations sets. Furthermore, although the method has been applied to a genetic study, its formulation is adapted to any data with high number of variables and an exposed a priori architecture in other application fields.


BMC Genomics ◽  
2020 ◽  
Vol 21 (S11) ◽  
Author(s):  
Adam Cornish ◽  
Shrabasti Roychoudhury ◽  
Krishna Sarma ◽  
Suravi Pramanik ◽  
Kishor Bhakat ◽  
...  

Abstract Background Single-cell sequencing enables us to better understand genetic diseases, such as cancer or autoimmune disorders, which are often affected by changes in rare cells. Currently, no existing software is aimed at identifying single nucleotide variations or micro (1-50 bp) insertions and deletions in single-cell RNA sequencing (scRNA-seq) data. Generating high-quality variant data is vital to the study of the aforementioned diseases, among others. Results In this study, we report the design and implementation of Red Panda, a novel method to accurately identify variants in scRNA-seq data. Variants were called on scRNA-seq data from human articular chondrocytes, mouse embryonic fibroblasts (MEFs), and simulated data stemming from the MEF alignments. Red Panda had the highest Positive Predictive Value at 45.0%, while other tools—FreeBayes, GATK HaplotypeCaller, GATK UnifiedGenotyper, Monovar, and Platypus—ranged from 5.8–41.53%. From the simulated data, Red Panda had the highest sensitivity at 72.44%. Conclusions We show that our method provides a novel and improved mechanism to identify variants in scRNA-seq as compared to currently existing software. However, methods for identification of genomic variants using scRNA-seq data can be still improved.


2021 ◽  
Vol 11 (2) ◽  
pp. 582
Author(s):  
Zean Bu ◽  
Changku Sun ◽  
Peng Wang ◽  
Hang Dong

Calibration between multiple sensors is a fundamental procedure for data fusion. To address the problems of large errors and tedious operation, we present a novel method to conduct the calibration between light detection and ranging (LiDAR) and camera. We invent a calibration target, which is an arbitrary triangular pyramid with three chessboard patterns on its three planes. The target contains both 3D information and 2D information, which can be utilized to obtain intrinsic parameters of the camera and extrinsic parameters of the system. In the proposed method, the world coordinate system is established through the triangular pyramid. We extract the equations of triangular pyramid planes to find the relative transformation between two sensors. One capture of camera and LiDAR is sufficient for calibration, and errors are reduced by minimizing the distance between points and planes. Furthermore, the accuracy can be increased by more captures. We carried out experiments on simulated data with varying degrees of noise and numbers of frames. Finally, the calibration results were verified by real data through incremental validation and analyzing the root mean square error (RMSE), demonstrating that our calibration method is robust and provides state-of-the-art performance.


Mathematics ◽  
2021 ◽  
Vol 9 (3) ◽  
pp. 222
Author(s):  
Juan C. Laria ◽  
M. Carmen Aguilera-Morillo ◽  
Enrique Álvarez ◽  
Rosa E. Lillo ◽  
Sara López-Taruella ◽  
...  

Over the last decade, regularized regression methods have offered alternatives for performing multi-marker analysis and feature selection in a whole genome context. The process of defining a list of genes that will characterize an expression profile remains unclear. It currently relies upon advanced statistics and can use an agnostic point of view or include some a priori knowledge, but overfitting remains a problem. This paper introduces a methodology to deal with the variable selection and model estimation problems in the high-dimensional set-up, which can be particularly useful in the whole genome context. Results are validated using simulated data and a real dataset from a triple-negative breast cancer study.


2021 ◽  
Vol 4 (1) ◽  
pp. 251524592095492
Author(s):  
Marco Del Giudice ◽  
Steven W. Gangestad

Decisions made by researchers while analyzing data (e.g., how to measure variables, how to handle outliers) are sometimes arbitrary, without an objective justification for choosing one alternative over another. Multiverse-style methods (e.g., specification curve, vibration of effects) estimate an effect across an entire set of possible specifications to expose the impact of hidden degrees of freedom and/or obtain robust, less biased estimates of the effect of interest. However, if specifications are not truly arbitrary, multiverse-style analyses can produce misleading results, potentially hiding meaningful effects within a mass of poorly justified alternatives. So far, a key question has received scant attention: How does one decide whether alternatives are arbitrary? We offer a framework and conceptual tools for doing so. We discuss three kinds of a priori nonequivalence among alternatives—measurement nonequivalence, effect nonequivalence, and power/precision nonequivalence. The criteria we review lead to three decision scenarios: Type E decisions (principled equivalence), Type N decisions (principled nonequivalence), and Type U decisions (uncertainty). In uncertain scenarios, multiverse-style analysis should be conducted in a deliberately exploratory fashion. The framework is discussed with reference to published examples and illustrated with the help of a simulated data set. Our framework will help researchers reap the benefits of multiverse-style methods while avoiding their pitfalls.


2021 ◽  
Vol 9 (1) ◽  
pp. 81-89
Author(s):  
Robert Penner

Abstract Tools developed by Moderna, BioNTech/Pfizer, and Oxford/Astrazeneca, among others, provide universal solutions to previously problematic aspects of drug or vaccine delivery, uptake and toxicity, portending new tools across the medical sciences. A novel method is presented based on estimating protein backbone free energy via geometry to predict effective antiviral targets, antigens and vaccine cargos that are resistant to viral mutation. This method is reviewed and reformulated in light of the recent proliferation of structural data on the SARS-CoV-2 spike glycoprotein and its mutations in multiple lineages. Key findings include: collections of mutagenic residues reoccur across strains, suggesting cooperative convergent evolution; most mutagenic residues do not participate in backbone hydrogen bonds; metastability of the glyco-protein limits the change of free energy through mutation thereby constraining selective pressure; and there are mRNA or virus-vector cargos targeting low free energy peptides proximal to conserved high free energy peptides providing specific recipes for vaccines with greater specificity than the full-spike approach. These results serve to limit peptides in the spike glycoprotein with high mutagenic potential and thereby provide a priori constraints on viral and attendant vaccine evolution. Scientific and regulatory challenges to nucleic acid therapeutic and vaccine development and deployment are finally discussed.


2010 ◽  
Vol 2010 ◽  
pp. 1-39 ◽  
Author(s):  
Alessandro Morando ◽  
Paolo Secchi

We study the boundary value problem for a linear first-order partial differential system with characteristic boundary of constant multiplicity. We assume the problem to be “weakly” well posed, in the sense that a uniqueL2-solution exists, for sufficiently smooth data, and obeys an a priori energy estimate with a finite loss of tangential/conormal regularity. This is the case of problems that do not satisfy the uniform Kreiss-Lopatinskiĭ condition in the hyperbolic region of the frequency domain. Provided that the data are sufficiently smooth, we obtain the regularity of solutions, in the natural framework of weighted conormal Sobolev spaces.


2014 ◽  
Vol 2014 ◽  
pp. 1-5 ◽  
Author(s):  
Liang Zhao

This paper presents a novel abnormal data detecting algorithm based on the first order difference method, which could be used to find out outlier in building energy consumption platform real time. The principle and criterion of methodology are discussed in detail. The results show that outlier in cumulative power consumption could be detected by our method.


Sign in / Sign up

Export Citation Format

Share Document