Pattern Theory
Latest Publications


TOTAL DOCUMENTS

19
(FIVE YEARS 0)

H-INDEX

0
(FIVE YEARS 0)

Published By Oxford University Press

9780198505709, 9780191916564

Author(s):  
Ulf Grenander ◽  
Michael I. Miller

This chapter explores random sampling algorithms introduced in for generating conditional expectations in hypothesis spaces in which there is a mixture of discrete, disconnected subsets. Random samples are generated via the direct simulation of a Markov process whose state moves through the hypothesis space with the ergodic property that the transition distribution of the Markov process converges to the posterior distribution. This allows for the empirical generation of conditional expectations under the posterior. To accommodate the connected and disconnected nature of the state spaces, the Markov process is forced to satisfy jump–diffusion dynamics. Through the connected parts of the parameter space (Lie manifolds) the algorithm searches continuously, with sample paths corresponding to solutions of standard diffusion equations. Across the disconnected parts of parameter space the jump process determines the dynamics. The infinitesimal properties of these jump–diffusion processes are selected so that various sample statistics converge to their expectation under the posterior.


Author(s):  
Ulf Grenander ◽  
Michael I. Miller

Thus far Pattern theory has been combinatore constructing complex patterns by connecting simpler ones via graphs. Patterns typically occurring in nature may be extremely complex and exhibit invariances. For example, spatial patterns may live in a space where the choice of coordinate system is irrelevant; temporal patterns may exist independently of where time is counted from, and so on. For this matrix groups as transformations are introduced, these transformations often forming groups which act on the generators.


Author(s):  
Ulf Grenander ◽  
Michael I. Miller

Pattern theory is combinatory in spirit or, to use a fashionable term, connectionist: complex structures are built from simpler ones. To construct more general patterns, we will generalize from combinations of sites to combinations of primitives, termed generators, which are structured sets. The interactions between generators is imposed via the directed and undirected graph structures, defining how the variables at the sites of the graph interact with their neighbors in the graph. Probabilistic structures on the representations allow for expressing the variation of natural patterns. Canonical representations are established demonstrating a unified manner for viewing DAGs, MRFs, Gaussian random fields and probabilistic formal languages.


Author(s):  
Ulf Grenander ◽  
Michael I. Miller

Computational Anatomy (CA) is the study of human anatomy I ∈ I = Iα ◦G, an orbit under groups of diffeomorphisms of anatomical exemplars Iα ∈ I. The observable images I<D ∈ ID are the output of Medical imaging devices. This chapter focuses on the third of the three components of CA: (iii) generation of probability laws of anatomical variation P(·) on the images I within the anatomical orbits and inference of disease models. The basic random objects studied are the diffeomorphic transformations encoding the anatomical objects in the orbit; Gaussian random fields are constructed based on empirical observations of the transformations. Hypothesis testing on various neuromorphometric changes are studied.


Author(s):  
Ulf Grenander ◽  
Michael I. Miller

Thus far we have only studied representations of the source. Now we add the channel, pushing us into the frameworks of estimate then examine estimation bounds for understanding rigid object recognition involving the low-dimensional matrix groups. Minimum-mean-squared error bounds are derived for recognition and identification.


Author(s):  
Ulf Grenander ◽  
Michael I. Miller

In this chapter the metric space structure of shape is developed by studying the action of the infinite dimensional diffeomorphisms on the coordinate systems of shape. Riemannian manifolds allow us to developmetric distances between the groupelements. We examine the natural analog of the finite dimensional matrix groups corresponding to the infinite dimensional diffeomorphisms which are generated as flows of ordinary differential equations.We explore the construction of the metric structure of these diffeomorphisms and develop many of the properties which hold for the finite dimensional matrix groups in this infinite dimensional setting.


Author(s):  
Ulf Grenander ◽  
Michael I. Miller

To study shape we introduce manifolds and submanifolds examined in the continuum as the generators. Transformations are constructed which are built from the matrix groups and infinite products. This gives rise to many of the widely used structural models in image analysis often termed active models, essentially the deformable templates. These deformations are studied as both diffeomorphisms as well as immersions. A calculus is introduced based on transport theory for activating these deformable shapes by taking variations with respect to the matrix groups parameterizing them. Segmentation based on activating these manifolds is examined based on Gaussian random fields and variations with respect to the parameterizations.


Author(s):  
Ulf Grenander ◽  
Michael I. Miller

Probabilistic structures on the representations allow for expressing the variation of natural patterns. In this chapter the structure imposed through probabilistic directed graphs is studied. The essential probabilistic structure enforced through the directedness of the graphs is sites are conditionally independent of their nondescendants given their parents. The entropies and combinatorics of these processes are examined as well. Focus is given to the classical Markov chain and the branching process examples to illustrate the fundamentals of variability descriptions through probability and entropy.


Author(s):  
Ulf Grenander ◽  
Michael I. Miller

This book is to be an accessible book on patterns, their representation, and inference. There are a small number of ideas and techniques that, when mastered, make the subject more accessible. This book has arisen from ten years of a research program which the authors have embarked upon, building on the more abstract developments of metric pattern theory developed by one of the authors during the 1970s and 1980s. The material has been taught over multiple semesters as part of a second year graduate-level course in pattern theory, essentially an introduction for students interested in the representation of patterns which are observed in the natural world. The course has attracted students studying biomedical engineering, computer science, electrical engineering, and applied mathematics interested in speech recognition and computational linguistics, as well as areas of image analysis, and computer vision. Now the concept of patterns pervades the history of intellectual endeavor; it is one of the eternal followers in human thought. It appears again and again in science, taking on different forms in the various disciplines, and made rigorous through mathematical formalization. But the concept also lives in a less stringent form in the humanities, in novels and plays, even in everyday language. We use it all the time without attributing a formal meaning to it and yet with little risk of misunderstanding. So, what do we really mean by a pattern? Can we define it in strictly logical terms? And if we can, what use can we make of such a definition? These questions were answered by General Pattern Theory, a discipline initiated by Ulf Grenander in the late 1960s [1–5]. It has been an ambitious effort with the only original sketchy program having few if any practical applications, growing in mathematical maturity with a multitude of applications having appeared in biology/medicine and in computer vision, in language theory and object recognition, to mention but a few. Pattern theory attempts to provide an algebraic framework for describing patterns as structures regulated by rules, essentially a finite number of both local and global combinatory operations. Pattern theory takes a compositional view of the world, building more and more complex structures starting from simple ones. The basic rules for combining and building complex patterns from simpler ones are encoded via graphs and rules on transformation of these graphs.


Author(s):  
Ulf Grenander ◽  
Michael I. Miller

The parameter spaces of natural patterns are so complex that inference must often proceed compositionally, successively building up more and more complex structures, as well as back-tracking, creating simpler structures from more complex versions. Inference is transformational in nature. The philosophical approach studied in this chapter is that the posterior distribution that describes the patterns contains all of the information about the underlying regular structure. Therefore, the transformations of inference are guided via the posterior in the sense that the algorithm for changing the regular structures will correspond to the sample path of a Markov process. The Markov process is constructed to push towards the posterior distribution in which the information about the patterns are stored. This provides the deepconnection between the transformational paradigm of regular structure creation, and random sampling algorithms.


Sign in / Sign up

Export Citation Format

Share Document