Study on Information Compression via the Matching and Unification of Patterns: An Approach towards Mathematical Foundations

Author(s):  
J. Gerard Wolff
Author(s):  
Ye. Yi. Bidaibekov ◽  
V. V. Grinshkun ◽  
S. N. Koneva

The article deals with computer graphics tasks related to the activities of the future informatics teacher in conditions of fundamentalization of education. Training of future informatics teachers in the context of the fundamentalization of education requires them to know the range of tasks related to computer graphics and the skills to solve them. In order to enhance the fundamental component of computer graphics, methods are proposed that rely on interprandial communications, as well as on in-depth training of computer graphics. In the course of reasoning, the authors come to the conclusion that the content of computer graphics should be enriched with mathematical foundations of computer graphics and as a result update the content of the computer graphics course with machine graphics algorithms. The basic principle of selecting the content of the course offered is the principle of the fundamentalization of education. Since the scope of application of computer graphics is extensive, in our opinion, the system of tasks and tasks on computer graphics is the most interesting. A feature of this system is the orientation towards solving fundamental problems of computer graphics. It was also revealed during the study that it is possible to reduce the tasks of the proposed system to a certain sequence of stages. The application of stages for a certain type of tasks affects the methods of solving them. Thus, the fundamental training of future informatics teachers in computer graphics requires them to know these stages and methods of solving fundamental computer graphics tasks.


Author(s):  
V. V. Goncharova

The increasing interest towards abstracting as a type of analytical and synthetical information processing due to science globalization trend, is emphasized. The professionals who study this primary information compression are bibliographers, linguists, and information specialists. The author argues that modern professors and students all have to and must learn abstracting in accordance with the international standards for scientific, research, reference and instructional works.The author points to the diversity of the national lexicographical studies and, based on the abstracts index obtained as a result of her study, characterizes the current trends in abstracting linguistic dictionaries. The key user groups are defined. Publishers’ abstracts of dictionaries are discussed and represented. The example of dictionary Internet-based abstract analysis is given (50 items). Based on the abstracts texts, main negative factors to impact information value of this secondary information source are revealed, that is: lacking data essential for users, incomplete description of targeted readership, etc.The author introduces a model plan for digital guides of Russian lexicographical works and complements the plan with the systematic aspect analysis. She concludes that abstracting is an intellectually intensive process. It is underexplored as far as lexicographical works are concerned, and offers many possibilities for further studies.


Author(s):  
Charles L. Epstein ◽  
Rafe Mazzeo

This book provides the mathematical foundations for the analysis of a class of degenerate elliptic operators defined on manifolds with corners, which arise in a variety of applications such as population genetics, mathematical finance, and economics. The results discussed in this book prove the uniqueness of the solution to the martingale problem and therefore the existence of the associated Markov process. The book uses an “integral kernel method” to develop mathematical foundations for the study of such degenerate elliptic operators and the stochastic processes they define. The precise nature of the degeneracies of the principal symbol for these operators leads to solutions of the parabolic and elliptic problems that display novel regularity properties. Dually, the adjoint operator allows for rather dramatic singularities, such as measures supported on high codimensional strata of the boundary. The book establishes the uniqueness, existence, and sharp regularity properties for solutions to the homogeneous and inhomogeneous heat equations, as well as a complete analysis of the resolvent operator acting on Hölder spaces. It shows that the semigroups defined by these operators have holomorphic extensions to the right half plane. The book also demonstrates precise asymptotic results for the long-time behavior of solutions to both the forward and backward Kolmogorov equations.


Author(s):  
Yacine Aït-Sahalia ◽  
Jean Jacod

High-frequency trading is an algorithm-based computerized trading practice that allows firms to trade stocks in milliseconds. Over the last fifteen years, the use of statistical and econometric methods for analyzing high-frequency financial data has grown exponentially. This growth has been driven by the increasing availability of such data, the technological advancements that make high-frequency trading strategies possible, and the need of practitioners to analyze these data. This comprehensive book introduces readers to these emerging methods and tools of analysis. The book covers the mathematical foundations of stochastic processes, describes the primary characteristics of high-frequency financial data, and presents the asymptotic concepts that their analysis relies on. It also deals with estimation of the volatility portion of the model, including methods that are robust to market microstructure noise, and address estimation and testing questions involving the jump part of the model. As the book demonstrates, the practical importance and relevance of jumps in financial data are universally recognized, but only recently have econometric methods become available to rigorously analyze jump processes. The book approaches high-frequency econometrics with a distinct focus on the financial side of matters while maintaining technical rigor, which makes this book invaluable to researchers and practitioners alike.


Author(s):  
Amos Golan

In this chapter I provide additional rationalization for using the info-metrics framework. This time the justifications are in terms of the statistical, mathematical, and information-theoretic properties of the formalism. Specifically, in this chapter I discuss optimality, statistical and computational efficiency, sufficiency, the concentration theorem, the conditional limit theorem, and the concept of information compression. These properties, together with the other properties and measures developed in earlier chapters, provide logical, mathematical, and statistical justifications for employing the info-metrics framework.


Sign in / Sign up

Export Citation Format

Share Document