separation theorems
Recently Published Documents


TOTAL DOCUMENTS

114
(FIVE YEARS 21)

H-INDEX

12
(FIVE YEARS 2)

Author(s):  
Christian Günther ◽  
Bahareh Khazayel ◽  
Christiane Tammer

AbstractIn vector optimization, it is of increasing interest to study problems where the image space (a real linear space) is preordered by a not necessarily solid (and not necessarily pointed) convex cone. It is well-known that there are many examples where the ordering cone of the image space has an empty (topological/algebraic) interior, for instance in optimal control, approximation theory, duality theory. Our aim is to consider Pareto-type solution concepts for such vector optimization problems based on the intrinsic core notion (a well-known generalized interiority notion). We propose a new Henig-type proper efficiency concept based on generalized dilating cones which are relatively solid (i.e., their intrinsic cores are nonempty). Using linear functionals from the dual cone of the ordering cone, we are able to characterize the sets of (weakly, properly) efficient solutions under certain generalized convexity assumptions. Toward this end, we employ separation theorems that are working in the considered setting.


2021 ◽  
Vol 26 (4) ◽  
pp. 542-549
Author(s):  
Adel Murtda Al-awci ◽  
Noori F. Al-Mayahi

The  applications of functional analysis in economics began worked out since the  by presenting theoretical studies related to the development and balance of financial  markets by building mathematical models with linear topological space , describing and defining the economic balance of the stock market in mathematical formulas and terms , and then using the theorems of  linear topological spaces such as Han's theorems . Banach , separation theorems  , open function theorem ,closed statement theorem and so on to create the necessary and sufficient condition to make the market model achieve viability , achieve no arbitrage , and not recognize No free Lunches                                                                                                                             


2021 ◽  
pp. 026010792110334
Author(s):  
William P. Fisher

In 1959, Ragnar Frisch prompted Georg Rasch to formalise a separability theorem that continues today to serve as the basis of a wide range of theoretical and applied developments in psychological and social measurement. Previously unnoted are the influences on Rasch exerted by Frisch’s concerns for data autonomy, model identification and necessary and sufficient conditions. Although Rasch acknowledged Frisch’s prompting towards a separability theorem, he did not acknowledge any substantive, intellectual debt to him, nor to Irving Fisher, but only to Ronald Fisher. Rasch appears to have developed a special interest in sufficiency and identified models when studying with Frisch in 1935, and in 1947, when Rasch accompanied Tjalling Koopmans to the University of Chicago and the Cowles Commission for Research in Economics. I. Fisher’s separation theorem continues to be relevant in econometrics, and interest in Rasch’s separability theorem is growing as the measurement models based on it are adopted in metrological theory and practice. The extensive interrelations between measurement science, metrological standards and economics suggest paths towards lower transaction costs and more efficient markets for individualised exchanges of human, social and natural capital. Equally, if not more, surprising are the implications for a poetic art of complex, harmonised relationships played out via creative improvisations expressed using instruments tuned to shared scales. JEL: B41, C10, C13, C20, C42, D70, E60, H54, I11, I21, I31, P11


Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 1090
Author(s):  
Alexander N. Gorban ◽  
Bogdan Grechuk ◽  
Evgeny M. Mirkes ◽  
Sergey V. Stasenko ◽  
Ivan Y. Tyukin

This work is driven by a practical question: corrections of Artificial Intelligence (AI) errors. These corrections should be quick and non-iterative. To solve this problem without modification of a legacy AI system, we propose special `external’ devices, correctors. Elementary correctors consist of two parts, a classifier that separates the situations with high risk of error from the situations in which the legacy AI system works well and a new decision that should be recommended for situations with potential errors. Input signals for the correctors can be the inputs of the legacy AI system, its internal signals, and outputs. If the intrinsic dimensionality of data is high enough then the classifiers for correction of small number of errors can be very simple. According to the blessing of dimensionality effects, even simple and robust Fisher’s discriminants can be used for one-shot learning of AI correctors. Stochastic separation theorems provide the mathematical basis for this one-short learning. However, as the number of correctors needed grows, the cluster structure of data becomes important and a new family of stochastic separation theorems is required. We refuse the classical hypothesis of the regularity of the data distribution and assume that the data can have a rich fine-grained structure with many clusters and corresponding peaks in the probability density. New stochastic separation theorems for data with fine-grained structure are formulated and proved. On the basis of these theorems, the multi-correctors for granular data are proposed. The advantages of the multi-corrector technology were demonstrated by examples of correcting errors and learning new classes of objects by a deep convolutional neural network on the CIFAR-10 dataset. The key problems of the non-classical high-dimensional data analysis are reviewed together with the basic preprocessing steps including the correlation transformation, supervised Principal Component Analysis (PCA), semi-supervised PCA, transfer component analysis, and new domain adaptation PCA.


Author(s):  
Alexander N Gorban ◽  
Bogdan Grechuk ◽  
Evgeny M Mirkes ◽  
Sergey V Stasenko ◽  
Ivan Y Tyukin

This work is driven by a practical question, corrections of Artificial Intelligence (AI) errors. Systematic re-training of a large AI system is hardly possible. To solve this problem, special external devices, correctors, are developed. They should provide quick and non-iterative system fix without modification of a legacy AI system. A common universal part of the AI corrector is a classifier that should separate undesired and erroneous behavior from normal operation. Training of such classifiers is a grand challenge at the heart of the one- and few-shot learning methods. Effectiveness of one- and few-short methods is based on either significant dimensionality reductions or the blessing of dimensionality effects. Stochastic separability is a blessing of dimensionality phenomenon that allows one-and few-shot error correction: in high-dimensional datasets under broad assumptions each point can be separated from the rest of the set by simple and robust linear discriminant. The hierarchical structure of data universe is introduced where each data cluster has a granular internal structure, etc. New stochastic separation theorems for the data distributions with fine-grained structure are formulated and proved. Separation theorems in infinite-dimensional limits are proven under assumptions of compact embedding of patterns into data space. New multi-correctors of AI systems are presented and illustrated with examples of predicting errors and learning new classes of objects by a deep convolutional neural network.


2021 ◽  
pp. 1-15
Author(s):  
Carole Bernard ◽  
Corrado De Vecchi ◽  
Steven Vanduffel
Keyword(s):  

2021 ◽  
Vol 138 ◽  
pp. 33-56
Author(s):  
Bogdan Grechuk ◽  
Alexander N. Gorban ◽  
Ivan Y. Tyukin

Author(s):  
Roman Badora

AbstractThe presented work summarizes the relationships between stability results and separation theorems. We prove the equivalence between different types of theorems on separation by an additive map and different types of stability results concerning the stability of the Cauchy functional equation.


2021 ◽  
Vol 7 (3) ◽  
pp. 3290-3302
Author(s):  
Ruini Li ◽  
◽  
Jianrong Wu

<abstract> <p>In this paper, we first study continuous linear functionals on a fuzzy quasi-normed space, obtain a characterization of continuous linear functionals, and point out that the set of all continuous linear functionals forms a convex cone and can be equipped with a weak fuzzy quasi-norm. Next, we prove a theorem of Hahn-Banach type and two separation theorems for convex subsets of fuzzy quasinormed spaces.</p> </abstract>


2020 ◽  
Vol 76 (1) ◽  
pp. 11-34
Author(s):  
Antonio Boccuto

AbstractWe prove Hahn-Banach, sandwich and extension theorems for vector lattice-valued operators, equivariant with respect to a given group G of homomorphisms. As applications and consequences, we present some Fenchel duality and separation theorems, a version of the Moreau-Rockafellar formula and some Farkas and Kuhn-Tucker-type optimization results. Finally, we prove that the obtained results are equivalent to the amenability of G.


Sign in / Sign up

Export Citation Format

Share Document