consistency condition
Recently Published Documents


TOTAL DOCUMENTS

190
(FIVE YEARS 30)

H-INDEX

20
(FIVE YEARS 1)

2021 ◽  
Vol 2021 (11) ◽  
pp. 053
Author(s):  
V.A. Berezin ◽  
V.I. Dokuchaev ◽  
Yu. N. Eroshenko ◽  
A.L. Smirnov

Abstract We investigated the possibility of construction the homogeneous and isotropic cosmological solutions in Weyl geometry. We derived the self-consistency condition which ensures the conformal invariance of the complete set of equations of motion. There is the special gauge in choosing the conformal factor when the Weyl vector equals zero. In this gauge we found new vacuum cosmological solutions absent in General Relativity. Also, we found new solution in Weyl geometry for the radiation dominated universe with the cosmological term, corresponding to the constant curvature scalar in our special gauge. Possible relation of our results to the understanding both dark matter and dark energy is discussed.


2021 ◽  
Vol 38 (5) ◽  
pp. 1521-1530
Author(s):  
Yanming Zhao ◽  
Hong Yang ◽  
Guoan Su

In the traditional slow feature analysis (SFA), the expansion of polynomial basis function lacks the support of visual computing theories for primates, and cannot learn the uniform, continuous long short-term features through selective visual mechanism. To solve the defects, this paper designs and implements a slow feature algorithm coupling visual selectivity and multiple long short-term memory networks (LSTMs). Inspired by the visual invariance theory of natural images, this paper replaces the principal component analysis (PCA) of traditional SFA algorithm with myTICA (TICA: topologically independent component analysis) to extract image invariant Gabor basis functions, and initialize the space and series of basis functions. In view of the ability of the LSTM to learn long and short-term features, four LSTM algorithms were constructed to separately predict the long and short-term visual selectivity features of Gabor basis functions from the basis function series, and combine the functions into a new basis function, thereby solving the defect of polynomial prediction algorithms. In addition, a Lipschitz consistency condition was designed, and used to develop an approximate orthogonal pruning technique, which optimizes the prediction basis functions, and constructs a hyper-complete space for the basis function. The performance of our algorithm was evaluated by three metrics and mySFA’s classification method. The experimental results show that our algorithm achieved a good prediction effect on INRIA Holidays dataset, and outshined SFA, graph-based SFA (SFA), TICA, and myTICA in accuracy and feasibility; when the threshold was 6, the recognition rate of our algorithm was 99.98%, and the false accept rate (FAR) and false reject rate (FRR) were both smaller than 0.02%, indicating the strong classification ability of our approach.


2021 ◽  
pp. 1-12
Author(s):  
Qingxian An ◽  
Ruiyi Zhang ◽  
Yongchang Shen

Data envelopment analysis (DEA) is widely used to evaluate the performance of a group of homogeneous decision making units (DMUs). Considering the uncertainty, interval DEA has been introduced to fit into more situations. In this paper, an interval efficiency method based on slacks-based measure is proposed to solve the uncertain problems in DEA. Firstly, the maximum and minimum efficiency values of the evaluated DMU are calculated by the furthest and closest distance from the evaluated DMU to the projection points on the Pareto-efficient frontier, respectively. Then, the AHP method is used for the full ranking of DMUs. The paper uses the pairwise comparison relationship between each pair of DMUs to construct the interval multiplicative preference relations (IMPRs) matrix. If the matrix does not meet the consistency condition, a method to obtain consistency IMPRs is introduced. According to the consistency judgment matrix, the full ranking of DMUs can be obtained. Finally, we apply our method to the performance evaluation of 12 tourist hotels in Taipei in 2019.


Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 524
Author(s):  
Veronika Baumann ◽  
Flavio Del Santo ◽  
Alexander R. H. Smith ◽  
Flaminia Giacomini ◽  
Esteban Castro-Ruiz ◽  
...  

The quantum measurement problem can be regarded as the tension between the two alternative dynamics prescribed by quantum mechanics: the unitary evolution of the wave function and the state-update rule (or "collapse") at the instant a measurement takes place. The notorious Wigner's friend gedankenexperiment constitutes the paradoxical scenario in which different observers (one of whom is observed by the other) describe one and the same interaction differently, one –the Friend– via state-update and the other –Wigner– unitarily. This can lead to Wigner and his friend assigning different probabilities to the outcome of the same subsequent measurement. In this paper, we apply the Page-Wootters mechanism (PWM) as a timeless description of Wigner's friend-like scenarios. We show that the standard rules to assign two-time conditional probabilities within the PWM need to be modified to deal with the Wigner's friend gedankenexperiment. We identify three main definitions of such modified rules to assign two-time conditional probabilities, all of which reduce to standard quantum theory for non-Wigner's friend scenarios. However, when applied to the Wigner's friend setup each rule assigns different conditional probabilities, potentially resolving the probability-assignment paradox in a different manner. Moreover, one rule imposes strict limits on when a joint probability distribution for the measurement outcomes of Wigner and his Friend is well-defined, which single out those cases where Wigner's measurement does not disturb the Friend's memory and such a probability has an operational meaning in terms of collectible statistics. Interestingly, the same limits guarantee that said measurement outcomes fulfill the consistency condition of the consistent histories framework.


Circuit World ◽  
2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
George Thiel ◽  
Flavio Griggio ◽  
Sanjay Tiku

Purpose The purpose of this paper is to describe a novel methodology for predicting reliability for consumer electronics or any other hardware systems that experience a complex lifecycle environmental profile. Design/methodology/approach This Physics-of-Failure–based three-step methodology can be used to predict the degradation rate of a population using a Monte Carlo approach. The three steps include: using an empirical equation describing the degradation of a performance metric, a degradation consistency condition and a technique to account for cumulative degradation across multiple life-cycle stress conditions (e.g. temperature, voltage, mechanical load, etc.). Findings Two case studies are provided to illustrate the methodology including one related to repeated touch-load induced artifacts for displays. Originality/value This novel methodology can be applied to a wide range of applications from mechanical systems to electrical circuits. The results can be fed into the several stages of engineering validation to speed up product qualification.


Author(s):  
Bo Yin ◽  
Johannes Storm ◽  
Michael Kaliske

AbstractThe promising phase-field method has been intensively studied for crack approximation in brittle materials. The realistic representation of material degradation at a fully evolved crack is still one of the main challenges. Several energy split formulations have been postulated to describe the crack evolution physically. A recent approach based on the concept of representative crack elements (RCE) in Storm et al. (The concept of representative crack elements (RCE) for phase-field fracture: anisotropic elasticity and thermo-elasticity. Int J Numer Methods Eng 121:779–805, 2020) introduces a variational framework to derive the kinematically consistent material degradation. The realistic material degradation is further tested using the self-consistency condition, which is particularly compared to a discrete crack model. This work extends the brittle RCE phase-field modeling towards rate-dependent fracture evolution in a viscoelastic continuum. The novelty of this paper is taking internal variables due to viscoelasticity into account to determine the crack deformation state. Meanwhile, a transient extension from Storm et al. (The concept of representative crack elements (RCE) for phase-field fracture: anisotropic elasticity and thermo-elasticity. Int J Numer Methods Eng 121:779–805, 2020) is also considered. The model is derived thermodynamic-consistently and implemented into the FE framework. Several representative numerical examples are investigated, and consequently, the according findings and potential perspectives are discussed to close this paper.


2021 ◽  
Vol 10 (1) ◽  
Author(s):  
Daniel Mayerson ◽  
Masaki Shigemori

We quantize the D1-D5-P microstate geometries known as superstrata directly in supergravity. We use Rychkov's consistency condition [hep-th/0512053] which was derived for the D1-D5 system; for superstrata, this condition turns out to be strong enough to fix the symplectic form uniquely. For the (1,0,n) superstrata, we further confirm this quantization by a bona-fide explicit computation of the symplectic form using the semi-classical covariant quantization method in supergravity. We use the resulting quantizations to count the known supergravity superstrata states, finding agreement with previous countings that the number of these states grows parametrically smaller than those of the corresponding black hole.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Kehan Si ◽  
Zhen Wu

AbstractThis paper studies a controlled backward-forward linear-quadratic-Gaussian (LQG) large population system in Stackelberg games. The leader agent is of backward state and follower agents are of forward state. The leader agent is dominating as its state enters those of follower agents. On the other hand, the state-average of all follower agents affects the cost functional of the leader agent. In reality, the leader and the followers may represent two typical types of participants involved in market price formation: the supplier and producers. This differs from standard MFG literature and is mainly due to the Stackelberg structure here. By variational analysis, the consistency condition system can be represented by some fully-coupled backward-forward stochastic differential equations (BFSDEs) with high dimensional block structure in an open-loop sense. Next, we discuss the well-posedness of such a BFSDE system by virtue of the contraction mapping method. Consequently, we obtain the decentralized strategies for the leader and follower agents which are proved to satisfy the ε-Nash equilibrium property.


Author(s):  
Tangliu Wen ◽  
Jie Peng ◽  
Jinyun Xue ◽  
Zhen You ◽  
Lan Song

Linearizability is a commonly accepted consistency condition for concurrent objects. Filipović et al. show that linearizability is equivalent to observational refinement. However, linearizability does not permit concurrent objects to share memory spaces with their client programs. We show that linearizability (or observational refinement) can be broken even though a client program of an object accesses the shared memory spaces without interleaving with the methods of the object. In this paper, we present strict linearizability which lifts this limitation and can ensure client-side traces and final-states equivalence even in a relaxed program model allowing clients to directly access the internal states of concurrent objects. We also investigate several important properties of strict linearizability. At a high level of abstraction, a concurrent object can be viewed as a concurrent implementation of an abstract data type (ADT). We also present a correctness criterion for relating an ADT and its concurrent implementation, which is the combination of linearizability and data abstraction and can ensure observational equivalence. We also investigate its relationship with strict linearizability.


2021 ◽  
Vol 31 (1) ◽  
pp. 271-290
Author(s):  
Aivaras Stepukonis ◽  

The article examines and criticizes Paul Karl Feyerabend’s seminal work entitled, “How to Be a Good Empiricist—A Plea for Tolerance in Matters Epistemological” which persuasively argued for a pluralistic view of scientific knowledge and theoretical truth. Throughout the article, a number of polemical points, analytic elaborations, and broader philosophical concerns are raised regarding the notions of consistency condition, meaning invariance, theoretical alternatives, and the very principle of theoretical pluralism. The article concludes that Feyerabend’s call for a plurality of theories as the surest path to the progress of science is in need of numerous conceptual qualifications, provoking the reader into critical thinking about the deeper underpinnings of science while providing very few ready-made answers to the problems enunciated.


Sign in / Sign up

Export Citation Format

Share Document