original result
Recently Published Documents


TOTAL DOCUMENTS

126
(FIVE YEARS 73)

H-INDEX

9
(FIVE YEARS 1)

2022 ◽  
Vol 184 (1) ◽  
pp. 1-47
Author(s):  
Pierre Ganty ◽  
Elena Gutiérrez ◽  
Pedro Valero

We provide new insights on the determinization and minimization of tree automata using congruences on trees. From this perspective, we study a Brzozowski’s style minimization algorithm for tree automata. First, we prove correct this method relying on the following fact: when the automata-based and the language-based congruences coincide, determinizing the automaton yields the minimal one. Such automata-based congruences, in the case of word automata, are defined using pre and post operators. Now we extend these operators to tree automata, a task that is particularly challenging due to the reduced expressive power of deterministic top-down (or equivalently co-deterministic bottom-up) automata. We leverage further our framework to offer an extension of the original result by Brzozowski for word automata.


2021 ◽  
Vol 49 (4) ◽  
pp. 24-62
Author(s):  
K. V. Lebedev ◽  
B. N. Filyushkin ◽  
N. G. Kozhelupova

Peculiarities of the spatial distribution of the Red Sea and the Persian Gulf waters in the northwestern part of the Indian Ocean have been investigated based on the Argo float measurement database. 27128 profiles of temperature and salinity were taken into account. To process these data, we used the Argo Model for Investigation of the Global Ocean (AMIGO). This technique allowed us for the first time to obtain a complete set of oceanographic characteristics up to a depth of 2000 m for different time intervals of averaging (month, season, years). Joint analysis of the variability of hydrological characteristics within the depths of 0-500 m during the summer monsoon clearly showed the influence of the Somali Current on the dynamics of the waters of this region: the formation of the largest anticyclone (Great Whirl), coastal upwelling zones, redistribution of water masses in the Gulf of Oman and the Arabian Sea. The main influence on the formation of the temperature and salinity fields is exerted by the Persian Gulf waters. The same analysis of the variability of fields within the depths of 600-1000 m showed the role of the outflow of the Red Sea waters from the Gulf of Aden in the formation of deep waters in this area during the year. And, finally, at depths of 1000-1500 m, a deep anticyclonic eddy is formed, the southern branch of which, moving westward, at 7˚N. reaches Africa and turns to the south with a narrow stream of Red Sea waters, and then, crossing the equator, reaches 15˚S. An original result was obtained for determining the temporal characteristics of the Somali Current: the time of its formation, the values of transports and life expectancy (according to model estimates of the estimated data for 7 years (1960–1996).


2021 ◽  
Author(s):  
Philipp Schönegger ◽  
Steven Verheyen

Over the past decades, psychology and its cognate disciplines have undergone substantial reform, ranging from advances in statistical methodology to significant changes in academic norms. One aspect of experimental design that has received comparatively little attention is incentivisation, i.e. the way that participants are rewarded and incentivised monetarily for their participation. While incentive-compatible designs are in use in disciplines like economics, the majority of studies in psychology and experimental philosophy are constructed such that individuals’ incentives to maximise their payoffs in many cases counteract their incentives to state their true preferences honestly. This is in part because the subject matter is often self-report data about subjective topics. One mechanism that allows for the introduction of an incentive-compatible design in such circumstances is the Bayesian Truth Serum (Prelec, 2004), which rewards participants based on how surprisingly common their answer are. Recently, Schoenegger (2021) applied this mechanism in the context of Likert-scale self-reports, finding that the introduction of this mechanism significantly altered response behaviour. In this registered report, we further investigate this mechanism by (i) replicating the original result and (ii) teasing out whether the effect may be explainable by an increase in expected earnings or the addition of a prediction task. We take this project to help introduce incentivisation mechanisms into fields where they were not widely used before.


Energies ◽  
2021 ◽  
Vol 14 (23) ◽  
pp. 7937
Author(s):  
Zenon Zwierzewicz ◽  
Lech Dorobczyński ◽  
Jarosław Artyszuk

This paper looks at a typical problem encountered in the process of designing an automatic ship’s course stabilisation system with the use of a relatively new methodology referred to as the Active Disturbance Rejection Control (ADRC). The main advantage of this approach over classic autopilots based on PID algorithms, still in the majority, is that it eliminates the tuning problem and, thus, ensures a much better average performance of the ship in various speed, loading, nautical and weather conditions during a voyage. All of these factors call for different and often dynamically variable autopilot parameters, which are difficult to assess, especially by the ship’s crew or owner. The original result of this article is that the required controller parameters are approximated based on some canonical model structure and analysis of the hydrodynamic properties of a wide class of ships. Another novelty is the use of a fully verified, realistic numerical hydrodynamic model of the ship as a simulation model as well as a basis for deriving a simplified model structure suitable for controller design. The preliminary results obtained indicate good performance of the proposed ADRC autopilot and provide prospects for its successful implementation on a real ship.


Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1436
Author(s):  
John C. Baez

Suppose we have n different types of self-replicating entity, with the population Pi of the ith type changing at a rate equal to Pi times the fitness fi of that type. Suppose the fitness fi is any continuous function of all the populations P1,⋯,Pn. Let pi be the fraction of replicators that are of the ith type. Then p=(p1,⋯,pn) is a time-dependent probability distribution, and we prove that its speed as measured by the Fisher information metric equals the variance in fitness. In rough terms, this says that the speed at which information is updated through natural selection equals the variance in fitness. This result can be seen as a modified version of Fisher’s fundamental theorem of natural selection. We compare it to Fisher’s original result as interpreted by Price, Ewens and Edwards.


2021 ◽  
Vol 2021 (6) ◽  
Author(s):  
M. Bruno ◽  
M. T. Hansen

Abstract We discuss a method to construct hadronic scattering and decay amplitudes from Euclidean correlators, by combining the approach of a regulated inverse Laplace transform with the work of Maiani and Testa [1]. Revisiting the original result of ref. [1], we observe that the key observation, i.e. that only threshold scattering information can be extracted at large separations, can be understood by interpreting the correlator as a spectral function, ρ(ω), convoluted with the Euclidean kernel, e−ωt, which is sharply peaked at threshold. We therefore consider a modification in which a smooth step function, equal to one above a target energy, is inserted in the spectral decomposition. This can be achieved either through Backus-Gilbert-like methods or more directly using the variational approach. The result is a shifted resolution function, such that the large t limit projects onto scattering or decay amplitudes above threshold. The utility of this method is highlighted through large t expansions of both three- and four-point functions that include leading terms proportional to the real and imaginary parts (separately) of the target observable. This work also presents new results relevant for the un-modified correlator at threshold, including expressions for extracting the Nπ scattering length from four-point functions and a new strategy to organize the large t expansion that exhibits better convergence than the expansion in powers of 1/t.


2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Feng Jiang ◽  
Minghai Li ◽  
Jiayan Wen ◽  
Zedan Tan ◽  
Wenyun Zhou

In the work, the suitable volumetric efficiency is very important for the gasoline engine to achieve the aim of energy-saving and emission reduction. Thus, the intake system characteristics, such as intake manifold length, diameter, volumetric efficiency, and valve phase, should be investigated in detail. In order to investigate the performance optimization of the engine intake system, an optimization model of the engine intake system is developed by the GT-Power coupled with Matlab-Simulink and validated by the experimental results under the different conditions at full load. The engine power-, torque-, and brake-specific fuel consumption are defined as the result variables of the optimization model, and the length and diameter of the intake manifold are defined as the independent variables of the model. The results show that the length of intake manifold has little influence on the engine power and BSFC, and the length of intake manifold has a great impact on the performance index at high speed. In addition, the engine volumetric efficiency is the highest when the length of intake manifold is in the range of 240 and 250 mm. The engine BSFC improved by variable valve timing is significant compared with the original result. Finally, the improvement suggestions for the performance enhancement of the gasoline engine are proposed.


Author(s):  
Francisco J. Palomo ◽  
Alfonso Romero

By means of a counter-example, we show that the Reilly theorem for the upper bound of the first non-trivial eigenvalue of the Laplace operator of a compact submanifold of Euclidean space (Reilly, 1977, Comment. Mat. Helvetici, 52, 525–533) does not work for a (codimension ⩾2) compact spacelike submanifold of Lorentz–Minkowski spacetime. In the search of an alternative result, it should be noted that the original technique in (Reilly, 1977, Comment. Mat. Helvetici, 52, 525–533) is not applicable for a compact spacelike submanifold of Lorentz–Minkowski spacetime. In this paper, a new technique, based on an integral formula on a compact spacelike section of the light cone in Lorentz–Minkowski spacetime is developed. The technique is genuine in our setting, that is, it cannot be extended to another semi-Euclidean spaces of higher index. As a consequence, a family of upper bounds for the first eigenvalue of the Laplace operator of a compact spacelike submanifold of Lorentz–Minkowski spacetime is obtained. The equality for one of these inequalities is geometrically characterized. Indeed, the eigenvalue achieves one of these upper bounds if and only if the compact spacelike submanifold lies minimally in a hypersphere of certain spacelike hyperplane. On the way, the Reilly original result is reproved if a compact submanifold of a Euclidean space is naturally seen as a compact spacelike submanifold of Lorentz–Minkowski spacetime through a spacelike hyperplane.


Author(s):  
Brian D. Earp

This chapter evaluates falsification. Contemporary philosophers of science tend to look down on falsifiability as overly simplistic. Nevertheless, among many practising scientists, the notion is still regarded as a useful — if imperfect — heuristic for judging the strength of a hypothesis in terms of its ability to generate new insights when combined with careful observation. Falsification also relates to self-correction in science. Often, erroneous findings make their way into the literature. If subsequent researchers conduct the same experiment as the original and yet it fails to yield the same finding, they are often described as having ‘falsified’ (that is, shown to be incorrect) the original result. In this way, mistakes, false alarms, and other non-reproducible output is thought to be identifiable and thus able to be corrected. For self-correction in science through falsification, what is needed are ‘direct’ replications. The chapter then considers the importance of auxiliary assumptions.


2020 ◽  
Vol 2 (2) ◽  
pp. 109-114
Author(s):  
Nadia Fariza Rizky ◽  
Surya Darma Nasution ◽  
Fadlina Fadlina

File size or large files are not only a problem in terms of storage, but also another problem when communication between computers. Data usage with a larger size will take longer transfer times compared to data that has a smaller size. Therefore to overcome this problem you can use data compression. Seeing the compression of data compression, data compression is needed that can reduce the size of the data, so that it gains the advantage of reducing the use of external storage media space, accelerating the process of transferring data between storage media, data utilization used in this study is the elias delta codes algorithm. Compression is a method used to reduce the bit or data from the original result to a new result. Where this compression will be applied into the algorithm, the elias delta codes algorithm. After getting the results of the compression it will be designed into Microsoft Visual Studio 2008


Sign in / Sign up

Export Citation Format

Share Document