scholarly journals Computing the Exact Number of Similarity Classes in the Longest Edge Bisection of Tetrahedra

Mathematics ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 1447
Author(s):  
Jose P. Suárez ◽  
Agustín Trujillo ◽  
Tania Moreno

Showing whether the longest-edge (LE) bisection of tetrahedra meshes degenerates the stability condition or not is still an open problem. Some reasons, in part, are due to the cost for achieving the computation of similarity classes of millions of tetrahedra. We prove the existence of tetrahedra where the LE bisection introduces, at most, 37 similarity classes. This family of new tetrahedra was roughly pointed out by Adler in 1983. However, as far as we know, there has been no evidence confirming its existence. We also introduce a new data structure and algorithm for computing the number of similarity tetrahedral classes based on integer arithmetic, storing only the square of edges. The algorithm lets us perform compact and efficient high-level similarity class computations with a cost that is only dependent on the number of similarity classes.

Algorithms ◽  
2021 ◽  
Vol 14 (3) ◽  
pp. 97
Author(s):  
Antoine Genitrini ◽  
Martin Pépin

In the context of combinatorial sampling, the so-called “unranking method” can be seen as a link between a total order over the objects and an effective way to construct an object of given rank. The most classical order used in this context is the lexicographic order, which corresponds to the familiar word ordering in the dictionary. In this article, we propose a comparative study of four algorithms dedicated to the lexicographic unranking of combinations, including three algorithms that were introduced decades ago. We start the paper with the introduction of our new algorithm using a new strategy of computations based on the classical factorial numeral system (or factoradics). Then, we present, in a high level, the three other algorithms. For each case, we analyze its time complexity on average, within a uniform framework, and describe its strengths and weaknesses. For about 20 years, such algorithms have been implemented using big integer arithmetic rather than bounded integer arithmetic which makes the cost of computing some coefficients higher than previously stated. We propose improvements for all implementations, which take this fact into account, and we give a detailed complexity analysis, which is validated by an experimental analysis. Finally, we show that, even if the algorithms are based on different strategies, all are doing very similar computations. Lastly, we extend our approach to the unranking of other classical combinatorial objects such as families counted by multinomial coefficients and k-permutations.


The productivity of land has been often discussed and deliberated by the academia and policymakers to understand agriculture, however, very few studies have focused on the agriculture worker productivity to analyze this sector. This study concentrates on the productivity of agricultural workers from across the states taking two-time points into consideration. The agriculture worker productivity needs to be dealt with seriously and on a time series basis so that the marginal productivity of worker can be ascertained but also the dependency of worker on agriculture gets revealed. There is still disguised unemployment in all the states and high level of labour migration, yet most of the states showed the dependency has gone down. Although a state like Madhya Pradesh is doing very well in terms of income earned but that is at the cost of increased worker power in agriculture as a result of which, the productivity of worker has gone down. States like Mizoram, Meghalaya, Nagaland and Tripura, though small in size showed remarkable growth in productivity and all these states showed a positive trend in terms of worker shifting away from agriculture. The traditional states which gained the most from Green Revolution of the sixties are performing decently well, but they need to have the next major policy push so that they move to the next orbit of growth.


2020 ◽  
Vol 12 (7) ◽  
pp. 2767 ◽  
Author(s):  
Víctor Yepes ◽  
José V. Martí ◽  
José García

The optimization of the cost and CO 2 emissions in earth-retaining walls is of relevance, since these structures are often used in civil engineering. The optimization of costs is essential for the competitiveness of the construction company, and the optimization of emissions is relevant in the environmental impact of construction. To address the optimization, black hole metaheuristics were used, along with a discretization mechanism based on min–max normalization. The stability of the algorithm was evaluated with respect to the solutions obtained; the steel and concrete values obtained in both optimizations were analyzed. Additionally, the geometric variables of the structure were compared. Finally, the results obtained were compared with another algorithm that solved the problem. The results show that there is a trade-off between the use of steel and concrete. The solutions that minimize CO 2 emissions prefer the use of concrete instead of those that optimize the cost. On the other hand, when comparing the geometric variables, it is seen that most remain similar in both optimizations except for the distance between buttresses. When comparing with another algorithm, the results show a good performance in optimization using the black hole algorithm.


Buildings ◽  
2019 ◽  
Vol 9 (3) ◽  
pp. 68
Author(s):  
Mankyu Sung

This paper proposes a graph-based algorithm for constructing 3D Korean traditional houses automatically using a computer graphics technique. In particular, we target designing the most popular traditional house type, a giwa house, whose roof is covered with a set of Korean traditional roof tiles called giwa. In our approach, we divided the whole design processes into two different parts. At a high level, we propose a special data structure called ‘modeling graphs’. A modeling graph consists of a set of nodes and edges. A node represents a particular component of the house and an edge represents the connection between two components with all associated parameters, including an offset vector between components. Users can easily add/ delete nodes and make them connect by an edge through a few mouse clicks. Once a modeling graph is built, then it is interpreted and rendered on a component-by-component basis by traversing nodes in a procedural way. At a low level, we came up with all the required parameters for constructing the components. Among all the components, the most beautiful but complicated part is the gently curved roof structures. In order to represent the sophisticated roof style, we introduce a spline curve-based modeling technique that is able to create curvy silhouettes of three different roof styles. In this process, rather than just applying a simple texture image onto the roof, which is widely used in commercial software, we actually laid out 3D giwa tiles on the roof seamlessly, which generated more realistic looks. Through many experiments, we verified that the proposed algorithm can model and render the giwa house at a real time rate.


2003 ◽  
Vol 792 ◽  
Author(s):  
V. Aubin ◽  
D. Caurant ◽  
D. Gourier ◽  
N. Baffier ◽  
S. Esnouf ◽  
...  

ABSTRACTProgress on separating the long-lived fission products from the high level radioactive liquid waste (HLW) has led to the development of specific host matrices, notably for the immobilization of cesium. Hollandite (nominally BaAl2Ti6O16), one of the main phases constituting Synroc, receives renewed interest as specific Cs-host wasteform. The radioactive cesium isotopes consist of short-lived Cs and Cs of high activities and Cs with long lifetime, all decaying according to Cs+→Ba2++e- (β) + γ. Therefore, Cs-host forms must be both heat and (β,γ)-radiation resistant. The purpose of this study is to estimate the stability of single phase hollandite under external β and γ radiation, simulating the decay of Cs. A hollandite ceramic of simple composition (Ba1.16Al2.32Ti5.68O16) was essentially irradiated by 1 and 2.5 MeV electrons with different fluences to simulate the β particles emitted by cesium. The generation of point defects was then followed by Electron Paramagnetic Resonance (EPR). All these electron irradiations generated defects of the same nature (oxygen centers and Ti3+ ions) but in different proportions varying with electron energy and fluence. The annealing of irradiated samples lead to the disappearance of the latter defects but gave rise to two other types of defects (aggregates of light elements and titanyl ions). It is necessary to heat at relatively high temperature (T=800°C) to recover an EPR spectrum similar to that of the pristine material. The stability of hollandite phase under radioactive cesium irradiation during the waste storage is discussed.


2019 ◽  
Vol 33 (6) ◽  
pp. 800-807 ◽  
Author(s):  
Graham W. Charles ◽  
Brian M. Sindel ◽  
Annette L. Cowie ◽  
Oliver G. G. Knox

AbstractField studies were conducted over six seasons to determine the critical period for weed control (CPWC) in high-yielding cotton, using common sunflower as a mimic weed. Common sunflower was planted with or after cotton emergence at densities of 1, 2, 5, 10, 20, and 50 plants m−2. Common sunflower was added and removed at approximately 0, 150, 300, 450, 600, 750, and 900 growing degree days (GDD) after planting. Season-long interference resulted in no harvestable cotton at densities of five or more common sunflower plants m−2. High levels of intraspecific and interspecific competition occurred at the highest weed densities, with increases in weed biomass and reductions in crop yield not proportional to the changes in weed density. Using a 5% yield-loss threshold, the CPWC extended from 43 to 615 GDD, and 20 to 1,512 GDD for one and 50 common sunflower plants m−2, respectively. These results highlight the high level of weed control required in high-yielding cotton to ensure crop losses do not exceed the cost of control.


2021 ◽  
Vol 2 (2) ◽  
pp. 325-334
Author(s):  
Neda Javadi ◽  
Hamed Khodadadi Tirkolaei ◽  
Nasser Hamdan ◽  
Edward Kavazanjian

The stability (longevity of activity) of three crude urease extracts was evaluated in a laboratory study as part of an effort to reduce the cost of urease for applications that do not require high purity enzyme. A low-cost, stable source of urease will greatly facilitate engineering applications of urease such as biocementation of soil. Inexpensive crude extracts of urease have been shown to be effective at hydrolyzing urea for carbonate precipitation. However, some studies have suggested that the activity of a crude extract may decrease with time, limiting the potential for its mass production for commercial applications. The stability of crude urease extracts shown to be effective for biocementation was studied. The crude extracts were obtained from jack beans via a simple extraction process, stored at room temperature and at 4 ℃, and periodically tested to evaluate their stability. To facilitate storage and transportation of the extracted enzyme, the longevity of the enzyme following freeze drying (lyophilization) to reduce the crude extract to a powder and subsequent re-hydration into an aqueous solution was evaluated. In an attempt to improve the shelf life of the lyophilized extract, dextran and sucrose were added during lyophilization. The stability of purified commercial urease following rehydration was also investigated. Results of the laboratory tests showed that the lyophilized crude extract maintained its activity during storage more effectively than either the crude extract solution or the rehydrated commercial urease. While incorporating 2% dextran (w/v) prior to lyophilization of the crude extract increased the overall enzymatic activity, it did not enhance the stability of the urease during storage.


Games ◽  
2021 ◽  
Vol 12 (3) ◽  
pp. 53
Author(s):  
Roberto Rozzi

We consider an evolutionary model of social coordination in a 2 × 2 game where two groups of players prefer to coordinate on different actions. Players can pay a cost to learn their opponent’s group: if they pay it, they can condition their actions concerning the groups. We assess the stability of outcomes in the long run using stochastic stability analysis. We find that three elements matter for the equilibrium selection: the group size, the strength of preferences, and the information’s cost. If the cost is too high, players never learn the group of their opponents in the long run. If one group is stronger in preferences for its favorite action than the other, or its size is sufficiently large compared to the other group, every player plays that group’s favorite action. If both groups are strong enough in preferences, or if none of the groups’ sizes is large enough, players play their favorite actions and miscoordinate in inter-group interactions. Lower levels of the cost favor coordination. Indeed, when the cost is low, in inside-group interactions, players always coordinate on their favorite action, while in inter-group interactions, they coordinate on the favorite action of the group that is stronger in preferences or large enough.


2017 ◽  
Vol 41 (S1) ◽  
pp. S430-S431
Author(s):  
Y. Barylnik ◽  
S. Pakhomova ◽  
D. Samoylova ◽  
J. Abrosimova ◽  
E. Kolesnichenko ◽  
...  

Identifying the patterns of neurocognitive disorders in pubertal schizophrenia is actual.MethodsBenton Test of visual retention, methods of forward and reverse bills, Bourdon correction sample, Wechsler's subtests (subtest 11 – “Encryption”, subtest 12 – “Labyrinths” 1, 2, 3, 4, 5), Trail Creating a Test Part A.ResultsAll patients were divided into 3 groups. The first group (schizophrenia) and second group (other psychic disorders) showed the worst results than healthy subjects. Qualitative analysis of the “Benton Test” results showed similar variations of difficulty and types of errors in the subjects of the first and second groups – ignoring the number of the figure sides, as well as difficulties in the structuring element of the image corners. The “Methods of forward and reverse bills” demonstrated the fatigue and attention instability. “Bourdon test” showed a high level of the stability index (K = 0.09). Wechsler's subtest “Encryption B” obtained poor results, indicating a pathological decrease in visual-motor speed. During the subtest “Labyrinths 1, 2, 3, 4, 5” the subjects of first and second groups exceeded the allowable time limit, but the first group of schizophrenia patients allowed more blunders during pubertal study (ignored the walls of the maze, torn pencil despite the given instructions). The test groups 1 and 2 while passing “Trail Creating a Test Part A” have shown good results – job data did not cause difficulties and carried out in accordance with the specified instructions.ConclusionsNeurocognitive disorders allow to confirm the presence of morphological and functional brain changes when endogenous mental illness occurs.Disclosure of interestThe authors have not supplied their declaration of competing interest.


Author(s):  
Irfan Uddin

The microthreaded many-core architecture is comprised of multiple clusters of fine-grained multi-threaded cores. The management of concurrency is supported in the instruction set architecture of the cores and the computational work in application is asynchronously delegated to different clusters of cores, where the cluster is allocated dynamically. Computer architects are always interested in analyzing the complex interaction amongst the dynamically allocated resources. Generally a detailed simulation with a cycle-accurate simulation of the execution time is used. However, the cycle-accurate simulator for the microthreaded architecture executes at the rate of 100,000 instructions per second, divided over the number of simulated cores. This means that the evaluation of a complex application executing on a contemporary multi-core machine can be very slow. To perform efficient design space exploration we present a co-simulation environment, where the detailed execution of instructions in the pipeline of microthreaded cores and the interactions amongst the hardware components are abstracted. We present the evaluation of the high-level simulation framework against the cycle-accurate simulation framework. The results show that the high-level simulator is faster and less complicated than the cycle-accurate simulator but with the cost of losing accuracy.


Sign in / Sign up

Export Citation Format

Share Document